Do Nvidia drivers cause planned obsolescence?




The graphics card market is highly competitive, and users can become extremely vocal in their support for one side or the other. Look at the comments on my recent AMD RX Vega review, for example, and sure enough you'll find accusations of fanboyism on both sides of the fence. Something else you'll find in the comments of almost every GPU review: claims that Nvidia drivers intentionally cripple older hardware, trying to force users to upgrade. This is the so-called "planned obsolescence" approach to hardware, and I've always felt it was more than a bit bogus. After all, in a highly competitive market, making your existing products look worse than the competition isn't good, not to mention it would be extremely easy to prove—or disprove.
The only thing you need is the correct hardware, and a patient soul to run through all the benchmarks. Thankfully, I don't even have to do this, as LinusTechTips posted a video today where they've done the dirty work for me! Hundreds of benchmarks covering five years of hardware, from the GTX 680 through the current generation GTX 1080 Ti. For the drivers, each card was tested with the drivers one release after the initial product launch, and from there forward. The tested games consist of The Witcher 3 (with and without HairWorks), Metro: Last Light, and BioShock: Infinite.
The results of the testing: in general, all the tested cards show an equal or upward trend in performance. Or in other words, claims that Nvidia is intentionally sabotaging performance of its older hardware are bogus. Unless you subscribe to the conspiracy theorists who say that LinusTechTips selected games where it knew performance didn't drop? Because three games is obviously a very limited sample size.
Also note that drivers that intentionally cripple performance on old hardware isn't the same thing as the occasional bad/buggy driver release, like 375.86. Other problems come and go, like SLI performance/support and 3D Vision, which can be very game specific. And finally, not fully optimizing for hardware that's several years old may happen.
Bottom line: it looks like poor performance in newer games on older hardware is due to those games using hardware features that perform substantially better on newer hardware. Anisotropic filtering used to cause a big performance hit, but on current hardware the difference between no AF and 16x AF is maybe 1-2 percent. SSAO likewise used to cause a much larger performance hit, and relative tessellation performance is something that has improved over time thanks to faster geometry processing in hardware. Eventually, we will have $300 graphics cards that deliver better performance than today's $700 GTX 1080 Ti.








Source : https://goo.gl/bWqP9p

Share this

Related Posts

Previous
Next Post »