arrow_upward
GPU Planned Obsolescence
#1
Do you guys think that graphics card manufacturers  purposefully make it so that eventually your graphics card slows down/weakens so much that it’s barely able to run games or anything of the sort anymore? I feel like this is the case with the NVIDIA cards (at the very least in the older cards) and it would appear that I’m not alone in that consensus since it seems like the new drivers that they push out for the cards have a detrimental effect on their performance. Anyways, just wanted to hear all of your thoughts.



#2
I always blamed my issues with old GPUs on the fact that new games need more performance requriements these days and that I can't expect NVIDIA to optimize for 10 year old products. The hardware itself is probably suffering from wear&tear and a lot of dust. I would be interested if someone tries to test your theory... I wonder how they can do that?



#3
I highly doubt this. Do you clean and repaste your gpu? I feel like a lot of people underestimate how much repasting and cleaning an old gpu (and cpu for that fact) actually does. A lot of chips can underclock itself under high temp and might do that under load when temps usually go up. What you are attributing to gpu obsolescence by driver might just be poor cleaning and heat management.

It could also be caused by virus, bad installation, driver conflict, etc.



#4
I honestly am not even sure about this



#5
Nah, I'm still using a GTX 680 for modern games and that was released 7 years ago. People just always want to upgrade, even if their current rig is fine.



#6
That's so sad for us consumers..
They place some critical parts near heat and they pretty much add lowest components to make it cheap :/



#7
Im using my 1080 until 6080 comes out, you just need to buy rhe most future proof card and maintain it by cleaning and replacing thermals