• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is driver degradation a thing or just made up?

Caporegime
Joined
18 Oct 2002
Posts
32,618
Every card has a best before date regardless of manufacturer eventually the GPU grunt isn't going to cut it regardless of how many software improvements you chuck at it. And as you've seen from the graphs there has been improvements for new drivers with Nvidia so its not they have full performance they just have better performance than AMD on release typically.

Indeed, and it is also natural for Nvidia to find the most improvements in the newest architecture. Maxwell was quite a big change fro Kepler so it was blatantly obvious that Nvidia would find new ways of extracting increased performance. It is also normal that a company concentrates on the newest hardware first and works their back through progressive older hardware generations.


Another important factor when comparing newer games to older games is that the actual engines change, shader loads alter and compete requirements have increased. this is particularly important with regards to Nvidia's performance. Recent nvidia cards have had very well balanced geometry, tessellation, texture fill rates, ROPs and raw compute. AMD have had excess compute with impressive shader count bu with bottlenecks elsewhere in the command processor and geometry engines. Modern games have increased the amount of raw compute load, which will naturally favor GCn architecture and have an impact on older NVidia cards with less raw compute performance. That is good for AMD, and clearly in the future there will be increased compute loads so nvidia will design architectures with increased capabilities there. The point is though that since compute game loads on the GPU have changed then relative performance between generations increases.

Nvidia designs GPUs for today and the near future, so when new games come out with GPU loads in line with Nvidia future predictions then the new GPUs see a big performance increase compared to old architectures. This isn't gimping drivers at all, it just signing a GPU for the games that will exist in the GPU's lifetime.
 
Associate
Joined
9 Jul 2016
Posts
150
Location
Here
Ha! :D WTF are they thinking running is with Physics on? on AMD it runs on the CPU and kills performance, on Nvidia it doesn't.
That depends what you mean by "Physics". There are games where it runs exclusively on the CPU regardless of the GPU used. Project CARS springs to mind. The same applies to Witcher 3 and many other recent titles, and this helps to explain why raw CPU IOPS has become such a large determining factor in minimum FPS when gaming nowadays.

PhysX has no additional impact on performance on an AMD graphics card compared to an Nvidia card when it's running on the CPU which it's probably doing a lot more often than you think.
 
Caporegime
Joined
20 Jan 2005
Posts
45,712
Location
Co Durham
Every card has a best before date regardless of manufacturer eventually the GPU grunt isn't going to cut it regardless of how many software improvements you chuck at it. And as you've seen from the graphs there has been improvements for new drivers with Nvidia so its not they have full performance they just have better performance than AMD on release typically.

try saying that to my 3 year old 290x which has just had a 20% boost in performance...............
 
Caporegime
Joined
4 Jun 2009
Posts
31,149
Yup when I bought my 290, the only other choice within my price range that was the equivalent from nvidia was the 780, which was neck in neck for performance, win some, lose some.

Back then you could say the 780 was the better buy "overall" as better power efficiency, less heat, less noisy but I went with the 290 mainly for the extra GB of VRAM as well as being cheaper, especially by the time I sold the 4 games I got with it.

Where as now, a 290 is matching a 780ti and even beating it a few times with the latest titles over the last year, not to mention how things are looking for AMD with dx 12. The 290 is looking good for at least another year where as I imagine 780 users will be wanting/needing to upgrade pretty soon.

So the way I see it is that I have got a free performance boost, which I wasn't expecting at the time of purchase.
 
Soldato
Joined
6 Feb 2010
Posts
14,595
Yup when I bought my 290, the only other choice within my price range that was the equivalent from nvidia was the 780, which was neck in neck for performance, win some, lose some.

Back then you could say the 780 was the better buy "overall" as better power efficiency, less heat, less noisy but I went with the 290 mainly for the extra GB of VRAM as well as being cheaper, especially by the time I sold the 4 games I got with it.

Where as now, a 290 is matching a 780ti and even beating it a few times with the latest titles over the last year, not to mention how things are looking for AMD with dx 12. The 290 is looking good for at least another year where as I imagine 780 users will be wanting/needing to upgrade pretty soon.

So the way I see it is that I have got a free performance boost, which I wasn't expecting at the time of purchase.
Was in the exactly position. Was waiting and hoping that the EVGA 780 Classified was gonna hit below £400 mark but didn't happen, so got a 290x instead which turned out to be a very good decision in a long run. Now I am on a Acer 34" XR341CK 21:9 Predator Freesync monitor without having to spend extra £300+ for the Gsync equivalent (with slightly higher refresh rate) that won't matter for a single card. Seriously after using Freesync, it makes me want to question what the hell is wrong with monitor and PC gaming for all these years!! :p
 
Soldato
Joined
30 Mar 2010
Posts
13,068
Location
Under The Stairs!
There seems to be a few who believe that NVidia have gimped performance and as seen from this thread, that isn't the case. By all means, feel free to start a thread that has NVidia ignoring EOL GPUs but that wasn't what this thread was about.



Forgetting the AMD crowd for a minute, some Nv users in this thread believe there is a brick wall hit with optimisation when Nv bring release a new architecture, you got all of them on ignore and can't see their posts?
 
Man of Honour
Joined
13 Oct 2006
Posts
91,371
I have a relatively modest, yet rock solid OC on older NV drivers with my 780Ti. As soon as I try the later drivers, it crashes.

Coincidence?

I think not.

Since IIRC 358.59 every other release hasn't played nice with Kepler cards for me mostly in older DX9 games. Usually enabling K-Boost and/or disabling any overclock factory or end user fixes it. At first I thought it was my 780 degrading but its rock solid on older drivers and I can replicate it on 3 other Kepler based cards albeit strangely not always the same driver version.
 
Associate
Joined
30 Jan 2016
Posts
75
try saying that to my 3 year old 290x which has just had a 20% boost in performance...............

Well not quite 3 years :p but well done your card has finally caught up on DX11 performance. I didn't say when that expiry date is obviously thats going to vary from person to person on how frequently they upgrade and some would consider performance straight out of the hat to be better than being 20%+ behind now but in 2 years I can expect maybe 30%+ depends on how you look at it.
 
Soldato
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
Well not quite 3 years :p but well done your card has finally caught up on DX11 performance. I didn't say when that expiry date is obviously thats going to vary from person to person on how frequently they upgrade and some would consider performance straight out of the hat to be better than being 20%+ behind now but in 2 years I can expect maybe 30%+ depends on how you look at it.

But people who bought the 290x bought it for its performance at the time of purchase, so they were happy with the performance of DX11 then and what they are getting now is a free bonus.
 
Soldato
Joined
30 Mar 2010
Posts
13,068
Location
Under The Stairs!
+1,

Everyone knows what performance is on the table when you purchase your gpu, 290 10% slower than a Titan then, now it's faster than the Titan.

Only in the Ocuk universe can the then faster card at the time be spun as a positive over the slower card that is now faster.:D
 
Soldato
Joined
14 Jan 2010
Posts
2,966
Location
London
It sort of is and there has been a few saying that NVidia are gimping performance and hence me looking and finding this. I see no evidence myself of any purposeful gimping going on. I thought a thread showing some decent testing over many games was a fair call and Kepler would be the one that suffers the most as it is now 2 generations old.

Some people still argue the earth is flat.

I get frustrated every time I come on this forum now, the same old tools derail any attempts at decent discussion, mods do nothing, even when it's a topic you want to read, it's usually 5 pages of bile and the useful bits get lost.

Don't know how you have the energy to bother with all this tbh.
 
Associate
Joined
27 Dec 2014
Posts
1,686
Location
Southampton
Some people still argue the earth is flat.

I get frustrated every time I come on this forum now, the same old tools derail any attempts at decent discussion, mods do nothing, even when it's a topic you want to read, it's usually 5 pages of bile and the useful bits get lost.

Don't know how you have the energy to bother with all this tbh.

it's not worth it, that's for sure.

It doesn't take long to know exactly what everyone thinks and what they will post.

it gets mind numbingly boring to see every discussion degenerate into exactly the same bad quality repeats as if a lot of people have nothing better to do.

Even arguing accomplishes nothing so might as well get some popcorn and hope you find one or two interesting discussions every now and then.
 
Soldato
Joined
19 Feb 2007
Posts
14,387
Location
ArcCorp
Linus did a good video using a GTX 480 using drivers from when the 480 was released in 2010 until 2015 and there isn't really any driver degradation, At least not with that card -

 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Some people still argue the earth is flat.

I get frustrated every time I come on this forum now, the same old tools derail any attempts at decent discussion, mods do nothing, even when it's a topic you want to read, it's usually 5 pages of bile and the useful bits get lost.

Don't know how you have the energy to bother with all this tbh.

You know it. I really enjoy these forums as a rule and apart from the odd guy who spoils it, for the most, people see what is what and can hopefully see above the twisting of the topic at hand.

Cheers Dice and a good watch that :)
 
Associate
Joined
27 Oct 2013
Posts
313
There isnt degradation but there certainly is lack of new game support and optimization if the 780ti beat the 290x it could do it again but they didnt keep the drivers up date and now the amd cards are still getting better. Good bussiness plan really bad for non hardcore customers who dont* wanna upgrade everysingle time. This is why i prefer AMD unless i become rich. I predict the 390/x will exceed the 970/980 in the years to come maybe even in 3 years the 980 ti at this rate lol.
 
Soldato
Joined
7 Aug 2013
Posts
3,510
It is utterly bizarre that a 780Ti would *gain* in performance yet a Titan would not. They are essentially the same GPU and driver optimizations made for them would be targeted towards both inherently.

I'd like to see more tests, really. This just doesn't make any sense at all.
 
Soldato
Joined
7 Aug 2013
Posts
3,510
Love this bit...



Ha! :D WTF are they thinking running is with Physics on? on AMD it runs on the CPU and kills performance, on Nvidia it doesn't.
So should any benches/games that use async compute be tested with that turned off to compare Nvidia vs AMD games fairly?

I'm always puzzled by this strange double standard. If Nvidia cards are good at certain things that are implemented in games, it doesn't count. But when AMD cards are good at certain things, it's fair game for some reason.
 
Back
Top Bottom