• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Has Nvidia Forgotten Kepler? The GTX 780 Ti vs. the 290X Revisited

Soldato
Joined
1 Apr 2010
Posts
3,034
Has Nvidia neglected Kepler? This evaluation will revisit 13 games from our Kepler GTX 780 Ti launch evaluation which was originally published on November 7, 2013. We focused on its performance versus the then recently released R9 290X, and also versus the original $1000 Titan. We will use 13 of the same games that we benched then with the GTX 780 Ti launch drivers, and compare them with the very latest drivers available today using these very same three video cards. Also, we will also feature these cards in our very latest 25-game benchmark suite to compare them with the top cards of today for a much bigger picture.


http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/view-all/
 
So on average the gtx780ti was around 10-20% faster in there launch 2013 gaming suite to then 10-20% slower in there up to date gaming suite and they don't think anything of it. That's a huge turn around.
 
Last edited:
So on average the gtx780ti was around 10-20% faster on average in there launch 2013 gaming suite to then 10-20% slower in there up to date gaming suite and they don't think anything of it. That's a huge turn around.

I'm not quite sure where you are getting that from. Looking down the chart there are a few outliers like Metro where the launch 290X to now has gone from 42 to 80fps and the 780ti has only gone from 65 to 69fps, but for most of those titles listed the 780ti has improved from launch till now and still gets a higher score than the 290X

the one comment they made was that on the newer games (not the original 13), the settings they run to test newer cards are limited by the 3GB so the FPS tanks... but that's not a driver issue thats a game settings issue which they comment in the article is fixed by lowering the settings slightly to reduce the vram issue

the point of the article was to see if the 780ti has been "forgotten" by nvidia, but there are improvements there across the board
 
Last edited:
Wish they'd list the actual boost clocks as yet again those results look like they match the "guaranteed" boost clock rather than what you'd get with actual Boost 2.0 results.

The only mention is kind of vague and doesn't really make it clear how they were tested:

GTX 780 Ti, 3GB, reference clocks, supplied by Nvidia
GTX Titan, 6GB, reference clocks, supplied by Nvidia

PowerColor R9 290X PCS+ 4GB; clocked at 1030Mhz Uber clocks
 
I'm not quite sure where you are getting that from. Looking down the chart there are a few outliers like Metro where the launch 290X to now has gone from 42 to 80fps and the 780ti has only gone from 65 to 69fps, but for most of those titles listed the 780ti has improved from launch till now and still gets a higher score than the 290X

the one comment they made was that on the newer games (not the original 13), the settings the run to test newer cards are limited by the 3GB so the FPS tanks... but that's not a driver issue thats a game settings issue which they comment in the article is fixed by lowering the settings slightly to reduce the vram issue

They never stated at what resolution the vram was limiting it. At 1440p the titan and the 780ti are taking a good old pounding. Even at Amd's weakest resolution 1080p there is no signs of Vram being a limit judging by the Titan results. There was no real reason to test old games as we knew in old games the ti would still be fast. in new games it fairs much worse which is what we have all been pointing out. No other site is mentioning the Vram being a limiting factor in the majority of games. The original Titan which has no Vram limiting it's performance has went backwards as well.
 
Wish they'd list the actual boost clocks as yet again those results look like they match the "guaranteed" boost clock rather than what you'd get with actual Boost 2.0 results.

The only mention is kind of vague and doesn't really make it clear how they were tested:

I think the AMD card is wrong as a PCS+ does not have uber mode and the cooler does not need to spin up to 100% to prevent throttling. The PCS+ model was not around at launch either. I think it was all reference cards.
 
even so, they haven't done any averages and looking down the list there are some games with the 290X ahead and some with the 780ti, its not as clear as you saying "it was 20% ahead and now its 20% behind"

its good that AMD have improved the drivers on the 290X, its a shame for them that they seem to always release such poor launch day drivers as having double the fps in some of those games on launch would have been a completely different story maybe
 
I think the AMD card is wrong as a PCS+ does not have uber mode and the cooler does not need to spin up to 100% to prevent throttling. The PCS+ model was not around at launch either. I think it was all reference cards.

Thing is though ignoring any throttling most 290X cards you could buy would be running around 1000MHz actual clocks out the box - some a little tiny bit under but most models were around 1000-1050MHz (and critically matches up with what is used to benchmark).

Most 780tis you'd actually buy would be boosting to around 1140MHz actual (and a fair bit faster than the reference clocks) - compared to the reference boost clock of 928MHz - what an actual reference card would boost to is another matter but those results seem to indicate the card is running a little under 1000MHz boost at most (its hard to exactly pin it down but with the experience I have I can get a rough idea for the ballpark).

If it wasn't for the fact it would be totally silly I sometimes half-wonder if nVidia doesn't lean on benchmarkers to make Kepler look worse than it is lol - in several benchmarks with easily reproducible results I've found their numbers match upto what you'd get with the cards limited to the guaranteed boost clock (which used to be a factor for Boost 1.0 benchmarking but is totally irrelevant to Boost 2.0 on the 700 series).
 
Last edited:
even so, they haven't done any averages and looking down the list there are some games with the 290X ahead and some with the 780ti, its not as clear as you saying "it was 20% ahead and now its 20% behind"

its good that AMD have improved the drivers on the 290X, its a shame for them that they seem to always release such poor launch day drivers as having double the fps in some of those games on launch would have been a completely different story maybe

Yea lets turn this into AMD doing a poor job at launch with drivers and not them improving things over time and Nvidia doing the opposite. Start from Total war down which is a years worth of games and take the Titan results as there is no Vram limiting it. I think i am being generous with the 10-20% average. The Titan and 290x were neck and neck at launch so it's an even better comparison.
 
Last edited:
this is pretty obvious reasoning here

780TI is very different to a 900s

290s are very close to 390 which is still up for sale

AMD work on 390s drivers & incidentally help the 290s performance. Nvidia do not.

not thats an excuse but thats why.
 
Thing is though ignoring any throttling most 290X cards you could buy would be running around 1000MHz actual clocks out the box - some a little tiny bit under but most models were around 1000-1050MHz.

Most 780tis you'd actually buy would be boosting to around 1140MHz actual - compared to the reference boost clock of 928MHz - what an actual reference card would boost to is another matter but those results seem to indicate the card is running a little under 1000MHz boost at most (its hard to exactly pin it down but with the experience I have I can get a rough idea for the ballpark).

If it wasn't for the fact it would be totally silly I sometimes half-wonder if nVidia doesn't lean on benchmarkers to make Kepler look worse than it is lol - in several benchmarks with easily reproducible results I've found their numbers match upto what you'd get with the cards limited to the guaranteed boost clock (which used to be a factor for Boost 1.0 benchmarking but is totally irrelevant to Boost 2.0 on the 700 series).

For there comparison to work they had to use the same cards as they did back in the day as they are using charts from back then. They will be using reference cards that were most likely supplied by Nvidia so should be hand picked by Nvidia. They could be bought cards but i understand why they would try to use the same cards as they did back in 2013.
 
^^ Which produces a skewed picture compared to actual realworld experiences of people running GK110 series cards versus Maxwell and Grenada/Hawaii based cards. (EDIT: Other than the delta changes obviously).
 
Last edited:
The games that show sub 50 fps are irrelevant to me as I always tune settings for my old 60hz monitor anyway to get as close to 60fps as possible with the highest IQ I can.

I do think nvidia could do more for the older gen cards but meh.... its lasted me as long as wanted it to.
 
^^ Which produces a skewed picture compared to actual realworld experiences of people running GK110 series cards versus Maxwell and Grenada/Hawaii based cards. (EDIT: Other than the delta changes obviously).

Just about every other site is showing the same though. This article is about showing launch performance to now. Whatever way it's looked at Kepler is not faring well for whatever reason. They go on about Vram but then have the Titan in there which is showing the same performance drop off with no Vram limits.
 
this is pretty obvious reasoning here

780TI is very different to a 900s

290s are very close to 390 which is still up for sale

AMD work on 390s drivers & incidentally help the 290s performance. Nvidia do not.

not thats an excuse but thats why.

That is part of the equation but the underlying fact is the Kepler cards are faster now than at launch so the myth is completely busted. What AMD have achieved is completely irrelevant and unrelated, nvidia are not in charge of AMD's driver team so relative driver co,a prison are completely pointless, especially since AMD have a record of poor launch drivers then of course they can get additional performance down the line. The huge jump in metro just highlights the fact that AMD frequently have driver issues on launch, something that's been the case for the last 20 or so years of AMD/ATI.
 
That is part of the equation but the underlying fact is the Kepler cards are faster now than at launch so the myth is completely busted. What AMD have achieved is completely irrelevant and unrelated, nvidia are not in charge of AMD's driver team so relative driver co,a prison are completely pointless, especially since AMD have a record of poor launch drivers then of course they can get additional performance down the line. The huge jump in metro just highlights the fact that AMD frequently have driver issues on launch, something that's been the case for the last 20 or so years of AMD/ATI.

The problem has never been about old games though. It's newer games where the performance is not great and does not look right for the power of the cards. AMD have improved more yea but if you look at games from The Witcher 3 down you start to see a lot of games where the performance is way off compared to the games before it.
 
Well it certainly silences the conspiracies about nVidia purposefully gimping kepler.

AMD's drivers have came on leaps and bounds since the 200 series launch so it's no surprise to see their cards improving more. Their DX11 overhead was way worse a few years back.
 
Yea lets turn this into AMD doing a poor job at launch with drivers and not them improving things over time and Nvidia doing the opposite. Start from Total war down which is a years worth of games and take the Titan results as there is no Vram limiting it. I think i am being generous with the 10-20% average. The Titan and 290x were neck and neck at launch so it's an even better comparison.

Eh? Nvidia drivers have still improved from the old ones to the new ones, but you have to ask yourself why a 780ti improves by 10-20% over time, but a 290X improves by in some cases 100%

Surely getting half the fps on launch than what we now know its capable of is a bit of an issue
 
Back
Top Bottom