• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Has Nvidia Forgotten Kepler? The GTX 780 Ti vs. the 290X Revisited

The problem has never been about old games though. It's newer games where the performance is not great and does not look right for the power of the cards. AMD have improved more yea but if you look at games from The Witcher 3 down you start to see a lot of games where the performance is way off compared to the games before it.


Nvidia isn't responsible for the way developers create games so I fail to see big conspiracy here. The article hasn't compared the old drivers to new drivers with new games so I don't see how you can derive at your conclusion.
 
Last edited:
The problem has never been about old games though. It's newer games where the performance is not great and does not look right for the power of the cards. AMD have improved more yea but if you look at games from The Witcher 3 down you start to see a lot of games where the performance is way off compared to the games before it.

This..

Its well known AMD is built to last.
 
Eh? Nvidia drivers have still improved from the old ones to the new ones, but you have to ask yourself why a 780ti improves by 10-20% over time, but a 290X improves by in some cases 100%

Surely getting half the fps on launch than what we now know its capable of is a bit of an issue

Look at the last chart of where we are today. The games are in order of release date. Up until the Witcher 3 arrived the performance of the Nvidia cards was usually within 10% or even them winning at 1440p. Now games after the Witcher 3 launched are on average way slower on the Nvidia cards apart from the odd few where AMD don't perform well..
 
Last edited:
Look at the last chart of where we are today. The games are in order of release date. Up until the Witcher 3 arrived the performance of the Nvidia cards was usually within 10% or even them winning at 1440p. Now games after the Witcher 3 launched are on average way slower on the Nvidia cards.

did anyone ever buy a 780ti with 4k in mind?
I'm really puzzled as to why anyone would expect the latest games to run at 4K on a 3 year old GPU

big whoop, a 290X gets 25fps and the 780ti languishes at 19fps... neither is playable

they are running all of those games at settings that neither card provides compelling experience, with averages in the 40's and 50's the minimums are going to be sub 30fps
of course 3 year old cards aren't going to be getting 100fps, who cares?
 
did anyone ever buy a 780ti with 4k in mind?
I'm really puzzled as to why anyone would expect the latest games to run at 4K on a 3 year old GPU

big whoop, a 290X gets 25fps and the 780ti languishes at 19fps... neither is playable

they are running all of those games at settings that neither card provides compelling experience, with averages in the 40's and 50's the minimums are going to be sub 30fps
of course 3 year old cards aren't going to be getting 100fps, who cares?

Who mentioned 4k. The article even states these cards are not good for 4k. I was taking 1440p results as it's middle ground. Even then they are only benchmarking so game play does not come into it as we all like to use different settings and game at an fps that might not be to others liking. The results are just that results for us to analyse.
 
Last edited:
TBH regardless of perceived cause and how ppl want to explain it away, it paints an attractive case for anyone looking to hold on to a card for 2+ years. Competitively priced and equivalently performing cards with one vendors product (solely according to one historical example ofc) improving better than the other over the useful lifetime of the product.
 
Whilst the continued improvements the 200 series of cards from AMD are seeing are very welcome, as others have said, this exceptional longevity seems to stem from the fact that the underlying GCN architecture between the 200 and 300 series cards is the same. I've got no issue with either manufacturer prioritising the optimisation of their drivers with newer cards in mind, as long as older cards are kept functioning perfectly well for new game releases for several years after their release.
 
What I would be asking if I owned a 290X is why it takes AMD so long to get drivers up to speed. People seem happy that it has taken a couple of years to get this performance.

How about that for a different spin, makes a change from the the Nvidia gimping the 780 theory's. Maybe Nvidia just got the most of what they could from the start?
 
Perhaps because you new performance figures when you bought it and the probability that it was way way cheaper than equivalent Nv products?

Likelihood users know AMD will probably keep supporting tech, look at the 79's v 680, especially as the 'pro' users said year after year that vram doesn't matter, that's an even bigger embarrassment for the locals in here, they know who they are.:o

However, not convinced Fiji's 4Gb is going to hold as good this time in the long run which may be a blessing in disguise for AMD as I do think that they need to knock it in the head a touch, only one of them has cards flying off the shelves each time a new gen arrives.
 
Last edited:
Perhaps because you new performance figures when you bought it and the probability that it was way way cheaper than equivalent Nv products?

Likelihood users know AMD will probably keep supporting tech, look at the 79's v 680, especially as the pro users said year after year that vram doesn't matter, that's an even bigger embarrassment for the locals in here, they know who they are.:o

:D:D:D:D:D

It's not like performance was bad back then either. The 290x was a match for Nvidia's £800 Titan.
 
NVIDIA have historically pushed the drivers on the edge and maximised performance from the off. AMD have previously done the same with 12.11 and the latest crimson drivers. People expecting to see gains on Kepler/Fermi only really come about with game performance patches.
 
Perhaps because you new performance figures when you bought it and the probability that it was way way cheaper than equivalent Nv products?

Likelihood users know AMD will probably keep supporting tech, look at the 79's v 680, especially as the 'pro' users said year after year that vram doesn't matter, that's an even bigger embarrassment for the locals in here, they know who they are.:o

However, not convinced Fiji's 4Gb is going to hold as good this time in the long run which may be a blessing in disguise for AMD as I do think that they need to knock it in the head a touch, only one of them has cards flying off the shelves each time a new gen arrives.

Yeah I would love to play a new game on my old 2gb 670 using 1.9gb of memory at high settings at 30fps, wait!!!! No I wouldn't :p
 
Yeah I would love to play a new game on my old 2gb 670 using 1.9gb of memory at high settings at 30fps, wait!!!! No I wouldn't :p

Yet that guy is still playing higher settings/>30fps/>1.9Gb vram on his cheaper 79 coz he used a bit of savvy/listened.:p
 
Most 7970's were cheaper than most 670's the majority of it's pp.

In the middle of the 79's lenghty carreer, one of my 7950's-after punting the crazy game package cost £165 new-half the price of a lot of 670's, it would probably rub shoulders with the 780 in plenty of titles now.:eek:

As stated earlier, no wonder AMD don't sell as many cards in comparison to the lean green milking machine.
 
Most 7970's were cheaper than most 670's the majority of it's pp.

In the middle of the 79's lenghty carreer, one of my 7950's-after punting the crazy game package cost £165 new-half the price of a lot of 670's, it would probably rub shoulders with the 780 in plenty of titles now.:eek:

As stated earlier, no wonder AMD don't sell as many cards in comparison to the lean green milking machine.

That's fine and amd were the lengthier card to buy back then, but not because 2gb was not enough for the 670, But because it hasn't the grunt. As was said then.

I am all for longevity but some cards are pointless to certain people with the games they play as the 79xx was back then for more money, And as my 670 was when I wanted more IQ at 60 fps.

The reason I skipped 980ti was because the 780 was fine for everything I play and I could not justify the hefty price.
I was tempted by the fury cards but 4gb for a card that (at launch ) was only marginally better than what I had @1200 res was also not justified by me, and also the hefty price tag.
 
Back
Top Bottom