• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: Our Next Generation (Arctic Islands) GPUs will offer Double the Performance Per Watt of the Cur

Yea I thought the same as I suspect they are using a reference 290x and anyone can agree that cooler is shocking and held the 290x back. On anandtech's bench under the uber profile its got better results than that of site. So makes me beleive it was a ref 290x. Like said would have been nice to see test setup.

The conclusion appears to be mostly about the 780Ti yet the title says 780ti vs 290X. Essentially the test is to debunk the claims that 290X has improved and Kepler has lost performance. It says the 290X is stock but no mention of what the 780ti is running at.

What will be more interesting is when someone decides to test the same cards in newer titles such as Battlefront and Fable Legends, etc. I'm sure the 290X even beats the 980 in some of the newest games, especially DX12 where the AMD has equal driver overhead to Nvidia.
 
Last edited:
The conclusion appears to be mostly about the 780Ti yet the title says 780ti vs 290X. Essentially the test is to debunk the claims that 290X has improved and Kepler has lost performance. It says the 290X is stock but no mention of what the 780ti is running at.

What will be more interesting is when someone decides to test the same cards in newer titles such as Battlefront and Fable Legends, etc. I'm sure the 290X even beats the 980 in some of the newest games, especially DX12 where the AMD has equal driver overhead to Nvidia.

Well some others went through that review with a fine toothed comb.
http://forums.anandtech.com/showthr...ight=hardwarecanucks+gtx+780+290x+the+rematch
 
Well here's Battlefront with old and new cards tested on Guru3d.

http://www.guru3d.com/articles_page...ta_vga_graphics_performance_benchmarks,5.html

As we can see the gtx780ti is not faring well. Even at 1080p where Nvidia are strong it's getting soundly beaten. There is probably other games that have been released this year that had to be tested on the older Gen cards like below with the Witcher 3. Again the 780ti is slower than a 290.

http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,5.html

Here is Gta 5 which again shows us Kepler getting beaten soundly

http://www.guru3d.com/articles_pages/gta_v_pc_graphics_performance_review,5.html

Bf Hardline again shows kepler getting well beaten. Think that will do for now and yes other sites could show different but was so easy to find gpu performance on a lot of old cards using guru3d.

http://www.guru3d.com/articles_pages/battlefield_hardline_vga_graphics_performance_review,5.html

One more for the road lol. Dying light with the 290x being faster again.

http://www.guru3d.com/articles_pages/dying_light_vga_graphics_performance_review,8.html

All games released or being released this year.
 
Last edited:
I've had four different 980Ti, today will get the fifth (I'm an idiot), all of them held 1450+.
I think is quite consistent, 1090Mhz is what the reference guarantees, all the rest is extra.

Bringing this back on topic, power efficiency should matter even for those who just want better performance; that's what gave Maxwell the edge.

Reference guarantee is not the same as actual boost clocks. It is illogical and wrong to use this as the base compared to max boost with OC when looking at 980Ti OC. No 980Ti will get close to those minimum clocks, as such it is not a valid baseline.

Also I never disputed 980Ti's could not get 1450+ core clock, my dispute is that normal max boost should be used as the baseline. I can guarantee that not one of your 980Ti cards got lower than 1250-1300 as an actual boost clock.

My own experience with 980Ti SC+ ACX is that it never went lower than 1250 at stock even after prolonged (hours) of W3 at 4K. Actual overclock reached was 1450+ but dropped to 1330-1350 after the same prolonged gamin in W3 at 4K. The only way I could negate this thermal capping was to fit an AIO.

Again I stress that 980Ti are good overclockers and agree that 1450+ is normal, just that this generally equates to ~17% core overclock, not 23% as some believe. Overclocks should be measured in actual in game % increase IMHO.

The actual link I was initially responding about shows an average real performance increase of ~15% for 980Ti, even though the article exclaims they are getting a 23%+ overclock. Still way better than the OC of Fury/X right now but from my experience 15% is not that exceptional as it's been made out to be.
 
Yea I thought the same as I suspect they are using a reference 290x and anyone can agree that cooler is shocking and held the 290x back. On anandtech's bench under the uber profile its got better results than that of site. So makes me beleive it was a ref 290x. Like said would have been nice to see test setup.

From the the Article:

HWCanucks Articles said:
For comparison purposes I’ve added a GTX 980 for good measure and our standard benchmark runs and test setup are being used. One important note is that our R9 290X is custom cooled but stock clocked so it will have no problem avoiding the throttling that plagued reference designs. Drivers being used are NVIDA's 355.69 and AMD's 15.7.1.

In the comments the author mentions it is a Powercolor PCS+ albeit with a stock clock bios.

A cross comparison to another sites result will be pointless unless they are running exactly the same test setup and benchmark scenarios.
 
Well here's Battlefront with old and new cards tested on Guru3d.

http://www.guru3d.com/articles_page...ta_vga_graphics_performance_benchmarks,5.html

As we can see the gtx780ti is not faring well. Even at 1080p where Nvidia are strong it's getting soundly beaten. There is probably other games that have been released this year that had to be tested on the older Gen cards like below with the Witcher 3. Again the 780ti is slower than a 290.

http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,5.html

Here is Gta 5 which again shows us Kepler getting beaten soundly

http://www.guru3d.com/articles_pages/gta_v_pc_graphics_performance_review,5.html

Bf Hardline again shows kepler getting well beaten. Think that will do for now and yes other sites could show different but was so easy to find gpu performance on a lot of old cards using guru3d.

http://www.guru3d.com/articles_pages/battlefield_hardline_vga_graphics_performance_review,5.html

One more for the road lol. Dying light with the 290x being faster again.

http://www.guru3d.com/articles_pages/dying_light_vga_graphics_performance_review,8.html

All games released or being released this year.

So glad I went with the 290 over the 780!

I have always noticed this with AMD & nvidia, AMD generally seem to support their older gpu's a lot better than nvidia do.
 
Last edited:
This might interest you guy's as it was done recently to see how the 780ti and 290x stack up today (last month).



http://www.hardwarecanucks.com/foru...iews/70125-gtx-780-ti-vs-r9-290x-rematch.html

It is only vocal supporters of AMD that claim nvidia is either purposely or through I actin letting Kepler performance slide. Time and time again it has been disproven with facts.

What's more is most of AMD's lineup is still the same Hawaii architecture while Nvidia has shifted to a completely new architecture yet still extensively support older Kepler cards.
 
It is only vocal supporters of AMD that claim nvidia is either purposely or through I actin letting Kepler performance slide. Time and time again it has been disproven with facts.

What's more is most of AMD's lineup is still the same Hawaii architecture while Nvidia has shifted to a completely new architecture yet still extensively support older Kepler cards.

BbcVe9m.gif
 
It is only vocal supporters of AMD that claim nvidia is either purposely or through I actin letting Kepler performance slide. Time and time again it has been disproven with facts.

What's more is most of AMD's lineup is still the same Hawaii architecture while Nvidia has shifted to a completely new architecture yet still extensively support older Kepler cards.

True that. I remember there was one game where Kepler wasn't doing so well and there was some vocal people who wasn't happy about it but Nvidia said that performance was sub par in that game on Kepler and was working on sorting it, which they did.

Kepler is still going strong as seen from Nasha's post.
 
Well here's Battlefront with old and new cards tested on Guru3d.

http://www.guru3d.com/articles_page...ta_vga_graphics_performance_benchmarks,5.html

As we can see the gtx780ti is not faring well. Even at 1080p where Nvidia are strong it's getting soundly beaten. There is probably other games that have been released this year that had to be tested on the older Gen cards like below with the Witcher 3. Again the 780ti is slower than a 290.

http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,5.html

Here is Gta 5 which again shows us Kepler getting beaten soundly

http://www.guru3d.com/articles_pages/gta_v_pc_graphics_performance_review,5.html

Bf Hardline again shows kepler getting well beaten. Think that will do for now and yes other sites could show different but was so easy to find gpu performance on a lot of old cards using guru3d.

http://www.guru3d.com/articles_pages/battlefield_hardline_vga_graphics_performance_review,5.html

One more for the road lol. Dying light with the 290x being faster again.

http://www.guru3d.com/articles_pages/dying_light_vga_graphics_performance_review,8.html

All games released or being released this year.

Pretty odd results those, especially Dying Light for example - A 780 is less than 10% more powerful than a 770 and then a 780Ti is nearly 20% more powrful than a 780...

In fact in GTA as well, the 780 is only 10-15% faster than a 770 which seems very odd considering the specs and im pretty sure when released the 780 was 30-40% faster than a 770.
 
Last edited:
Pretty odd results those, especially Dying Light for example - A 780 is less than 10% more powerful than a 770 and then a 780Ti is nearly 20% more powrful than a 780...

In fact in GTA as well, the 780 is only 10-15% faster than a 770 which seems very odd considering the specs and im pretty sure when released the 780 was 30-40% faster than a 770.

No as can be seen here it was only around 15% at 1200p.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_770/27.html

Loads of games tested.
 
Last edited:
Ye it could be -50% power usage but every card has half the TDP

I wonder if there is any technical reason why it wouldn't be possible to make a standard card which has three 8 pin connectors giving a total power usage of 500w from the factory ( not onboard crossfire ), assuming that a stock cooler can be made which is capable enough. The core clock could be 2GHz or something.
 
True that. I remember there was one game where Kepler wasn't doing so well and there was some vocal people who wasn't happy about it but Nvidia said that performance was sub par in that game on Kepler and was working on sorting it, which they did.

Kepler is still going strong as seen from Nasha's post.

lol, those vocal people were the thousands of posts on Nvidia's forums for 6 months wondering why all the new AAA titles performed like crap. The performance got SO bad in Witcher 3 with the performance difference being so embarrassingly large that even more Nvidia users piled into that thread on Nvidia forums demanding action.

Then lets get history straight, when people mentioned it here the usual group of Nvidia people who post then agree with each other constantly kept saying Kepler was slow in Witcher 3 because it was an old architecture, banged on about tessellation and how it was holding back Kepler performance. You acknowledged and excused the poor performance.

Then after a couple of weeks of both unstable and awful performance drivers Nvidia publicly admitted lacking Kepler performance, magically came up with a new driver that 'fixed' this performance in both Witcher 3 and several of the other games Nvidia users had been complaining about. So this excused poor performance apparently down to architecture that the usual suspects had been removed.... so was clearly not down to architecture. Since Nvidia was forced by their own users into finally addressing poor Kepler performance you are all now attempting to rewrite history by pretending the performance was never poor in the first place. Again only trouble there is the Nvidia users complaining about Kepler performance for months and Nvidia publicly admitting Kepler performance was sub par.


Even now D.P is trying to excuse lacking performance as an old architecture, because apparently with 'top support' and higher costs for the cards you can't expect good drivers for an older architecture 6 months after the release of a new architecture. Hardware Cannucks suck, as someone else posted a link to people basically laughing at the results entirely. The results from that website are both incredibly contradictory to most websites around AND excuses, as both of you have done, that Nvidia were forced into releasing drivers to fix the terrible performance.

So if performance is dire for 6 months, Nvidia finally release new drivers drastically improving performance... that 6 months didn't happen if you test drivers from launch and now? Sure.


EDIT:-

https://forums.geforce.com/default/...kepler-gpus-performance-in-favor-of-maxwell-/

just ONE of the threads with examples of MULTIPLE games released over a multiple month period that had slower 'new gen' cards all of a sudden beating faster Kepler based cards. Since the driver updates performance is magically back to where it should be and supposed architectural improvements that were excused as the reason for lacking Kepler performance are now no longer the difference with new drivers? Yes, on their own forums you see those same types of people making every excuse possible as to why Nvidia wasn't to blame in any way for poor performance.

There is proof of this literally every where, not least from Nvidia users, Nvidia themselves when they admitted they needed better Kepler drivers, the sudden large increase in Kepler performance when they finally addressed the problems. But right, now Nvidia have actually fixed the problem and can't be seen in the current drivers, it never happened. Funny how for something that never happened you all were throwing out every excuse possible for the lacking performance at the time... none of which stand up now the problem has been fixed.
 
Last edited:
Oh yeh, surprising that. I really thought the 780 was more powerful than that at the time.

My GHZ edition is about 30-40% faster than a 770 (out the box) with a ~19% performance increase over a stock 780 out the box.

Quite a few people including in the link Final8y gave are making the same mistake as r7slayer of referencing benchmarks that use old numbers for a lot of cards rather than constantly retesting 2 dozen cards, the reason that more recent 780ti/290X comparison focuses on the 780ti is that it was looking into claims nVidia have been sabotaging Kepler performance in recent drivers not comparing the cards head to head to find a winner. It still gives a good indication that the 290X doesn't destroy the 780ti overall.
 
Last edited:
lol, those vocal people were the thousands of posts on Nvidia's forums for 6 months wondering why all the new AAA titles performed like crap. The performance got SO bad in Witcher 3 with the performance difference being so embarrassingly large that even more Nvidia users piled into that thread on Nvidia forums demanding action.

Then lets get history straight, when people mentioned it here the usual group of Nvidia people who post then agree with each other constantly kept saying Kepler was slow in Witcher 3 because it was an old architecture, banged on about tessellation and how it was holding back Kepler performance. You acknowledged and excused the poor performance.

Then after a couple of weeks of both unstable and awful performance drivers Nvidia publicly admitted lacking Kepler performance, magically came up with a new driver that 'fixed' this performance in both Witcher 3 and several of the other games Nvidia users had been complaining about. So this excused poor performance apparently down to architecture that the usual suspects had been removed.... so was clearly not down to architecture. Since Nvidia was forced by their own users into finally addressing poor Kepler performance you are all now attempting to rewrite history by pretending the performance was never poor in the first place. Again only trouble there is the Nvidia users complaining about Kepler performance for months and Nvidia publicly admitting Kepler performance was sub par.


Even now D.P is trying to excuse lacking performance as an old architecture, because apparently with 'top support' and higher costs for the cards you can't expect good drivers for an older architecture 6 months after the release of a new architecture. Hardware Cannucks suck, as someone else posted a link to people basically laughing at the results entirely. The results from that website are both incredibly contradictory to most websites around AND excuses, as both of you have done, that Nvidia were forced into releasing drivers to fix the terrible performance.

So if performance is dire for 6 months, Nvidia finally release new drivers drastically improving performance... that 6 months didn't happen if you test drivers from launch and now? Sure.


EDIT:-

https://forums.geforce.com/default/...kepler-gpus-performance-in-favor-of-maxwell-/

just ONE of the threads with examples of MULTIPLE games released over a multiple month period that had slower 'new gen' cards all of a sudden beating faster Kepler based cards. Since the driver updates performance is magically back to where it should be and supposed architectural improvements that were excused as the reason for lacking Kepler performance are now no longer the difference with new drivers? Yes, on their own forums you see those same types of people making every excuse possible as to why Nvidia wasn't to blame in any way for poor performance.

There is proof of this literally every where, not least from Nvidia users, Nvidia themselves when they admitted they needed better Kepler drivers, the sudden large increase in Kepler performance when they finally addressed the problems. But right, now Nvidia have actually fixed the problem and can't be seen in the current drivers, it never happened. Funny how for something that never happened you all were throwing out every excuse possible for the lacking performance at the time... none of which stand up now the problem has been fixed.

^
Soo much this.

How much upgrade cash cash swelled the books in Santa Clara during the 'lost' performance debacle during the the wait for a DRIVER fix...

Go to hand it to Nvidia, they currently have 82% of potential gullible customers, that is how you run a business.:D
 
Yes actually most hardcore PC gamers do actually tend to buy the best hardware to play games on, £500 for a gpu is common now because that's the market

There is a big difference to 'hardcore gamers off computer forums who like to argue about things' and PC gamers in general.

0.84% of gamers on Steam use an Nvidia 980. 3.5% use a 970, 1.03% use a R9 200 series. I would hardly call that the 'market'.

http://store.steampowered.com/hwsurvey/videocard/?sort=name
 
Back
Top Bottom