• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Both RX Vega 64 & 56 Beat GTX 1080 Ti in Forza 7 DX12 benchmarks

Yes it's a shock to the system but it just shows just how far behind AMD have been. On paper though compare the specs of a 570 and a 1080 and ignore the price for moment, is there anything there that would tell you that a 570 shouldn't be around 1080 performance? Everyone is just use to the 1080 being marketed as a high end card which in reality it's not, it's Nvidia's mid ranged chip that been promoted to the high end due the lack of competition so seeing results like we seeing in Forza is just a gentle reminder to everyone of how things should be working.

You mean apart from Nvidia having close to double the GPU performance?
 
You mean apart from Nvidia having close to double the GPU performance?

Imeant looking at the technical specifications notably the die size and power draw not benchmarks.

In games that's right the 1080 stomps all over the Polaris lineup, the Pascal cards have hardly have wasted GPU cycles and their cards don't get bottle necked by CPU draw calls and it's gemotray engine is far stronger. The bottom line is Nvidia getting the most of a 1080's 8 tflops compute performance and turns it into FPS. The 570/580 is around 6 Tflops if i recall so on paper not that far behind but due to software limitations of DX11 not all of the GCN shaders are used and a lot of potential performance is wasted every cycle. In Forza we are seeing what happens when games are coded correctly for GCN hardware/DX12, if I were AMD I would want to send people around to every development studio and get them build games around these lower level API's ASAP.
 
Not just a single source though.

Unless I've missed something I can't see any other links to Forza 7 benchmarks here, a quick google just now just came up with re posts of the computerbase.de benchmarks.

Imeant looking at the technical specifications notably the die size and power draw not benchmarks.

In games that's right the 1080 stomps all over the Polaris lineup, the Pascal cards have hardly have wasted GPU cycles and their cards don't get bottle necked by CPU draw calls and it's gemotray engine is far stronger. The bottom line is Nvidia getting the most of a 1080's 8 tflops compute performance and turns it into FPS. The 570/580 is around 6 Tflops if i recall so on paper not that far behind but due to software limitations of DX11 not all of the GCN shaders are used and a lot of potential performance is wasted every cycle. In Forza we are seeing what happens when games are coded correctly for GCN hardware/DX12, if I were AMD I would want to send people around to every development studio and get them build games around these lower level API's ASAP.

Yet Other DX12 games like GOW4 show nvidia on top by a margin. Heck even the 980ti beats out the Fury X despite the latter having asynchronous compute switch on, which how some peeps talk makes massive swings in performance.
 
Last edited:
Imeant looking at the technical specifications not benchmarks.

In games that's right the 1080 stomps all over the Polaris lineup, the Pascal cards have hardly have wasted GPU cycles and their cards don't get bottle necked by CPU draw calls and it's gemotray engine is far stronger. The bottom line is Nvidia getting the most of a 1080's 8 tflops compute performance and turns it into FPS. The 570/580 is around 6 Tflops if i recall so on paper not that far behind but due to software limitations of DX11 not all of the GCN shaders are used and a lot of potential performance is wasted every cycle. In Forza we are seeing what happens when games are coded correctly for GCN hardware/DX12, if I were AMD I would want to send people around to every development studio and get them build games around these lower level API's ASAP.

I think AMD have been doing that sinse 2012 with Mantle haven't they?
 
If this situation is how it seems, it will be interesting to see how much it costs to build a PC with Nvidia to match an Xbox one X if DX12 is so well tuned on the Xbox.
 
Not just a single source though.

Can you link to a different credible source?

Seriously I think there's going to be a lot of egg on a lot of people's faces, jumping in delight at one set of results from some obscure website. Let's see some proper results from big name U.K. Sites and if it's the same I'll be just as as excited as you guys but really a 570>1080?
 
i-knew-it-gif-8.gif
 
Can you link to a different credible source?

Seriously I think there's going to be a lot of egg on a lot of people's faces, jumping in delight at one set of results from some obscure website. Let's see some proper results from big name U.K. Sites and if it's the same I'll be just as as excited as you guys but really a 570>1080?

Yeah you'd expect a 1080 to beat a 570 into dust in any game with Windows drivers.

I think jumping in delight needs a special mention, but the scary thought is maybe AMD haven't got an optimised driver either.
 
but the scary thought is maybe AMD haven't got an optimised driver either.

Relive 17.9.3 does provide support for Forza 7, but as for it being more optimised rather than just supported, not sure. I suppose it will be to a degree. Anyway, its still good news to see the card performing well. All we can really do is wait for an official release and see what Nvidia can do with optimizations. Also I am looking forward to FarCry 5 and Wolfenstien:New Collossus as they have been touted as using specific Vega features. I am also not sure why Nvidia users scream out "but it's optimised for Vega Features".....Well it's about time IMHO as AMD have brought some promising features over the years to push GFX tech forward only to be held back by devs choosing not to use them (You can make your own mind up why this has been happening ;)). I have always believed in devs using all the features of GFX cards regardless of make so that we all get the best out of each and every game. That is when tech becomes competitive on a level playing field and then it's just down to the tech itself...one side will no doubt triumph over the other, then the opposition develops something better and we leapfrog just like we used to do and things go forward at a faster pace. We all gain from that. :)
 
This FineWine stuff we hear I believe it's more devs getting more understanding of the GPU.
Vega will be no different I firmly believe that come a year or two RX vega will still be playing game while GTX 10 series will fade away so who really wins? Nvidia for the short run or AMD for the long run? :p

Who buys a £600 GPU in the hope they'll have it performing well in a couple of years? LOL.

If I spend that sort of cash I want top level performance right now, not pray and hope it comes at some point in the future.

Look at it this way, you buy a Vega 64 now for what, £550? Then in two years time it's properly dialled in, and yet in that time a mid level Nvidia GPU has been released at £350 and trashes it in all the latest games. Where is the logic?

I'm not an Nv fanboy, but I hate this approach AMD has where the customer basically pays through the nose for a half finished product and then has to wait 18 months/2 years for it to be optimized, who in their right mind thinks that's a good use of their hard earned money?!
 
'Bout time AMD, Nvidia needs a wake up call. Fingers crossed more titles use all vega has to offer.
All this talk of Nvidia having the most powerful cards, yet AMD cards have been king for a long time at mining. I believe there's some truth in that its the game developers that have it wrong, not AMD.
 
Can you link to a different credible source?

Seriously I think there's going to be a lot of egg on a lot of people's faces, jumping in delight at one set of results from some obscure website. Let's see some proper results from big name U.K. Sites and if it's the same I'll be just as as excited as you guys but really a 570>1080?

I did couple pages back Cpu benchmarks
 
Who buys a £600 GPU in the hope they'll have it performing well in a couple of years? LOL.

If I spend that sort of cash I want top level performance right now, not pray and hope it comes at some point in the future.

Look at it this way, you buy a Vega 64 now for what, £550? Then in two years time it's properly dialled in, and yet in that time a mid level Nvidia GPU has been released at £350 and trashes it in all the latest games. Where is the logic?

I'm not an Nv fanboy, but I hate this approach AMD has where the customer basically pays through the nose for a half finished product and then has to wait 18 months/2 years for it to be optimized, who in their right mind thinks that's a good use of their hard earned money?!

Those that don't upgrade £600 graphics cards every year?
 
'Bout time AMD, Nvidia needs a wake up call. Fingers crossed more titles use all vega has to offer.
All this talk of Nvidia having the most powerful cards, yet AMD cards have been king for a long time at mining. I believe there's some truth in that its the game developers that have it wrong, not AMD.

There must be something in it. I mean from a business point of view why would Devs spend more time to put in those extra features for only 30% of users. Nvidia's 70% must also give them a nice bit of leverage and the odd whisper in a developers ear from the overwhelming leader in the market can go a long way. Hence why Nvidia still led the field even when AMD had the better cards. Groundbreaking features not being used by devs certainly has held AMD back in the mindshare dept and performance arena. I think gameworks has also contributed to this in its own way. In a reverse of the Spock notion "The needs of the few (Nvidia) outweigh the needs of the many (pushing the boundaries of graphics tech for everyone)". Nvidia being the monopoly have slowed down the development of faster tech just like Intel have. I am sure devs would have picked up DX12/Async compute faster if Nvidia had cards that could fully implement it in hardware. We shall see if this comes to pass if/when Volta has these features built into the hardware. Time will tell, it always does. :D
 
Back
Top Bottom