• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

please just stop it, stop buying

Since when was userbench something we used in here for price performance. Game benchmarks and the actual price at the time is far more reliable. New games keep coming and it's clear to see in them what performs well. A quick look at the price at the time tells you all you need to know. Nvidia right at this moment are just about as bad as it gets end of story. The main probably for high end after Vega is it's NV v NV.

I was looking for an easy to obtain number that is standardised across all cards. It's not perfect, which is why I used two different generic benchmark sources.

Taking an average fps across a representative basket of games and varying resolutions would have course reveal much better results, and I think show AMD in a slightly better light with Vega64 - but I think this chart shows that the price/performance outrage is a little overdone (now that RTX 2080 Ti's are selling at ~£1000). At £1400+ it was obviously insane.
 
I think at current prices 1080ti / 2080 there is not much difference. But you were saying that 2070/ 1070 ti / vega 56 are best value for money. I'm sayin you need to consider the resolution your playing at.

Completely agree, which is why I find all this price/performance stuff a little strange. If you want to make use of a gsync 4K 144hz monitor then a 2080 Ti is better bang for buck and SLI 1080 Tis. At that point you have already invested so much money on the rest of the setup, a couple of hundred quid here or there is meaningless.
 
nice music, how on earth is a vega keeping up with a 2080 tho?

Frostbite is a well written engine tapping to the true power of the GPUs, especially when Async Compute is used with DX12, showing that Vega 64 is very powerful if not used with some archaic engine. Look at other games written with Frostibite like Battlefront 1 & 2.
Also many new games on new engines show Vega 64 is comparable to 1080ti/2080 not the 1080 (Forza 4, MHW for example) and more games going to follow now RTX cards have proper async compute, which will greatly benefit Vega also. Let not forget Vega 64 is a 13tflop GPU, as much as the RTX2080ti almost (13.4tflop) and is sleeping on most games.

In addition to the above benchmark the Nitro is used not the reference card. Wrote numerous times that should find Red Devil/Nitro/LC benchmarks to compare performance, not Reference or Strix which are the worse samples of Vega.

Finally many old engines somehow clock Vega 64 far bellow the normal speed on 32bit mode. (they dont support 64bit) If the users set the Min speed the same as the Max speed on P7 state (just a click on wattman) the card maintains the max clocks improving performance 70-100% (Elder Scrolls Online, SWG, and Clausewitz engine based games for exaple)
 
Frostbite is a well written engine tapping to the true power of the GPUs, especially when Async Compute is used with DX12

Yeah right... :confused:

DX.png


"Using DX12 the game suffered major stuttering with both AMD and Nvidia GPUs. This is something we saw with Battlefield 1 upon release, so we guess it’s not entirely surprising. Developer EA DICE clearly isn’t prioritizing the low level API."
 
People bought the 2080ti and 2080 for the features set.

Why buy the 2080 instead of the 1080ti? For the feature set as they are both the same in terms of performance and yet the 2080 is £200 more.

What don’t you get ?

Now that the RTX series is out you will see the performance of 1080ti decline over time. Give it 2 more years and the RTX 2080 will pull ahead of the 1080ti by a significant margin. Always been the case with NVIDIA when the new gen drops. Hence I never buy nvidia’s older tech.
 
That's not the point. The point is that Battlefield V runs worse in DX12 than it does in DX11.

Yup, DX12 is a broken mess. Just look at Hitman 2, they've dropped support for it altogether!! I don't know how anyone can expect RTX to ever stand a chance of working properly if they can't get DX12 sorted... and it's been out how long now??! :rolleyes:
 
Yup, DX12 is a broken mess. Just look at Hitman 2, they've dropped support for it altogether!! I don't know how anyone can expect RTX to ever stand a chance of working properly if they can't get DX12 sorted... and it's been out how long now??! :rolleyes:
Such a shame about DX12. I wonder if it is the way it is designed or just developers not putting in the time and resources to get it to work right?

I would like to see devs stop using DX11, even better maybe move to Vulkan. It has been ages now. Maybe they need to admit that even though there is a bit of overhead in DX11 model, it makes life much easier for devs or maybe make an improved version of that in DX13?

Anyone have any thoughts on what the issue is?
 
Such a shame about DX12. I wonder if it is the way it is designed or just developers not putting in the time and resources to get it to work right?

I would like to see devs stop using DX11, even better maybe move to Vulkan. It has been ages now. Maybe they need to admit that even though there is a bit of overhead in DX11 model, it makes life much easier for devs or maybe make an improved version of that in DX13?

Anyone have any thoughts on what the issue is?
My guess is they're re-using old code bases to save time and £ so DX12 is a bit of a tack on. Same as RT at the moment really.
Developing using DX12 from the ground up will be done when they hit the limitations of their existing engines I think? Which is probably just about now.
 
Now that the RTX series is out you will see the performance of 1080ti decline over time. Give it 2 more years and the RTX 2080 will pull ahead of the 1080ti by a significant margin. Always been the case with NVIDIA when the new gen drops. Hence I never buy nvidia’s older tech.

I won’t have a 1080ti in 2 years and the new gen will be out rendering the 20 series obsolete.

Btw my question was rhetorical.
 
My guess is they're re-using old code bases to save time and £ so DX12 is a bit of a tack on. Same as RT at the moment really.
Developing using DX12 from the ground up will be done when they hit the limitations of their existing engines I think? Which is probably just about now.
You could be right.

I get the feeling DX12 will not over take DX11 for a long time yet :(
 
Vulkan runs well on AMD and NVidia but I doubt we will see many games using that. A shame that DX12 is poo at present but I hope it gets fixed, as it is the future so to speak
DX12 isn't exactly poo. Most of the current engines are designed for DX11 or DX9 in the case of Bethesda. Very few games have been designed with a DX12 implementation from the start or properly integrated. The other issue is that Frostbite games with DX12 aren't really a good baseline as the engine is a PITA with DX12 still technically in BETA for the game even though it was on the forefront with Mantle in BF4 (another failed BETA test).

There are a few good DX12 applications but not many...

EDIT: Come to think of it the only two good implementations I can think of are Forza 7 and Forza Horizon 4, most of the others have a DX12 exclamation without much substance.
 
Last edited:
EDIT: Come to think of it the only two good implementations I can think of are Forza 7 and Forza Horizon 4, most of the others have a DX12 exclamation without much substance.
Noticed who the publisher is?
Maybe fairly soon we'll have some ground up DX12 games built. 3 1/2 years seems like a long time but knocking out a new games engines around DX12, without knowledge of doing that myself, I assume would take a numbers of years. Microsoft would have a good advantage there, being the developers of DX 12.
 
Vulkan runs well on AMD and NVidia but I doubt we will see many games using that. A shame that DX12 is poo at present but I hope it gets fixed, as it is the future so to speak

DX12 isn't exactly poo. Most of the current engines are designed for DX11 or DX9 in the case of Bethesda. Very few games have been designed with a DX12 implementation from the start or properly integrated. The other issue is that Frostbite games with DX12 aren't really a good baseline as the engine is a PITA with DX12 still technically in BETA for the game even though it was on the forefront with Mantle in BF4 (another failed BETA test).

There are a few good DX12 applications but not many...

EDIT: Come to think of it the only two good implementations I can think of are Forza 7 and Forza Horizon 4, most of the others have a DX12 exclamation without much substance.

https://www.guru3d.com/news-story/codemasters-to-release-directx-12-support-for-f1-2018.html

F1 2018 now has DX12 support but Guru3D benchmarks showed DX12 is slower than DX11 on F1 2018.
 
Back
Top Bottom