• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon VII a win or fail?

Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Gregster: AMD said they were looking at it. Doesn't mean soon however.

4K: DLSS is a worthy feature whether you like it or not. A lot of people put value in it.
Totally agreed and the sooner, the better as it really does add to the game. Sure we are in a primitive state at present but I can see this as a healthy future.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Nasha, I genuinely like the Radeon VII and anyone who is into PC tech, knows AMD have been hitting the CPU space hard and doing a fantastic job at it. My take on the VII is a rushed release though and I personally feel it isn't quite ready (or drivers aren't), maybe because AMD have been putting their resources to the CPU side of things? I say the same thing with NVidia and RTX. Whilst I really like the effect in BFV, I want to see it in more games and DLSS so far is a 4K thing, which I have no use for but would love to use it at 3440x1440.

Both the VII and RTX have the pluses and minuses and whilst it is great to see AMD competing, I don't quite feel they have done enough thus far.

You're right AMD definitely haven't done enough, I don't think they've been able too. I think the VII happened because A, Sales of the MI50 weren't keeping up with how many they were producing meaning they had enought to add an RX card to the line-up & B, Nvidia helped make it possible with their pricing, As you said Zen's been the focus but now it's available & doing fairly well Radeons R&D is getting a boost (CEO Su said this) but we'll still need to wait a couple of years before the extra R&D trickles down to the consumer.
 
Last edited:
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I suck at Math but decided to painstakingly work out the difference, as you are rather obnoxious and this is what I did.

BFV = 6.4% AMD
Civ VI = 24% NVidia
Dasrksiders 3 = 23.5% NVidia
Deus Ex = 7.1% AMD
Divinity Original Sin = 14.7 NVidia
Dragon Quest XI = 35.8% NVidia
F1 2018 = 9.7% NVidia
FarCry 5 = 8% AMD
Ghost Recon Wildlands = 2.3% NVidia
GTA V = 7.1% NVidia
Hellblade = 10% NVidia
Hitman 2 = 22.5% NVidia
Just Cause 4 = 11.61% NVidia
Monster Hunter World = 18% NVidia
Middle Earth 2 = 5% NVidia
Rainbow Six = 14.7% NVidia
Shadow of the Tomb Raider = 3% NVidia
Strange Brigade = 22% AMD
The Witcher 3 = 10.6% NVidia
Wolfenstein 2 = 12.1% NVidia
Overall = 9.5% favour of NVidia

So overall, 9.5% in favour of NVidia and if I was to have gone to the 4th decimal point, it probably works out closer to 10%. You said it was 8% or am I wrong on this?

If someone else wants to make me wrong and show me where I have gone wrong, I will listen.

Benchmarks can also be very misleading! Take this run from Wolfenstein 2 and the VEGA 7 is winning.


Either way am done with all this 2080 is better, VEGA 7 is better! I will just focus on my own upgrade path and if the next GPU or VEGA 7 is better than what i running and worth the upgrade i'll buy.

VEGA 56/64 was slated on release to be one of AMD worst GPU releases now look at them lol they priced very competitively and the 64 wins the GTX 1080 in most benchmark videos I watch.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
If you could easily and reliable drop the voltage and fan across all cards wouldn't AMD have shipped them like that? You really think they just dialed everything up to the max and QA said yeah that's fine just ship it, let the customers live with it or try to fix it. The following from the Tom's Hardware review:

"we approached AMD about our findings and were told that this behavior is intended. Although Radeon VII exhibits similar maximum-load acoustic performance as the reference Radeon RX Vega 64, that quick ramp from idle to 2,900 RPM explains why the new card seems so much louder. Again, the proposed solution is a manual adjustment in WattMan. Unfortunately, any attempt to relax Radeon VII's ability to cool itself off is going to cost you sustainable clock rates, negatively affecting performance."


You'd of thought so wouldn't you? It was the same with the 14nm Vega's, You could tweak them for a decent improvement across the board, From what I'm currently seeing the VII hasn't got the same amount of room as the other Vega's had but I've been able to lower my voltage & overclock the memory so now I'm getting a small performance increase with a decent db improvement compared to stock which is good as I was concerned about how well the heatsink would work leading up to release. Thank God for undervolting. :)
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Because they just like making them look bad?
Surely just a die-shrink to 7nm, as AMD have done, is the first step for faster/smaller? Sure the Ti cards will be horribly expensive, people will still buy them though. I see the Radeon VII is still in stock even with the pitiful launch stock levels.

That's likely due to people not being willing to pay the overhead some brands think they can add over the recommended price, The £650 cards sold quickly leaving the stupidly priced models from Asus, MSI & Gigabyte on the shelves. That's gotta be a good thing.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Benchmarks can also be very misleading! Take this run from Wolfenstein 2 and the VEGA 7 is winning.


Either way am done with all this 2080 is better, VEGA 7 is better! I will just focus on my own upgrade path and if the next GPU or VEGA 7 is better than what i running and worth the upgrade i'll buy.

VEGA 56/64 was slated on release to be one of AMD worst GPU releases now look at them lol they priced very competitively and the 64 wins the GTX 1080 in most benchmark videos I watch.
What settings was used and why no SBS? You stated previous that you like to see that or it is useless, so surprised you show this.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Benchmarks can also be very misleading! Take this run from Wolfenstein 2 and the VEGA 7 is winning.


Either way am done with all this 2080 is better, VEGA 7 is better! I will just focus on my own upgrade path and if the next GPU or VEGA 7 is better than what i running and worth the upgrade i'll buy.

VEGA 56/64 was slated on release to be one of AMD worst GPU releases now look at them lol they priced very competitively and the 64 wins the GTX 1080 in most benchmark videos I watch.


Some OC.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
What settings was used and why no SBS? You stated previous that you like to see that or it is useless, so surprised you show this.

I said VEGA 64 not 7 :p
And that feature is an Nvidia setting its there for Nvidia to gain extra performance. He is right in disabling it just like you wouldn't bench a game with PhysX enabled vs AMD or a game works feature against AMD.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
I said VEGA 64 not 7 :p
And that feature is an Nvidia setting its there for Nvidia to gain extra performance. He is right in disabling it just like you wouldn't bench a game with PhysX enabled vs AMD or a game works feature against AMD.
Huh?

So he disabled a setting that gives NVidia better performance? Seems flawed to me. If the option is there, turn it on unless you wish to make something look better than it actually is. So no side by side comparison, settings disabled to give an advantage to AMD. Terrible video to use as a basis of benchmarking.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
And for the record @Gregster You do realise that Adaptive shading gains performance by reducing image quality around the view of the screen. So for the best possible image quality and performance this setting should remain off.
DF does an excellent video on this.


 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Huh?

So he disabled a setting that gives NVidia better performance? Seems flawed to me. If the option is there, turn it on unless you wish to make something look better than it actually is. So no side by side comparison, settings disabled to give an advantage to AMD. Terrible video to use as a basis of benchmarking.

So would you benchmark a game works feature against AMD ? No you wouldn't why do you think benchmarks list NO GAMEWORKS! Simple
Because it isn't optimised code to run on AMD GPUs reduced performance making Nvidia look even better or in this case reduced Image quality to make Nvidia look better.

I see right though this BS
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
And for the record @Gregster You do realise that Adaptive shading gains performance by reducing image quality around the view of the screen. So for the best possible image quality and performance this setting should remain off.
DF does an excellent video on this.


Hey, show me what you like but like I said, a poor video to use and I have based my argument on the basis of what 4k8k linked to. Don't shoot the messenger and maybe pull him up?
 
Soldato
Joined
27 Mar 2010
Posts
3,069
It could well be the top dog in AMD’s line-up for some time. But Nvidia already beat it on performance and power usage at 12nm. Just imagine how that will compare when Nvidia release 7nm in 2020.

Well really they did it on 16nm with Gp102 and
Benchmarks can also be very misleading! Take this run from Wolfenstein 2 and the VEGA 7 is winning.


Either way am done with all this 2080 is better, VEGA 7 is better! I will just focus on my own upgrade path and if the next GPU or VEGA 7 is better than what i running and worth the upgrade i'll buy.

VEGA 56/64 was slated on release to be one of AMD worst GPU releases now look at them lol they priced very competitively and the 64 wins the GTX 1080 in most benchmark videos I watch.

Either way rtx 2080 and the v7 are overpriced and lucklustre. Depending on game i'm seeing a 5-8fps gain over my lc64 with an undervolted tweaked V7, so it's not even worth my time.

Because Vega is at eol pricing, It was unsustainable to produce Vega at current prices. Just because it's considered affordable it doesn't suddenly cancel out the factors which contributed to reasons of Vega 56/64 being a failure overall. I'm not going to repeat them.
If however the only metric (performance fps) is the basis for comparison, then sorry but that's flawed and theres more detail being left out.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
So would you benchmark a game works feature against AMD ? No you wouldn't why do you think benchmarks list NO GAMEWORKS! Simple
Because it isn't optimised code to run on AMD GPUs reduced performance making Nvidia look even better or in this case reduced Image quality to make Nvidia look better.

I see right though this BS
Seems a bit harsh to me. No GameWorks, even if it works on AMD is defeating the idea of benchmarking. Somebody could well think "Oooo, that AMD GPU performs really well watching that vid, I will grab that". They turn all the settings on, only to find they are not getting the frames that the vid got. I find that disingenuous and if it works on both AMD and NVidia, it should be turned on.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Seems a bit harsh to me. No GameWorks, even if it works on AMD is defeating the idea of benchmarking. Somebody could well think "Oooo, that AMD GPU performs really well watching that vid, I will grab that". They turn all the settings on, only to find they are not getting the frames that the vid got. I find that disingenuous and if it works on both AMD and NVidia, it should be turned on.

Completely disagree! Hair works is massively in Nvidia favour because its coded with so much more tessellation that at the time Nvidia GPUs did so much better vs AMD "Not sure how VEGA handles that now" AMD users had to add custom radeon settings profile and force the tessellation lower.

What you also saying then if the setting is there it should be enabled, ok would you enabled Physx on AMD for Metro or batman when benching both GPUs? These games have the option on AMD to switch on. You know dam well the AMD GPU has no chance.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Completely disagree! Hair works is massively in Nvidia favour because its coded with so much more tessellation that at the time Nvidia GPUs did so much better vs AMD "Not sure how VEGA handles that now" AMD users had to add custom radeon settings profile and force the tessellation lower.

What you also saying then if the setting is there it should be enabled, ok would you enabled Physx on AMD for Metro or batman when benching both GPUs? These games have the option on AMD to switch on. You know dam well the AMD GPU has no chance.
It goes back to the "Too much tessellation is being used and it's not fair" NVidia have historically been good at Tessellation but because Crytek used it too much, the AMD owners kicked off and the same again in many games after. Tessellation adds to the games IQ. VRS clearly takes the IQ and lowers it (noticeable?). I am sure you will say yes but you are wanting things to be done fairly in benchmarks, so if PhysX works on AMD (something I remember you even saying you wanted), shouldn't it be turned on?

I like to see fair benching, that is all and like I said, it was 4k8k who linked that TPU article, so if you have an issue with it, take it up with him.

I like things to be equal and fair in testing.
 
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
Hi, That's a big assumption to make if you haven't owned a Vega card, or have you? I'm a gamer, My PC & adaptive sync ultrawide monitor are first and foremost for gaming, I've owned both 14nm & 7nm Vega cards, I'm still getting to know the Vega II but the Vega 64's of which I've had numerous versions are great gaming cards. As for the Vega II bringing a level of performance that's late to the market doesn't mean it's a poor gaming card just that it's late, In reality it's not even that late if it was it would be offering much lower performance than it is on today's Totem pole. Your running a Pascal Titan is that now a rubbish gaming card? From what I can tell it has similar performance to the Vega II but it cost a lot more, £300 more or less? They still sell Pascal cards so does that mean anyone that's bought a Pascal card since Turing released is buying a rubbish gaming card? They don't seem to think so and Pascal's still selling well, even the 1080ti which offers Vega II levels of performance at a higher price point has continued to sell well.
14nm Vega released in a poor state, 7nm Vega's the same, With 14nm Vega we learned how to maximise and capitalize on it's performance and as someone who's owned a non reference GTX 1080 as well as a few Vega 64's I found that a tweaked Vega is the better choice which is something none of us believed would happen on Vega's launch, (at least not those of us who could look at it objectively). As for long term viability AMD's GCN architecture has proven itself as the better product again & again. That said we are getting closer to leaving GCN based architectures behind & it will be interesting to see how well GCN products are supported after we move on.
I was waiting for Vega with Liquid cooling but was not stupid enough to buy one after benchmarks when for 650 pounds I gotten 3 month old Titan X With EK waterblock on it :p Cause MAte that upgraded Titan X pascal to Titan Xp pascal sold me one. Also got both vegas as You see now. In hes spare for fun rig :D

I got some benchmark from my mate that had FE and now got VII
186c8e3d383d9b9b94ceada145e1b276a1d6fb89.png

4a3e57bc9953b825a29384903646568843630e33.png


8k numbers
 
Suspended
Joined
30 Mar 2010
Posts
13,068
Location
Under The Stairs!
It goes back to the "Too much tessellation is being used and it's not fair" NVidia have historically been good at Tessellation but because Crytek used it too much, the AMD owners kicked off and the same again in many games after.

Nv users kicked off with NV gimping their last gen gpu's, with the over use of tessellation.

As AMD provide a driver level override, CDPR had to enable a slider in W3 for Nv users to allow them to override Nv's top heavy tessellation factory settings.
 
Back
Top Bottom