• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon VII a win or fail?

Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Seen quite a few mixed opinions on this new card from forum users and some reviewers liking it and disliking it, so what does everyone here think?

Personally, I am on the fence because it is AMDs fastest gaming card ever and does have good performance. 16GB VRAM is a plus also but noise is meh! It looks a decent 4K card in truth but lower resolutions do kind of make me look at other cards. Price is fair in my opinion but an 8GB variant at £100 cheaper would have been sexy. If I was after a new GPU, would I have jumped on it? Probably not in truth but that doesn't make this a decent card. Overclocking needs looking at and if Roman couldn't do it, something is wrong there. Anyways, a good launch?
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Some interesting and mostly fair responses. Thinking on, I do think the limited supply (5000?) shows that it is a stop gap card prior to Navi launching. Clearly they have made use of the Mi50/60? cards and repurposed them as a gaming card and basically it is a die shrunk Vega. Not a bad move from AMD and a good way of selling off failed chips and getting themselves back in the high end gaming segment. I do feel the launch is meh though and no overclocking helps (or the usual undervolting). I also don't feel we will be seeing the "fine wine" status with these cards, as Vega is quite long in the tooth now but saying that, it is AMD and they have eeked out performance before. Hopefully the overclocking part gets sorted very quickly and those who have already purchased one can enjoy having some fun with them.

16GB is nice for the 4K gamers out there and VRAM has added some decent longevity to the card and whilst it has some decent grunt, it should well last a good few years in someones PC. I do love the look of the card also but man, what were they thinking about the noise levels?

To those with one, nice one and I know you will enjoy it and sure, it is quite pricey but YOLO and all that, so crack on brothers.

I am looking forward to Navi and hopefully that isn't too far away and does kick some serious butt.

Shankly said earlier that if it was 10 fps faster, would it be a win or fail (or something like that) and I say a win then, as PC gamers, I think/feel the majority of us want smoother gaming and frames and sync technology do that.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
I have explained it multiple times. I see things which you don't :D

And come back in 3 or 5 years with your Radeon VII 16 GB. We will speak then about how great it is, and how fail RTX 2080 will be.

Actually you don't see things that others don't but use the placebo effect to see what you want.

That's because it's fantasy.

In 3 or 5 years I'd hope to have a much better GPU than either the 2080 or the VII.

Then again, given the way the market's going, I could just be entirely next gen console. I've already got the 4K OLED ready for next gen.

I am feeling the same way in truth. Getting a bit peeved at the high prices and might just say **** it anf grab a PS5.

I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.

Seriously, you have to stop this. As someone who still owns a 290X, I have never seen any difference in IQ between AMD or NVidia once the monitor has been set to how I like it. There is no compression used from NVidia and again, your placebo is letting you see what you want (or even read what you want, as I can't imagine you won an NVidia card).
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
At 1440p, its even more embarrassing. Loses to the 1080ti by 5% and to the 2080 by 14%. Just 6% faster than 2070. Its closer to 2070 than the 2080

relative-performance_2560-1440.png
Ouch! I forgot about the 1080Ti and shows it is still a beast of a card, even at 2 years old.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
lol I can't believe you can be so naive. Type the numbers game by game in Excel and let the Excel show you the real difference.
Whoever posted those graphs is an idiot.
I am crap at math and hence why I ask. I am sure someone who is decent will take the time to do it but not me. Even if you are correct, 8% faster for the 2080 makes it the sensible choice, even if a Freesync screen is being used... Faster/Quieter/Cheaper/Less power hungry.... Just using logic!
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
If your going to go right up to 1440 is it not fair to also show 4k? I don't know what the reviews say and to be fair haven't read any reviews really. I bought it as to be fair I knew exactly what chip I was buying (Vega 20/MI50) and what it is capable of. As an owner 4k and compute is really what I bought the card for.
I did show 4K. Someone else posted 1440P and I don't knock anyone for buying one at all. I just don't like someone using mass bias to suit an argument like that 4k8k does.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
My bias? Ok, let's see techpowerup's bias. You have 13 cards and 10 cards are nvidia's.
You are missing RX 580, RX 570, RX 590... https://www.techpowerup.com/reviews/AMD/Radeon_VII/32.html

Fan noise can be easily controlled via Radeon Settings.
Power consumption can be easily controlled via reduction of the core voltage.

Come again with the bias plot.

The card we are discussing is Radeon VII and sure, bring the 2080 and even the 2080Ti if you like. I merely went to the link you posted and grabbed a 4K chart, which states the 2080 is 10% faster overall in conclusion with their vast game tests. Sure Radeon wins some and loses more but you can grab a 2080 for less money than a Radeon VII. It is less power hungry, quieter (using your own chart), cheaper and has more features (even if some of them still need some love).

You say Noise can easily be controlled by Radeon settings and power consumption also, which is fair enough I guess but would that make me want to buy a VII over a 2080? No chance. @Vince has bought one for very valid reasons and fair play (OpenCL has always been great with AMD).

Not sure why you are posting a link to TPU but as soon as someone goes to that link, you basically say, "nope, they are biased because I don't like the results". Can you not see the irony in that?
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Then Nvidia will continue releasing very expensive products because you'll buy them anyway. We need to make a stand!
Indeed but what alternative? "Buy secondhand" I read a couple of posts after but wouldn't that person selling be plying money back into NVidia or AMD? Buy the Radeon VII? Ermm, it is more expensive than the 2080. Don't buy and spoil your own fun?

I get very limited game time, as I am sure many of us are the same. I want to enjoy my downtime when I get the chance and whilst I don't make you wrong on pricing, I am not prepared to cut off my nose to spite my face.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
The RTX 2080 is the better overall gaming card at the moment, The only reason I went with the Vega II instead of the RTX was Freesync, I know Nvidia have told us they now support adaptive sync but they also said that out of 400 monitors they tested only 5 worked properly, then they refused to tell us which were tested and found to not work correctly leaving us with Nvidia's support being nothing more than a stab in dark which made my decision to go with the Vega II all the easier.




If the 1080ti's still seen as a beast of a card how can you class a card with roughly the same performance & more ram at a similar price any different? Yes it's 2 years late but it still offers a good level of performance with only the high end Nvidia cards offering more.
Nasha, I genuinely like the Radeon VII and anyone who is into PC tech, knows AMD have been hitting the CPU space hard and doing a fantastic job at it. My take on the VII is a rushed release though and I personally feel it isn't quite ready (or drivers aren't), maybe because AMD have been putting their resources to the CPU side of things? I say the same thing with NVidia and RTX. Whilst I really like the effect in BFV, I want to see it in more games and DLSS so far is a 4K thing, which I have no use for but would love to use it at 3440x1440.

Both the VII and RTX have the pluses and minuses and whilst it is great to see AMD competing, I don't quite feel they have done enough thus far.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Again you fail to understand that Radeon provides better image quality and AMD has the readiness to unlock ray-tracing acceleration via the compute shaders of the VII 16 GB whenever they would like to do so.
DLSS forget it....... I don't count it as a feature.
No, I provided a video with extensive testing in response to you citing this about AMD having a better IQ and it shows there really isn't. As for AMD unlocking RT acceleration via the compute shaders, why haven't they done this? I don't care for bias and even I am but you are seriously deluded.

When do you think AMD will be doing Raytracing?
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
I told you that the difference is not 10% but you again fail to listen to me.

I suck at Math but decided to painstakingly work out the difference, as you are rather obnoxious and this is what I did.

BFV = 6.4% AMD
Civ VI = 24% NVidia
Dasrksiders 3 = 23.5% NVidia
Deus Ex = 7.1% AMD
Divinity Original Sin = 14.7 NVidia
Dragon Quest XI = 35.8% NVidia
F1 2018 = 9.7% NVidia
FarCry 5 = 8% AMD
Ghost Recon Wildlands = 2.3% NVidia
GTA V = 7.1% NVidia
Hellblade = 10% NVidia
Hitman 2 = 22.5% NVidia
Just Cause 4 = 11.61% NVidia
Monster Hunter World = 18% NVidia
Middle Earth 2 = 5% NVidia
Rainbow Six = 14.7% NVidia
Shadow of the Tomb Raider = 3% NVidia
Strange Brigade = 22% AMD
The Witcher 3 = 10.6% NVidia
Wolfenstein 2 = 12.1% NVidia
Overall = 9.5% favour of NVidia

So overall, 9.5% in favour of NVidia and if I was to have gone to the 4th decimal point, it probably works out closer to 10%. You said it was 8% or am I wrong on this?

If someone else wants to make me wrong and show me where I have gone wrong, I will listen.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Gregster: AMD said they were looking at it. Doesn't mean soon however.

4K: DLSS is a worthy feature whether you like it or not. A lot of people put value in it.
Totally agreed and the sooner, the better as it really does add to the game. Sure we are in a primitive state at present but I can see this as a healthy future.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Benchmarks can also be very misleading! Take this run from Wolfenstein 2 and the VEGA 7 is winning.


Either way am done with all this 2080 is better, VEGA 7 is better! I will just focus on my own upgrade path and if the next GPU or VEGA 7 is better than what i running and worth the upgrade i'll buy.

VEGA 56/64 was slated on release to be one of AMD worst GPU releases now look at them lol they priced very competitively and the 64 wins the GTX 1080 in most benchmark videos I watch.
What settings was used and why no SBS? You stated previous that you like to see that or it is useless, so surprised you show this.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
I said VEGA 64 not 7 :p
And that feature is an Nvidia setting its there for Nvidia to gain extra performance. He is right in disabling it just like you wouldn't bench a game with PhysX enabled vs AMD or a game works feature against AMD.
Huh?

So he disabled a setting that gives NVidia better performance? Seems flawed to me. If the option is there, turn it on unless you wish to make something look better than it actually is. So no side by side comparison, settings disabled to give an advantage to AMD. Terrible video to use as a basis of benchmarking.
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
And for the record @Gregster You do realise that Adaptive shading gains performance by reducing image quality around the view of the screen. So for the best possible image quality and performance this setting should remain off.
DF does an excellent video on this.


Hey, show me what you like but like I said, a poor video to use and I have based my argument on the basis of what 4k8k linked to. Don't shoot the messenger and maybe pull him up?
 
Back
Top Bottom