• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon VII a win or fail?

Soldato
Joined
8 Jun 2018
Posts
2,827
I think you don't realise the cost of HBM2 memory. They aren't pricing it to gouge, they have to factor in HBM2 cost as well as the usual associated costs. 16gig of the stuff is WAY expensive over GDDR5 or GDDR6
I'm curious to know something. Were you aware that AMD helped create HBM? And in later reports, after release, clarified allegations they were taking royalties to prevent Nvidia from using HBM?
You can research it. But when you do could you explain why AMD would create a product then price it so that it's cost prohibitive for the creators to use in their own products?
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
I'm curious to know something. Were you aware that AMD helped create HBM? And in later reports, after release, clarified allegations they were taking royalties to prevent Nvidia from using HBM?
You can research it. But when you do could you explain why AMD would create a product then price it so that it's cost prohibitive for the creators to use in their own products?

Cost of production doesn't disappear because you helped create it, that's not how production works. Oh, you helped create it, then we'll do these wafers on the cheap for free so you can get these 80% cheaper.

More complex things cost more money to produce.
 
Soldato
Joined
28 Sep 2014
Posts
3,430
Location
Scotland
I have explained it multiple times. I see things which you don't :D

And come back in 3 or 5 years with your Radeon VII 16 GB. We will speak then about how great it is, and how fail RTX 2080 will be.

Martini1991 is right that you are see things in your own fantasy dream.

The truth is 3 or 5 years from now both Radeon VII 16GB and RTX 2080 8GB will not run great at 1440p and 4K at maximum graphics settings in 2022 or 2024 games than in 2019 games.

Looked what happened to Titan X Maxwell launched back in 2015 it was the fastest card for 4K games at maximum settings with massive 12GB GDDR5 memory and people was said 4K games at 60+ fps with 12GB VRAM is future proof with Titan X Maxwell but fast forward to now Titan X Maxwell is very struggled to run 2018 and 2019 games at 4K 60 fps but it lagged behind less than 30 fps at 4K slower than GTX 1070 8GB, massive 12GB VRAM did not helped. Looked what happened to Radeon RX Vega Frontier Edition 16GB launched nearly 2 years back in 2017 too, it did not run 2019 games great at 4K maximum settings with massive 16GB HBM2 memory either.

Now you woke up back to reality from your fantasy dream and you will see Radeon VII 16GB excluded from graphic cards reviews in 2022 because it lacked hardware ray tracing, it would not run great to be able to compete at 4K games on maximum settings with next generation hardware ray tracing GPUs like Intel Xe 1080, Geforce RTX 4080 and Radeon RX 4080 or in 2024 with more powerful ray tracing GPUs that can push 50 giga rays on screen on Intel Xe 2080, Geforce RTX 5080 and Radeon RX 5080.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
To borrow from the comments there... it is funny how you ignore things that don't fit your preconceived narrative - even repeating the same fallacious information after it has been proved wrong.

I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.
 
Last edited:
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.
Reading stuff like this makes me feel more stupid.

DLSS renders in 1440p and uses AI algorithm to figure/ use digital imagination out how would it look in 2160p. So it increses 1440p rendering quality to make it closer to 2160p.

Like you send some artist photo and he paints image off that pgoto. But other way around.


Is that basic enough explanation how AI upscaler works ??
If anything that's how It should be called AI Upscaler nod Deep learning super sampling.


I also assume You missed part where on NV cards default You got 8bit plate selected instead of 10 bit and thats why image quality at default can look worse.


If it goes about Vega VII in 3-5 years... God help us no one including AMD will remember VEGA cards possibly worst architecture ati/amd ever made. Give us a break 7nm is worse than equaly priced 12nm cards. It could have 64gb of HBM2 still would be crap for JUST gaming.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
32,615
I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.

You may think that, but you are wrong.
If the compression was lossy then the drivers would fail to be certified by Microsoft. The DX standard specifies tolerances and limits on precision.

The fact is Nvidia don't use lossy compression, they use lossless compression just like a zip file or PNG. The output is absolutely identical.

Moreover, AMD does the exact same compression techniques, nearly AMD are about a generation or 2 behind Nvidia. VEGA matches Maxwell for compression ability, I'm sure Navi will match Pascal, perhaps even Turing
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
I have explained it multiple times. I see things which you don't :D

And come back in 3 or 5 years with your Radeon VII 16 GB. We will speak then about how great it is, and how fail RTX 2080 will be.

Actually you don't see things that others don't but use the placebo effect to see what you want.

That's because it's fantasy.

In 3 or 5 years I'd hope to have a much better GPU than either the 2080 or the VII.

Then again, given the way the market's going, I could just be entirely next gen console. I've already got the 4K OLED ready for next gen.

I am feeling the same way in truth. Getting a bit peeved at the high prices and might just say **** it anf grab a PS5.

I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.

Seriously, you have to stop this. As someone who still owns a 290X, I have never seen any difference in IQ between AMD or NVidia once the monitor has been set to how I like it. There is no compression used from NVidia and again, your placebo is letting you see what you want (or even read what you want, as I can't imagine you won an NVidia card).
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
placebo effect

There is no reason for a so called placebo effect, especially when thousands, if not millions of other people report the same quality differences.

Also, you can not make the nvidia image quality equal by changing your monitor setup. This is stupid even as a simple idea.
 
Associate
Joined
14 Dec 2016
Posts
958
All the GPU manufacturers are doing is pushing more and more users towards Consoles, which is a win for AMD lol, Nvidia are pushing some of their customers to line AMD's pockets.

Next gen consoles will be good, I have the Xbox One X and its a great bit of kit, recently added KB and Mouse support albeit devs have to implement it, and MS is pushing cross play with other platforms. So if the next round of consoles are significantly better i can see even more PC gamers switching to consoles.

The fact that for the price of a 2080ti you can buy a 55" 4K TV, Console, games, subs to services etc shows you how out of whack PC components are becoming.

For the price of one current high end PC with an Intel CPU, Nvidia or AMD GPU, Screen etc i can buy both my kids a 4K TV and a Console each and a years sub to some services.
 
Soldato
Joined
30 Aug 2014
Posts
5,960
I've just had a look at VII reviews which include the GTX1080/V64 (You'd hope with current results)
I'd say my V64 still trades blows with the GTX1080 exactly the same at launch.

Is a Fury X now a 980Ti killer? Fury X seemed to die an absolute death and is basically never heard about, yet it's later GCN that the former cards I've mentioned.

Sure, we can say that AMD might gain more percentage over time. But I'd certainly say finewine's a rather overhyped concept.

I think the "finewine" lark is really only a thing for the R9 290/R9 290X/390X/390 and when the 7970 surged ahead of the GTX680, whereas at launch they traded blows. Albeit some of this was done with a soft relaunch of the card.

It could also be of course that Nvidia just drop their support sooner and don't bother optimizing the drivers for newer games on their older GPU's. Which is obviously terrible. But I'd say that's more Nvidia cards becoming "rotten milk" rather than AMD being "fine wine". Although AMD can be commended for their support.

Of course there's always games that are the exception. But that's called cherry picking.
'Finewine' is not something to boast about. If your card is underperforming on release due to your own incompetence then that is a problem that should have been addressed many years ago and never repeated, not something to be celebrated.
 
Soldato
Joined
30 Aug 2014
Posts
5,960
I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.
DLSS is not a cheat, it is an option for the user should they desire to use it and the only way that raytracing can be used at a playable framerate. If it were forced on us without our knowledge as Nvidia infamously did in 3dmark 03 then you would be right and they should be dragged over the coals for it.

I'm just disappointed and frustrated as an ATI and AMD fan that I feel forced to buy Nvidia at the high end over the last 4-5 years due to incompetence and a lack of competition from AMD.
 
Soldato
Joined
25 Apr 2007
Posts
5,255
'Finewine' is not something to boast about. If your card is underperforming on release due to your own incompetence then that is a problem that should have been addressed many years ago and never repeated, not something to be celebrated.

But the card's performance at the point you buy it is relevant. If it didn't perform adequately at the moment you agree to trade money for that product, then why buy it in the first place? The fact that AMD usually improve matters as time goes on is only a bonus, rather than having the company drop their older products like they were a ginger-haired 780ti.
 
Soldato
Joined
30 Aug 2014
Posts
5,960
But the card's performance at the point you buy it is relevant. If it didn't perform adequately at the moment you agree to trade money for that product, then why buy it in the first place? The fact that AMD usually improve matters as time goes on is only a bonus, rather than having the company drop their older products like they were a ginger-haired 780ti.
It should be performing to its full potential or close to it if AMD want to gain maximum market share. Imagine a card that theoretically destroys Nvidia's best but loses to their two top models on release due to AMD's driver being worse than Nvidia's, think about all the lost sales and people who have now bought into Nvidia's ecosystem like Gsync. It's just not good business.

Also, consider that the Nvidia 780 Ti may not have been neglected, its full potential may have been reached from the start which is why there wasn't much they could do to improve its performance. By the time AMD has extracted all that lost potential from their 290X it's too late as it has been superseded by better Nvidia cards and many more people have already bought 780 Ti's anyway.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
It should be performing to its full potential or close to it if AMD want to gain maximum market share.

No.
The software needs time to catch up with the hardware - it is always like this. When you spend 600 pounds, you want your product to last, 3 years, 5 years, maybe 10 years.
Not to consume it for 1 year and then jump on the next most expensive card. It is waste of money.

Actually, you must clarify with yourself what exactly you want from one card and what the Radeon VII 16 GB gives you.
AMD gives you 8 GB additional memory for free and you are still complaining it wasn't enough. Super rude and greedy.
 
Soldato
Joined
30 Aug 2014
Posts
5,960
No.
The software needs time to catch up with the hardware - it is always like this. When you spend 600 pounds, you want your product to last, 3 years, 5 years, maybe 10 years.
Not to consume it for 1 year and then jump on the next most expensive card. It is waste of money.

Actually, you must clarify with yourself what exactly you want from one card and what the Radeon VII 16 GB gives you.
AMD gives you 8 GB additional memory for free and you are still complaining it wasn't enough. Super rude and greedy.
It won't last 3 years+ as a high end card because it's too slow. I'm not rude or greedy, to expect competition is neither of these things and I won't apologise for it. It's AMD's job to produce a product that is so good that people will want to switch from Nvidia and not for the first time they haven't done this, I'm sorry.

Putting more memory on a card than it actually needs is not a new tactic, manufacturers have done it for years as a selling point on budget cards that were too slow to take advantage of that extra RAM. The Radeon VII is too slow at 4K and by the time games need more than 8 GB there will be better cards out.

The one positive for the Radeon is in professional workloads and now with the pro drivers and high FP64 performance then it is a good deal.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Cost of production doesn't disappear because you helped create it, that's not how production works. Oh, you helped create it, then we'll do these wafers on the cheap for free so you can get these 80% cheaper.

More complex things cost more money to produce.
But that's the point in and of itself. One doesn't know what agreement was made regarding HBM. Sure one can assume there is a cost associated to creating HBM2 and using it because "it's not free to produce". However, it's an equal false equivalence to believe they are paying the same as those who weren't involved in the creation of HBM. Unless there is documentation to support it.

So I ask, is there any documentation to support what AMD cost perr video card for using HBM2 vs anyone else who wants to use HBM2 (who didn't have any involvement in it's creation)?
Or am I suppose to believe that AMD cost (who created HBM) is equal or greater then to Nvidia's cost?
 
Last edited:
Back
Top Bottom