1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Radeon VII a win or fail?

Discussion in 'Graphics Cards' started by Gregster, Feb 8, 2019.

  1. EastCoastHandle

    Gangster

    Joined: Jun 8, 2018

    Posts: 322

    I'm curious to know something. Were you aware that AMD helped create HBM? And in later reports, after release, clarified allegations they were taking royalties to prevent Nvidia from using HBM?
    You can research it. But when you do could you explain why AMD would create a product then price it so that it's cost prohibitive for the creators to use in their own products?
     
  2. drunkenmaster

    Caporegime

    Joined: Oct 18, 2002

    Posts: 32,854

    Cost of production doesn't disappear because you helped create it, that's not how production works. Oh, you helped create it, then we'll do these wafers on the cheap for free so you can get these 80% cheaper.

    More complex things cost more money to produce.
     
  3. AthlonXP1800

    Mobster

    Joined: Sep 28, 2014

    Posts: 2,593

    Location: Scotland

    Martini1991 is right that you are see things in your own fantasy dream.

    The truth is 3 or 5 years from now both Radeon VII 16GB and RTX 2080 8GB will not run great at 1440p and 4K at maximum graphics settings in 2022 or 2024 games than in 2019 games.

    Looked what happened to Titan X Maxwell launched back in 2015 it was the fastest card for 4K games at maximum settings with massive 12GB GDDR5 memory and people was said 4K games at 60+ fps with 12GB VRAM is future proof with Titan X Maxwell but fast forward to now Titan X Maxwell is very struggled to run 2018 and 2019 games at 4K 60 fps but it lagged behind less than 30 fps at 4K slower than GTX 1070 8GB, massive 12GB VRAM did not helped. Looked what happened to Radeon RX Vega Frontier Edition 16GB launched nearly 2 years back in 2017 too, it did not run 2019 games great at 4K maximum settings with massive 16GB HBM2 memory either.

    Now you woke up back to reality from your fantasy dream and you will see Radeon VII 16GB excluded from graphic cards reviews in 2022 because it lacked hardware ray tracing, it would not run great to be able to compete at 4K games on maximum settings with next generation hardware ray tracing GPUs like Intel Xe 1080, Geforce RTX 4080 and Radeon RX 4080 or in 2024 with more powerful ray tracing GPUs that can push 50 giga rays on screen on Intel Xe 2080, Geforce RTX 5080 and Radeon RX 5080.
     
  4. 4K8KW10

    Soldato

    Joined: Sep 2, 2017

    Posts: 5,097

  5. Rroff

    Man of Honour

    Joined: Oct 13, 2006

    Posts: 60,739

  6. 4K8KW10

    Soldato

    Joined: Sep 2, 2017

    Posts: 5,097

    I think nvidia uses some type of compression technique to increase the framerates and worsen the image quality.
    Technically, it should be the opposite of what HEVC H.265 or H.264 do as video codecs. These provide good image quality in fast moving scenery but when you stop or pause, you get tremendous blur.
    With nvidia in games it is the opposite. If you don't move, the graphics looks slightly worse than Radeon's but when you move fast, the image quality has significant impact.


    What is DLSS? A cheat to mimic higher resolution with less processing power. A "legal" way to decrease the image quality.
     
    Last edited: Feb 12, 2019
  7. Zeed

    Mobster

    Joined: Oct 15, 2011

    Posts: 4,325

    Location: Nottingham Carlton

    Reading stuff like this makes me feel more stupid.

    DLSS renders in 1440p and uses AI algorithm to figure/ use digital imagination out how would it look in 2160p. So it increses 1440p rendering quality to make it closer to 2160p.

    Like you send some artist photo and he paints image off that pgoto. But other way around.


    Is that basic enough explanation how AI upscaler works ??
    If anything that's how It should be called AI Upscaler nod Deep learning super sampling.


    I also assume You missed part where on NV cards default You got 8bit plate selected instead of 10 bit and thats why image quality at default can look worse.


    If it goes about Vega VII in 3-5 years... God help us noone including AMD will remember VEGA cards possibly worst architecture ati/amd ever made. Give us a break 7nm is worse than equaly priced 12nm cards. It could have 64gb of HBM2 still would be crap for JUST gaming.
     
    Last edited: Feb 12, 2019
  8. D.P.

    Caporegime

    Joined: Oct 18, 2002

    Posts: 29,515

    You may think that, but you are wrong.
    If the compression was lossy then the drivers would fail to be certified by Microsoft. The DX standard specifies tolerances and limits on precision.

    The fact is Nvidia don't use lossy compression, they use lossless compression just like a zip file or PNG. The output is absolutely identical.

    Moreover, AMD does the exact same compression techniques, nearly AMD are about a generation or 2 behind Nvidia. VEGA matches Maxwell for compression ability, I'm sure Navi will match Pascal, perhaps even Turing
     
  9. Gregster

    Caporegime

    Joined: Sep 24, 2008

    Posts: 37,548

    Location: Essex innit!

    Actually you don't see things that others don't but use the placebo effect to see what you want.

    I am feeling the same way in truth. Getting a bit peeved at the high prices and might just say **** it anf grab a PS5.

    Seriously, you have to stop this. As someone who still owns a 290X, I have never seen any difference in IQ between AMD or NVidia once the monitor has been set to how I like it. There is no compression used from NVidia and again, your placebo is letting you see what you want (or even read what you want, as I can't imagine you won an NVidia card).
     
  10. 4K8KW10

    Soldato

    Joined: Sep 2, 2017

    Posts: 5,097

    There is no reason for a so called placebo effect, especially when thousands, if not millions of other people report the same quality differences.

    Also, you can not make the nvidia image quality equal by changing your monitor setup. This is stupid even as a simple idea.
     
  11. Gregster

    Caporegime

    Joined: Sep 24, 2008

    Posts: 37,548

    Location: Essex innit!

    Exaggerate much?

     
  12. G J

    Hitman

    Joined: Oct 3, 2008

    Posts: 732

    In terms of prepping AMD fans for higher pricing simillar to nvidia's then sure its a win.
     
  13. KentMan

    Gangster

    Joined: Dec 14, 2016

    Posts: 329

    All the GPU manufacturers are doing is pushing more and more users towards Consoles, which is a win for AMD lol, Nvidia are pushing some of their customers to line AMD's pockets.

    Next gen consoles will be good, I have the Xbox One X and its a great bit of kit, recently added KB and Mouse support albeit devs have to implement it, and MS is pushing cross play with other platforms. So if the next round of consoles are significantly better i can see even more PC gamers switching to consoles.

    The fact that for the price of a 2080ti you can buy a 55" 4K TV, Console, games, subs to services etc shows you how out of whack PC components are becoming.

    For the price of one current high end PC with an Intel CPU, Nvidia or AMD GPU, Screen etc i can buy both my kids a 4K TV and a Console each and a years sub to some services.
     
  14. ltron

    Wise Guy

    Joined: Aug 30, 2014

    Posts: 1,133

    'Finewine' is not something to boast about. If your card is underperforming on release due to your own incompetence then that is a problem that should have been addressed many years ago and never repeated, not something to be celebrated.
     
  15. ltron

    Wise Guy

    Joined: Aug 30, 2014

    Posts: 1,133

    DLSS is not a cheat, it is an option for the user should they desire to use it and the only way that raytracing can be used at a playable framerate. If it were forced on us without our knowledge as Nvidia infamously did in 3dmark 03 then you would be right and they should be dragged over the coals for it.

    I'm just disappointed and frustrated as an ATI and AMD fan that I feel forced to buy Nvidia at the high end over the last 4-5 years due to incompetence and a lack of competition from AMD.
     
  16. mulpsmebeauty

    Mobster

    Joined: Apr 25, 2007

    Posts: 3,152

    But the card's performance at the point you buy it is relevant. If it didn't perform adequately at the moment you agree to trade money for that product, then why buy it in the first place? The fact that AMD usually improve matters as time goes on is only a bonus, rather than having the company drop their older products like they were a ginger-haired 780ti.
     
  17. ltron

    Wise Guy

    Joined: Aug 30, 2014

    Posts: 1,133

    It should be performing to its full potential or close to it if AMD want to gain maximum market share. Imagine a card that theoretically destroys Nvidia's best but loses to their two top models on release due to AMD's driver being worse than Nvidia's, think about all the lost sales and people who have now bought into Nvidia's ecosystem like Gsync. It's just not good business.

    Also, consider that the Nvidia 780 Ti may not have been neglected, its full potential may have been reached from the start which is why there wasn't much they could do to improve its performance. By the time AMD has extracted all that lost potential from their 290X it's too late as it has been superseded by better Nvidia cards and many more people have already bought 780 Ti's anyway.
     
  18. 4K8KW10

    Soldato

    Joined: Sep 2, 2017

    Posts: 5,097

    No.
    The software needs time to catch up with the hardware - it is always like this. When you spend 600 pounds, you want your product to last, 3 years, 5 years, maybe 10 years.
    Not to consume it for 1 year and then jump on the next most expensive card. It is waste of money.

    Actually, you must clarify with yourself what exactly you want from one card and what the Radeon VII 16 GB gives you.
    AMD gives you 8 GB additional memory for free and you are still complaining it wasn't enough. Super rude and greedy.
     
  19. ltron

    Wise Guy

    Joined: Aug 30, 2014

    Posts: 1,133

    It won't last 3 years+ as a high end card because it's too slow. I'm not rude or greedy, to expect competition is neither of these things and I won't apologise for it. It's AMD's job to produce a product that is so good that people will want to switch from Nvidia and not for the first time they haven't done this, I'm sorry.

    Putting more memory on a card than it actually needs is not a new tactic, manufacturers have done it for years as a selling point on budget cards that were too slow to take advantage of that extra RAM. The Radeon VII is too slow at 4K and by the time games need more than 8 GB there will be better cards out.

    The one positive for the Radeon is in professional workloads and now with the pro drivers and high FP64 performance then it is a good deal.
     
  20. EastCoastHandle

    Gangster

    Joined: Jun 8, 2018

    Posts: 322

    But that's the point in and of itself. One doesn't know what agreement was made regarding HBM. Sure one can assume there is a cost associated to creating HBM2 and using it because "it's not free to produce". However, it's an equal false equivalence to believe they are paying the same as those who weren't involved in the creation of HBM. Unless there is documentation to support it.

    So I ask, is there any documentation to support what AMD cost perr video card for using HBM2 vs anyone else who wants to use HBM2 (who didn't have any involvement in it's creation)?
    Or am I suppose to believe that AMD cost (who created HBM) is equal or greater then to Nvidia's cost?
     
    Last edited: Feb 12, 2019