• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD announces FSR Redstone for RDNA4: Neural Radiance Caching, ML Ray Regeneration and Frame Generation

It was just an example, not based on any figures I've seen. I just meant having it is one thing, but that doesn't mean it's comparable/competitive. Maybe it is or maybe it's much better.

There is nothing that an equivalent AMD GPU does only 30% as well as Nvidia let alone Intel.

The 5070 Ti is 15% better in Cyberpunk with RT, its not a huge difference.
 
It was just an example, not based on any figures I've seen. I just meant having it is one thing, but that doesn't mean it's comparable/competitive. Maybe it is or maybe it's much better.

I don’t want this to sound like a dig, but this demonstrates the type of mindshare Nvidia have. People don’t know and don’t bother to find out facts, they just assume and usually incorrectly that AMD are “much worse”.
 
Last edited:
There is nothing that an equivalent AMD GPU does only 30% as well as Nvidia let alone Intel.

The 5070 Ti is 15% better in Cyberpunk with RT, its not a huge difference.
nvidia is currently matching or surpassing amd with a much smaller transistor count, its just that they dont want to like completely obliterate them, amd is not really in a great position if nvidia actually decides to go all out, which puts them in an extraordinarily comfortable position from all angles (not just mindshare but the actual product and engineering capability)
 
Last edited:
Is that 44 games that natively support FSR4 or games that support FSR3 that can be overridden via the drivers to use FSR4?
To give some comparison do you know how many games support DLSS4 in the same way (either natively or via driver override)?
There's probably not a new version of XeSS to compare to is there?
Hmm yeah im not sure, the AMD table just says 44 games and lists them. I think I saw a mention that they want over 75 by end of year. Given they have gone from 30 titles at launch (I think) to 44 in just 3 months then if they keep the same pace, it is achievable, they would hit 72 by end of year. Nvidia had I think 75 games at launch whilst AMD had 30 titles.
 
nvidia is currently matching or surpassing amd with a much smaller transistor count, its just that they dont want to like completely obliterate them, amd is not really in a great position if nvidia actually decides to go all out, which puts them in an extraordinarily comfortable position from all angles (not just mindshare but the actual product and engineering capability)

Less than worthless to a consumer that the 90% market share company is comfortably holding their share and increasing prices.
 
nvidia is currently matching or surpassing amd with a much smaller transistor count, its just that they dont want to like completely obliterate them, amd is not really in a great position if nvidia actually decides to go all out, which puts them in an extraordinarily comfortable position from all angles (not just mindshare but the actual product and engineering capability)

Nvidia are using vastly more expensive GDDR7,so Nvidia would be in a worse position if AMD had gone all out and used GDDR7. The RX9070XT hits memory bandwidth limitations in certain scenarios.

An RTX5080 has 1024GB/s,an RTX5070TI has 896GB/s and the RX9070XT only 640GB/s,which is less than an RTX5070 at 672GB/s.The RX7800XT has 624GB/s of bandwidth. GDDR7 also consumes less power than GDDR6/GDDR6X. GDDR7 is between 30% to 50% more power efficient than GDDR6 according to Samsung,Micron and Sk Hynix.

So AMD basically cheaped out using GDDR6. It shows how AMD really didn't want to compete too hard with Nvidia.Not surprising as AMD has limited TSMC wafers too and would rather make more profitable CPUs. They are just doing the bare minimum and pretty much RDNA4 is a probably a result of their PS5 PRO efforts which Sony partially funded.

The fact is that AMD,with far less R and D spend being subsidised by consoles,using ancient memory,etc is managing to compete on a shoestring budget shows how Nvidia has stopped caring for PC gamers and how much of a second thought Blackwell is as a gaming orientated design. This even goes down to DLSS,etc where AMD has simply less software engineers and appears to have mostly caught up.

You should be livid at what Nvidia is paying back your loyalty with. This has nothing to do with AMD - they have stopped caring and you should be holding them to a higher level instead of blaming AMD.

Doesn't help when they're in cahoots.

Agreed,especially when its clear AMD didn't even bother to use GDDR7 and used el-cheapo GDDR6. If they had used GDDR7,we might have had more of an RTX5080 challenger. AMD only looks "good" because Nvidia CBA with the gaming market currently and ends up with half arsed releases like the RTX5070 and RTX5060.

AMD are just doing the bare minimum too - it shows you how much Nvidia has stopped caring too and their fans still defend them. This is probably the main reason the GPU market is the way it is - it reminds me of Apple fans explaining away every move they do,which makes their competitors like Samsung also start trying the same moves.

If it were not for Chinese smartphone makers the phone market would also be rubbish.
 
Last edited:
There is nothing that an equivalent AMD GPU does only 30% as well as Nvidia let alone Intel.

The 5070 Ti is 15% better in Cyberpunk with RT, its not a huge difference.

I don’t want this to sound like a dig, but this demonstrates the type of mindshare Nvidia have. People don’t know and don’t bother to find out facts, they just assume and usually incorrectly that AMD are “much worse”.
There may have been some misunderstanding here. I didn't say that AMD are worse, what I'm saying is that having feature parity is only really relevant if it performs at a similar level (or better).

I think some people are so keen to have an argument they read what they want to read rather than what was actually written.
 
There are 44 games that now support FSR4. That's pretty good I think for a GPU lineup that was only released in March. Not sure which of those use PT mind you. Still early days.
Not really, but either way it's still too little (and very late). Worse than that if you really wanted to enjoy FSR 4 in the titles where you'd need it most you'd have to rely on Optiscaler, and even now the piggybacking FSR 4 does needs whitelisting for driver-toggle. To me that's simply unacceptable - why deal with that headache when I can just go Nvidia and have DLSS with no such issues? Obviously Nvidia does this to a certain extent for the transformer model but it's a different matter - a slight increase in quality (T vs CNN) vs usable/nigh-unusable (FSR 4 vs 3).

What's most indicative that they're not serious about this is how lackluster the support has been in Cyberpunk (but ofc applicable to other mega popular titles), whether we're talking late adoption (had to wait many months for FSR 2, which itself took many months to launch in the first place), late updating, or indeed still waiting - for FSR 3.1-4, and for proper denoising. And don't tell me, oh but it's Nvidia sponsored, bla bla, because at the same time there's 0% chance AMD went up to CDPR with a bag and said let's update the game for support and CDPR turned them down. If you don't see the marketing & brand value of competing in the most benchmarked GPU title & a 30+ million seller, then what do they see the value in? But again, AMD's marketing department has been disastrous since... pfffft, ever?
 
Last edited:
nvidia is currently matching or surpassing amd with a much smaller transistor count, its just that they dont want to like completely obliterate them, amd is not really in a great position if nvidia actually decides to go all out, which puts them in an extraordinarily comfortable position from all angles (not just mindshare but the actual product and engineering capability)

Navi 48 is a smaller die than GB-203, not hugely so but smaller.

GB-203 is 378mm^2
Navi 48 is 357mm^2

AMD also use much cheaper GDDR 6 memory IC's.

Nvidia could still "completely obliterate AMD" because they bring in 5X more money from AI, Nvidia could give their gaming GPU's away and still not lose money.

And? What your point? Should AMD stop trying?
 
Last edited:
@Martini1991 i don't think AMD and Nvidia are in cahoots, its not how Jenson is, look at what's going on at the moment, Nvidia are trying to bully the largest tech Youtubers in to conforming to their review dictatorial, they are doing this because Jenson is worried he might lose a few percentage points market share to AMD.

AMD's over pricing is entirely AMD.
 
Last edited:
Not really, but either way it's still too little (and very late). Worse than that if you really wanted to enjoy FSR 4 in the titles where you'd need it most you'd have to rely on Optiscaler, and even now the piggybacking FSR 4 does needs whitelisting for driver-toggle. To me that's simply unacceptable - why deal with that headache when I can just go Nvidia and have DLSS with no such issues? Obviously Nvidia does this to a certain extent for the transformer model but it's a different matter - a slight increase in quality (T vs CNN) vs usable/nigh-unusable (FSR 4 vs 3).

What's most indicative that they're not serious about this is how lackluster the support has been in Cyberpunk (but ofc applicable to other mega popular titles), whether we're talking late adoption (had to wait many months for FSR 2, which itself took many months to launch in the first place), late updating, or indeed still waiting - for FSR 3.1-4, and for proper denoising. And don't tell me, oh but it's Nvidia sponsored, bla bla, because at the same time there's 0% chance AMD went up to CDPR with a bag and said let's update the game for support and CDPR turned them down. If you don't see the marketing & brand value of competing in the most benchmarked GPU title & a 30+ million seller, then what do they see the value in? But again, AMD's marketing department has been disastrous since... pfffft, ever?

It shouldn't be that game developers only add vendor specific features if they get a bag of money, you're blaming the wrong people for the wrong reasons, its also not the GPU vendor who pays for that, its you and me in the price of the GPU.
If AMD are to start throwing money at game developers then their GPU's are going to be more expensive.



AMD eventually throwing money CDPRs way for proper FSR support, better late than never I suppose.


Off the top of my head...

Years ahead of Nvidia with Adrenaline.

Integrated GPU/CPU oc'ing.

Integrated OSD.

Rebar.

Open source universal upscaling for everyone.

FG for everyone.

Driver level FG support.
 
Last edited:

Its really easy to check if what AMD say is true or not, unfortunately it is....


s73msnc.jpeg
 
Nvidia are using vastly more expensive GDDR7,so Nvidia would be in a worse position if AMD had gone all out and used GDDR7. The RX9070XT hits memory bandwidth limitations in certain scenarios.
is VRAM bandwidth a bottleneck on any of the lower mid range cards?

You should be livid at what Nvidia is paying back your loyalty with. This has nothing to do with AMD - they have stopped caring and you should be holding them to a higher level instead of blaming AMD.
the original observation was that you cant really compare the competitors anymore, statements like amd lands within 15% of nvidia's offerings does not correctly paraphrase the situation, nvidia is giving them a big handicap.. all you have got to do is look up the massive void between the 5090 and 5080 and then lower down the stack.. idk how you translated that to brand loyalty

Navi 48 is a smaller die than GB-203, not hugely so but smaller.

GB-203 is 378mm^2
Navi 48 is 357mm^2

AMD also use much cheaper GDDR 6 memory IC's.

Nvidia could still "completely obliterate AMD" because they bring in 5X more money from AI, Nvidia could give their gaming GPU's away and still not lose money.

And? What your point? Should AMD stop trying?

that makes the nvidia architecture even more performant, the 5070 ti is basically a rejected 5080 with disabled SMs, simpler circuitry with much less transistors
further, nvidia can obliterate them without any future earnings from AI, they can obliterate them right now in this cycle if they decide to do so
and the point was not about whether or not amd should stop trying.. its just that the competitors cannot be properly compared across tiers anymore.. it doesnt properly reflect their competitive moat. nvidia's xx70 is not the same as amd's xx70
 
Last edited:
is VRAM bandwidth a bottleneck on any of the lower mid range cards?


the original observation was that you cant really compare the competitors anymore, statements like amd lands within 15% of nvidia's offerings does not correctly paraphrase the situation, nvidia is giving them a big handicap.. all you have got to do is look up the massive void between the 5090 and 5080 and then lower down the stack.. idk how you translated that to brand loyalty



that makes the nvidia architecture even more performant, the 5070 ti is basically a rejected 5080 with disabled SMs, simpler circuitry with much less transistors
further, nvidia can obliterate them without any future earnings from AI, they can obliterate them right now in this cycle if they decide to do so
and the point was not about whether or not amd should stop trying.. its just that the competitors cannot be properly compared across tiers anymore. nvidia's xx70 is not the same as amd's xx70

AMD are a CPU company and ever since Ryzen has succeeded,they put more resources into CPUs. They only care about gaming graphics because of the longterm graphics contracts which are funded by Sony and Microsoft. No console contracts,then AMD would have moved entirely over to APUs and built commercial cards only.This is why AMD only builds a fairly small number of dGPUs,which is enough to justify the console contracts.

If they moved over entirely to CPUs,they would make more money and are massively capacity constained - Intel stills sells more CPUs than them. Thank Sony and Microsoft for AMD still actually doing what little they are doing now.

The RX9070XT is an RX7800XT replacement and the RX9070 an RX7700XT replacement,in terms of memory bandwidth,overall die size,etc. Similar die size to an RX6700XT/RX6750XT. The fact is AMD never intended to sell these cards for so much and why they renamed them. It was the same as the RX5700XT which was originally branded RX690. Nvidia got greedy on both scenarios and AMD quite happily "undercut" Nvidia whilst increasing their own prices.

The fact is despite your appreciation for Blackwell,its an RDNA3 level stagnation. This indicates Nvidia didn't push many resources into this generation of gaming cards and did the bare minimum too. They have quite clearly dropped QA/QC standards to maximise the amount they can get from the wafers they have allocated,because Nvidia has bigger fish to fry too,rather than gamers. This is why the RTX5000 series have had all these issues with missing ROPs,etc.

AMD is getting far more out of lower memory bandwidth than what Nvidia is achieving so far. If AMD truly wanted to compete with the RTX5080 they would have used GDDR7 on these cards like Nvidia does. But they chose economy GDDR6,meaning their "top card" has less memory bandwidth than an RTX5070.

The fact is AMD,is not even trying to really compete either despite all their PR spin.They acknowledged UDNA will be a multi-purpose design AFAIK - that tells us,they are throwing less resources at dedicated gaming dGPUs and want to harmonise both lines.

But Nvidia has stopped caring too - most of their sales are in prebuilt systems. So unless AMD and Intel increase production,Nvidia wins even if it looses. But sadly,gamers are down the bottom of the pile now and will get only the scraps.
 
Last edited:
Nvidia is pricing high,because repeated excuse making for the RTX3000 V2 and RTX4000 level pricing and lack of VRAM is why they kept on doing it. It worked. ATI did their best,and almost got into distance of Nvidia a few times,but longer term they were still making less and once AMD bought them,the writing was on the wall once all the ATI projects were finished. AMD cut Radeon R and D to spend on developing Zen in the years before 2017.

AMD are a CPU company and ever since Ryzen has succeeded,they put more resources into CPUs. They only care about gaming graphics because of the longterm graphics contracts which are funded by Sony and Microsoft. No console contracts,then AMD would have moved entirely over to APUs and built commercial cards only.This is why AMD only builds a fairly small number of dGPUs,which is enough to justify the console contracts.

If they moved over entirely to CPUs,they would make more money and are massively capacity constained - Intel stills sells more CPUs than them. Thank Sony and Microsoft for AMD still actually doing what little they are doing now.

The RX9070XT is an RX7800XT replacement and the RX9070 an RX7700XT replacement,in terms of memory bandwidth,overall die size,etc. Similar die size to an RX6700XT/RX6750XT. The fact is AMD never intended to sell these cards for so much and why they renamed them. It was the same as the RX5700XT which was originally branded RX690. Nvidia got greedy on both scenarios and AMD quite happily "undercut" Nvidia whilst increasing their own prices.

The fact is despite your appreciation for Blackwell,its an RDNA3 level stagnation. This indicates Nvidia didn't push many resources into this generation of gaming cards and did the bare minimum too. They have quite clearly dropped QA/QC standards to maximise the amount they can get from the wafers they have allocated,because Nvidia has bigger fish to fry too,rather than gamers. This is why the RTX5000 series have had all these issues with missing ROPs,etc.

AMD is getting far more out of lower memory bandwidth than what Nvidia is achieving so far. If AMD truly wanted to compete with the RTX5080 they would have used GDDR7 on these cards like Nvidia does. But they chose economy GDDR6,meaning their "top card" has less memory bandwidth than an RTX5070.

The fact is AMD,is not even trying to really compete either despite all their PR spin.They acknowledged UDNA will be a multi-purpose design AFAIK - that tells us,they are throwing less resources at dedicated gaming dGPUs and want to harmonise both lines.

But Nvidia has stopped caring too - most of their sales are in prebuilt systems. So unless AMD and Intel increase production,Nvidia wins even if it looses. But sadly,gamers are down the bottom of the pile now and will get only the scraps.
thats not the point i am trying to make..
i am sure how the 9070 will start to suddenly look like a 9060 when nvidia announces the inevitable supers
i am just saying here that nvidia is too far ahead of the game right now and comparisons that paint the competitors in the same ballpark are probably not a correct interpretation of the current market situation
the only thing in your comment that resonates with my observation is that the market is currently supply constrained
given this backdrop, i am not too optimistic about amd's new AI based featureset until its proven in the market and i think i have an informed bias towards nvidia winning the performance & features battle by a huge margin, so wouldnt consider them comparable to the slightest, its not that i want nvidia to win but its the current scenario. hope it changes in the near future
 
that makes the nvidia architecture even more performant, the 5070 ti is basically a rejected 5080 with disabled SMs, simpler circuitry with much less transistors
further, nvidia can obliterate them without any future earnings from AI, they can obliterate them right now in this cycle if they decide to do so
and the point was not about whether or not amd should stop trying.. its just that the competitors cannot be properly compared across tiers anymore.. it doesnt properly reflect their competitive moat. nvidia's xx70 is not the same as amd's xx70

GB-203 is the 5080 which is 15% faster than the 9070 XT.
 
Last edited:
GB-203 is the 5080.
nvidia can call it anything they like but the reality is that theres a big gap between the gb-203 and gb-202, so the big point i am making here is that the tiers are no longer comparable between the competitors.. theres this big competitive moat and nvidia is winning by a huge margin

edit: i am referring to the 5070 ti that you mentioned in your original comment and not the vanilla 5070
 
Last edited:
nvidia can call it anything they like but the reality is that theres a big gap between the gb-203 and gb-202, so the big point i am making here is that the tiers are no longer comparable between the competitors.. theres this big competitive moat and nvidia is winning by a huge margin

edit: i am referring to the 5070 ti that you mentioned in your original comment and not the vanilla 5070

Yes and the performance is similar with a similar die size, AMD also achieve that with cheaper memory, at this point i don't know what your point is? Is Nvidia's market share bigger? Yes, that's not what you're referring to, you're trying to say that Nvidia have significant higher performance for a similar build of materials cost, no they don't, not even close.
 
Last edited:
Back
Top Bottom