• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

TK5eB1N.jpeg
IUGtiWy.jpeg

If we use the most recent set of graphs from the 5070 Ti review, it will look something like this:

average-fps-3840-2160-9070-XT.png
relative-performance-rt-3840-2160-9070-XT.png
 
If anyone is concerned about the massive drop in relative RT performance between the Sapphire 7900 GRE review (Feb 2024) and the MSI 5070 TI Ventus Review (Feb 2025), Techpowerup changed the RT suite of games as follows:

Removed:
A Plague Tale Requiem
F1 23
Far Cry 6
Spider-Man Remastered

Added:
F1 24
Resident Evil 4
Silent Hill 2

In both reviews:
Alan Wake 2
Cyberpunk 2077
Doom Eternal
Elden Ring
Hogwarts Legacy
Ratchet & Clank

For the games that are common in both reviews, you can look at the individual results to see that actual FPS for the 7900 GRE is about the same, although I've only compared a few. I don't think the AMD cards are being intentionally nerfed, just imo.
 
If anyone is concerned about the massive drop in relative RT performance between the Sapphire 7900 GRE review (Feb 2024) and the MSI 5070 TI Ventus Review (Feb 2025), Techpowerup changed the RT suite of games as follows:

Removed:
A Plague Tale Requiem
F1 23
Far Cry 6
Spider-Man Remastered

Added:
F1 24
Resident Evil 4
Silent Hill 2

In both reviews:
Alan Wake 2
Cyberpunk 2077
Doom Eternal
Elden Ring
Hogwarts Legacy
Ratchet & Clank

For the games that are common in both reviews, you can look at the individual results to see that actual FPS for the 7900 GRE is about the same, although I've only compared a few. I don't think the AMD cards are being intentionally nerfed, just imo.
That absolutely makes sense. Adding modern games and removing outdated games from the basket shows more realistic results.
 
Since it's looking more and more like I'll be going the 9070 XT route (rather than a 5080 or 5070Ti) I am left with the question, do I buy at release or wait 3+ months for AMD to drop the price significantly?

At least this question is easy to answer with Nvidia, they're not going to drop the price in the next 2 years let alone the next few months! Plus, right now, you can't get an Nvidia GPU for anything like MSRP.

Depends on want vs need.

I want not need, I wouldn't mind playing some games i've had on the backburner like CP77 but i'm also not desperate to playing them next week, so ill be waiting. If I can get one for around 600-700 i'll consider it, if they're more than that ill wait.
 
But then why are they comparing it to the RTX5070TI at $750 and saying the RX9070XT has to be $500?
No one is saying it HAS to be $500. What people are saying is that in order to pry people over from Nvidia then they need to be aggressive on pricing, which is what $500 would be.

Do I want it to be $500,OFC! But why is everyone just ignoring the RTX5070?
Because we don't have the numbers on 5070 so no use in speculating on both cards. Plus 5070 Ti is the match-up people are really looking at as they're more similar in vram/performance.

Overall it would demolish it and if any games are optimised for RDNA4 RT(think PS5 PRO games),then it would look even worse.
Unfortunately it doesn't work that way. Games are further optimised when ported to PC, but for Nvidia, as well as they depend a lot more on driver optimisations and support (where AMD has been doing the bare minimum - I say that as a pained Radeon owner). Across the board for all games that got ported but used to be exclusive to PS5 you can see that Nvidia still easily wins when RT is enabled. Just look at the latest, Spider-Man 2, which was made with RT reflections-only in mind on PS5 (and thus for RDNA), a 7800 XT still gets trounced by a 4070 (not even Super!) at 1080p.

You can see here an example of what I mean, if it disadvantages Nvidia then it doesn't get done, even if it's faster for AMD and was thus used on console:
vfFGJhW.png


The most representative game that's demanding continues to be Cyberpunk, because for both raster & RT (NOT PT) all vendors (including Intel) have done extensive driver work and have collaborated with CDPR to varying degrees, and that's reflected in performance and features, and also between generations of each vendor (RDNA 2 vs RDNA 3, Turing vs Ampere vs Lovelace etc.).
So if we take that and assume the leaked numbers are true (which is likelier, because it has a benchmark everyone uses, so no need to wonder about scene differences), then 5070 Ti would be 20% faster than a 9070 XT, which is about the difference between 4070 Ti Super & 4070 Super in the game too. So perhaps you're right we should compare it to 5070 because the 9070 XT is clearly a tier below the Ti, but then guess what the price will have to be? ;)
 
The AMD leak shows the 9070XT trading blows with a 5070Ti in raster and marginally slower in RT. Or if you want to cherry pick the most extreme RT low score, it’s about 16% slower in heavy RT (CP2077). Incidentally that’s about on par with a 4070Ti Super or 3090Ti in heavy RT in a game optimised for Nvidia and means the 9070XT RT has closed the gap significantly in heavy RT. I’ve removed the FC6 anomaly for RT averages as it heavily favours AMD here. These are also AMD numbers so could be best case scenario but (and it’s a big but) these are not marketing slides which tend to be BS.

Let’s see where the 9070/XT pricing and availability looks like before we get an idea of its direct performance competition.

Ideally the 9070XT is priced to compete with (or slightly above) the 5070. It will destroy it in raster and be overall better in RT and even heavy RT games won’t favour the 5070.

At this point we have an idea of performance and it looks very good to be fair. Now pricing and availability is key.
 
Last edited:
I joked at the time but hindsight tells me AMD held off their CES announcement because they got wind the 5000 series sucked from top to bottom. I’m sorry but the 5090 is a terrible GPU and everything below it is utterly terrible for price/perf.

I think ironically Nvidia have allowed AMD to price less aggressively than they intended to.
 
I suspect AMD had the same *issue with TSMC as Nvidia had and instead of launching with limited stock and throwing in broken dies to make up the numbers they decided to delay.

*I've not investigated if TSMC actually had any publicly confirmed issues with production in 2024 so i could be wrong.
 
I suspect AMD had the same *issue with TSMC as Nvidia had and instead of launching with limited stock and throwing in broken dies to make up the numbers they decided to delay.

*I've not investigated if TSMC actually had any publicly confirmed issues with production in 2024 so i could be wrong.

Which production issue did Nvidia have?
 
No one is saying it HAS to be $500. What people are saying is that in order to pry people over from Nvidia then they need to be aggressive on pricing, which is what $500 would be.

Because we don't have the numbers on 5070 so no use in speculating on both cards. Plus 5070 Ti is the match-up people are really looking at as they're more similar in vram/performance.

This is what you said as you clearly forgotten it:
for me it's going to take the 9070 XT to be $500 to be tempted over a $750 5070 Ti. Don't think there's a chance that happens.

But basically you said the RX9070XT had to be priced below an RTX5070 assuming the performance figures are legit. So you are saying the RTX5070 and RTX5070TI are fine at those RRPs.




Unfortunately it doesn't work that way. Games are further optimised when ported to PC, but for Nvidia, as well as they depend a lot more on driver optimisations and support (where AMD has been doing the bare minimum - I say that as a pained Radeon owner). Across the board for all games that got ported but used to be exclusive to PS5 you can see that Nvidia still easily wins when RT is enabled. Just look at the latest, Spider-Man 2, which was made with RT reflections-only in mind on PS5 (and thus for RDNA), a 7800 XT still gets trounced by a 4070 (not even Super!) at 1080p.

You can see here an example of what I mean, if it disadvantages Nvidia then it doesn't get done, even if it's faster for AMD and was thus used on console:
vfFGJhW.png


The most representative game that's demanding continues to be Cyberpunk, because for both raster & RT (NOT PT) all vendors (including Intel) have done extensive driver work and have collaborated with CDPR to varying degrees, and that's reflected in performance and features, and also between generations of each vendor (RDNA 2 vs RDNA 3, Turing vs Ampere vs Lovelace etc.).


I only mentioned Cyberpunk 2077 as even DF admitted that CDPR was extensively optimising for that game. So in most other games with RT,which use more economy RT,because they need to run fine on a RTX4060TI or RTX4070 the difference will be less.
So if we take that and assume the leaked numbers are true (which is likelier, because it has a benchmark everyone uses, so no need to wonder about scene differences), then 5070 Ti would be 20% faster than a 9070 XT, which is about the difference between 4070 Ti Super & 4070 Super in the game too. So perhaps you're right we should compare it to 5070 because the 9070 XT is clearly a tier below the Ti, but then guess what the price will have to be? ;)

Glad we cleared that up and I already computed the numbers based on Cyberpunk 2077.

So you basically admitted for 20% extra performance in one game you will be pay 50% more. Also thank you for admitting that you would still buy a more expensive RTX5070 even though:
1.)It isn't any faster than an RX9070XT in Cyberpunk 2077 by your own estimations
2.)That means in most other games it won't be quicker if that is the case assuming the said numbers hold true
3.)Will be faster in rasterised performance
4.)Have more VRAM

Even if AMD were to match the RTX5070TI in RT and rasterised and price it at $400 I doubt you will buy it. My best advice to you,is to put in an RTX5070 order next week and stop looking at this thread! :cry:

9lagmv.jpg
 
Last edited:
Honestly hadn't thought about it. I got as far as nvidia qa screwed up. Nvidia did say it was a manufacturing issue, so could be pointing to tsmc, but that feels like ******** bucket passing if that's what they meant.
There are really three possibiliies:
  1. The chips came out defective from the fab with missing/non-functional ROPs
  2. The chips were intentionally cut to have less ROPs.
  3. Both 1 and 2 could have occured i.e. chips were defective, then they were cut so they would function without issue
I find #1 less likely because this would mean there would be some variance in the number of non-functional ROPs. And from what I understand, all the 5090s and 5080s that have this issue so far have the same number of missing ROPs. It's much more likely these chips therefore were cut to have less ROPs.

Any which way, the issue originated at TSMC, then was "missed" by nV in validation, then was "missed" by (multiple it seems now) AIBs in QC. My theory is nV caught these in validation, let them go to AIBs, then AIBs either intentionally disregarded QC results (on their own or under instructions from nV). The fact that there are reports of these issues across *both* 5080 and 5090, and across multiple AIBs points to an intent here i.e. someone knew about the issue, and much less probable, an endemic failure of validation and QC at multiple steps.
 
Back
Top Bottom