• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Next gen consoles will be out in 2-3 years, so anything less than a 5090 is probably going to be in trouble in 5 years, especially if you have low vram

I would not be buying a GPU in 2025 with the view that it's going to run games well in 5 years

So stick with what I have as it’s playing everything now at reasonable settings?

Then upgrade for new consoles to whatever is out then?
 
Last edited:
It basically means 4K 240Hz will need DSC (Display Stream Compression).

You know, for all those people using a $549 - $599 GPU with a $1500 - $3000 monitor. I think the vast majority are going to be fine :)

Incidentally expect this to become a thing now in some Nvidia biased reviews. AMD DP2.1 has been doing 54Gbps for the past few gens now. Nvidia supported up to 48Gbps on the 4000 series. As such DP bandwidth was not “important”. 5000 series now support full 80Gbps, so expect Nvidia marketing to make this a vitally important point all of a sudden.

This happens regularly where so called “priorities” flip entirely depending on if you are team green or team red.

Conversely, you will now notice that excessive power draw is/isn't important or worth mentioning depending upon your new “priority”.
It is always quite funny how things flip-flop between team green and team red. I've thought this on a number of occasions.
Last gen when Nvidia had DP 1.4a and AMD had DP 2.1 it was super important even though most monitors were 1.4a. Now Nvidia have the better DP it's flipped as to who fins it a big deal.
Previous gens when Nvidia have had better upscaling and RT, one side hated them and didn't want to use them while the other side thought they were fantastic. Now AMD have made such big strides and look to have narrowed the gap, the side that hated them keep going on about them (apparently because 2 games force them on and are unplayable on existing AMD hardware) while the other side has gone pretty quite only occasionally mentioning the new transformer model.
As you say both side a little more quite about power draw these days. Although I did enjoy it last gen when power draw was a hugely important thing when it came to CPUs but barely worth mentioning when it came to GPUs as nobody really cares about power draw.

Thankfully both sides are using PCIe 5.0, otherwise I'm sure that would've been a huge bottleneck!

To be honest that is the one thing that concerns me, my motherboard is only PCIe 4.0 (I think there's a 5.0 m.2 slot, whoop-de-*******-do!) and I'm a little concerned that that may be an issue, if not this gen then maybe next.
 
Well as someone who regularly buys both Nvidia and AMD (I have a 4080 and a 7900 XT) I set my own criteria of what’s important. I also change my mind as games evolve.

6 years ago I barely gave a crap about RT and DLSS as it was so poor and even the best most expensive GPUs tanked hard despite upscaling. I always felt it was the way we were going but it would take years to get there. Yet say that on the RT thread and you were labelled a “hater”. Same for DLSS and FSR, I hated the first iterations but now use it where possible, even if not needed.

So now AMD have closed the gap significantly on RT just as more games utilise it to more meaningful levels. Does this mean they got it right by not pushing it so hard? Or paradoxically were they the reason it was being held back?
 
Last edited:
Incidentally expect this to become a thing now in some Nvidia biased reviews. AMD DP2.1 has been doing 54Gbps for the past few gens now. Nvidia supported up to 48Gbps on the 4000 series. As such DP bandwidth was not “important”. 5000 series now support full 80Gbps, so expect Nvidia marketing to make this a vitally important point all of a sudden.

This happens regularly where so called “priorities” flip entirely depending on if you are team green or team red.

Conversely, you will now notice that excessive power draw is/isn't important or worth mentioning depending upon your new “priority”.

Its really comical when you bullet these things over time how fickle the features that are championed move about. :cry:

So now AMD have closed the gap significantly on RT just as more games utilise it to more meaningful levels. Does this mean they got it right by not pushing it so hard? Or paradoxically were they the reason it was being held back?

Excellent. Or now AMD are thereabouts is it suddenly still not good enough and another reason to just avoid them?

To be honest that is the one thing that concerns me, my motherboard is only PCIe 4.0 (I think there's a 5.0 m.2 slot, whoop-de-*******-do!) and I'm a little concerned that that may be an issue, if not this gen then maybe next.

You seem to be over thinking things on many occasions. At some point I will leap from AM4 to AM5 and it will have always been part of the plan. I rarely overhaul the system on one go and one of the things being on pc is all about you can upgrade over time and still get great performance with component upgrades.
 
Last edited:
I was away for the conference/price reveal and aftermath, but at least was reading the posts on my phone, a lot of good points raised.

My 2 cents, which probably reflects the same mentality as many others:
Well done AMD for showing native performance in comparisons, none of this upscaling/framegen nonsense. Also they did alright for not putting 9070XT MSRP above $600, it's good, but not quite there to definitively knock Nvidia down a peg. That said I've seen some folks here saying they will sell their Nvidia GPUs to move over to one of these, so maybe good enough with how terrible Nvidia is this gen to turn some green into red. Unfortunately AMD did mess up big time with the pricing of the non-XT, should have been $500 max.

I have a feeling there won't be quite as many of the 9070s, given that the non-XTs are the rejects of the full die, which didn't pass the binning process for becoming an XT. I'd hope they sell poorly and prices settle at just under £500, kinda like how the prices slowly droped for the 7700XT and 7900XT when AMD priced those too close last time. Otherwise, seems like a poor choice to buy one when the XT can be had for £50 more. That sort of price difference only makes sense at lower prices, tthe higher we go, the more the difference needs to be to prevent one model encroaching on another.

The stock sounds good, but honestly I still don't feel it's enough to have just a few thousand here at OCUK. Most of us are only interested in the more affordable models or just the top-end Red Devils/Nitros of the bunch. I have a feeling all the MSRP cards will be sold out and that one pricing leak a few pages back has gone back to killing the hype for me. Also the rumours that AMD will be putting prices up after the initial launch, due to their deals with retailers.

I was looking at either a Sapphire Pulse or XFX Mercury (given that there's no nice reference models this time), but the latter is almost £100 more over MSRP... not worth it. I'll try to nab a Pulse on launch day, but it's looking unlikely since these GPUs look like the only ones available to buy anytime soon. I've got a feeling I'll come home from work to find they're all sold out :(
If I don't manage to get one, I'll just not bother with it until there's more stock at MSRP or below (which is potentially likely to never happen). Not like I play many games anyway. Monster Hunter will probably just sit in my Steam library and become part of the usual backlog of shame, fated to forever gather dust.

Is there not a Special Edition Red Devil? That will probably be £800. The normal one may be nearer £700.

I'm all about the Pulse anyway. Or an ASRock.

Overclockers have it. The limited edition, which I reckon wil be at least £900. As nice a card the red devils are, they are the top model and so out of my price budget. Plus they need more slots, more power draw, higher recommended PSU wattage, etc.
 
Last edited:
I was away for the conference/price reveal and aftermath, but at least was reading the posts on my phone, a lot of good points raised.

My 2 cents, which probably reflects the same mentality as many others:
Well done AMD for showing native performance in comparisons, none of this upscaling/framegen nonsense. Also they did alright for not putting 9070XT MSRP above $600, it's good, but not quite there to definitively knock Nvidia down a peg. That said I've seen some folks here saying they will sell their Nvidia GPUs to move over to one of these, so maybe good enough with how terrible Nvidia is this gen to turn some green into red. Unfortunately AMD did mess up big time with the pricing of the non-XT, should have been $500 max.

I have a feeling there won't be quite as many of the 9070s, given that the non-XTs are the rejects of the full die, which didn't pass the binning process for becoming an XT. I'd hope they sell poorly and prices settle at just under £500, kinda like how the prices slowly droped for the 7700XT and 7900XT when AMD priced those too close last time. Otherwise, seems like a poor choice to buy one when the XT can be had for £50 more. That sort of price difference only makes sense at lower prices, tthe higher we go, the more the difference needs to be to prevent one model encroaching on another.

The stock sounds good, but honestly I still don't feel it's enough to have just a few thousand here at OCUK. Most of us are only interested in the more affordable models or just the top-end Red Devils/Nitros of the bunch. I have a feeling all the MSRP cards will be sold out and that one pricing leak a few pages back has gone back to killing the hype for me. Also the rumours that AMD will be putting prices up after the initial launch, due to their deals with retailers.

I was looking at either a Sapphire Pulse or XFX Mercury (given that there's no nice reference models this time), but the latter is almost £100 more over MSRP... not worth it. I'll try to nab a Pulse on launch day, but it's looking unlikely since these GPUs look like the only ones available to buy anytime soon. I've got a feeling I'll come home from work to find they're all sold out :(
If I don't manage to get one, I'll just not bother with it until there's more stock at MSRP or below (which is potentially likely to never happen). Not like I play many games anyway. Monster Hunter will probably just sit in my Steam library and become part of the usual backlog of shame, fated to forever gather dust.



Overclockers have it. The limited edition, which I reckon wil be at least £900. As nice a card the red devils are, they are the top model and so out of my price budget. Plus they need more slots, more power draw, higher recommended PSU wattage, etc.

Great points. Not worth getting the 9070XT above msrp eg if Red Devil is £900 you might as well pony up another £79 to get the 5080 FE (which is still compelling for the better feature set- AMD are close but still not there in terms of path tracing).

Thanks for everyone’s opinions on what I should I do. I am going to resist the temptation to upgrade and ride my limited edition mba 7800XT (tweaked for another 10-14% performance so only about 20% behind a stock 7900XT) till next console gen! Only trouble is I have to read what great experience everyone is having with their 9070XTs and have to suppress my FOMO everytime- but that’s the same in life when I see a better car/phone- I don’t upgrade when this happens.
 
Last edited:
Well as someone who regularly buys both Nvidia and AMD (I have a 4080 and a 7900 XT) I set my own criteria of what’s important. I also change my mind as games evolve.

6 years ago I barely gave a crap about RT and DLSS as it was so poor and even the best most expensive GPUs tanked hard despite upscaling. I always felt it was the way we were going but it would take years to get there. Yet say that on the RT thread and you were labelled a “hater”. Same for DLSS and FSR, I hated the first iterations but now use it where possible, even if not needed.

So now AMD have closed the gap significantly on RT just as more games utilise it to more meaningful levels. Does this mean they got it right by not pushing it so hard? Or paradoxically were they the reason it was being held back?
In fairness you were the one that started the priority flip topic, I just pointed out a few more or expanded on what you said. Surely if it's fair that if you're allowed to change your mind so is everyone else?
You seemed to get offended when I said the same things you were.

I don't use upscaling or RT still.

I guess you could argue that if it was Nvidia that got RT to the point that it's at and if both had gone with AMD's tactic then RT might not be where it is today (which I'd be fine with). Maybe AMD got it right, but I don't see why having better RT in previous generations would've been a bad thing. I'm sure if AMD didn't have better RT this generation but waited until UDNA then there would be arguments that AMD got the time just right then.
 
It is always quite funny how things flip-flop between team green and team red. I've thought this on a number of occasions.
Last gen when Nvidia had DP 1.4a and AMD had DP 2.1 it was super important even though most monitors were 1.4a. Now Nvidia have the better DP it's flipped as to who fins it a big deal.
Previous gens when Nvidia have had better upscaling and RT, one side hated them and didn't want to use them while the other side thought they were fantastic. Now AMD have made such big strides and look to have narrowed the gap, the side that hated them keep going on about them (apparently because 2 games force them on and are unplayable on existing AMD hardware) while the other side has gone pretty quite only occasionally mentioning the new transformer model.
As you say both side a little more quite about power draw these days. Although I did enjoy it last gen when power draw was a hugely important thing when it came to CPUs but barely worth mentioning when it came to GPUs as nobody really cares about power draw.

Thankfully both sides are using PCIe 5.0, otherwise I'm sure that would've been a huge bottleneck!

To be honest that is the one thing that concerns me, my motherboard is only PCIe 4.0 (I think there's a 5.0 m.2 slot, whoop-de-*******-do!) and I'm a little concerned that that may be an issue, if not this gen then maybe next.
1-4% on a 5090 between pcie 3 and pcie 5 so I wouldn't worry too much

 
Right to gauge the opinions here. If (Big If) I upgrade from 7800XT will getting 9070XT mean it’s a 5 year card at Max settings at 1440p?

EDIT: I already the feel the 7800XT would last till next console gen without many compromises at 1440p as everything I play (AAA titles, COD) is on max and get respectable fps. So basically am I FOMOing into the 9070 series unnecessarily? (It would cost me a £117 loss on what I paid for my current card).
Max settings = PT, so no. But if the cost of upgrading is only £117 then that's a no brainer as the 9070 XT will have way better performance, especially RT, and access to FSR 4.

Does this mean they got it right by not pushing it so hard? Or paradoxically were they the reason it was being held back?
Of course they didn't get it right otherwise their marketshare wouldn't be continuously declining.
Don't think RT was really held back, it's just that most devs can't even implement basic AA to save their life let alone make the effort for something so much more involved; so if it's just up to them it would never happen outside of the rare exception (4A etc.). So budget, talent, and timing all played a part, hence Nvidia was almost alone in pushing it hard, because they had the money & motive (RTX). Radeon graphics is still in lockstep with whatever the console makers ask of them, so what we have on desktop is just what trickles out from those initiatives, thus meagre RT capabilities (and even weaker ML) but overall decent raster cards that have good perf per area (aka cheap and simple to make). It's not a coincidence that RDNA 4 is better for RT/ML just as they make a console (PS5 Pro) whose upgrades are exactly in those areas.
 
Last edited:
I would imagine next gen consoles will have something pretty similar to a 9070xt in them... they have to sell to the masses not the 0.1% who have the money to splash on 5090

I'm pretty sure next gen consoles, due in 2027, would wait for 3nm chips considering the likley power/TDP improvements. Also, wasn't it leaked Sony is switching to nV for PS6?

Edit: it acually looks like PS6 will use an AMD UDNA after all (so 3nm). nV wasn't even in the running apparently, it was between AMD and Intel.
 
Last edited:
I agree the RT has been held back by consoles, but I don’t think it was lower RT performance that reduced AMD dGPU market share, it was **** poor launches and pricing. If a 7900 XTX was £800 and the 7900 XT was £650 or even £700 on release it would have sold a lot better. If the 7800 XT was not 7 months late to the party etc.

It was AMD marketing and pricing that caused the problems. If you believe it was lack of RT features that killed AMD market share that’s fair enough, I’m not saying it had zero impact. But the ever decreasing market share has been a problem since before the 2000 series and RT was even a thing.

AMD market share in 2010 was about 40%. In 2017 it was about 25%. A drop of about 2% per year on average and all before RT was ever even on the gamer GPU horizon. Do the maths based on that predicted trend and they were set to be around 10% right around… 2024.

Ain’t statistical analysis fun ;)
 
Last edited:
£117? you only live once

I've been looking what aib cards are available doesn't look like a wide selection and all are ugly still gutted no MBA reference:(
 
Back
Top Bottom