• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

On a different note will the upcoming 8700xt=7800xt and likewise 8800xt=7900GRE ? Then there is no incentive for 6000 series or 7000 series owners to upgrade at all.

True but this is similar to some of us on nvidia cards where the dilution of the sku's yields same limited choices. 3090 > 4070Ti (not worth it). 3080 > 4070 (not worth it) etc.
 
Forgive me only skimming the last few pages but is there anywhere I can go for a quick summary of where we're at with rumours?

I'm in the market for a 7900 XTX tier card and trying to understand if there's something on the horizon that fits the brief under the 8800 badge or that's totally off the cards. Also interested in any likely timeframe but I think I've seen as early as end of Jan mentioned? Again any pointers to a source of info welcome.

Happy to go and read for myself if there's anything semi believable out there... Cheers!
 
Forgive me only skimming the last few pages but is there anywhere I can go for a quick summary of where we're at with rumours?

I'm in the market for a 7900 XTX tier card and trying to understand if there's something on the horizon that fits the brief under the 8800 badge or that's totally off the cards. Also interested in any likely timeframe but I think I've seen as early as end of Jan mentioned? Again any pointers to a source of info welcome.

Happy to go and read for myself if there's anything semi believable out there... Cheers!
Yeah we should be introduced to RDNA 4 at CES 2025 in January. All we know is that Su has said, "In addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray tracing performance, and adds new AI capabilities. We are on track to launch the first RDNA 4 GPUs in early 2025."
 
I'm in the market for a 7900 XTX tier card and trying to understand if there's something on the horizon that fits the brief under the 8800 badge or that's totally off the cards.

I would limit the expectations for around 7900XT performance (which is OK) but it would be stretching it to be on par or exceed the 7900XTX in raster.

If you want the performance of that then AMD have stated its not producing high end this gen, unless your not interested and wont be using RT in any way - then just pickup a discounted 7900XTX.
 
Thanks both. Hmm bit of a predicament for me then. Sounds like there's benefits to be gained going new gen but really somewhere around the missing 8900XT would be better placed to cover all bases. I'm aiming for something that'll put out 5120x1440 high refresh rates for a good while (coming from an RX480 I want to really crank up the performance).

Sure 7900XT level will do just that but it would be nice to match the current top end. If I do want ray tracing (tbd) and that level performance does that push me towards the current 4080 Super? They do at least come in a size that'd fit my small case.
 
Thanks both. Hmm bit of a predicament for me then. Sounds like there's benefits to be gained going new gen but really somewhere around the missing 8900XT would be better placed to cover all bases. I'm aiming for something that'll put out 5120x1440 high refresh rates for a good while (coming from an RX480 I want to really crank up the performance).

Sure 7900XT level will do just that but it would be nice to match the current top end. If I do want ray tracing (tbd) and that level performance does that push me towards the current 4080 Super? They do at least come in a size that'd fit my small case.
My 2p worth, I'd wait. CES 2025 is only like 6 weeks away.
 
I'm looking forward to seeing what AMD come up with. The rumour that they won't be competing at the high end is interesting - hopefully that just means "competing with the 5090"? As others have said, a card with something around or better than 7900xt raster and improved RT, with a sensible amount of VRAM would be ideal.

I'm on a 3080 12gb and would rather not take a bath on a £1.2k 5080 with just 16gb VRAM.
 
Oh ok, I thought I was enjoying CP2077 with RT, guess I'll turn it off then.
There's a thread on these forums and all 98% of posts (all the ones that aren't mrk) say is that they don't like RT and raster looks just a good, or something much like that. I think HUB did a video and said only 3 games looked better with RT. I think there was also a thread with a poll in it too and RT was the thing least people cared about when buying a new graphics card.

That's what I based my comment on, maybe I should've phrased it as "the majority of people, at least on this forum, just turn it off". Still, given the general hatred for how RT looks and all that, it would seem an odd area for AMD to invest time or money into.
 
There's a thread on these forums and all 98% of posts (all the ones that aren't mrk) say is that they don't like RT and raster looks just a good, or something much like that. I think HUB did a video and said only 3 games looked better with RT. I think there was also a thread with a poll in it too and RT was the thing least people cared about when buying a new graphics card.

That's what I based my comment on, maybe I should've phrased it as "the majority of people, at least on this forum, just turn it off". Still, given the general hatred for how RT looks and all that, it would seem an odd area for AMD to invest time or money into.
You forgot to mention the Nexus enormouse wall of text that no body reads. :p
 
I would love to try path tracing on AMD cards without tanking FPS. Hoping RX 9000/UDNA makes this dream a reality.
Keep in mind that for already released titles it won't matter even if they catch up with HW. There's a gross amount of Nvidia-specific optimisations done for PT. You can tell because even the Intel Arc cards which have great RT hw actually underperform horrifically to where they should be, even compared to Ampere (Portal RTX was a prime example of just how obscene it can get, but even in more "neutral" titles like CP2077 it's still bad). In fact it's not unlike how RT looked initially for RDNA 2 (tho way worse now for PT), it took a year+ to really have RT performance shape up after they exclusively shipped with Turing/Ampere support & optimisations (and ofc some titles still are subpar having gotten no/little patching).

There's no doubt that NV GPUs are strictly better than AMD's at RT but people overlook just how much NV-specific tuning occurs by game devs simply because they're market leaders which puts them much further ahead than what the hw differences alone would be. We saw even recently with Alan Wake 2 the devs mentioning how they chose to code a subpar method (not RT related) for AMD but which worked better for NV and go with that instead precisely because NV is 80% of the market.

It's just the unfortunate reality of owning an AMD GPU.
 
Keep in mind that for already released titles it won't matter even if they catch up with HW. There's a gross amount of Nvidia-specific optimisations done for PT. You can tell because even the Intel Arc cards which have great RT hw actually underperform horrifically to where they should be, even compared to Ampere (Portal RTX was a prime example of just how obscene it can get, but even in more "neutral" titles like CP2077 it's still bad). In fact it's not unlike how RT looked initially for RDNA 2 (tho way worse now for PT), it took a year+ to really have RT performance shape up after they exclusively shipped with Turing/Ampere support & optimisations (and ofc some titles still are subpar having gotten no/little patching).

There's no doubt that NV GPUs are strictly better than AMD's at RT but people overlook just how much NV-specific tuning occurs by game devs simply because they're market leaders which puts them much further ahead than what the hw differences alone would be. We saw even recently with Alan Wake 2 the devs mentioning how they chose to code a subpar method (not RT related) for AMD but which worked better for NV and go with that instead precisely because NV is 80% of the market.

It's just the unfortunate reality of owning an AMD GPU.

The issue now with the PS5 Pro going for more RT effects,we might end up with some games with RT being optimised for the PS5 PRO and things going the other way. Would be funny if that happens and I expect people will be moaning about AMD sabotaging Nvidia,like what happened in Starfield.
 
I'm looking forward to seeing what AMD come up with. The rumour that they won't be competing at the high end is interesting - hopefully that just means "competing with the 5090"? As others have said, a card with something around or better than 7900xt raster and improved RT, with a sensible amount of VRAM would be ideal.

I'm on a 3080 12gb and would rather not take a bath on a £1.2k 5080 with just 16gb VRAM.

1p worth can't see competing with 5080 either, I'm hoping they at least compete against the 5070 which should be at or close to 4080 performance
I'm looking forward to seeing what AMD come up with. The rumour that they won't be competing at the high end is interesting - hopefully that just means "competing with the 5090"? As others have said, a card with something around or better than 7900xt raster and improved RT, with a sensible amount of VRAM would be ideal.

I'm on a 3080 12gb and would rather not take a bath on a £1.2k 5080 with just 16gb VRAM.

AMD confirmed it will aim for mid tier to low tier , so hopefully that means it competes with the 5070 which should perform at or close to 4080
 
From what I've read there's no reason to upgrade right now if you are on 6000 or 7000, 8000 is mainly to bump up AMD's marketshare. 9000 will see the return of the high end apparently.

Sounds exactly like the start of this GPU gen where none of the new GPUs were worth it at launch, with the older gen parts still knocking around giving better price-to-performance.
I'm not sure if AMD can even bump market share, as they don't seem to really be trying on the GPU front. Even taking into consideration there's no high-end next gen, doesn't have to mean that they aren't capable of releasing decent GPUs at decent prices. I was listening to a HWUB discussion on upcoming GPUs and they mentioned that AMD were really going for it and hungry to win, when it came to Ryzen over the past decade. But they don't show that same level of desire to win back marketshare when it comes to the GPU side of things.

Things can go two ways and we've seen AMD do both before:

1) rebrand/re-release or releasing new parts at fairly similar prices (i.e. little improvement to price to performance) such that it still feels like a re-release. E.g. 7800xt being barely better than the 6800xt for almost the same price.

2) just price the parts well. Mid-range for next gen should perform like top-range this gen at mid-range pricing. So if they price 7900XT(X) similar to 7700XT/7800XT (or ideally much better). AMD did do this before, back in the day they had 390X at the top end, which was followed up by a similar performing RX480 for far less money, actually at mainstream sub-£200 pricing. Even though AMD didn't have any high-end to compete in the RX400 series, the RX480 was still a win due to massive improvements to price-to-performance. I knew a fair few folks around that time were buying RX480s instead of whatever Nvidia had, especially since they were on a budget GPU-wise.

The former has been happening a lot recently from both AMD and Nvidia, though admittedly AMD has been seemingly doing it a lot more. It won't gain them much market share if they do that with the upcoming releases.

The latter is what AMD would do, if they really did care about regaining market share. Just price in a way that brings todays top end performance to the mainstream price-points. Performance wise, we don't know what AMD's next gen will top out at. But if similar to a previous time they didn't have high-end (RX 400), then they should ideally be giving us 7900XTX performance for 7700XT price.

The poor expectations I have makes me think we'll get 7900XT performance at 7800XT price, if we're even lucky to get that. But even that would at least be appreciated, over what we've had recently. Heck at this point I'll even take a £300 card performing like 7800XT if that's the best they give us.
 
Last edited:
There's a thread on these forums and all 98% of posts (all the ones that aren't mrk) say is that they don't like RT and raster looks just a good, or something much like that. I think HUB did a video and said only 3 games looked better with RT. I think there was also a thread with a poll in it too and RT was the thing least people cared about when buying a new graphics card.

That's what I based my comment on, maybe I should've phrased it as "the majority of people, at least on this forum, just turn it off". Still, given the general hatred for how RT looks and all that, it would seem an odd area for AMD to invest time or money into.
I do think RT does look nice however I don't actively play any games where it will make/does a massive difference to visuals.

RT will look good if the game is made with it from the ground up and actually has gameplay scenes which takes advantage of it.

I don't know if it has it for RE Village would be one of the best games for RT imo and way more then Cyberpunk would have since a lot of Cyberpunk RT examples we get forced fed here is puddles and rain which granted will look nice just doesn't add enough but atmospheric which RE Village is full of would be great.

If I'm honest, horror games is probably where RT works best, not action games or open world games.
 
Back
Top Bottom