• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Caporegime
Joined
17 Mar 2012
Posts
49,555
Location
ARC-L1, Stanton System
Rumour is AMD will be bowing out of the high end with its next generation GPU's, not for the first time, RX 480, RX 5700XT.

Although i find MLID take on a lot of things annoying he is the primary rumour source and not always wrong, i do find that ^^^^ rumour credible.
I don't know what AMD's thinking is here but if they want the graphics section of their business to remain profitable they can't go spaffing huge amounts of money on R&D that they are never going to get back with a 15% market share.

Anyway, more on Toms annoying take on things, at about the 10 or 11 minute mark in this video Tom explains that RDNA 4 will be around 7900 XTX, but then qualifies that with a 20% range of *i don't know* and then further qualifies that be saying that range could stretch out even more, right ok Tom so you know about as much as i do and you're just using sweeping range qualifiers to cover your arse.

And then he says we would be getting a 30% increase in performance at the $500 price range, and that this is a good thing, no Tom, that's not good, if i'm looking to upgrade my $500 GPU for the latest $500 GPU i'm not impressed by +30%, i get half of that just overclocking my existing $500 GPU, its pathetic, its no incentive. What are you talking about?

 
Last edited:
Pretty much what I took from it. "We don't know, this is my best guess". Makes sense they may be having a pause on the high end until RDNA 5. That won't affect many buyers. I'd rather see another 5700XT, decent price to performance. On a 6950XT I'll probably hold out until I see a decent 50%+ upgrade in the £500 region.
 
RedGaming was reporting this 6 month back. Apparently because of this there will be a short window between RDNA 4 and 5. Guess we will have to wait and see
 
Pretty much what I took from it. "We don't know, this is my best guess". Makes sense they may be having a pause on the high end until RDNA 5. That won't affect many buyers. I'd rather see another 5700XT, decent price to performance. On a 6950XT I'll probably hold out until I see a decent 50%+ upgrade in the £500 region.
Same here, that's what I'm waiting for, XTX/4080 Super like performance (a bit higher ideally) for around £500-£600.
 
I always see MLID getting a lot of heat for his rumors, I'm not sure what people are expecting? It's rumor and speculation, if you want facts wait for reviews.

Approx 30% is one tier of performance so a pretty good increase for the £500 level. Also, any significant improvement to raytracing is good, narrowing this gap will increase competition.

RDNA 4 seems incremental, but that's not a bad thing considering NVIDIA is not predicted to do anything else this year (other than a 5090 for a record price).
 
I could swear a thread like this already existed... or perhaps I'm getting mixed up with countless similar threads that came before it?

RedGaming was reporting this 6 month back. Apparently because of this there will be a short window between RDNA 4 and 5. Guess we will have to wait and see

I don't think such a short window between will be likely, unless RDNA 4 & 5 target different market segments and slot in around each other. E.g. RDNA5 is high-end only.

Approx 30% is one tier of performance so a pretty good increase for the £500 level. Also, any significant improvement to raytracing is good, narrowing this gap will increase competition.

RDNA 4 seems incremental, but that's not a bad thing considering NVIDIA is not predicted to do anything else this year (other than a 5090 for a record price).

It's sad to see CPUs and GPUs flip from where we were a decade ago, now accepting such small generational improvements on GPUs (even though rebrands/re-releases were still a thing a decade ago). While there's decent options, value & competition on the CPU front.

But yes, performance tiers and pricing will be decided by Nvidia as usual, AMD have shown no interest themselves to do otherwise.
4070ti/7900xt-perf for £600 is most likely, even though in the good old days, it would be the previous gen's flagship performance at mainstream prices.

At this rate, my best bet is to nab a 2nd hand 7900XT(X)/4080/4070ti for £500 after these new gens of GPU launch at the start of next year, to have a GPU to put into my GPU-less new build. I don't see new releases offering that much better value. Especially seeing Nvidia cut corners and downgrade further on 4080 Super vs 4080. AMD's poor decisions like 7700xt pricing (amongst other things) gives me no faith in them either going forward.
 
Last edited:
we dictate pricing

Either there's a lot of folks out there willing to pay more for less performance or the potential customers clearly don't have a say in pricing. :rolleyes: I don't think anyone has had anything positive to say about GPU prices this gen. Going by that we should get price drops across the board, but we don't. Plenty of folks (myself included) just stopped bothering to upgrade/buy GPUs in recent years.

Otherwise: "dear Lisa Su/Jensen, 7900XTX/4080 performance for under £400... make it happen!" :cry:
 
I'd personally expect around 7900XT, esp on launch (AMD dur), any more will be a bonus. A proper decent hardware upscale is what they're missing, and this chip may *finally* bring one.
XeSS has already demonstrated that you can have DLSS-tier upscaling without dedicated hardware (and arguably Nvidia itself did that even earlier with DLSS 1.9). A good XeSS implementation is on par with a good DLSS one, even running in DP4a mode on the shader cores of non-Intel GPUs. It doesn't boost performance quite as much as FSR/DLSS do on non-Intel cards (because it's stealing shader resources), but in terms of image quality it delivers. FSR looks the way it does because of decisions that AMD have made in its development, preferring wide compatibility and maximum performance over addressing image quality concerns. They can change course on that whenever they like. It doesn't require a new generation of hardware to "fix". Nvidia use the tensor cores to run DLSS simply because they're already there and would otherwise be doing nothing when running video games, but they're not some magic bullet that makes DLSS look better than FSR - that's entirely down to the software side of things.
 
XeSS has already demonstrated that you can have DLSS-tier upscaling without dedicated hardware (and arguably Nvidia itself did that even earlier with DLSS 1.9).
DLSS 1,9 was pretty bad. Somewhere between FSR1 and FSR2. XeSS with the DP4a path might look better than FSR2.2, but in most games I've tested it in, its performance hit on most GPUs is not worth using. The point of these technologies is to increase performance, so if my 6600xt looks worse with XeSS "Quality" than native 1440p, but gives only like a 5% performance uplift, I'm not bothering to turn it on. I'd rather get 30% uplift with FSR2.2 even if it looks a little worse.

The best thing I've used on my GPU is Epic's TSR in UE5. The performance impact is less than XeSS, and it looks as good, and according to some resent Digital Foundry content, might actually look even better. The XeSS that actually runs on Intel's own hardware is a different story.
 
4070ti/7900xt-perf for £600 is most likely, even though in the good old days, it would be the previous gen's flagship performance at mainstream prices.
Is that really that bad? Technically that's a 50% performance per dollar increase generation on generation. If you got 90FPS on a 7900xt at $900 that's 0.1 FPS per dollar, you now get 90FPS at $600 which is 0.15 FPS per dollar. That's an above normal uplift for one generation. That being said, the 7900xt really should have been $800 since it's 18% slower than a 7900xtx, so it should be 20% cheaper at minimum. But even if they had launched at 4070ti-like prices at $800, it's still a 33% FPS per dollar increase. That's the same % increase the HD 5830 to the HD 6870 had in performance per dollar 14 years ago. And about the same as the RTX 2060 to RTX 3060.
People are comparing current sales prices of 1 year old GPUs to brand new hardware not even released yet. And if it's 7900xt performance it might actually come in at $550 or lower.
 
Is that really that bad? Technically that's a 50% performance per dollar increase generation on generation. If you got 90FPS on a 7900xt at $900 that's 0.1 FPS per dollar, you now get 90FPS at $600 which is 0.15 FPS per dollar. That's an above normal uplift for one generation. That being said, the 7900xt really should have been $800 since it's 18% slower than a 7900xtx, so it should be 20% cheaper at minimum. But even if they had launched at 4070ti-like prices at $800, it's still a 33% FPS per dollar increase. That's the same % increase the HD 5830 to the HD 6870 had in performance per dollar 14 years ago. And about the same as the RTX 2060 to RTX 3060.
People are comparing current sales prices of 1 year old GPUs to brand new hardware not even released yet. And if it's 7900xt performance it might actually come in at $550 or lower.
It all hinges on whether you think the current "sale prices" are genuine or original MSRP prices were artificially inflated? I'd say the consensus is they were inflated as an attempt to reframe pricing. Crypto and CV19 have got them drunk on profits and they don't want to let them go. To be fair older gamers with disposable income have reinforced this. When I started out the average age was lower and credit was hard to come by and it was harder, but not impossible, to get more money out of gamers (that term also didn't exist), neither did RGB :cry:
 
No one is interested in AMD's GPU's, we dictate pricing, Certainly not AMD.

Customers only dictate the price when the manufacturer depends on its clients. Both AMD and nVIDIA make their money from other business, gaming GPUs are just a side thing for now. Heck, since AMD has a weak market share, it should push the prices and quality of their products like crazy in order to get that back when it fact they're quite happy shadowing nVIDIA.
 
So I didn't actually watch/listen to the video in OP till today.

According to the video the rumour states the following:

Navi 48
Performance guess is between 7900XT and 7900XTX
300-350mm^2 die size
256-bit memory controller, 20Gbps GDDR6 predicted

Navi 44
Performance guess ranges between 7600XT and 7800XT
<210mm^2 die size
128-bit memory controller

Going on by those rumours, Navi 48 will have die size similar to 7800XT or 4070ti non-super. And Navi 44 will have die size similar to 4060ti or 7600xt. Means bugger all these days, especially for pricing. My guess is that Navi 44 will be (or should be) the 8600xt and should cost no more than £300 (ideally £250). If it does, then I'll be satisfied that its a decent card, bringing actual performance gains for mainstream price bracket. The other one sounds like it would be called 8800xt, that £600 7900XT I'm expecting.

Hardware Unboxed mentioned in their Q&A yesterday that AMD doesn't even try any more in the GPU space. They only pulled Ryzen off in the CPU space cos they were actually trying hard, being aggressive and Intel made multiple mistakes. Nvidia's only real mistakes are pricing and poor naming/segmentation for the sake of milking customers.
 
Hardware Unboxed mentioned in their Q&A yesterday that AMD doesn't even try any more in the GPU space. They only pulled Ryzen off in the CPU space cos they were actually trying hard, being aggressive and Intel made multiple mistakes. Nvidia's only real mistakes are pricing and poor naming/segmentation for the sake of milking customers.

Apparently hardware unboxed forgot about the 6000 series, a massive step up from what they had previously, the 6900xt and the 3090 traded blows in quite a few games, same for the 6950 and 3090ti . Nvidia were faster in ray tracing but that was to be expected as they had a head start on it.

And if they weren't trying anymore why bother developing a gaming gpu comprised of chiplets? Would it not be easier to design a gpu the traditional way rather than go with the chiplet approach? They're the only manufacturer currently with that tech in a gaming gpu and its quite clearly still in its infancy, much the same as the first iteration of ryzen, a good product not great, but it got a lot better over time with subsequent releases.
 
Back
Top Bottom