• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 4 thread

Lol MLID said Navi 41 was cancelled and rdna4 would mostly compete with rtx5000 midrange, then few days later this guy makes a video saying rdna4 is going to beat the rtx5090 and it's going to launch early before Nvidia can

He's talking about RDNA5, not 4. (2025)
 
Last edited:
Ironically what's holding back MCM and multi-GPU is that people aren't taking advantage of DX12.
I seriously wish AOTS had better success because it's a technical marvel. What other game made use of your iGPU to help your DGPU?
 
Last edited:
RDNA4 RX8000 GPUs appear in Linux patch notes. The two GPUs which appear to have received software support appears to be Navi 44 and Navi 48

These sort of leaks don't provide much info on specs etc, all they tell us is AMD is testing two new GPUs


Nice.... lets park those and wait for Nvidia to launch thiers...... and then wait some more.
 

According to Red Gaming Tech, RDNA4 will feature GDDR7 and PCI Express 5.0 interface. Navi 44 have 20 WGP and Navi 48 have 32 WGP.

I looked at Techpowerup GPU database on RDNA3 GPUs WGPs, guess Navi 44 with 20 WGP will be RX 8500 and Navi 48 with 32 WGP will be RX 8600.
 
Anyone care to speculate what difference this chiplet approach will make if and when they get it working? Does it have the potential to give Nvidia as much trouble as Intel have on the CPU side or it it a case of you'll win on some aspects and lose in others?
 
Anyone care to speculate what difference this chiplet approach will make if and when they get it working? Does it have the potential to give Nvidia as much trouble as Intel have on the CPU side or it it a case of you'll win on some aspects and lose in others?
current radeon 7000 series are chiplets...and working.
die shrinks is about to end and become more expensive as normally you die shrink and add twice the transistor count to improve performance.
soon this ends and innovation and other solutions is now happening (chiplets etc..) but the big easy improvements are gone.

two things will happen,
mid range be cheaper to make and hopefully also then better prices.
high end aka 4k,8k will cost more and have less performance increases each generation than previously.
 
Anyone care to speculate what difference this chiplet approach will make if and when they get it working? Does it have the potential to give Nvidia as much trouble as Intel have on the CPU side or it it a case of you'll win on some aspects and lose in others?
Theoretically it allows you to build some absolute monsterous GPUs if you can get it all working together, vs a monolith die.

Think back to when zen chiplets intially came out. Intel was struggling to compete with the core count and the MT performance in the Workstation and server space. Even now intel caps off at 60 cores, while AMD are at 192.

If AMD can get it all to work together, they could produce an absolute mammoth of a GPU in-terms of size. Not sure how well it will function in games, but for compute stuff it would be incredible.
 
Last edited:
Anyone care to speculate what difference this chiplet approach will make if and when they get it working? Does it have the potential to give Nvidia as much trouble as Intel have on the CPU side or it it a case of you'll win on some aspects and lose in others?
its highly likely nvidia will be working on chiplet to be fair. theres much speculation on it already
 
Man this takes me back. I still have fond memories of my 8600gt,and my HD3850. The 4870 was a belter. Also my Core 2 Duo e6300 that I OC'ed from 1.86ghz to 3.6ghz rock solid for years. I'm 41 today. She'd a tear for me :(
HD4870 was a great card, and so was HD7970 a few years later!
 
HD4870 was a great card, and so was HD7970 a few years later!
It certainly was. But the HD6000* series was a joke,and the HD7000 series was too expensive and that's when I bowed out. Good memories though. I remember COD 4 I had a HD2400(!) thought it looked fine. Got the 8600gt and by accident enabled shadows on the first level. Mind blown lol. Pity we can't have those moments anymore. *I had a HD5850 as well. Dirt 2 and 3 @60fps was epic. Off topic sorry. Had a few.
 
Last edited:
It certainly was. But the HD6000* series was a joke,and the HD7000 series was too expensive and that's when I bowed out. Good memories though. I remember COD 4 I had a HD2400(!) thought it looked fine. Got the 8600gt and by accident enabled shadows on the first level. Mind blown lol. Pity we can't have those moments anymore. *I had a HD5850 as well. Dirt 2 and 3 @60fps was epic. Off topic sorry. Had a few.
Had a HD5850, solid GPU. Had 2 7950 in crossfire.. that was a mistake specially when I could only pair them with a FX8350(although at 5,1ghz) because my i7 920 died. Room was a bloody furnace and performance was lacking a lot :P but that wasn't the GPUs fault hehe. Girlfriend always stayed away from my cave due to the heat :D.
 
I just really hope they improve the high idle power usage, with 3 screens my 7900XTX idles at 105W, wasted way too long trying different refresh rates, monitor configs and drivers that supposedly help but I never found a way to reduce it
 
Back
Top Bottom