• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Caporegime
Joined
17 Mar 2012
Posts
49,555
Location
ARC-L1, Stanton System
Rumour is AMD will be bowing out of the high end with its next generation GPU's, not for the first time, RX 480, RX 5700XT.

Although i find MLID take on a lot of things annoying he is the primary rumour source and not always wrong, i do find that ^^^^ rumour credible.
I don't know what AMD's thinking is here but if they want the graphics section of their business to remain profitable they can't go spaffing huge amounts of money on R&D that they are never going to get back with a 15% market share.

Anyway, more on Toms annoying take on things, at about the 10 or 11 minute mark in this video Tom explains that RDNA 4 will be around 7900 XTX, but then qualifies that with a 20% range of *i don't know* and then further qualifies that be saying that range could stretch out even more, right ok Tom so you know about as much as i do and you're just using sweeping range qualifiers to cover your arse.

And then he says we would be getting a 30% increase in performance at the $500 price range, and that this is a good thing, no Tom, that's not good, if i'm looking to upgrade my $500 GPU for the latest $500 GPU i'm not impressed by +30%, i get half of that just overclocking my existing $500 GPU, its pathetic, its no incentive. What are you talking about?

 
Last edited:
7800XT brought the same performance as the 6800XT for less money as it should have done being a new generation. The 8800XT is said to do the same again which i would expect.

I've got a 4090.

Its actually a replacement for the RX 6800, it was $580 MSRP, the 7800 XT is $500 and 20% faster.

Its about a 35% cost per frame improvement, its not bad. Its a good card to boot, i really like mine.
 
Last edited:

7800 XT: 60 CU's, 2.43Ghz, 256Bit, 624 GB/s, 263 watts, $499

8700 XT: 56 CU's, 3.05Ghz, 256Bit, 640 GB/s, 190 watts. $399
8800 XT: 64 CU's, 3.20Ghz, 256Bit, 768 GB/s, 225 Watts. $499


All built on TSMC 4NP, which i believe is the same as current Nvidia 4000 series.
 
Last edited:
I think it will still sell well and also give Intel a big headache

And they could give Nvidia a headache as well if they launch early, since it's on 4nm they should be able to launch soon if they want to, where as Nvidia wants 3nm so they have to wait for yields and supply to mature

Yeah Intel are selling today what is essentially an RTX 2080, at best.

At £270 for the A770 its already too much money for that and if Intel don't put out a GPU that is at least competitive with this generation of GPU's soon it could be a problem for their dGPU competitive hopes because we will probably see another new generation from AMD at least if not also Nvidia long before this year is out.
 
Interestingly, looking at the PS5 Pro leaks, if we take them for granted then it would put that mix of RDNA in terms of RT performance on par with Ampere if not slightly better. Looking at the old DF video of RT - 6800 XT vs 3080, we see that a 2-4x RT increase would have made the 6800 XT either on par or faster than the 3080. Unfortunately I don't think such an optimistic outlook is really possible for desktop, if we look at the RT off vs on % difference improvements gen on gen, things don't look that great for either team (tho NV has less ground to make and still closes more of the gap each gen). Most performance increases even in RT come from mostly bigger & more powerful GPUs rather than specific RT-specific innovations, at least excluding PT (which is mostly **** anyway due to low ray count and crappy denoisers; possibly starts being relevant by 2028 but not now).

xYKep6E.jpg



RX 7800 XT

On Port Royal i score 10,569

The FE RTX 3080 scores 11,449, so its 8% faster than an RX 7800 XT.

Dig a little deeper...

Now i don't know why Guru 3D switched to FPS instead of scores with the 3070 Ti, its idiotic as it becomes a meaningless number to people looking at it.
But its 40 FPS compared with the 53 of the 3080, so the 3080 is 32% faster, that would make the RX 7800 XT about 23% faster, in fact they have it at 48 FPS, which is 20% faster, tho 1 FPS in this can be a couple of % so it depends how its rounded out, another reason why switching to FPS was stupid, its imprecise.

I think that's a decent showing for V2 RT cores vs V2 RT cores.

And look at the RX 6900 XT, it scores 45 FPS, 3 FPS lower the the RX 7800 XT, in raster the RX 7800 XT is only about 5% faster than the RX 6800 XT, which scores 41, so the RX 7800 XT is 17% faster is RT.

Another thing to consider is the RX 7800 XT actually only has the same number of RT cores as the RX 6800, 60, its 37% faster than that.

The 7900 GRE as the same 80 RT cores as the 6900 XT, it scores 55 FPS, 15% higher than the RX 7800 XT and 22% higher than the RX 6900 XT.

There isn't actually a huge difference between clock speeds RDNA 2 vs RDNA 3, stock my RX 7800 XT runs at between 2.4 to 2.5Ghz, compared with about 2.2Ghz of the RX 6800, so for about 10% higher clocks i'm getting 37% higher performance, and IPC increase of about 25%, you can see the same thing with the RX 6900 XT vs the RX 7900 GRE as they actually run at about the same clock speed.

There is a chunk of extra performance in the cores RDNA 3 vs RDNA 2, a sub £500 GPU is 20% faster than the admittedly overpriced £600 RTX 3070 Ti of the previous gen Nvidia.
How about the 4070, well that scores 52 FPS, vs my 48 FPS, so its 8% faster.

OK all of that aside if one can say the RT performance on the RTX 3080 and RTX 4070 is not bad, its pretty usable, then i think the same can be said for the RX 7800 XT.
For my personal experience how does it fare? its replaced the 2070 Super, its more than 2X as fast in RT, Port Royal and in games like Control, ecte... its more than just usable.

And look, the RX 6900 XT has 33% more RT cores than my RX 7800 XT, and yet its 5% faster in RT, AMD have improved RT per core, and by quite a chunk.
 
Last edited:
Last edited:
To put it another way, if the RX 7800 XT was RDNA4, not RDNA 3 it would have the same RT throughput as the RTX 4090.

That's not to say it will have the same performance in Cyperpunk RT as the 4090, rasterization still plays a big part in that game even if it is the most RT replacing rasterization game currently, what it does mean is it has that much more horsepower for RT, so unless the RTX 5000 series gets a similar 2X RT bump in performance Nvidia would find it impossible to choke RDNA 4 by just cranking the RT in Cyberpunk higher and higher and higher, all that would do is choke off Nvidia's own performance more than AMD's the higher they push it.
 
Last edited:
Interesting if a PS5 Pro has similar RT performance to a 4080 :o

Yeah.

The RX 7700 XT has 56 CU's at 2.2Ghz, 35 TFlops FP32, BVH4 56 RT cores, 330 TOPS AI
The RX 7800 XT has 60 CU's at 2.5Ghz, 38 TFlops FP32, BVH4 60 RT cores, 400 TOPS AI

The PS5 Pro: 60 CU's at 2Ghz, 34 TFlops FP32, BVH8 60 RT cores, 300 TOPS AI.

Its basically an RX 7700 XT but with 2X the RT performance.
 
Last edited:
People are forgetting the leaked specs showed that Sony basically created their own hardware and software to use with the RDNA hardware part to accelerate RT, as DF alluded to this information, their thinking is that RDNA wasn't up to task for the level of RT Sony wanted, so they went and did it themselves whilst also inventing PSSR as a better upscaler.

Whether it has 4080 RT "performance" or not who knows, it won't be 4080 performance at 1:1 resolution since the console will be using DRS still, but DRS when implemented well won't be noticeably different to a fixed render resolution anyway, at least on PC that is, consoles are a bit different.

I want to find this patent and read it myself, without this idiot babbling and babbling on and on..... making up utter nonsense and not having it on screen for you to read for more than 2 seconds, he says a thousands things without saying anything at all, typical Youtuber looking for clicks but actually knowing nothing at all about the subject.

Of course these idiots never post a link to their source....

"i was sent the direct link" Right? SO WHERE IS THAT ______ LINK?????? you smooth brained ____!

 
Last edited:
Also... Digital Foundry are just as stupid, none of these Youtubers do any actual research, they all makes crap up that suits their channels agenda, even if they don't believe it themselves, if you watched DF content over the last few years you would think Sony created RDNA 1 and 2 and 3 for AMD, there is a mountain of AMD's patents that tell these feature DF say are Sony have actually been developed by AMD, they design and make the bloody things.

I wonder if it'll struggle to get a 2x improvement with RT though, the 700XT is fair bit less powerful than a 4080 and then having faster RT, seems a bit disproportionate

As i said here....

To put it another way, if the RX 7800 XT was RDNA4, not RDNA 3 it would have the same RT throughput as the RTX 4090.

That's not to say it will have the same performance in Cyperpunk RT as the 4090, rasterization still plays a big part in that game even if it is the most RT replacing rasterization game currently, what it does mean is it has that much more horsepower for RT, so unless the RTX 5000 series gets a similar 2X RT bump in performance Nvidia would find it impossible to choke RDNA 4 by just cranking the RT in Cyberpunk higher and higher and higher, all that would do is choke off Nvidia's own performance more than AMD's the higher they push it.

A 60 CU RDNA 4 with a BVH8 RT engine will not keep up with a 4080 as it still has 2X the rasterization performance, roughly, even a game like Cyberpunk is still primarily Rasterised.
It just mean's it can handle RT that much better, so if with RTX 5000 series being unknown we stick to RTX 4000, RDNA 4 can never fall behind RTX 4000 in RT relative to equal rasterization performance, in fact RDNA 4 would have much more muscle to spare.

My own smooth brain can't really laymen's convert my thinking better than that.
 
Last edited:
What i will say is this, AMD RT functions are not fixed, Both Intel and Nvidia use fixed function dedicated cores, so unless you change the core there isn't a lot that you can do for further development.

AMD could develop a completely new RT engine for their existing cores, such as moving from BVH4 to BVH8.
 
We'll see, I'm looking to simplify my gaming (or make it more complex I don't know), by having a cheaper PC setup coupled with a console, best of both worlds. If this will be possible, I have no idea. But spending £1000 nearly on a GPU days are gone.

It depends on what one needs, i play at 1440P, Helldivers 2 is the newest game i have and i love it, i play it far too much, to the point where i'm neglecting my testing responsibilities for Star Citizen test builds and the one in testing now is the most important one to date :o
it run's at around 100 FPS at native resolution with all the graphics setting's maxed out, the GPU is overclocked, a £480 GPU, would a £1000 4080 get me higher FPS for my screen? Yes of course it would, if i turn off the FPS counter would i notice it? Would it feel much different?

Probably not.... its excessive. its E-Peen, no tangible benefit. :)
 
Last edited:
Dude you gotta stop this, it's getting childish and inflammatory now. There's nothing wrong with a bit of performance 'overhead' or higher fps if the screen is capable (each to their own), and as for cards I've had all the higher end ones of late so have the experience of knowing.

You're right, but for twice the money? If in 2 years time i buy the latest £500 GPU i probably still come out on top.
 
O I agree, there's certainly a sweet-spot when it comes to gfx card tiers etc but if people want to spend more for that extra 20/30% of perf and the extra features then let them. I only went over to a 4080 from an XTX (excellent cards tbh) because I was disappointed with how FSR was performing, once AMD get a hardware solution up and running I'll certainly be interested in them once more!

True fact about FSR, i get it, i can see a need for upscaling tech when you're running RT, it definitely helps there and FSR just isn't good enough.
 
Agree entirely but you are kidding yourself if you think the current 'acceptable' mid-range price of £500 doesn't get upgraded to £700 in two years time for equivalent performance levels. By then £500 will get you a 9500xt 16GB or a 6050 4GB ;)

Well $999 last gen is now $500, 7800 XT is a 6900 XT in performance terms, yes it was always halo priced but its still quicker than a 6800 XT which was $650. That's just one gen to the next, as it should be.

The value is still there, mores law is not "dead" If you don't buy Nvidia.
 
Last edited:
The 7900 GRE now looks like a very solid GPU for £550.


Don't bother with the ASRock Challenger one, this one even if it is only £510, i had a 5700 XT ASRock Challenger and the cooler was like a cheap cooler from 2010, it could not keep the card properly cooled even with the fans running flatout and this to me looks like it has the same problem, so avoid it.


I don't know about this one for £530, the cooler looks far more substantial.


My advice is again avoid the ASRock Challenger, you will regret saving £40.
Just go for the Sapphire Pulse, i have the Sapphire Pulse RX 7800 XT, it has a smaller dual fan cooler, its cool and its silent, even overclocking to 300 watts its still reasonably cool and quiet, excellent cooler and fans.
 
Last edited:
Back
Top Bottom