• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
They may do but to stay within the 50%+ perf per watt they cant overdo it. Just look at the 4090 and how it runs way more efficient undervolting it. Instead of cranking up the 7900XT to compete with the 4090 they should have settled on leaving that job to the 7900XTX or whatever their halo card ends up being.

Somewhere in this thread from Igore, i think it was, published the RX 7900XT board design, it had 2X 8 pin, that's 300 watts, or 375 if you include the PCIe slot power, which really you shouldn't, at least not to it's full extent as these high end GPU's tend not to draw much power from the PCIe slot.
 
Last edited:
Somewhere in this thread from Igore, i think it was, published the RX 7900XT board design, it had 2X 8 pin, that's 300 watts, or 375 if you include the PCIe slot power, which really you shouldn't, at least not to it's full extent as these high end GPU's tend not to draw much power from the PCIe slot.

I was wrong, it is actually 3X 8 pin, 450 watts.

At least it wont melt.

Edit: this is the 24GB design, so the 7900XTX

decZd5U.jpg
 
Last edited:
They may do but to stay within the 50%+ perf per watt they cant overdo it. Just look at the 4090 and how it runs way more efficient undervolting it. Instead of cranking up the 7900XT to compete with the 4090 they should have settled on leaving that job to the 7900XTX or whatever their halo card ends up being.

It entirely depends on what power they designed for. 6600XT and 6900XT have similar perf/W despite the latter being nearly double the TBP because they were designed to hit their TBPs in a saner part of the v/f curve.

If 12k shaders + rops etc is enough units to stay in the sane part of the v/f curve at 400W it will be efficient at that power usage.
 
What will be interesting RDNA3 performance at lower resolution, 1080P, even 1440P.

A quick recap- Nvidia use Software Thread Scheduling, it uses your CPU to queue and schedule CPU threads, that steals CPU cycles from your game.
AMD use Hardware Thread Scheduling, a bit like having a mini CPU on the GPU its self to do the job that with Nvidia your actual CPU does.
As a result you will find yourself CPU bottlenecked much sooner with Nvidia than you would with AMD.
You can see that to some extent already between RDNA2 and Ampere, AMD's GPU's are not better with lower resolution, or worse at higher resolution, at least not in the way you might think, its just that they are less CPU bound at lower resolution, so it appears that Nvidia are better at 4K.
You can also see it, much more pronounced with Intel, who i suspect don't do any thread scheduling at all.
Probably because these are AMD and Nvidia specific hacks to achieve higher performance efficiency.


The result when you push it to extremes to where the problem becomes obvious, notice the RX 5700XT blowing the RTX 3090 out of the water? because in this deliberately contrived scenario Nvidia's poor CPU efficiency is bottlenecking the 3090 to such an extent it allows the 5700XT to blow right past it with AMD's much more efficient design.

Now have a look at the bottlenecking at 1440P compared with 4K for the 4090.



609st93.png


Dt5ZadM.png

nBvSaIb.png

PoKxhpf.png

A8WeBFN.png
 
Last edited:
What will be interesting RDNA3 performance at lower resolution, 1080P, even 1440P.

A quick recap- Nvidia use Software Thread Scheduling, it uses your CPU to queue and schedule CPU threads, that steals CPU cycles from your game.
AMD use Hardware Thread Scheduling, a bit like having a mini CPU on the GPU its self to do the job that with Nvidia your actual CPU does.
As a result you will find yourself CPU bottlenecked much sooner with Nvidia than you would with AMD.
You can see that to some extent already between RDNA2 and Ampere, AMD's GPU's are not better with lower resolution, or worse at higher resolution, at least not in the way you might think, its just that they are less CPU bound at lower resolution, so it appears that Nvidia are better at 4K.
You can also see it, much more pronounced with Intel, who i suspect don't do any thread scheduling at all.
Probably because these are AMD and Nvidia specific hacks to achieve higher performance efficiency.


The result when you push it to extremes to where the problem becomes obvious, notice the RX 5700XT blowing the RTX 3090 out of the water? because in this deliberately contrived scenario Nvidia's poor CPU efficiency is bottlenecking the 3090 to such an extent it allows the 5700XT to blow right past it with AMD's much more efficient design.

Now have a look at the bottlenecking at 1440P compared with 4K for the 4090.



609st93.png


Dt5ZadM.png

nBvSaIb.png

PoKxhpf.png

A8WeBFN.png
4090 is a card made of heavy RT scenarios and/or high resolution, not relatively low resolution rasterization. So it was with previous gen as well.

metro-exodus-rt-1920-1080.png


metro-exodus-rt-2560-1440.png


cyberpunk-2077-rt-2560-1440.png



1440p may be up to discussion, but how many people are playing 1080p games with such cards?
 
You would be surprised. I remember at last gen launch saying 4k was what the top cards were aimed at and it was quite scoffed at then. The upgrade on displays seems to have been a knock on effect (of people buying ampere) so maybe all them 1080p units have now been replaced.
Well, you can always render the image at 1440p or 4k then downsample it to 1080p for to get a better IQ, so either way is a plus.
 
Asking out of ignorance here not stirring the pot. If AMD have a card that competes with the 4090, what incentive do they have to price it lower? Knowing how 'loyal' people are to brands, wouldn't they just try and milk the early adopters first.

I read a lot of hopium that they are about to release a 4090 and charge hundreds less, seems unlikely is all.
 
Last edited:
Asking out of ignorance here not stirring the pot. If AMD have a card that competes with the 4090, what incentive do they have to price it lower? Knowing how 'loyal' people are to brands, wouldn't they just try and milk the early adopters first.

I read a lot of hopium that they are about to release a 4090 and charge hundreds less, seems unlikely is all.

If it's competitive it won't be cheaper by much, maybe £50 or so.
 
Status
Not open for further replies.
Back
Top Bottom