• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
So Igor's lab has sketched out the rumoured board for the 7900xt

Key points:
450W Max.
24GB of VRAM max. (Gunning for the creatives I see; lets hope ROCm and HIP catches up)
1 HDMI, 3DP, No USB-c in sight



Surely it's 525W including the 75W the PCI-E slot can provide?
 
Yet 500 watt 3090ti are ok?
No, should have done the same thing for that too. They can say +##% performance per watt all they want but it doesn't make much difference when the performance AND the watts are increasing by a large proportion.
Maybe we'll plateau soon and people won't need the extra performance? Probably the wrong audience here, hah.
 
The problem with efficiency for AMD is that it's actually limited to non-RT games, so in reality it's not as much of an efficiency win as it would initially seem. Then if we also start adding up differences with DLSS they fall even further behind. So fundamentally it's not really a win because it's still only the case in certain scenarios, and that's a much harder sell than if it were across the board - for an attribute that gamers in general still care little about, let's be honest.

The way AMD is positioning with RDNA 3 is such that it will hurt Nvidia where they care the most: the pocketbook, because they can afford to coast with RDNA 3 on higher margins & have Nvidia eat theirs with those big beefy dies & systems. Unfortunately this doesn't really do much of anything for us as the GPU buyers. So we have to wait for RDNA 4 (or whatever they call the new generation then) for them to actually step up and land some hard punches and maybe show something RT-wise, feature-wise, etc. Then again, will they actually commit to buying all those wafers for desktop graphics? That's the sad part I think, CPUs will always get priority, so maybe it's Intel with all their current failures that still represent a bigger hope that things might change long-term. Who knows..
 
The problem with efficiency for AMD is that it's actually limited to non-RT games, so in reality it's not as much of an efficiency win as it would initially seem. Then if we also start adding up differences with DLSS they fall even further behind. So fundamentally it's not really a win because it's still only the case in certain scenarios, and that's a much harder sell than if it were across the board - for an attribute that gamers in general still care little about, let's be honest.

The way AMD is positioning with RDNA 3 is such that it will hurt Nvidia where they care the most: the pocketbook, because they can afford to coast with RDNA 3 on higher margins & have Nvidia eat theirs with those big beefy dies & systems. Unfortunately this doesn't really do much of anything for us as the GPU buyers. So we have to wait for RDNA 4 (or whatever they call the new generation then) for them to actually step up and land some hard punches and maybe show something RT-wise, feature-wise, etc. Then again, will they actually commit to buying all those wafers for desktop graphics? That's the sad part I think, CPUs will always get priority, so maybe it's Intel with all their current failures that still represent a bigger hope that things might change long-term. Who knows..
we don't know what RT performance is like yet though do we? or is this just an assumption based on the lack of fanfare for it so far?
 
If I get a GPU this time (if I can justify it) it will be AMD as long as they are competitive to the point that they are very close to Nvidia's performance for each tier (except for the 4090+ tier, I would never consider such an expensive card so that tier is irrelevant to me). The much lower power consumption and heat production will be a huge bonus given what's going on at the moment.

I know Gamers Nexus did a video and praised the FE cooler but my 3080 FE, with the CPU being cooled by a Noctua NH-D15S with two fans, really heats up my case (my NVME SSD, X570 chipset etc) and my room; much more than my GTX 1080 ever did. This is a case with good airflow too (Phanteks P600S).

Maybe Gamers Nexus should have tested with a bigger CPU air cooler because the FE design doesn't work that well for me.

Of course the 3080 heats up your room more than the 1080. It uses more power, It's simple thermodynamics. The object of a cooler is to remove the heat from the card. That heat goes into your room/case.

The current AMD cards, like the 6800XT, would also heat up your room/case more than your 1080 did.

And lastly, your case cooling. Just change your setup so that you have an exhaust fan at the top of the case. The Ampere FE cards work best with that configuration. But, if you are comparing case temps to what your 1080 used to be, then, any of the high end cards from either company would heat up your case and room much more than your 1080. You would need to have some extra case cooling to provide better cooling for more power hungry cards.
 
Of course the 3080 heats up your room more than the 1080. It uses more power, It's simple thermodynamics. The object of a cooler is to remove the heat from the card. That heat goes into your room/case.

The current AMD cards, like the 6800XT, would also heat up your room/case more than your 1080 did.

And lastly, your case cooling. Just change your setup so that you have an exhaust fan at the top of the case. The Ampere FE cards work best with that configuration. But, if you are comparing case temps to what your 1080 used to be, then, any of the high end cards from either company would heat up your case and room much more than your 1080. You would need to have some extra case cooling to provide better cooling for more power hungry cards.
I meant more than I expected given the power difference and enough to not want to put anything significantly more power hungry with the same cooler design in my case given the huge CPU cooler. I did briefly have a faulty 2080 TI FE that I refunded and that was much closer to the 1080 than the 3080 in terms of heating up my components. I also had an aftermarket Vega 64 which didn't cause problems either. Both consumed significantly more power than the 1080 . Of course I acknowledge that any power hungry card would heat up my room to this extent and I already have three Noctua fans in the front, one in the rear and another in the top rear. With this setup I have the 3080 under control but I am not sure I can handle a 400+ W card comfortably with the same FE cooler design.
 
The problem with efficiency for AMD is that it's actually limited to non-RT games, so in reality it's not as much of an efficiency win as it would initially seem. Then if we also start adding up differences with DLSS they fall even further behind. So fundamentally it's not really a win because it's still only the case in certain scenarios, and that's a much harder sell than if it were across the board - for an attribute that gamers in general still care little about, let's be honest.

The way AMD is positioning with RDNA 3 is such that it will hurt Nvidia where they care the most: the pocketbook, because they can afford to coast with RDNA 3 on higher margins & have Nvidia eat theirs with those big beefy dies & systems. Unfortunately this doesn't really do much of anything for us as the GPU buyers. So we have to wait for RDNA 4 (or whatever they call the new generation then) for them to actually step up and land some hard punches and maybe show something RT-wise, feature-wise, etc. Then again, will they actually commit to buying all those wafers for desktop graphics? That's the sad part I think, CPUs will always get priority, so maybe it's Intel with all their current failures that still represent a bigger hope that things might change long-term. Who knows..

Exactly what I have been saying for a while now. When it comes to most ray tracing scenarios where you have a 3070 matching a 6900xt and a 3080 comfortably beating a 6900xt especially with amperes undervolting potential. AFAIC, ray tracing is the only area where amd need to catch up on nvidia now but I don't think we'll see that until rdna 4 or even 5.
 
Yeah I feel like, despite not having used it myself, nvidia on their latest iteration along with intel also having a viable product for RT they need to pull their finger out tbh.
 
No, should have done the same thing for that too. They can say +##% performance per watt all they want but it doesn't make much difference when the performance AND the watts are increasing by a large proportion.
Maybe we'll plateau soon and people won't need the extra performance? Probably the wrong audience here, hah.
Rtx 3090ti is current not next gen card.... and you are ok with all of this?
 
Exactly what I have been saying for a while now. When it comes to most ray tracing scenarios where you have a 3070 matching a 6900xt and a 3080 comfortably beating a 6900xt especially with amperes undervolting potential. AFAIC, ray tracing is the only area where amd need to catch up on nvidia now but I don't think we'll see that until rdna 4 or even 5.
You have no idea what rdna3 ray tracing is like, but have written it off already. Bet you also say miners will cause the shortages next time as well....
 
You have no idea what rdna3 ray tracing is like, but have written it off already. Bet you also say miners will cause the shortages next time as well....

Hence the key word here:

AFAIC, ray tracing is the only area where amd need to catch up on nvidia now but I don't think we'll see that until rdna 4 or even 5.

Do you not find it a bit odd how the only thing we know of rdna 3 ray tracing so far is amds comment of "it will be more advanced than rdna 2", if they were going to significantly improve it, why not say something like "it will be better than ampere", same way intel have.... Not agree that is a poor choice of words and doesn't show they have much confidence in it?

Then go based on history:

- how long did amd take to catch up on dx 11?
- how long did they take to catch up on raster?
- how long did they take to catch up on opengl? And still aren't there yet

Of course, they could come out and completely smash 40xx RT and I hope so as right now, if you care about ray tracing, we only have the one choice.
 
Troll fanboy gonna fanboy.

Butthurt boy grinding that axe again :cry:

They've not said a lot about anything in RDNA3 yet, no real reason to single out RT in the list of things they haven't talked about yet.

See this bit:

Do you not find it a bit odd how the only thing we know of rdna 3 ray tracing so far is amds comment of "it will be more advanced than rdna 2", if they were going to significantly improve it, why not say something like "it will be better than ampere", same way intel have.... Not agree that is a poor choice of words and doesn't show they have much confidence in it?

Then go based on history:

- how long did amd take to catch up on dx 11?
- how long did they take to catch up on raster?
- how long did they take to catch up on opengl? And still aren't there yet

Of course, they could come out and completely smash 40xx RT and I hope so as right now, if you care about ray tracing, we only have the one choice.

Like I said, I'm not stating that this will happen, simply saying based on history and their wording, it doesn't exactly fill me with confidence.
 
I understand your reasoning - in the absence of any other information of course most people will assume that the situation is the same. But let's see what they talk about when they do start talking about RDNA3 more - if they downplay RT then then I think assumptions of not being market leaders could be correct, but I don't think their strategy of not hyping RDNA3 just yet says anything specific about RT.
 
Status
Not open for further replies.
Back
Top Bottom