• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RTX 3080 TI isn't going to be- EDIT - No We Were All Wrong!

Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Where did you read that?

Even if it ends up being 200w, that would still leave plently of scaling room for other RDNA 2 GPUs.

Another possibility is that the console RDNA 2 GPUs will be more power efficient than the desktop GPUs, due to cut down features or more efficient / simpler design.
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
I did some quick and dirty percentage calculations based on the Series X SOC image, and it seems that the GPU part of the image is only about 47.4% the size of the whole SOC image, which would equate to approx 171 mm² (out of a total 360.4mm²). That assumes that the 'SOC Fabric Coherency G6 MCs' aren't counted as part of GPU itself, also I haven't included the GDDR6 memory , or 'HW Acel' ray tracing hardware either.

If someone one with photoshop can do a better job, please go ahead. I just used MS paint, and a calculator to work out the total number of pixels.

If 171 mm² is about right, then it's quite a small GPU compared to the 251 mm² die size of the RX 5700 XT. I'm unsure what conclusions to draw from that!

EDIT - Maybe seems too small. I wonder if some parts of the SOC overlap other parts :confused:
 
Last edited:
Soldato
Joined
26 Sep 2010
Posts
7,178
Location
Stoke-on-Trent
I'm unsure what conclusions to draw from that!
A GPU smaller than the 5700 XT with 25% more and better CUs offers around 2080 Super performance under 2GHz and 200W? There's only 1 conclusion, surely?

However, to paraphrase the latest Moore's Law Is Dead video, 1 of 3 things is going on:
  1. RDNA 2 is astounding and will really bring competition back to Nvidia in a huge way, perhaps even to levels that will make Nvidia sweat
  2. Microsoft and Sony are outright lying about their respective performance
  3. Microsoft and Sony are not lying, and if discrete RDNA 2 does not mirror console performance then AMD are incompetent at building cards. Great tech designers, but useless implementers. Allegedly Sony engineers have already said a few times they're better at designing Radeon GPUs than Radeon itself.
Take it for what it's worth.
 
Associate
Joined
20 Oct 2011
Posts
665
A GPU smaller than the 5700 XT with 25% more and better CUs offers around 2080 Super performance under 2GHz and 200W? There's only 1 conclusion, surely?

However, to paraphrase the latest Moore's Law Is Dead video, 1 of 3 things is going on:
  1. RDNA 2 is astounding and will really bring competition back to Nvidia in a huge way, perhaps even to levels that will make Nvidia sweat
  2. Microsoft and Sony are outright lying about their respective performance
  3. Microsoft and Sony are not lying, and if discrete RDNA 2 does not mirror console performance then AMD are incompetent at building cards. Great tech designers, but useless implementers. Allegedly Sony engineers have already said a few times they're better at designing Radeon GPUs than Radeon itself.
Take it for what it's worth.

But I would have thought the consoles have a advantage with everything been on board less limiting factors, better access to faster ram.
 
Associate
Joined
5 Sep 2014
Posts
289
Consoles have a big disadvantage of having to be put into relatively small box and it can't be too loud, and cooling solution can't be very expensive, and CPU is part of TDP (not separate like on desktop) - a lot of handicaps!
 
Associate
Joined
20 Oct 2011
Posts
665
Consoles have a big disadvantage of having to be put into relatively small box and it can't be too loud, and cooling solution can't be very expensive, and CPU is part of TDP (not separate like on desktop) - a lot of handicaps!

There must some big advantages to the architecture of the hardware, considering the GPU on a PS4 can easily outperform a similar GPU on a PC.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Well it seems like good news if the Series X GPU really is that small. But, even on with the Radeon VII 7nm Vega GPU, AMD had no problems with die size (just 331 mm²).

I wonder if the die size limits for AMD are about the same as with Nvidia? So, around 800mm².

It looks like the only limiting factor for RDNA 2 desktop GPUs will be the TDP. Presumably, they could eventually build a single GPU with a massive amount of transistors (say 40 billion) if the design was power efficient enough.
 
Soldato
Joined
6 Feb 2019
Posts
17,794
Well it seems like good news if the Series X GPU really is that small. But, even on with the Radeon VII 7nm Vega GPU, AMD had no problems with die size (just 331 mm²).

I wonder if the die size limits for AMD are about the same as with Nvidia? So, around 800mm².

It looks like the only limiting factor for RDNA 2 desktop GPUs will be the TDP. Presumably, they could eventually build a single GPU with a massive amount of transistors (say 40 billion) if the design was power efficient enough.

AMD has to stick to smaller dies to cut cost, making large 600mm-800mm2 dies like Nvidia would eat up a lot of margin
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Makes sense, bigger die = more transistors crammed in. I suspect they will stick to around 500mm² as maximum, as they have never made a larger die than that.

And RDNA 3 will probably follow similar rules, but with multiple GPUs per graphics card.

Regarding the TDP of desktop GPUs, I wonder how much TDP the ray tracing hardware will require for RDNA 2? If it's like 50w, that could be pretty crippling for performance of desktop GPUs. I'd guess 20w..

Apparently, the max TDP is around 375w for a graphics card with 2 8 pin connectors, so maybe a RDNA 2 GPU with over 300w is possible.
Dual GPU HD 6990 here:
https://www.techpowerup.com/gpu-specs/radeon-hd-6990.c275

Kinda makes me cast doubt on the need for a new PSU connector type for Nvidia GPUs, especially since a lot of half decent power supplies (like Seasonic Focus series) have 4 x 6+2-pin PCIe connectors.

I think the only reason not to go over 300w TDP is GPU temperature. The 5700 XT typically hits ~76 degrees celsius under load @ 225 watts, which isn't exactly cool.

If total power corresponds to temperature, a similar RDNA v1 GPU with a TDP of 300w could hit 101 degrees (76 + 33.3%). Apparently the 5700 XT throttles it's performance if it reaches 110 degrees, e.g. pretty damn hot.

So, I think an RDNA 2 GPU with a TDP of 300w would have similarly high temperatures, perhaps between 90-100 degrees. If increased to 320w TDP, it could reach 108 degrees (76 + 42.2%). I'd say that's too high and could reduce the GPUs lifespan and increase product return rates potentially.

Who knows though, perhaps they've made some improvements to heat dissipation / cooling with RDNA 2 GPUs.
 
Last edited:
Soldato
Joined
26 Sep 2010
Posts
7,178
Location
Stoke-on-Trent
That meme died on AMDs first 7nm products, the massive die shrink helped their CPUs and gpus to lose the heat
Not the GPUs. Radeon VII was still a bit warm and there was no end of complaining - both warranted and rabid foaming - about the 5700 XT. As recently as a few months ago I've seen posts in forums about undervolting Navi to bring thermals down.
 
Associate
Joined
12 Jul 2020
Posts
288
The power requirements for the 3090 sound insane, and so does the pricing. I'm hoping the 3080 has the nice middle ground of wattage, performance and pricing.
 
Associate
Joined
12 Jul 2020
Posts
288
Depending on how it all shapes up (and what AMD does), I can see around the 3070 being the card for me.
AMD probably won't match Nvidia in pure performance, but if I could get something at a far better price, then I'll be happy. I'm worried the Nvidia cards will be too expensive for what they are.
 
Caporegime
Joined
8 Nov 2008
Posts
29,034
AMD probably won't match Nvidia in pure performance, but if I could get something at a far better price, then I'll be happy. I'm worried the Nvidia cards will be too expensive for what they are.

Agreed. The only problem is if I get an AMD card then as my monitor is G-SYNC only, I'll be without that glorious smoothness. I'm wondering what it would be like to go without that benefit for a while...
 
Soldato
Joined
21 Jul 2005
Posts
20,148
Location
Officially least sunny location -Ronskistats
AMD probably won't match Nvidia in pure performance, but if I could get something at a far better price, then I'll be happy. I'm worried the Nvidia cards will be too expensive for what they are.

Its a fallacy though isnt it, where AMD do bring a card equally as fast on performance - at a better price, people still sway to nvidia (if they wernt completely brand loyal) due to the mindshare. The latter point has been answered for many a generation lets be honest here.
 
Back
Top Bottom