• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RTX 3080 TI isn't going to be- EDIT - No We Were All Wrong!

Soldato
Joined
28 May 2007
Posts
10,073
As good as the new consoles look on paper, there's the old saying that 'paper burns'. Just to play devils advocate, if all games coming out support DLSS 2.0/3.0 baked in, didn't the 2060s see gains of around 60%? Does that not mean that 2000 series cards may end up leapfrogging the high end RDNA2 cards at a significantly lower cost?

It's probably not gonna be a big thing as there is now Direct ML which being a part of Direct x should be the go to ai learning technique. It's also on the new Xbox and Sony also have a patent for machine learning. Most big aaa games that are on PC are console as well so i can see DLSS having to take a back seat being an NV only tech. My real question would be whether Direct ML and it's Sony counter part will be up to the standard of DLSS 2.0/3.0.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
What do you reckon about the high end desktop RDNA 2 GPU having a TDP of 300w - Could it reach temps of 90-100 degrees?

I've based that on 2 assumptions:
1. That the temperature per watt will will be similar to the 5700 XT
2. That an increase in watts from 225 to 300 watts results in a proporsional increase in temps.

So, if that's true, then the main constraint on performance for RDNA 2 GPU is temperature, since we know over 300w is possible, even with just 2 X 8 pin PCIe connectors.
 
Last edited:
Associate
Joined
29 Aug 2013
Posts
1,176
What do you reckon about the high end desktop RDNA 2 GPU having a TDP of 300w - Could it reach temps of 90-100 degrees?

I've based that on 2 assumptions:
1. That the temperature per watt will will be similar to the 5700 XT
2. That an increase in watts from 225 to 300 watts results in a proporsional increase in temps.

So, if that's true, then the main constraint on performance for RDNA 2 GPU is temperature, since we know over 300w is possible, even with just 2 X 8 pin PCIe connectors.

Depends if they go with the blower cooler again on reference cards. Hopefully they change it this time around.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
AMD Radeon's VP has said that they are scrapping the Blower designs for RDNA 2 reference graphics cards, in favour of 'a design with two or three fans'.

Link: https://www.guru3d.com/news-story/confirmed-rdna-2-reference-cards-do-not-get-a-blower-cooler.html

It certainly sounds like it will need a very good air cooler, especially since third party designs won't be available on launch. My Powercolor R9 390 has a 3 fan design and it's always worked really well, so I'm glad the design has been standardized. Still, that might only improve temps by 5-10 degrees at load, so I still think that heat will be the main limit on RDNA 2 performance.
 
Last edited:
Associate
Joined
7 Apr 2017
Posts
1,762
It's probably not gonna be a big thing as there is now Direct ML which being a part of Direct x should be the go to ai learning technique. It's also on the new Xbox and Sony also have a patent for machine learning. Most big aaa games that are on PC are console as well so i can see DLSS having to take a back seat being an NV only tech. My real question would be whether Direct ML and it's Sony counter part will be up to the standard of DLSS 2.0/3.0.

That seems like a real shame as it looks like a real game changer and the fps benefits are huge. Not using that extra hardware on the cards is such a waste...
 
Soldato
Joined
28 May 2007
Posts
10,073
What do you reckon about the high end desktop RDNA 2 GPU having a TDP of 300w - Could it reach temps of 90-100 degrees?

I've based that on 2 assumptions:
1. That the temperature per watt will will be similar to the 5700 XT
2. That an increase in watts from 225 to 300 watts results in a proporsional increase in temps.

So, if that's true, then the main constraint on performance for RDNA 2 GPU is temperature, since we know over 300w is possible, even with just 2 X 8 pin PCIe connectors.
That seems like a real shame as it looks like a real game changer and the fps benefits are huge. Not using that extra hardware on the cards is such a waste...

Well I assume Nvidia cards still could use the extra hardware with Direct Ml if it's similar and the boosts should be similar if it's any good. Nvidia wont drop DLSS any how and have big pockets so i still think it will be used. I just dont think it will be a massive advantage if AMD have something similar to use that should in theory have more uptake due to both vendors being able to use it.
 
Associate
Joined
7 Apr 2017
Posts
1,762
Well I assume Nvidia cards still could use the extra hardware with Direct Ml if it's similar and the boosts should be similar if it's any good. Nvidia wont drop DLSS any how and have big pockets so i still think it will be used. I just dont think it will be a massive advantage if AMD have something similar to use that should in theory have more uptake due to both vendors being able to use it.

Well looking at the rumoured pricing (which im sure nvidia leak intentionally to test market reaction) AMD need to bring it from a performance perspective.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Looks like my earlier predictions were mostly accurate - That the clock rates for the top end 3000 series RTX card be upto 1700mhz, meaning a total of 17 TFlops of processing power.

What I actually said:
"the base model RTX 3080 TI will have a very similar boost clock to the current gen. Titan RTX (1770 MHz)"

The shader unit count is apparently a bit higher, at 5248, with a clock speed of 1695MHz, so a total of 17.79 TFlops. It looks like it's getting slightly more than the usual 768 shader unit increase from the last generation's 2080 TI.

Based on the rumoured clockrate, I've revised my earlier predicted pixel and texture rates for Nvidia's flagship GPU:

Pixel rate:
120 ROPs x 1695mhz = 203,400‬ GPixel/s

Texture Rate:
324 TMUs x 1695mhz = 549,180 GTexel/s

In terms of theoretical performance, it works out at 30.6-49.5% higher than the RTX 2080 TI.

The specs of the RTX 3080 are '4352 cores clocked at 1710MHz'. So a total of 14.88 TFlops, a bit higher than the RTX 2080 TI, which has 13.45 TFLOPS. Looks like an appealing, but expensive GPU.

All the above only matters though if the info found on this website turns out to be true:
https://www.kitguru.net/components/...090-and-3080-specifications-have-been-leaked/
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,795
Looks like my earlier predictions were mostly accurate - That the clock rates for the top end 3000 series RTX card be upto 1700mhz, meaning a total of 17 TFlops of processing power.

What I actually said:
"the base model RTX 3080 TI will have a very similar boost clock to the current gen. Titan RTX (1770 MHz)"

The shader unit count is apparently a bit higher, at 5248, with a clock speed of 1695MHz, so a total of 17.79 TFlops. It looks like it's getting slightly more than the usual 768 shader unit increase from the last generation's 2080 TI.

The specs of the RTX 3080 are '4352 cores clocked at 1710MHz'. So a total of 14,88 TFlops, a bit higher than the RTX 2080 TI, which has 13.45 TFLOPS.

be interesting to see how much p of easing power it has once load is out on the card - the other day the Nvidia record was set by overclockering a 2080ti to 2900mhz - output was 28tflops
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
It's interesting because they've needed 50w more (than the 3/4 power of the Ampere A100 that I thought theyd need, e.g. 300w) for the RTX 3090 to achieve approx. 17 TFlops. Power efficiency seems to be one of the main problems for Nvidia, and it seems likely that heat will be AMD's main constraint.

EDIT - Having said that, the load temps of the RTX 2080 TI are already ~74 degrees celsius. If the heat scales roughly up with the TDP, the RTX 3090 could run super hot too, at 350w. I suppose it depends how much the cooling has improved from the last gen.

Apparently, Nvidia has redesigned the cooler and graphics card layout for the RTX 3090 (maybe others too). There's a bit of info here:
https://www.theverge.com/2020/8/26/21402321/nvidia-rtx-3090-design-12-pin-connector-cooling
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,832
Location
Uk
My real question would be whether Direct ML and it's Sony counter part will be up to the standard of DLSS 2.0/3.0.
Probably not but then if it's available in all games whereas DLSS is only in 5-10% it could boost AMDs performance in all titles while nvidia only benefit in a handful.
 
Soldato
Joined
6 Feb 2019
Posts
17,795
It's interesting because they've needed 50w more (than the 3/4 power of the Ampere A100 that I thought theyd need, e.g. 300w) for the RTX 3090 to achieve approx. 17 TFlops. Power efficiency seems to be one of the main problems for Nvidia, and it seems likely that heat will be AMD's main constraint.

EDIT - Having said that, the load temps of the RTX 2080 TI are already ~74 degrees celsius. If the heat scales roughly up with the TDP, the RTX 3090 could run super hot too, at 350w. I suppose it depends how much the cooling has improved from the last gen.

my ASUS strix 2080ti is overclocked and draws 330w and it doesn't go over 65c

Nvidia founders coolers might suck but AIB air coolers are definitely able to handle 350w
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
I wonder if the RTX 3080 will be closer to £700 or £800 in the UK... Either way, too rich for my blood. Still, an improvement over the launch price of the (probably) slightly worse performing RTX 2080 TI, at $1,199.

I'm not sure I'll be able to afford the flagship RDNA 2 GPU either, if they price it at between £550-£600. I wonder if they could release a version of the flagship GPU, but with lower spec ray tracing hardware, to cut production costs?

I think the RTX 3080 will have a very similar core config to the RTX 2080 TI (like the shader count, the number of TMUs and ROPs will be similar, probably slightly higher). It uses more power too, which I think increases the likelyhood of it performing slightly better in general, unless the extra 70w of power is needed for improved raytracing hardware.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,832
Location
Uk
my ASUS strix 2080ti is overclocked and draws 330w and it doesn't go over 65c

Nvidia founders coolers might suck but AIB air coolers are definitely able to handle 350w
The 350w figure is probably just stock with OC taking them well over 400w.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
That's a good point, and would certainly explain why they might opt for a power connector capable of higher watts.

If they can clock it to 2000 mhz, you'd be looking at a 20.99 TFlop GPU.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Yeah, fair enough it says this:

NVIDIA CUDA® Cores 10496

on the official specs here:
https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

I'm quite surprised that the number of shader units is more than double the amount of the RTX 2080 TI.

Assuming it's correct, 8K gaming might actually be possible.

NV did a good job hiding the specs for the RTX 3000 series, I dont think anyone knew the top card would have more than 2x the shader count of the last gen.
 
Last edited:
Back
Top Bottom