• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
From what we can see of the 4080 leaks the 12GB is on power with a 3090 and on average about 15% faster than a 3080 12GB, or 20% faster than the 3080 10GB. The 4080 16GB is ~50% faster than a 3080 10GB and 72% more expensive.

3080 10GB was $699 MSRP (availability aside)
4080 12GB is $899 MSRP

So an almost 30% increase in MSRP for 2 extra GB of VRAM and a ~20% increase in performance.

It's not even 20% improvement, looking at aggregate benchmarks, 3080 has 53% of the performance of 4090 at 4K.

4080 12GB has only 47% of the CUDA cores of 4090 and they run at the same clock, so you'd expect it to have 47% of its performance.

There's no way that 4080 12GB is going to be 20% faster than 3080. At best they are equal, but more likely that 3080 is going to beat it by a few percents.
 
Comically before the Ada reveal, the "only 10%" faster was mocked big time. Yet you could have had that 4080 two years ago with double the vram.. not sure why any person that has a top tier Ampere would even consider the 4080's.

Not rocket science.

- 3090 10-15% faster for an extra £750

Bargain.....

That's why most 3080 and above owners won't look at the 4080 12gb, 408016gb on the other hand.... Also, the 4080 12GB will perform significantly better in future RT games especially where SER is used bringing the difference to more than "10%", not to mention when throwing in dlss 3/fg, just as we have witnessed with ampere/3080 vs rdna 2/6800xt, this will be more beneficial across many games than the extra vram.

So yeah bit of a difference.....
 
Last edited:
It's not even 20% improvement, looking at aggregate benchmarks, 3080 has 53% of the performance of 4090 at 4K.

4080 12GB has only 47% of the CUDA cores of 4090 and they run at the same clock, so you'd expect it to have 47% of its performance.

There's no way that 4080 12GB is going to be 20% faster than 3080. At best they are equal, but more likely that 3080 is going to beat it by a few percents.

This, those Nvidia slides are BS.
 
Nvidia have went all in on DLSS it seems.
The same slide also says the 4090 is 70% faster than the 3090Ti.

Most reviewer have it at 45% to 60%.

We need to stop taking vendor slides as gospel.

This has always been the case and AMD are guilty of this as well. So TPU 4K relative performance shows the 4090 is 45% faster than the 3090Ti, yet the Nvidia slides would have you believe it is 70% faster from a likely cherry picked selection. Their delta between the 3070 > 3080 > 3090Ti is about right, but then again they have to be to lend credability to their 4090 performance claims. It's an old marketing trick, sprinkle your lies in some truth dust and it lends them credability. Hey those 3070 vs 3080 numbers are legit so the 4090 numbers must be true.

But what this does mean is that the $899 4080 12GB is very poor value for money if compared to the 3080 MSRP. It only looks like a value proposition if people look at it as 3090 performance for a lot less money.
 
amd.jpg

The current leaks/rumors so far.

Well if true I did well going 3080 Ti at £575, as $799 will mean around £900 :cry:

AMD need to do better than that with pricing. It is their opportunity to claw back A LOT of market share.
 
Nvidia have went all in on DLSS it seems.


This has always been the case and AMD are guilty of this as well. So TPU 4K relative performance shows the 4090 is 45% faster than the 3090Ti, yet the Nvidia slides would have you believe it is 70% faster from a likely cherry picked selection. Their delta between the 3070 > 3080 > 3090Ti is about right, but then again they have to be to lend credability to their 4090 performance claims. It's an old marketing trick, sprinkle your lies in some truth dust and it lends them credability. Hey those 3070 vs 3080 numbers are legit so the 4090 numbers must be true.

But what this does mean is that the $899 4080 12GB is very poor value for money if compared to the 3080 MSRP. It only looks like a value proposition if people look at it as 3090 performance for a lot less money.

Think about where Nvidia see themselves at, when they make a new ##80 class card that's no better than the one its replaced, jack the price up 30% to make the current ##80 class card which is still retailing on US sites right now for $800... look like good value.

Nvidia really are taking us for ####'s, that's what they think of us.
 
It's not even 20% improvement, looking at aggregate benchmarks, 3080 has 53% of the performance of 4090 at 4K.

4080 12GB has only 47% of the CUDA cores of 4090 and they run at the same clock, so you'd expect it to have 47% of its performance.

There's no way that 4080 12GB is going to be 20% faster than 3080. At best they are equal, but more likely that 3080 is going to beat it by a few percents.

Oh absolutely, it's astonishing how people can't comprehend data. It can stare them right in the face and they just ignore it. I was just demonstrating that Nvidia's own promotional marketing have the 4080 12GB barely faster than a 3080 10GB. If Nvidia are pricing these mid range tat at those prices, then they are either banking on brand loyalty (aka stupidity). Or they know/think (delete as appropriate) AMD 7x00 is so bad that Nvidia can get away with these prices.
 
It's not even 20% improvement, looking at aggregate benchmarks, 3080 has 53% of the performance of 4090 at 4K.

4080 12GB has only 47% of the CUDA cores of 4090 and they run at the same clock, so you'd expect it to have 47% of its performance.

There's no way that 4080 12GB is going to be 20% faster than 3080. At best they are equal, but more likely that 3080 is going to beat it by a few percents.

4080 12G vs 3080
+21.7% / +11.5% / +24%
 

4080 12G vs 3080
+21.7% / +11.5% / +24%

The real head scratcher is that those are Nvidia’s own marketing numbers and they are always inflated.

So are we likely to see the 3080 replacement offer no tangible benefits other than an extra 2GB VRAM and DLSS 3.0 in a handful of games?
 
The real head scratcher is that those are Nvidia’s own marketing numbers and they are always inflated.

So are we likely to see the 3080 replacement offer no tangible benefits other than an extra 2GB VRAM and DLSS 3.0 in a handful of games?
Probably. They will lean on RT & DLSS 3 heavily, plus the msrp for 3080 was never really accurate in the first place. So I wouldn't be surprised if it turns out like this. Though I reckon the 7800 XT will outshine it, so maybe a year in we'll see a better variant for the same money ala Turing Supers.
 
The same slide also says the 4090 is 70% faster than the 3090Ti.

Most reviewer have it at 45% to 60%.

We need to stop taking vendor slides as gospel.
It also shows a 3090ti as 30% faster than a 3080 10gb when on average it's only 25% so I'd say the 20% is a best case scenario and I bet in some games it'll be as little as 5-10%
 
Think about where Nvidia see themselves at, when they make a new ##80 class card that's no better than the one its replaced, jack the price up 30% to make the current ##80 class card which is still retailing on US sites right now for $800... look like good value.

Nvidia really are taking us for ####'s, that's what they think of us.
Tbf they've been doing this at the low end of the market for a number of years in releasing cards that offer slightly better performance but cost more so they work out the same speed and price as the previous gen tier above but now but it seems like they they want to take it to a new extreme and not even give you the same performance for the same price as was previously available.
 
Where did you get that number from? In CPUs cache uses a majority of all transistors. In Ada cache takes almost 1/3rd of the whole chip but it's more densely packed with transistor than the rest of the logic.
I used the 6T per bit assumption. So for 96 MB that would work out to 4.6 billion.. the percentage contribution in CPUs is higher because of base effect as CPUs are puny compared to GPUs
From what we can see of the 4080 leaks the 12GB is on power with a 3090 and on average about 15% faster than a 3080 12GB, or 20% faster than the 3080 10GB. The 4080 16GB is ~50% faster than a 3080 10GB and 72% more expensive.

3080 10GB was $699 MSRP (availability aside)
4080 12GB is $899 MSRP

So an almost 30% increase in MSRP for 2 extra GB of VRAM and a ~20% increase in performance.

Frankly at that utterly turgid price/perf AMD don't need to beat the 76 billion transistor 4090 because the turgid overpriced muck that Nvidia are shovelling as their lower tiers is all they need to beat. And from the leaks given so far it is a very low bar indeed.

If I wanted 3090 performance I could get a used one for a lot less than a new 4080 12 GB. I think I can live without DLSS 3.0 thanks.

So can we please dispense with the idea that the only thing that matters is RTX 4090 vs 7900 XT because most of us are far more interested in the mid tier and it is clear Nvidia are fleecing us at the price/perf on offer. I have a sneaking suspision AMD will offer only marginally better price/perf.
Amd is rumoured to follow in nvidias steps as far as product segmentation is concerned. 7800 will be using a step down die and n31 is reserved exclusively for 7900..so maybe we would land in the same or marginally better price/perf bracket as you have suggested
 
Last edited:
Considering that the reason the 4000 series is bottlenecked at 1440p is because Nvidia doesn't have a hardware scheduler and uses the CPU for that. AMD could handily beat Nvidia at 1440p and lower resolutions. If the markerting team is smart they will push that info about being better at those resolutions since more people are running those resolutions than 4k.


4080 12G vs 3080
+21.7% / +11.5% / +24%
The more information that comes out about die sizes and performance of the 4000 series, the stupider it looks for AMD to follow Nvidia's pricing lead. I guess we will soon know how stupid AMD is or isn't.
 
Agreed, hence my remark that I suspect AMD will be only marginally better. I think they will be quicker than mid tier, they literally can’t fail to be considering how poor that 4080 12GB looks.
 
I used the 6T per bit assumption. So for 96 MB that would work out to 4.6 billion.. the percentage contribution in CPUs is higher because of base effect as CPUs are puny compared to GPUs
Fair enough. 6T per bit - I found that info online as well, though it's about SRAM L2 cache in CPUs and only about bits themselves, without counting in all other logic that's needed to make the memory actually work. I couldn't find actual die shot of 4090 to see how much space it truly takes there, I could only see block schematics and that's not true size of things.
 
We want them to price lower but their shareholders will want the premium brand prices my guess will be. Good news will be as stocks are harder to shift throughout 2023 you can just wait for them to drop prices if you think you need one of the sku's.

The thing is I have a PS5 and a Steam Deck I can use and I am not playing any demanding PC games and haven't turned on my Pimax 8KX in months. I am in no rush for a major GPU upgrade at the moment.
 
Oh absolutely, it's astonishing how people can't comprehend data. It can stare them right in the face and they just ignore it. I was just demonstrating that Nvidia's own promotional marketing have the 4080 12GB barely faster than a 3080 10GB. If Nvidia are pricing these mid range tat at those prices, then they are either banking on brand loyalty (aka stupidity). Or they know/think (delete as appropriate) AMD 7x00 is so bad that Nvidia can get away with these prices.
When you check all kinds of reviews (not just one and not just average scores), you can see that in same games, even at 4k, 4090 is CPU-limited. It could go faster but current CPUs in some games aren't enough anymore - that will likely change when games put even more effects in and even more strain on the GPU. This seems to be the first time where GPU is CPU limited at 4k. But that also means that scores from benchmarks do not tell us the full story about how fast 4090 really is in 4k, especially the average ones. It also means that 4080 could potentially be faster in SOME games than people realise, as in their case CPU limit shouldn't be a thing. However, we'll know for sure when they arrive and it's not all games - in few titles 4090 was on the level of 3090 or even dropped below in 4k (like in CS Go), which to me seems like immature drivers, yet.
 
Status
Not open for further replies.
Back
Top Bottom