• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Not much excitement for RDNA3, compared to the RTX 4000 series

Right it wont have DLSS 3. :)

Every time someone posts "BuT AmdEE No DLsS" i'm going to post this....

nWfRTLR.png
you'll have the 4000 series owners complaining mind :cry: :cry:
 
I mean, it helps to be realistic. I don't care about the branding, just what they price the Founders Editions at.

It's not going to £400-£500 for a card that has better performance than the RTX3080 (as much as I wish it could be), even if RDNA3 is brilliant.

Well if it isn't it will be terrible. At 1080p and qHD,an RTX3080 is only around 16% to 23% faster than a RTX3070 and the RTX3070 used a salvaged GA104:

So you were already paying over £500 for a graphics card with a full GA104!

Better performance would be 17% to 24% uplift in the £400~£500 area. If we can't get at least 40% uplift at the $200~$300,$300~$400 and $400~$500 areas,it's rubbish. Because a game which runs at 40FPS,you can get close to 60FPS,etc. At 20% you won't even get 50FPS.

So there you have the issue,they push the smaller dies up in price,to where an RTX3070 was in price,ie,an RTX4060 with "AD106" which has a die size more like a 107 series dGPU,at £450~£500 which barely beats an RTX3070. But if it was priced to beat its actual ancestor(an RTX3050) you would be getting a generational uplift like going from an RTX3090 to an RTX4090. This is what Nvidia and AMD have been doing bit by bit per generation. You saw it last generation with the RTX3060,6600 and 6600XT. They were priced in line with older graphics cards which used bigger GPUs,so the performance jump look a bit rubbish!

People then wonder why GFX in games has stagnated and blame consoles. It's not only the fault of consoles,but the whole mainstream area stagnating. This is why raytracing in games will be ultimately limited to a few add-on effects and a few tech demo games. The performance improvements of mainstream dGPUs over the last 6 years hasn't kept pace relative to the higher end ones.

For many here who buy higher dGPUs its not as noticeable,but it is for people who don't! :(
 
Last edited:
People then wonder why GFX in games has stagnated and blame consoles.
i would beg to differ.. have been looking up 4090 numbers, and i think rtx 50 will make 4k mainstream - i think a big shift is coming
also depending on how the mcm approach works out for amd, MAYBE prices will come under control
this is like stuff for future youtube documentaries. though i fully understand that prices have been spoiling the moment
 
Last edited:
Oh well, lol. This is why people should ignore upscaling features when buying cards now.

I hope most games let you choose between DLSS 2 and 3.
DLSS 2 is already a hell of a performance boost, I'm sceptical they will be able to improve performance much more.
 
Last edited:
i would beg to differ.. have been looking up 4090 numbers, and i think rtx 50 will make 4k mainstream - i think a big shift is coming
also depending on how the mcm approach works out for amd, MAYBE prices will come under control
this is like stuff for future youtube documentaries. though i fully understand that prices have been spoiling the moment

But the problem again is the high end. If the existing tiers were kept it might happen,but the previous 104 series dGPUs had 60% to 70% of the transistors of the top 102 series dGPUs.

The AD104 only has 48% of the transistors of the AD102,under half the shaders and half the memory bandwidth:

Its very close to what you would expect with a 106 series dGPU. The AD103 is nowhere a 104 series dGPU would be,ie,60% to 70% of the transistor count of the top chip.

Since it is being sold as an RTX4080 12GB(or maybe an RTX4070),it probably is half an RTX4090:

RTX3070 or RTX3070TI level performance if it is clocked very high? Even if it was an RTX3080,it would be around 20% to 25% faster than an RTX3070. Now think how much it was priced at? £900. Even if it was rebranded as an RTX4070 12GB,it would be over £500. The chip itself will be impressive,but Nvidia does not want to pass this onto the consumer. If it was sold as an RTX3060 replacement it would be a very big upgrade. But Nvidia doesn't want to do this! :(
 
Last edited:
RTX3070 or RTX3070TI level performance if it is clocked very high? Even if it was an RTX3080,it would be around 20% to 25% faster than an RTX3070. Now think how much it was price at? £900. Even if it was rebranded as an RTX4070 12GB,it would be over £500. If it was sold as an RTX3060 replacement it would be a very big upgrade. But Nvidia doesn't want to do this! :(

I think you seem a bit obsessed. The impression I got was that the RTX4080 12GB has a bit better performance in non RT games and better RT performance than the RTX 3080. Considering it will now likely be marketed as the RTX 4070, that seems OK to me. The cooling is likely to be improved too, and there will a decent amount of VRAM (4GB more than the RTX 3070).

Only issue is pricing, it needs to come down a lot.

I don't think there's any special sauce in the RTX 4000 series (similar architecture to Ampere, better RT), what you get is a die shrink and better cooling.
 
Last edited:
It would just be the most obvious conclusion to draw seeing as nvidia have had a head start, they may well be ahead but it could be by a negligable amount for all we know. Though AMD have surprised us before, their raster performance being on par with nvidia in the 6 series compared to what it was in the 5 series was a massive leap, before launch people were saying that if they could equal 2080ti it would be a big jump, but they far exceeded that. Maybe they can do it again on RT performance, who knows?

Not really in the market for a top end card at the minute so i really couldn't give a flying **** one way or the other, might get a 3060 or something to aid in video rendering as my gaming interest has basically vanished. :(
Don’t blame you mate.

I only gained interest in gaming again recently with 3d and triple monitor and sim. Before that I think I would have stuck with a 2080 for this generation. I think I get more fun from my steam deck!
 
People said DLSS 1 was "better than Native" that didn't stop when it was proven not to be and of course continued on pretending science fiction is real through every new iteration of it, even some tech journalists got on board with that marketing because they don't understand you cannot create something from nothing, the absence of something it not the basis for anything.
I remember the press from dlss being quite bad tbh mate. Maybe we follow different ppl in YouTube.

I remember dlss and raytracing generally being crap for ages. I couldn’t tell the difference at the start but things have slowly gotten better.
 
I still think an RTX 3060 TI is enough, especially for 1080p.

It's a shame these cards are still being sold at £400+

Tbh if buying right now, definitely get the FE as these are still being sold on the website at MSRP. Shame there is still the 1 per household rule! Best bet is to get someone else to buy 1 for you if someone in your household already bought one, circumventing this rule now seems to be otherwise impossible.

RT is purely nice to have.

Or the equally decent RX 6700 XT:
 
Last edited:
I think you seem a bit obsessed. The impression I got was that the RTX4080 12GB has a bit better performance in non RT games and better RT performance than the RTX 3080. Considering it will now be marketed as the RTX 4070, that seems OK to me. The cooling is likely to be improved too.

Only issue is pricing, it needs to come down a lot.

People keep giving Nvidia the benefit of the doubt,and why should you after Turing,the whole upwards price rebranding with the Ampere TI series,selling stock directly to miners,etc. The fact is they are chancers,and if people don't criticise this they will keep doing it. It's criticism which made them withdraw the RTX4080 12GB.

This generation is:
AD102>AD103>AD104

Previous generations:

102/100>104>106

The 104 is where the 106 used to be. That is also the problem - the AD104 is a tiny dGPU selling for way too much money. It has less than half the transistors of an AD102(48%),under half the shaders,less bandwidth.

The GA104 had 66% of the transistors of the GA102. The same as the GM204(GTX1070/GTX1080) against the GM200(GTX1080TI). Its literally an RTX4060 being sold as an RTX4080 12GB,etc. Literally everyone I know buys according to price. These changes of GPU tiers are not helping at all.

So you say it is fine as an RTX4070,and "matches an RTX3080". Say that happens. Will this be priced at £450? That would be only 20~25% uplift at £450.

Say,Nvidia makes it £550. So you pay 20% more for 20% to 25% more performance so a stagnation. But the RTX4080 16GB is over £1000. So it probably will be more than £600.

This is what is happening at mainstream and entry level tiers. Smaller and smaller dGPUs pushed up the stack,meaning the progression generation to generation is not great for similar money.
 
Last edited:
But the problem again is the high end. If the existing tiers were kept it might happen,but the previous 104 series dGPUs had 60% to 70% of the transistors of the top 102 series dGPUs.

The AD104 only has 48% of the transistors of the AD102,under half the shaders and half the memory bandwidth:

Its very close to what you would expect with a 106 series dGPU. The AD103 is nowhere a 104 series dGPU would be,ie,60% to 70% of the transistor count of the top chip.

Since it is being sold as an RTX4080 12GB(or maybe an RTX4070),it probably is half an RTX4090:

RTX3070 or RTX3070TI level performance if it is clocked very high? Even if it was an RTX3080,it would be around 20% to 25% faster than an RTX3070. Now think how much it was priced at? £900. Even if it was rebranded as an RTX4070 12GB,it would be over £500. The chip itself will be impressive,but Nvidia does not want to pass this onto the consumer. If it was sold as an RTX3060 replacement it would be a very big upgrade. But Nvidia doesn't want to do this! :(


The 4080 16gb is half the 4090, the 4080 12gb is less than half

For some reason the 4090 proportionally performs better than its core count would suggest. I'm not sure why this is, it's not the clock speed the lower end cards have the same or faster clocks. I believe it's the amount of cache the 4090 has
 
Last edited:
The 4080 16gb is half the 4090, the 4080 12gb is less than half

For some reason the 4090 proportionally performs better than its core count would suggest. I'm not sure why this is, it's not the clock speed the lower end cards have the same or faster clocks. I believe it's the amount of cache the 4090 has
You are not wrong on the cache at all. The massive cache 4090 has puts it far out ahead of the lower stack.

It is really hard to guess or gauge lower stacks true position with Nvidia not really releasing any information.

But I guess the core counts and memory bus is the best starting point.

Anyway let’s see what that 4080 12GB get renamed to. I got a feeling it will become 4070ti or something a bit more “higher cost”.

I wouldn’t be surprised if they just renamed 4080 16GB as 4080Ti or super and then come out with the 4080 12GB as the 4080

We know how greedy Nvidia is and how ludicrous they are also.
 
The 4080 16gb is half the 4090, the 4080 12gb is less than half

For some reason the 4090 proportionally performs better than its core count would suggest. I'm not sure why this is, it's not the clock speed the lower end cards have the same or faster clocks. I believe it's the amount of cache the 4090 has
It could be that - the lower AMD RDNA2 cards have less cache relative to shader count than the higher end ones too. The problem though is not the GPUs themselves is how Nvidia is marketing them. The AD104 has only 48% of the transistors of a AD102,and as you said its less than half the shaders,memory bandwidth,etc too. Even if has higher clockspeeds,how was it worth £900?? How is it even worth £600? So we essentially get RTX3070TI to RTX3080 level performance with slightly more VRAM,better RT performance and similar rasterised performance(but it might have problems at higher resolutions).

Imagine if the AD104 was sold as even a £399 RTX4060TI 12GB,as many have suggested that would be a decent bump in performance all around. I can't see Nvidia doing it unless AMD has something decent and wants to price it well(looking at Zen4 I am not so sure myself).

I think these companies have become spooked by their revenues falling a huge amount. So they want to increase prices and margins to compensate. What they seem to not think is the pandemic was a rare event for tech,ie, a situation where a lot of people and companies had to upgrade their PCs for work/entertainment and mining hit at the same time. Now they want even higher margins,and sell off their overproduction for miners at RRP.
 
Last edited:
i would beg to differ.. have been looking up 4090 numbers, and i think rtx 50 will make 4k mainstream - i think a big shift is coming
also depending on how the mcm approach works out for amd, MAYBE prices will come under control
this is like stuff for future youtube documentaries. though i fully understand that prices have been spoiling the moment
AMD will price their stuff as high as they'll can. After all, nVIDIA did had a better card than AMD in terms of price/performance (at MSRP at least) - 3060ti.

Ryzen 5800x could have been sold around $101 at a profit. Instead they've chosen to charge almost 4,5x as much. :)
 
AMD will price their stuff as high as they'll can. After all, nVIDIA did had a better card than AMD in terms of price/performance (at MSRP at least) - 3060ti.

Ryzen 5800x could have been sold around $101 at a profit. Instead they've chosen to charge almost 4,5x as much. :)

It wasn't helped that AMD didn't sell any reference models in the UK too,so cards such as the RX6700XT were poorly priced.
 
Last edited:
Back
Top Bottom