• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
14 Aug 2009
Posts
2,931
That's assuming RT or AI-based tech is the future. :)

I'm not saying it's not but I'm also not saying it is, IDK because i can't look into the future and I'm not knowledgable enough to make an educated guess on it.
Even in rasterization they're pretty close (7900xtx and 4080). 7900xt(x) will only have an advantage due to vram as a result of some truly crappy coding designs. Other than that, the silicon is too weak to make good use of that vRAM.
 
Soldato
Joined
15 Oct 2019
Posts
12,021
Location
Uk
Even in rasterization they're pretty close (7900xtx and 4080). 7900xt(x) will only have an advantage due to vram as a result of some truly crappy coding designs. Other than that, the silicon is too weak to make good use of that vRAM.
16gb on the 4080 is about right in terms of the cards performance, the main downside to 16gb is the 256 bit bus rather than capacity.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,931
16gb on the 4080 is about right in terms of the cards performance, the main downside to 16gb is the 256 bit bus rather than capacity.
Is that 256bit truly a problem considering the architecture, caches, etc., in normal gaming? It doesn't look like AMD is doing a lot better with more than that.
 
Associate
Joined
22 Jan 2006
Posts
1,423
Coming from a 2080ti, my 4080 (FE) that arrived this week is a hell of a piece of kit. Runs a good 25c cooler in the same case whilst giving 3-4 times the performance in some games. Expensive, sure. But definitely no buyers remorse here

Do you really notice the difference?

I was running a 3700X with a 2080Ti until this week. Upgraded the 3700x to a 5850X3d, and I was considering a 4090 to go with it. Was the 4080 that much of a boost?
 
Soldato
Joined
15 Oct 2019
Posts
12,021
Location
Uk
Is that 256bit truly a problem considering the architecture, caches, etc., in normal gaming? It doesn't look like AMD is doing a lot better with more than that.
Not so much as the 70 classes 192 bit bus but it’ll still be impacting performance at 4k.

AMDs problem is the RDNA 3 architecture just isn’t very good and you can see that when comparing an 84 CU 7900XT to an 80 CU 6950XT.
 
Soldato
Joined
14 Aug 2009
Posts
2,931
Not so much as the 70 classes 192 bit bus but it’ll still be impacting performance at 4k.

AMDs problem is the RDNA 3 architecture just isn’t very good and you can see that when comparing an 84 CU 7900XT to an 80 CU 6950XT.

In theory I know, but I couldn't find on a quick search info to prove it. OC brought relative modest gains.

Do you really notice the difference?

I was running a 3700X with a 2080Ti until this week. Upgraded the 3700x to a 5850X3d, and I was considering a 4090 to go with it. Was the 4080 that much of a boost?
on a 5800x3d, from 2080 to 4080, there is a huge difference. If you go with the 4090, probably it will be at least the same if not greater. Depends what games you play, resolution and desired FPS. Have a quick look on youtube, reviews, etc., and see how it performs in your favorite game(s).
 
Associate
Joined
23 Aug 2010
Posts
237
Do you really notice the difference?

I was running a 3700X with a 2080Ti until this week. Upgraded the 3700x to a 5850X3d, and I was considering a 4090 to go with it. Was the 4080 that much of a boost?
CP2077 went from 52FPS on the benchmark run to 200FPS per its own numbers. It's paired with a 13700k in both cases. That was with settings allowing for the 2080ti to actually work reasonably well. Turning off DLSS and turning everything else I could to ultra/psycho, it still ran at 100FPS on the same bench.

So yes, I would say there is a noticeable difference in my case
 
Last edited:
Soldato
Joined
11 Sep 2007
Posts
5,740
Location
from the internet
Not so much as the 70 classes 192 bit bus but it’ll still be impacting performance at 4k.

AMDs problem is the RDNA 3 architecture just isn’t very good and you can see that when comparing an 84 CU 7900XT to an 80 CU 6950XT.

I mean it's not that bad, it gets like 5% more CUs and performs maybe 10-15% better. Really it just needs to be a bigger part.

The main thing is just AMD obviously didn't set a particularly ambitious target for how much silicon they wanted to spend on the GCD which has lead to it being more of an incremental improvement over Navi21 in general. The XTX is a little faster but it's far from mind blowing.

For contrast AD102 is bigger than the entire Navi31 package and a lot of Navi31 is taken up by MCDs produced on a much less dense node.

I mostly think it's a bit of a teething issue with the MCM process that led them to be a bit more conservative this gen, they will hopefully learn a lot from this generation about how to better scale it for performance.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
12,021
Location
Uk
I mean it's not that bad, it gets like 5% more CUs and performs maybe 10-15% better. Really it just needs to be a bigger part.

The main thing is just AMD obviously didn't set a particularly ambitious target for how much silicon they wanted to spend on the GCD which has lead to it being more of an incremental improvement over Navi21 in general. The XTX is a little faster but it's far from mind blowing.

For contrast AD102 is bigger than the entire Navi31 package and a lot of Navi31 is taken up by MCDs produced on a much less dense node.

I mostly think it's a bit of a teething issue with the MCM process that led them to be a bit more conservative this gen, they will hopefully learn a lot from this generation about how to better scale it for performance.
I hope so as AMDs top chip just about competing with Nvidia’s 2nd tier die is part of the reason prices are so bad this time around.
 
Associate
Joined
22 Nov 2020
Posts
1,472
I think the 7900xt dropping to just around £700 represents a better deal as it’s 30% uplift over 6800xt for 12% more money.

Not sure why folks are still flocking to Nvidia- value for money isn’t part of Jensen’s plan.
 
Last edited:
Associate
Joined
22 Nov 2020
Posts
1,472
tbh it’s your posts which are the troll posts.

All I’ve done is present evidence to back up my claims on where I believe each card’s performance sits in relation to a 3080.
Look I concede you maybe right about the 3080. The 3080 is the better buy if someone took the opportunity- but this year if you had no alternative at £700ish then it’s the 4070ti or 7900xt. I think the AMD card is the better buy due to the vram- but I can see RT and DLSS 3.0 attracting ppl to the 4070ti.

Speculating for the future the 5070 might have 4090 perf but would Jensen drop vram to 12gb?
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
12,021
Location
Uk
Look I concede you maybe right about the 3080. The 3080 is the better buy if someone took the opportunity- but this year if you had no alternative at £700ish then it’s the 4070ti or 7900xt. I think the AMD card is the better buy due to the vram- but I can see RT and DLSS 3.0 attracting ppl to the 4070ti.

Speculating for the future the 5070 might have 4090 perf but would Jensen drop vram to 12gb?
Personally I would go for a used 3080 and trouser the difference but if I had to get one of the new £700-800 gpus then I’d go for the 7900XT over the 4070ti as the 4070ti lacks vram bandwidth and raster performance for its price tag, the 7900XT does better on all 3 but still has drawbacks like a weaker feature set + RT and some frametime consistency issues so while it’s better it’s not by a huge amount.

I don’t think the 5070ti will have 4090 performance as it would need to be around 70% faster than a 4070ti which I can’t see happening unless Nvidia moves it back to a larger die. It will need to have at least 16gb though if Jensen plans to sell any as I suspect 12gb will be struggling by then.
 
Back
Top Bottom