• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

4090 vs 7900xtx re-test, who has aged like fine wine?

Soldato
Joined
19 Dec 2010
Posts
12,038
does not work as markets dont work as such
AMD sells plenty of cards in some markets they have a 50% share vs nvidia
You bought the myth of marketing from nvidia simply
vs Intel they sell 8 to intels 1 cpus.... lol

and people talk about these cards usually dont even have one.
The market has 90% with cards that cost $500 or so.
The rest the 10% are a blend with higher cost which isnt the majority of users and buyers

AMD simply put has a better gaming experience overall.



Ok, I will bite. Please explain how the markets work and list the discrete GPU markets where AMD has 50% market share?
 
Soldato
Joined
7 Dec 2010
Posts
8,272
Location
Leeds
Maybe but there is also this too:

Was fixed a while back the driver overhead issues.


 
Soldato
Joined
9 Nov 2009
Posts
24,879
Location
Planet Earth
Was fixed a while back the driver overhead issues.


Core i9 12900k is still a high end CPU. Needs testing on a range of CPUs - didn't see any retests so far.
 
Last edited:
Soldato
Joined
7 Dec 2010
Posts
8,272
Location
Leeds
Not really well done - I didn't see the test of the RTX3070 with the Core i5 11400F,only the RTX3050. Why can't people just test these things normally?
 
Soldato
Joined
9 Nov 2009
Posts
24,879
Location
Planet Earth

Again where is the list of processors testing low to higher end showing this? I find it weird the previous video,why the guy could have tested the 11400F rig with the RTX3070 too.

The TPU review has a single 5800X,which TPU used until the RTX4000 series launch. Then TPU found the 5800X CPU limited the RTX4090,etc so had to get a faster CPU.
 
Last edited:
Soldato
Joined
7 Dec 2010
Posts
8,272
Location
Leeds
Again where is the list of processors testing low to higher end showing this? I find it weird the previous video,why the guy could have tested the 11400F rig with the RTX3070 too.

The TPU review has a single 5800X,which TPU used until the RTX4000 series launch. Then TPU found the 5800X CPU limited the RTX4090,etc so had to get a faster CPU.

Yes it was not tested after the drama that was created by HUB and even they didn't do another video after the driver overhead issue was fixed by Nvidia in the 522.25 drivers and on.

No idea need to keep searching for better testing videos or reviews of the driver. Still searching here, but the fixes came out in driver 522.25.
 
Caporegime
Joined
8 Jan 2004
Posts
32,099
Location
Rutland
So if Nvidia are "out fine wine'ing" AMD, does this imply that the 4090 was released before it was ready?!

Anyway, the real problem is that the suite has change do much that benchmarks are not comparable.
For raster, only A Plague Tale: Requiem, Cyberpunk 2077, and Forza Horizon 5 were in both, and the scene for Cyberpunk 2077 is not the same (going from PCGH's Red Light Alley in 2023 to PCGH's Dog Days in 2024.

So directly comparable for raster we only have those two and at 4K min FPS we got (2023 first, 2024 last)
APT: Requiem, for the 4090: 64.0 to 71.0 so nearly 11% better.
APT: Requiem, for the 7900 XTX: 49.0 to 51 so about 4%.
FH5 for the 4090: 117.0 to 118.0 so about 1% better.
FH5 for the 7900 XTX: 83.0 to 95 so about 14%.

For RT scenes: Well I gave up after comparing the only common games (CB2077, DL2, and Metro Exodus) since not only did CB2077 have a different scene, Dying Light 2 uses the same 'scene' but different settings, and Metro Exodus using the same scene but again changing the settings.

CPUs? Stated is 12900K vs 13900KS (implying P cores only for both).

And the CPU was changed too, so not really a scientific comparison.

What we can sort of say is that for new games the 4090 is going really well as most of the suite difference is newer games.
So many confounding factors, what a waste of time.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,909
Location
Greater London
does not work as markets dont work as such
AMD sells plenty of cards in some markets they have a 50% share vs nvidia
You bought the myth of marketing from nvidia simply
vs Intel they sell 8 to intels 1 cpus.... lol

and people talk about these cards usually dont even have one.
The market has 90% with cards that cost $500 or so.
The rest the 10% are a blend with higher cost which isnt the majority of users and buyers

AMD simply put has a better gaming experience overall.

Posts like this really do highlight the need for a u wot m8 smiley :p:cry:
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,879
Location
Planet Earth
Yes it was not tested after the drama that was created by HUB and even they didn't do another video after the driver overhead issue was fixed by Nvidia in the 522.25 drivers and on.

No idea need to keep searching for better testing videos or reviews of the driver. Still searching here, but the fixes came out in driver 522.25.
Intel Core i7-12700K with 32 GB of DDR5-5200 and a GeForce RTX 3070. Was used in the hothardware link above too in their updated testing at bottom of the article.

Its sloppy these tech sites don't do follow up videos. Maybe its worth suggesting to someone in their Discord if they could do this with newer cards.

Some channels did testing in the last 12 months,and it still seemed AMD had an edge in certain CPU limited scenarios:


But even then it's a mishmash of hardware,etc. It waa the same with AMD DX11 overhead which was certainly there,but not sure if it was ever fixed.
 
Last edited:
Soldato
Joined
6 Feb 2010
Posts
14,595
I think at that point though the differences tend to get more specialized i.e ray tracing. If you don't care about RT and can get a 7900XTX at a decent price it's a solid card... outside of AMD drivers crashing every 5 minutes :p
Driver aside, I think part of the reason is that Nvidia ALWAYS being more stingy on the vram on their cards (except the top card) at similar or slightly higher price point than AMD's offering, hence why they always tends to age worse over time (3070 8GB is a good example of this).

Nvidia managed to create this illusion of their product better across the board simply due to them having the fastest card at the top, while at lower end and mainstream cards there ALWAYS cut corner on the spec, as if they are afraid of their users "eating too well" or they are having nightmares of their users being able to use their cards for more than 2 years and not upgrading (despite the users are already forking out hundreds of pounds) :cry:

Even if considering RT, people buying Nvidia card they would have to get a card with at least 16GB vram (meaning at least a 4080 16GB) to avoid potential stuttering, so for anyone with budget below that and getting Nvidia cards with 12GB vram or less the RT advantage becomes a moot point.
 
Soldato
Joined
7 Dec 2010
Posts
8,272
Location
Leeds
Its sloppy these tech sites don't do follow up videos. Maybe its worth suggesting to someone in their Discord if they could do this with newer cards.

Some channels did testing in the last 12 months,and it still seemed AMD had an edge:
Seems it wasn't even HUB that noticed the issue but PCGAMESHARDWARE.DE, they may have an updated test there.


Release 520 Driver for Windows, Version 522.25 RN-08399-522.25_v01 | 12

Chapter 3. Changes and Fixed Issues in Version 522.25

The following sections list the important changes and the most common issues resolved in this
version. This list is only a subset of the total number of changes made in this driver version. The
NVIDIA bug number is provided for reference.

3.1 Fixed Issues in Version 522.25 WHQL
[PCGAMESHARDWARE.DE][Teardown] Very low performance on Turing/Ampere GPUs
compared to AMD Radeon [3653400]

Tiny Tina's Wonderlands displays texture corruption after extended gameplay on NVIDIA
GPUs [3777340]
UE5.1 crashes when enabling path tracing on some drivers [3731151]



Page 12 lists above.
 
Soldato
Joined
9 Nov 2009
Posts
24,879
Location
Planet Earth
Seems it wasn't even HUB that noticed the issue but PCGAMESHARDWARE.DE, they may have an updated test there.


Release 520 Driver for Windows, Version 522.25 RN-08399-522.25_v01 | 12

Chapter 3. Changes and Fixed Issues in Version 522.25

The following sections list the important changes and the most common issues resolved in this
version. This list is only a subset of the total number of changes made in this driver version. The
NVIDIA bug number is provided for reference.

3.1 Fixed Issues in Version 522.25 WHQL
[PCGAMESHARDWARE.DE][Teardown] Very low performance on Turing/Ampere GPUs
compared to AMD Radeon [3653400]

Tiny Tina's Wonderlands displays texture corruption after extended gameplay on NVIDIA
GPUs [3777340]
UE5.1 crashes when enabling path tracing on some drivers [3731151]



Page 12 lists above.

Problem is that there has been testing last year where Nvidia cards still had issues on slower CPUs(those two videos). There should be no scenario where an RX6700 can be quicker than an RTX3070,or an RX480 compared to an RTX3060. In the GPU limited scenarios the Nvidia cards seem to perform as expected.

It was the same with the AMD DX11 overhead - there was an issue there but it was never fully resolved on whether it exists still.

Maybe someone on the HUB Discord can ask them to retest again.
 
Last edited:
Soldato
Joined
19 Feb 2007
Posts
14,421
Location
ArcCorp
Posts like this really do highlight the need for a u wot m8 smiley :p:cry:

My mental reaction to reading it at first was this :D

fM9i8eM.gif



Driver aside, I think part of the reason is that Nvidia ALWAYS being more stingy on the vram on their cards (except the top card) at similar or slightly higher price point than AMD's offering, hence why they always tends to age worse over time (3070 8GB is a good example of this).

Nvidia managed to create this illusion of their product better across the board simply due to them having the fastest card at the top, while at lower end and mainstream cards there ALWAYS cut corner on the spec, as if they are afraid of their users "eating too well" or they are having nightmares of their users being able to use their cards for more than 2 years and not upgrading (despite the users are already forking out hundreds of pounds) :cry:

Even if considering RT, people buying Nvidia card they would have to get a card with at least 16GB vram (meaning at least a 4080 16GB) to avoid potential stuttering, so for anyone with budget below that and getting Nvidia cards with 12GB vram or less the RT advantage becomes a moot point.

Given how VRAM is going to be higher capacity per chip going forward I bet Nvidia are scheming up ways to limit memory XD
 
Last edited:
Caporegime
Joined
8 Jan 2004
Posts
32,099
Location
Rutland
Given how VRAM is going to be higher capacity per chip going forward I bet Nvidia are scheming up ways to limit memory XD
They'll sell it at half capacity and charge a monthly subscription to double your VRAM. You heard it here first. DLSS license sold separately. RTX premium upgrade by the hour.
 
Last edited:
Back
Top Bottom