• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I have my TUF 4090 at +110 on the core, +1250 on the memory and 80% power limit in afterburner.

I can get most games and various benchmarks to run at +190 or a bit higher on the core but if any crash (particularly F1 22) then I guess it isn't stable.

Perhaps not the best clocking gpu out there.

Get about 2940Mhz on the core and usually sits around there or just under in games with the power limit.

Get better than stock performance in games with those settings and shaves of around 100W maximum (but games don't usually hit that) and runs a few degrees cooler.

Just helps it using more power for little reason.

I settled on 85% 250/800 in the end. I'll probably tweak it as I don't really need the 250 lets face it.
 
So they're pulling an AMD now with the secret sauce.

They did that a few drivers ago too a few months ago, They enabled the bit of on-die hardware on the 3000 series to give a fairly substantial perf uplift, The GSP or "GPU System Processor" which deals specifically with the driver to free up performance that up until around the release of the 4000 series was disabled.

The perf uplift was quite substantial, 25% in some instances, So there's likely parts of Ada that too are not being used.
 
Last edited:
They did that a few drivers ago too a few months ago, They enabled the bit of on-die hardware on the 3000 series to give a fairly substantial perf uplift, The GSP or "GPU System Processor" which deals specifically with the driver to free up performance that up until around the release of the 4000 series was disabled.

The perf uplift was quite substantial, 25% in some instances, So there's likely parts of Ada that too are not being used.
I had no idea this even happened lol, interesting i'll have a read of the link.
 
I had no idea this even happened lol, interesting i'll have a read of the link.

The enabling of the GSP is heavy speculation but I read on some twitter leakers page before this release that "Nvidia will be enabling the as of yet disabled GSP to give up to a 25% perf uplift" and then the driver releases and hey presto 25% perf uplift.

Nvidia has as of yet said zero about it but the GSP has been known about for quite some time.
 
Last edited:
The enabling of the GSP is heavy speculation but I read on some twitter leakers page before this release that "Nvidia will be enabling the as of yet disabled GSP to give up to a 25% perf uplift" and then the driver releases and hey presto 25% perf uplift.

Nvidia has as of yet said zero about it but the GSP has been known about for quite some time.
So its not official yet?

I mean i did try Cyberpunk again not too long ago with the most current upto date driver back then but the increase was minimal i would say, maybe 3-5 fps more at 1440p ultrawide with ultra settings (dlss balanced) was around 65fps in crowded areas on my 3090.
 
So its not official yet?

I mean i did try Cyberpunk again not too long ago with the most current upto date driver back then but the increase was minimal i would say, maybe 3-5 fps more at 1440p ultrawide with ultra settings (dlss balanced) was around 65fps in crowded areas on my 3090.

Nope not official and I highly doubt Nvidia would ever publicly comment on it as they would receive a fair bit of backlash for holding performance back by keeping a bit of on-die hardware disabled for so long that has fairly decent perf gains.

Right now it's speculation but it all adds up, On the 3090Ti I saw some hefty gains especially at 4K of up to 10% which is massive for a simple driver update hence why many including myself pointed to the GSP being finally enabled which deals with driver overhead.
 
Last edited:
So its not official yet?

I mean i did try Cyberpunk again not too long ago with the most current upto date driver back then but the increase was minimal i would say, maybe 3-5 fps more at 1440p ultrawide with ultra settings (dlss balanced) was around 65fps in crowded areas on my 3090.
Based on my experience with AMD, it could be for certain games and/or some scenes. Cyberpunk could be working close to 100% already, so perhaps little performance left on the table.

If the AI driver thing is true, than it could be another nail in AMD's coffin. Hopefully they too can put together something similar.
 
AMD is having its own driver issues with RDNA3; with one of the major issues being that dual issue FP32 is unable to be currently used by almost all software on the market due to the amd compiler not working well. So there are things AMD can do with their drivers which allows games and software access to more performance for their cards.

Nvidia may be trying to get ahead of game, knowing that AMD has more performance in the tank and they need to counter it. Nvidia isn't one for giving away free performance, they have a reason for everything they do
 
I’m going to be testing the two systems I have and which is a better fit for the 4080 I have. Purely for gaming at 1440p maybe 4k once I decide on a new oled.

Intel i5-12400 (upgrade to what 13600?)

AMD 5800x (upgrade to 5800x3d?)


Annoyingly the 4080 only fits in one of my cases so I don’t want to keep shifting things around constantly. (Meshify c and a 400d).

It’s in the Intel system at the moment with 32gb 3200 ram on z690 Nzxt n5 board and things are performing very we ll overall. I want to try this weekend putting x570 tomahawk into 400d with 4080 and my 5800x and compare. I am assuming it will perform slightly better?

But to ensure I maximise 4080s power I don’t know if at 1440p I’m choking it slightly with my cpus and if the 5800x3d or intel equivalent would help? Downside of z690 I have is that it is also not ddr5 soo in reality either board would be somewhat end of life.. ddr4 totally fine atm as I’m basically gaming only.

To further complicate things I can just BARELY fit the 4080 in meshify if I move the fans around…
 
Well I jumped in. Snagged a Gigabyte 4080 Gaming OC used in mint condition boxed for £900.

My 3090 has been handed down to my 13 year old lad, who of course goes “ooo look at the fps I’m getting in Fortnite”

Must admit the prices are painful at the moment, I was gonna get a 4090 till I saw the used 4080 and haggled. Next Gen pricing is going to be very interesting, as I’m assuming sales volumes will be down this Gen.

Just waiting on 7950x3d so I can upgrade my 5950x.
 
Last edited:
Well I jumped in. Snagged a Gigabyte 4080 Gaming OC used in mint condition boxed for £900.

My 3090 has been handed down to my 13 year old lad, who of course goes “ooo look at the fps I’m getting in Fortnite”

Must admit the prices are painful at the moment, I was gonna get a 4090 till I saw the used 4080 and haggled. Next Gen pricing is going to be very interesting, as I’m assuming sales volumes will be down this Gen.

Just waiting on 7950x3d so I can upgrade my 5950x.

Nice one. I'm so happy to be back in Nvidia camp. Not had one flickering display, black screen with driver time out or any other nonsense I got with the AMD card. Never again (probably).
I also prefer the much less bloated overlay. In fact, I can't think of anything I don't prefer over the AMD stuff.
 
Last edited:
Back
Top Bottom