• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

It's been discussed to death already on these forums and various popular review sites. I just wanted to test for myself to see the difference.

If you want some official benchmarks, just use google and look for them. I'll link you some quickly if you're too lazy > https://www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-core-i9-12900k/2.html

Link above compares performance difference between Ryzen 5800X and 12900k. The difference is even larger when you compare 13th gen and 11th gen, or even people such as @Jay-G25 on 10th gen.

Comparing 7950X to 13900k (at lower resolutions) > a7WMpGX.jpeg


Frequency and IPC really matter for the 4090. Sure, it's still an upgrade even if you have an old potato of a CPU, though IMO very silly to spend £1700+ and pair it with a potato. A 3070, 3080, 3090, 4070ti etc would be much more sensible, as you can't drive the 4090 well enough for it to shine vs much cheaper cards.
But you specifically said " 7950x, 13900k or 5800x3d"

at 4k there is no meaningful difference between a 13900k, 12900k, 12600-12700k or any modern cpu.

Heres 5800x3d v 12900k @ 4k on a 4090

 
But you specifically said " 7950x, 13900k or 5800x3d"

at 4k there is no meaningful difference between a 13900k, 12900k, 12600-12700k or any modern cpu.

Heres 5800x3d v 12900k @ 4k on a 4090


Not sure if you're trolling, or just didn't read my post and follow the links.

Here's 4K testing between 5800X and 12900k

4rQbyOP.png

In which universe are the above numbers, to use your words, "no meaningful difference"?

Not sure why I bother, some folks just want to believe their own nonsense and ignore reality.

That's enough GPU forum for me for today, enjoy 4090 on potato CPU's!
 
It's been discussed to death already on these forums and various popular review sites. I just wanted to test for myself to see the difference.

If you want some official benchmarks, just use google and look for them. I'll link you some quickly if you're too lazy > https://www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-core-i9-12900k/2.html

Link above compares performance difference between Ryzen 5800X and 12900k. The difference is even larger when you compare 13th gen and 11th gen, or even people such as @Jay-G25 on 10th gen.

Comparing 7950X to 13900k (at lower resolutions) > a7WMpGX.jpeg


Frequency and IPC really matter for the 4090. Sure, it's still an upgrade even if you have an old potato of a CPU, though IMO very silly to spend £1700+ and pair it with a potato. A 3070, 3080, 3090, 4070ti etc would be much more sensible, as you can't drive the 4090 well enough for it to shine vs much cheaper cards.

At 720p, who the hell is playing at 720p :cry: You say allow the 4090 to shine but what resolution and framerate are you on about that allows it to shine ? i play on 2 displays which are a 4k/120hz and 1440p/240hz and with my Maris Piper of a CPU am capped at the refresh rate of the monitor in the games i play with the CPU and GPU underutilised most of the time unless i turn off v-sync and allow it to go over, it plays smooth as silk but according to you is not enough to drive a 4090 . We have a shadow of the tomb raider benchmark thread on this forum and with your superior 13900k should be able to wipe the floor with my score at 4k using a 10900k (apparently not good enough for 4k according to you )...

 
It's been discussed to death already on these forums and various popular review sites. I just wanted to test for myself to see the difference.

If you want some official benchmarks, just use google and look for them. I'll link you some quickly if you're too lazy > https://www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-core-i9-12900k/2.html

Link above compares performance difference between Ryzen 5800X and 12900k. The difference is even larger when you compare 13th gen and 11th gen, or even people such as @Jay-G25 on 10th gen.

Comparing 7950X to 13900k (at lower resolutions) > a7WMpGX.jpeg


Frequency and IPC really matter for the 4090. Sure, it's still an upgrade even if you have an old potato of a CPU, though IMO very silly to spend £1700+ and pair it with a potato. A 3070, 3080, 3090, 4070ti etc would be much more sensible, as you can't drive the 4090 well enough for it to shine vs much cheaper cards.
Thanks I love playing at 720p so really handy for me.

Suffice to say my 12700k @5ghz is fine for a few generations with my 4090.

Vr in msfs with g2 is looking great and performing well.
 
Last edited:
Not sure if you're trolling, or just didn't read my post and follow the links.

Here's 4K testing between 5800X and 12900k

4rQbyOP.png

In which universe are the above numbers, to use your words, "no meaningful difference"?

Not sure why I bother, some folks just want to believe their own nonsense and ignore reality.

That's enough GPU forum for me for today, enjoy 4090 on potato CPU's!
You literally said "IMO only consider 4090 if you have a 13900k, 5800X3D or 7950X, else you're wasting the true potential of the card in gaming, even at 4K resolution"

Then you show a 12900k being substantially faster than a 5800x which was never mentioned

According to you, my potato 12900k should be updated and isnt worth of a 4090, despite the fact i can max out my framerate on my 4k/138 monitor on most games.
:D
 
Last edited:
What sort of clock speed are you hitting at 520w out of interest?

It wasn't the best overclocker. I think it was 170mhz and 1000 on the memory.

Tbh it was just for scores on 3d mark, which tbf it did get good results (even a legendary number 1 in the world paired with a 5800x at the time lol). Ill be undervolting when I get my FE though.... but I'll over clock first just to see.
 
Absolutely loving my Strix 4090, very happy with my purchase. 0 issues. Settled on a 70% power target which makes it run at 350W, hardly lose any performance, around 2-5% depending on the game. The difference 100W makes it quite insane - card is much quieter and runs cooler.

Did some testing with my 4090 on other systems. Saw big differences in games with a 10900k, 11900k and 12900k compared to my 13900k.

IMO only consider 4090 if you have a 13900k, 5800X3D or 7950X, else you're wasting the true potential of the card in gaming, even at 4K resolution.
Good to know 350W is quiet and cool, my 7900XTX should be nice and quiet :D
 
If its from OCUK they normally send you an email from around now till 7pm saying its been dispatched , Good choice on the Zotac imo and if its anything like mine was will have 0 coil whine unlike my FE which does have a small bit :cry:
Thanks I hope Zotac is good and it wasn’t my first choice as I bought an overpriced Aorus Master which had a faulty LCD display and got a refund. So the Zotac worked out £390 cheaper which is a good result in the end :)
 
In which universe are the above numbers, to use your words, "no meaningful difference"?

Shows a chart that displays a 6.4% average difference.

Settled on a 70% power target which makes it run at 350W, hardly lose any performance, around 2-5% depending on the game.

Yet said losing up to 5% is "hardly any performance" - jebus make up your mind, do you want the fastest or not? :cry: :cry: :cry:
 
Thanks I hope Zotac is good and it wasn’t my first choice as I bought an overpriced Aorus Master which had a faulty LCD display and got a refund. So the Zotac worked out £390 cheaper which is a good result in the end :)


Only AFAIK manufacturer to have 5 years warranty on their cards now EVGA no longer make GPU's. Hopefully EVGA will be back.
 
Last edited:
Back
Top Bottom