• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Ada Lovelace RTX 4090 Owners Thread

What was the uplift?
Game dependant, some things were definitely being strangled by the old setup. As JediFragger suggested 20%+ with some outliers at 50%.

Noticeable improvement in lows and general smoothness. GPU utilization is up across the board. As an example God Of War with everything maxed and DLSS quality was about 90FPS on the 10700k and about 130 on the 13700k. Playing with the graphics settings in Watchdogs Legion now actually makes a difference in FPS.

Had the 10700k running at 4.9 all core with a custom loop and power usage and temps are very similar while gaming with the 13700k at stock settings.
Hit 280w and 95c while rendering so I'll leave that to the 4090. Render setup times are greatly improved during CPU intensive things like texture decompression, in edge cases upto 10x. Happy camper overall.
 
hey guys any guide on how to oce these cards with msi afterburner?
I have two curves - an undervolt @ 2802 MHz/990 mV +500 MHz on the VRAM and 80% power limit. Then I have an OC at 2902 Mhz/1025 mV with +1100 MHz on the VRAM (100% power - I only have 3x8 pins plugged in).
 
I have two curves - an undervolt @ 2802 MHz/990 mV +500 MHz on the VRAM and 80% power limit. Then I have an OC at 2902 Mhz/1025 mV with +1100 MHz on the VRAM (100% power - I only have 3x8 pins plugged in).
I meant a guide on how to do it havent oced my cards in a long time
 
Yup, it's fantastic what a great cpu can do to a system!
It is, also seems some of the things I assumed to be true aren't any longer.
The CPU can make a difference to games even at 4K, rendering times can improve slightly with a better CPU even though the GPU claimed to be running flat out before and at least with FH5 it loads a lot quicker off a modern nvme drive rather than the sata ssd I had it on before. Seen plenty of articles saying drive speed didn't make much difference, seems that with new CPUs the drive can become the bottleneck.
 
I'm a bit late to the 4090 club, bought the only one that would fit in my case of choice...

y4mBjIJFBmCSmgYbpXwXh0DHlMcBw4obOGDA_jAa0nfuWmSAsmtRJWOlWo2c2m2RUa3LnbTLcjTqvg_pP3OQxrqxGqEfrEIcu-V99bVlBqg9rnmOJCWE8eP6fo8Yhb-_eGQn3eNKZmNTSpW7iX8RYXVr0aqBuspAJd-atk3_hdH7r0


If it turns out to be a bit of a hotbox (highly likely) then I have Plan B waiting in the background:

y4mJakM0nHTaQi5xrAfm-Hm4mu4LQh3BbADkl_DFIuoIxU8jIzMiOQSqwG_f1cFZ4sLlC8C-A6lM_jTaZaUXesoZTrHpopsYzlpHIsoteyug-TY8bofS2siND94aw5auWP0gN6BcfOXN0PdHspq-SiAnkV6cKSdCtsN-eNKg34iuGI
 
First encode at above 4K using AV1 tonight, 5160x2160 60fps, recorded some System Shock gameplay for the tube. impressive to see 94-100fps encode rate using CQ26 in handbrake. Seems to have settled to 74fps towards the end of the encode which is about expected for encodes.

My card is set to 70% power limit 24/7 btw, barely made a difference to fps in games lol but power consumption is a lot lower, it now draws up to 3080 Ti power (or less generally).
 
Last edited:
I've just gone 10700k ddr4 3200 CL18 to 13700k ddr5 6600 cl32 and I'm really surprised at the uplift in performance at 4k.

Yeap the difference is huge. 12900k to 13900k is even a noticeable improvement at 4K, especially with RT on.

@Bencher and other forum members will not accept this fact, so prepare for many to dispute your findings.
 
Yeap the difference is huge. 12900k to 13900k is even a noticeable improvement at 4K, especially with RT on.

@Bencher and other forum members will not accept this fact, so prepare for many to dispute your findings.
I've asked you repeatedly to show me but you keep avoiding it so....

The only difference between a 12900k and a 13900k with a 4090 can be seen at below 1080p. It would take you 5 minutes to post a video at 4k and show me your results, but you don't. And here are mine, Cyberpunk running maxed out. Unless you have a special version of the 4090 that can get 150 to 200 fps at 4k you are full of it about there being any bottleneck


I had both the 13900k and the 12900k, difference in 4k was literally 0. I need to drop to 1366x768p to have the card drop below 99% usage :)

@Dave2150 We are not going to see any proof from you, are we? :cry: At this point I believe you know you are wrong, you are just trolling. No other explanation.
 
Last edited:
I've asked you repeatedly to show me but you keep avoiding it so....

The only difference between a 12900k and a 13900k with a 4090 can be seen at below 1080p. It would take you 5 minutes to post a video at 4k and show me your results, but you don't. And here are mine, Cyberpunk running maxed out. Unless you have a special version of the 4090 that can get 150 to 200 fps at 4k you are full of it about there being any bottleneck


I had both the 13900k and the 12900k, difference in 4k was literally 0. I need to drop to 1366x768p to have the card drop below 99% usage :)

@Dave2150 We are not going to see any proof from you, are we? :cry: At this point I believe you know you are wrong, you are just trolling. No other explanation.

Too busy playing D4 to cater to your whims, as even when you're proven wrong, you just move goalposts.

The 13900k is simply a faster CPU than the 12900k, all around. At 4K the difference is less, though there are some games with RT where the 13900k pulls ahead, as the 4090 is so powerful.

Also, your story of returning your 13900k for a 12900k due to the power consumption difference sounds nonsense. If you were concerned about power consumption, you'd probably not have a 4090 and would have a very efficient AMD Zen4 CPU instead.
 
Last edited:
Too busy playing D4 to cater to your whims, as even when you're proven wrong, you just move goalposts.

The 13900k is simply a faster CPU than the 12900k, all around. At 4K the difference is less, though there are some games with RT where the 13900k pulls ahead, as the 4090 is so powerful.

Also, your story of returning your 13900k for a 12900k due to the power consumption difference sounds nonsense. If you were concerned about power consumption, you'd probably not have a 4090 and would have a very efficient AMD Zen4 CPU instead.
Nobody said that the 13900k isn't faster. It's around 10--15% faster at completely cpu bound settings. That is, usually, 720p. At 4k the difference is literally 0%, and you obviously know that that's why you are dodging what would be a 3minute excercise. I was running 7600c34 manually tuned ram on the 13900k mind, not your run of the mill xmp..

I don't know wtf you are talking about but the 4090 is a very efficient card - probably the most efficient there is. Zen 4 cpus are not at all efficient for my workloads, since they consume 4-5 times more power than my 12900k and my 13900k when im actually working.

Until you post your results on 4k with your 3d showing me performance and power consumption, ill just call it wishful thinking. I can bet a paycheck there will be a 0% difference in both performance and power draw, since my 12900k is already sitting at or below 50w while gaming. But again, ball is in your court, prove me wrong and ill consider upgrading, after all I don't want my 4090 to be bottlenecked :D:D:D


EG1. In the in game benchmark at 1080p DLSS Ultra performance the 12900k was getting 164 fps compared to 188 on the 13900k
 
Last edited:
I can't see how a 13900K vs 12900K = Big difference at resolutions above 1440P, the 4090 is indeed very powerful, but no CPU on the market now can fully realise a 4090 at CPU bound resolutions (<1440P) - I can't see there being a healthy uplift at 4K in this context from a 13th gen vs 12th gen.

I've yet to see any reputable benchmarks in reviews showing a big uplift from 12900K to 13900K with a 4090 at 4K either unless there's something I'm missing here....

A 10th gen to /any/ 12th gen is in orders of magnitude better, let alone 10th gen to 13th gen, no refuting that level of generational boost.
 
Back
Top Bottom