• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3800x vs 9900k

Last edited:
102152.png


9900k stock with 2666MHz its default RAM. Average FPS 109.9
3800x stock with IF1900/3600 RAM (3800 tighten timings), AMD recommended for reviews DDR4 3600. Average FPS: 148




Wow a 9900k is slower on a graphed bottlenecked (RTX 1080), than a AMD 3800x with a faster RTX 2080 which is said to be 30% plus faster.

What are you saying ?
 
Wow a 9900k is slower on a graphed bottlenecked (RTX 1080), than a AMD 3800x with a faster RTX 2080 which is said to be 30% plus faster.

What are you saying ?

Miss the next post were it's faster than a 9900k with a 2080ti as well? A 9900k all core's 5GHz with 3600CL16 RAM. A 15fps lead against a 2080 ti. Or 12%.
 
Last edited:
What I'm saying is the one with 9900k says GTX 1080 , but the one with the AMD 3800x says GTX 2080.
The GTX 2080 is 30% faster than the GTX 1080, so there is a bottleneck on the GTX 1080 card .
So if you put a GTX 1080 in the AMD and a GTX 2080 in the Intel system it will be faster, what does it prove.

But I didn`t see the GTX 2080 ti bit !!!!
 
What I'm saying is the one with 9900k says GTX 1080 , but the one with the AMD 3800x says GTX 2080.
The GTX 2080 is 30% faster than the GTX 1080, so there is a bottleneck on the GTX 1080 card .
So if you put a GTX 1080 in the AMD and a GTX 2080 in the Intel system it will be faster, what does it prove.

But I didn`t see the GTX 2080 ti bit !!!!

Depends if they are running the exact same settings and methodology etc. though - if you look at a few different articles for the Shadow of the Tombraider benchmark there are a spread of results - other sites for instance have 1080p highest settings results ranging from 120 to 160 FPS for the 9900K with a 2080ti and others that have 137 with a 2070, etc. etc.

You often can't just take result from a review article and compare to another run done elsewhere even with the ostensible same quality level and resolution.
 
Last edited:
Depends if they are running the exact same settings and methodology etc. though - if you look at a few different articles for the Shadows of the Tombraider benchmark there are a spread of results - other sites for instance have 1080p highest settings results ranging from 120 to 160 FPS for the 9900K with a 2080ti and others that have 137 with a 2070, etc. etc.

You often can't just take result from a review article and compare to another run done elsewhere even with the ostensible same quality level and resolution.

Exactly this. This is why synthetics and dedicated benchmarks are more usable in comparing across system due their linear scaling than workloads such as games which can be impacted by a number of external factors (drivers, game version, control panel settings, AA settings etc). In Tomb raider you can set highest and then adjust AA on a different screen and change the results alone just on that. I have SOTTR and used it for testing my GPU a lot. The game is fun also.
 
Exactly this. This is why synthetics and dedicated benchmarks are more usable in comparing across system due their linear scaling than workloads such as games which can be impacted by a number of external factors (drivers, game version, control panel settings, AA settings etc). In Tomb raider you can set highest and then adjust AA on a different screen and change the results alone just on that. I have SOTTR and used it for testing my GPU a lot. The game is fun also.

The settings are the same but the graphic card is different. The 3800x system is faster, this is 3800x 2080 vs 9900k 2080ti both overclocked.

The only exception so far for the 3800x being faster is WoT EnCore. A game optimised by their friends at Intel.

I am carefully picking like for like benchmarks. Or reasonable comparisons. It's hard, its like the reviews are making sure you can't fact check them. Lot's of custom benches they can only run.
 
Last edited:
Gears of War 5

Core i9 9900K
Z390 (ASRock Tachi Ultimate)
32 GB DDR4 3200 MHz CL16
NVMe M.2. SSD WD Black
https://www.guru3d.com/articles_pages/gears_of_war_5_pc_graphics_performance_benchmark_review,3.html


1080p settings as per the website.



2080 ti 135fps
Titan Xp 120fps
2080 super 119fps
2080 111fps

3800x and 2080.



Average fps 123.5FPS
GPU bound 99.90%
CPU bound 0.10%

This is an AMD sponsored title.

See this video which shows a 9900k is CPU bound 10.09%. 24gb Ram / RTX 2080 OC 8gb / i9900k 5.0 GHZ I love the RAM amount.
https://youtu.be/tUZzOchs8PY?t=216
 
Last edited:
Looks like the 3800x under water with IF1900 and 3600 RAM OC'ed to 3800 CL16 with tightened timing is faster than a 9900k in games.
 
Looks like the 3800x under water with IF1900 and 3600 RAM OC'ed to 3800 CL16 with tightened timing is faster than a 9900k in games.

Seems a bit of a jump to me if your claiming you are getting an increase in performance in Shadow of the Tombraider of ~33% just from water cooling and tuning the RAM/IF.
 
Seems a bit of a jump to me if your claiming you are getting an increase in performance in Shadow of the Tombraider of ~33% just from water cooling and tuning the RAM/IF.

What else would you think it was? It's what I am getting with the ABBA BIOS, stock cores (no pbo or anything else) and just the RAM overclocked. Note the issues of RAM latency are greatly reduced. Maximum latency I have seen can be 80ns but with this OC its as low as 60ns in userbench.

I 'think' the memory latency is the big issue of chiplet based CPU's. The more you reduce latency the better. I believe Intel can get as low as 45ns. My 4930k is 61ns.

https://uk.crucial.com/gbr/en/memory-performance-speed-latency

true latency (ns) = clock cycle time (ns) x number of clock cycles (CL)

The effect can be seen on Intel cpu's as well.

https://www.reddit.com/r/intel/comments/82jaqg/ram_latency_effect_on_gaming_performance_on/


Here on an Intel cpu there is a 27% increase in AotS going from 68.2ns to 46.5ns. So why do you find it hard to believe? It's exactly what you should expect.



Metro Redux with latest RAM timings.

Average 204fps.

Metro Exodus with latest RAM timings.


  • Average Framerate (99th percentile): 65.30
  • Max. Framerate (99th percentile): 90.33
  • Min. Framerate (99th percentile): 38.14
 
Last edited:
At stock sure but this is overclockers, stock is something normal people do when they buy a PC in-store at Rumbelows.

9900k is the honey badger of this generation of chips, corrupting your soul you with all sorts of juicy promises of 5.4GHZ all-core and 4900mhz RAM clocks. It will be a fleeting affair until the 5.5GHZ bins arrive next year.

3900X is the dependable marks and spencer decision, great at lots of stuff, you might even want to marry it and keep it for a number of years. Once you get depressed looking at the honey badgers gaming scores you can always keep running Cinebench to re-assure yourself of a purchase well made.

I know you're semi joking but overclocking adds very, very little to 9900K average gaming performance, 4.7Ghz to 5Ghz all-core adds like 1.5% according to TPU. Not worth it at all considering the massive heat and power.

This is another myth of this whole Intel marketing 'gaming king' thing. Use a large sample of games and overclock the chip and see how tiny the average perf increase is.

Remember how high the base all-core overclock on the 9900K is (4.7), so OC to 5.1Ghz (realistic max if you're lucky) is merely 7.5% more frequency, and as we know, frequency let alone CPU frequency does not scale equally with actual gaming perf.
 
I know you're semi joking but overclocking adds very, very little to 9900K average gaming performance, 4.7Ghz to 5Ghz all-core adds like 1.5% according to TPU. Not worth it at all considering the massive heat and power.

This is another myth of this whole Intel marketing 'gaming king' thing. Use a large sample of games and overclock the chip and see how tiny the average perf increase is.

Remember how high the base all-core overclock on the 9900K is (4.7), so OC to 5.1Ghz (realistic max if you're lucky) is merely 7.5% more frequency, and as we know, frequency let alone CPU frequency does not scale equally with actual gaming perf.

some games like frequency. Maybe it's 2% as you say over a big range - but individually some games gain as much as 10% from the cpu overclock, and some just 1%. Every bit helps when it's rising your minimum frames. I don't understand the heat comment - a 5ghz 9900k doesn't have much heat or power draw in games - we already disproved that myth several times in these forums.

and where the cpu overclock really helps is aging. On older Intel cpus like the old 4 cores - the difference in 2019 between playable and unplayable performance for many people is their overclock. The overclocks may not have done much when they were released but since games are now cpu bottlenecked for those 4 core chips - the overclocks are making a big difference for playability
 
Last edited:
Back
Top Bottom