• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cinebench23 13700KF 6100Mhz First out the box test

Interested in this CPU, my 12700kf hit's 92c on the highest core clocked at 5g on all cores in cinebench. Using a H150i 360 AIO. But watched HW unboxed video showing this CPU hitting 99C out the box in cinebench? Does it run that hot?
 
I have two systems in use, 12900k and 13900k. 13900K at 6.1 feels snappier to my 12900k @ 5.5. Have you tried both, or are you just assuming?

Feels like you just wish your 12th gen to be on "equal footing", with it also having" supreme single thread performance" - when it reality there is a difference.

Hope you'd agree that something as common as web browsing satisfies your critera of "day to day use"? Lets look at some browser benchmarks from Guru3D:

eaQdEnh.png

JFg2HFZ.png

Ryzen 7000 and 13th gen leading the pack. The above is stock - so the cap will be even greater, considering the ease that you can take a 13th gen i5/i7/i9 to 6Ghz and beyond. The extra cache combined with the big frequency bump is responsible.

Would I recommend someone upgrade from 12th to 13th gen (or to the Ryzen 7000 series)? No, not unless they're an enthusiast with money to burn. Though it's silly to pretend they all have the same single thread performance.

Unless you are someone out of a comic and use the soubriquet 'The Flash' or your real father is Jor-El, then you will not be able to tell the difference between 444 Milliseconds and 357 ms when using a web browser! So you stating you can tell the difference is almost the same level of nonsense when you were stating how great 11th gen would be.

Interested in this CPU, my 12700kf hit's 92c on the highest core clocked at 5g on all cores in cinebench. Using a H150i 360 AIO. But watched HW unboxed video showing this CPU hitting 99C out the box in cinebench? Does it run that hot?
Yes, generally 13th gen runs hotter than 12th gen.
 
Known issue on AM5 Motherboards only with RTX 4090,I also knew of the problem before I bought the gear.BIOS were updated to fix the RTX 4090. From what I read.
No problems with any other video card.

For my issue with Asrock motherboard. I was unable to get a post with RTX 4090.HDMI/Displayports did not work.System not stable on rare occasion I did get it to post with RTX 4090.If when working a restart system would go back to no post/black screen/hang.No DX12,No HDR working.Tried 3 different HDR monitors and 1 non-HDR

After reformatting,remove the RTX 4090 ,put in RTX 3090 test to see if it works for around 8Hrs of troubleshooting ,I made some headway.

I was able to get HDMI working with RTX 4090 by doing this.
Plug HDMI cable into IGPU on motherboard,reformatt again through HDMI only on LG OLED .
Then installed the Nvidia drivers
Then Install RTX 4090 would post and run on HDMI but no display ports on second monitor on system.

To get displayports working again, another install through IGPU over HDMI but on a different monitor attached to the system.
Then install Nvidia drivers the install RTX 4090 and display ports worked.

So about 10hrs-11hrs total troubleshooting to get working system.

Now when working some games would work awesome such as Cyberpunk minus drivers bugs on RTX 4090.
Other games such as Spiderman Remaster not so goood,RTX 3080/RTX3090 out performed the RTX 4090.

Said screw it bought another Intel system so in two weeks went from 12900K to awesome 7700x to 12600K and next week most likely a 113900K lol.

Shadow of the tomb raider no DX12 HDR First problem after many hours trouble shooting.Games like Spiderman got half FPS with RTX 4090


AMD 7700X RTX 4090 when working was fine in Cyberpunk


Intel Cyberpunk RTX 4090 a couple FPS less than AMD 7700X but all good
Thanks for the insight. I ended up ordering the 13700k, it should be here tomorrow :)

I am still waiting on the ATX 3.0 PSU and looks like eta on that is 4th of Nov so I can't even test it out until then. I'll be sure to try out your OC settings with UV.
 
you will not be able to tell the difference between 444 Milliseconds and 357 ms when using a web browser!
talking rubbish
benchmarks exist to show relative performance of CPUs. As it happens the browser benchmarks are a good indicator of how fast and snappy CPU feels in day to day tasks (not just browsers)
 
talking rubbish
benchmarks exist to show relative performance of CPUs. As it happens the browser benchmarks are a good indicator of how fast and snappy CPU feels in day to day tasks (not just browsers)
Nonsense! Get yourself a decent hand stop watch and try to get ~0.087 seconds on the display. That is the difference we are talking about. (I've done it so know!)

Anybody who says they can easily tell that difference when it comes to using a browser or pretty much anything else clearly has super human powers or moves as fast as a fly.
 
Last edited:
This is more stupid than the regular "nobody games at 720p on a RTX4090 or can see 600fps in CS:GO"

In your example, 357ms and 444ms is the run time of arbitrary piece of code.
What it tells you is that one CPU is 20% faster at small random tasks. And thats why we run benchmarks, to see which is faster or completes more work.

You will notice programs and files opening faster, game levels loading quicker, and yes, even heavy web page rendering.
Rather, you will notice if you go back to slower CPU after getting used to quick one.
 
This is more stupid than the regular "nobody games at 720p on a RTX4090 or can see 600fps in CS:GO"

In your example, 357ms and 444ms is the run time of arbitrary piece of code.
What it tells you is that one CPU is 20% faster at small random tasks. And thats why we run benchmarks, to see which is faster or completes more work.

You will notice programs and files opening faster, game levels loading quicker, and yes, even heavy web page rendering.
Rather, you will notice if you go back to slower CPU after getting used to quick one.
Do some actual tests before replying with non sequitur anecdotal points.

Think about it properly for a second. (no pun intended)

If something takes 10 minutes to complete I can easily notice a 20% improvement as then that will take 8 mins instead of 10 mins. Though when something takes only a fraction of a second to complete then a 20% improvement will be almost unnoticeable for human perceptions.

Put it this way, if you did a test whereby you had 2 browsers open side by side on two computers and then clicked a link on both browsers at the same time and then watched to see how quick they opened. In this scenario if one page opened 0.087 seconds after the other one, for our human perception they would open at pretty much the same time. You would need software to highlight any difference as they are so small.

It's a similar point to when you have a game that runs at 500fps and then get a 20% improvement so it now runs at 600 fps and then try to convince people that you can notice the difference!
 
Last edited:
This is more stupid than the regular "nobody games at 720p on a RTX4090 or can see 600fps in CS:GO"
Whoever thinks thats stupid...lacks common sense.

Nobody argues that people game at 720p. The argument is that you are going to keep your computer, assuming you are primarily gaming, until it doesn't offer adequate gaming performance / bottlenecks your brand new shiny GPU. That point is going to come faster for a CPU that is slower, obviously, and in order to find out which one is slower, you look at 720p games.

It's insanely easy to demonstrate the point, but nobody ever answers the question cause it shows their folly. Say you wanted a gaming CPU with a budget of 300€. CPU A costs 300€ and gets 100 fps at 4k, CPU B costs 300€ and also gets 100 fps at 4k. Which one would you buy between these two? Unless you look at 720p games, you have a 50% chance of making the wrong choice.
 
Thanks how are people cooling it then?
A decent 360 AIO (top air cooler) and probably one of these if you are one of the people affected by it...

 
You have posted very select graphs that when looked at in isolation do fit your narrative. What do they actually mean in real actual day to day use? Nothing as the margins aren't big enough. You say you can tell the difference that the 13th feels snappier than the 12th - But those graphs do not represent a value that would be meaningfully observed by a user outside of looking at a graph. So I refuse to believe that you actually notice a sizeable difference like you say between those two chips in day to day just regular use, the spec bump simply isn't big enough to state that as true, so unless there is a meaningful test out there that outright shows results that would be noticeable to someone for day to day stuff, then I'll stand by my thoughts. To me it sounds more like placebo in this context otherwise.


This is total speculation, I have already said a number of times in the rumours thread before that I will be skipping 13th gen as it doesn't seem to offer a sizeable leap from 12th gen, and besides that, the edge I also once had with 12th gen when I got it was for Lightroom exporting, but now that Lightroom fully utilises the GPU for export processing, this is less of a demand on the CPU - No games max out a 12th gen yet either at 1440p unless you have a 4090, and even then you're pushing frames that are far beyond any half decent gaming monitor refresh rate so seems fairly meaningless other than scoreboard points.

Gone are the days where I would upgrade just because it was cool to have the latest. Meaningful upgrades worth their money is what matters most to most of us now. If that means skipping a gen or two between upgrades, then that's cool. We are in a time where even mid-range chips are so good as it is for everyday stuff.

Edit*
The term I was looking for is Diminishing Returns to sum up this.
I am with you 100%. No way to notice the difference. My Haswell system - for example, feels just as fast as my 5.5GHz 9600KF for most tasks.

In fact I would say that a DDR4 12900KF system with better timings will be noticeably faster than a DDR5 13900K system.
 
Well I just started overclocking testing new CPU.Takes awhile for each clock for stability.
Handbrake two hr encode and stuff like that.Now need to game a few days ,then move on to 5600Mhz.I am thinking up to 5800Mhz but happy with default settings also at 5300Mhz for PC Gaming.

13700KF 5500Mhz 1Hour Cinebench R23 Test Voltage and Temperatures HWinfo64. 84°C Max .
Simple BIOS Settings 5500Mhz
Offset Voltage= -0.0800
LLC 4 on Asus Motherboard Auto voltages on everything else
Can you share with me what setting in the bios you are changing to get 5.5ghz? Sorry I’m really new to overclocking. I know how to do adaptive voltage with - offset. I changed mine to -0.08 and I’m getting exactly the same temps (84c) but not sure how to make it 5.5ghz.

I’m using Z790 hero with a 13700k.



Thanks!
 
Can you share with me what setting in the bios you are changing to get 5.5ghz? Sorry I’m really new to overclocking. I know how to do adaptive voltage with - offset. I changed mine to -0.08 and I’m getting exactly the same temps (84c) but not sure how to make it 5.5ghz.

I’m using Z790 hero with a 13700k.



Thanks!
Change the multiplier. On an Air cooler that will be pretty hot unless you can achieve it a low voltage.
 
Back
Top Bottom