• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cinebench23 13700KF 6100Mhz First out the box test

No I did not,only difference the AMD 7700X got a couple more FPS in games I tested at 4K which was a nothing burger really.7700X remains my most favorite CPU ever but the AMD motherboard did not work well with RTX 4090 so I got rid of it.
What motherboard model was it? What was the issue you was facing?

I’m considering if I should go 13700k or 7700x
 
What motherboard model was it? What was the issue you was facing?

I’m considering if I should go 13700k or 7700x
Known issue on AM5 Motherboards only with RTX 4090,I also knew of the problem before I bought the gear.BIOS were updated to fix the RTX 4090. From what I read.
No problems with any other video card.

For my issue with Asrock motherboard. I was unable to get a post with RTX 4090.HDMI/Displayports did not work.System not stable on rare occasion I did get it to post with RTX 4090.If when working a restart system would go back to no post/black screen/hang.No DX12,No HDR working.Tried 3 different HDR monitors and 1 non-HDR

After reformatting,remove the RTX 4090 ,put in RTX 3090 test to see if it works for around 8Hrs of troubleshooting ,I made some headway.

I was able to get HDMI working with RTX 4090 by doing this.
Plug HDMI cable into IGPU on motherboard,reformatt again through HDMI only on LG OLED .
Then installed the Nvidia drivers
Then Install RTX 4090 would post and run on HDMI but no display ports on second monitor on system.

To get displayports working again, another install through IGPU over HDMI but on a different monitor attached to the system.
Then install Nvidia drivers the install RTX 4090 and display ports worked.

So about 10hrs-11hrs total troubleshooting to get working system.

Now when working some games would work awesome such as Cyberpunk minus drivers bugs on RTX 4090.
Other games such as Spiderman Remaster not so goood,RTX 3080/RTX3090 out performed the RTX 4090.

Said screw it bought another Intel system so in two weeks went from 12900K to awesome 7700x to 12600K and next week most likely a 113900K lol.

Shadow of the tomb raider no DX12 HDR First problem after many hours trouble shooting.Games like Spiderman got half FPS with RTX 4090


AMD 7700X RTX 4090 when working was fine in Cyberpunk


Intel Cyberpunk RTX 4090 a couple FPS less than AMD 7700X but all good
 
Last edited:
I mean, so is any modern day CPU with supreme single threaded performance, basically all 12th gen+

13th gen is a nice jump ahead of 12th gen in ST speed. 6.0Ghz+ on a single core, with more cache/higher IPC making up the improvement.

12700k runs at 5Ghz stock for example. Some run a single core OC at 5.3Ghz, this is no-where near 13th 6Ghz, which is easily attainable on a single core.
 
Last edited:
Known issue on AM5 Motherboards only with RTX 4090,I also knew of the problem before I bought the gear.BIOS were updated to fix the RTX 4090. From what I read.
No problems with any other video card.

For my issue with Asrock motherboard. I was unable to get a post with RTX 4090.HDMI/Displayports did not work.System not stable on rare occasion I did get it to post with RTX 4090.If when working a restart system would go back to no post/black screen/hang.No DX12,No HDR working.Tried 3 different HDR monitors and 1 non-HDR

After reformatting,remove the RTX 4090 ,put in RTX 3090 test to see if it works for around 8Hrs of troubleshooting ,I made some headway.

I was able to get HDMI working with RTX 4090 by doing this.
Plug HDMI cable into IGPU on motherboard,reformatt again through HDMI only on LG OLED .
Then installed the Nvidia drivers
Then Install RTX 4090 would post and run on HDMI but no display ports on second monitor on system.

To get displayports working again, another install through IGPU over HDMI but on a different monitor attached to the system.
Then install Nvidia drivers the install RTX 4090 and display ports worked.

So about 10hrs-11hrs total troubleshooting to get working system.

Now when working some games would work awesome such as Cyberpunk minus drivers bugs on RTX 4090.
Other games such as Spiderman Remaster not so goood,RTX 3080/RTX3090 out performed the RTX 4090.

Said screw it bought another Intel system so in two weeks went from 12900K to awesome 7700x to 12600K and next week most likely a 113900K lol.

Shadow of the tomb raider no DX12 HDR First problem after many hours trouble shooting.Games like Spiderman got half FPS with RTX 4090


AMD 7700X RTX 4090 when working was fine in Cyberpunk


Intel Cyberpunk RTX 4090 a couple FPS less than AMD 7700X but all good

Did you get rid of the AM5 board before trying bios update to fix 4090 issues or did the update not fix it for you?

I’ve got a 4090, just ordered an AM5 motherboard and don’t want to have the same issues!
 
13th gen is a nice jump ahead of 12th gen in ST speed. 6.0Ghz+ on a single core, with more cache/higher IPC making up the improvement.
Yes, but that's not what the commet reply was referring to. Of course it's a jump, but it's not going to be a noticed jump outside of benchmark scores. In gaming you need an RTX 4090 minimum to be CPU limited at 1440P, and then we're talking framerates that go beyond what any decent monitor today has a refresh rate for - So for all intents and purposes, who is going to be noticing the difference in normal day to day use outside of actual productivity work, that's what i was referring to.
 
Yes, but that's not what the commet reply was referring to. Of course it's a jump, but it's not going to be a noticed jump outside of benchmark scores. In gaming you need an RTX 4090 minimum to be CPU limited at 1440P, and then we're talking framerates that go beyond what any decent monitor today has a refresh rate for - So for all intents and purposes, who is going to be noticing the difference in normal day to day use outside of actual productivity work, that's what i was referring to.

The comment you quoted mentioned nothing of just being noticeable in benchmarks?
Omg that single core speed is mad. That thing should be mega snappy in actual use :eek:

Also for the i7 model, it's IMO a bigger ST performance jump between 12th-13th gen than it was between 11th-12th gen.
 
"Snappy in actual use" - I take that as referring to day to day use, for which my reply is 100% accurate. I don't call benchmarking actual use, because you don't measure how "snappy" a benchmark is. Likewise a cinebench run can never be "snappy" - Day to day use within the OS though, that can and is snappy on almost all the mentioned modern CPUs, but you're not going to notice which is a 12th gen, which is Zen 4, which is 13th gen etc in that type of use alone, because they're all so fast and snappy in that type of use.
 
Last edited:
Any chance you can run Handbrake Nightly and test out AV1? You can download it here (click on HandBrake-x86_64-Win_GUI-EXE), but you have to log in. Just put any H264 video file you might have on hand into it, only need to actually transcode 4 minutes, then take a picture of statistics in queue (see last pic).

Settings to use:
6kgGArL.jpg.png
TPSlIp5.jpg.png


7bgJmBN.jpg.png
 
Did you get rid of the AM5 board before trying bios update to fix 4090 issues or did the update not fix it for you?

I’ve got a 4090, just ordered an AM5 motherboard and don’t want to have the same issues!
Simple answer is yes.I knew about the problem on OCT 9 2022.I was convinced or not even concerned that ASRock Motherboard would have a BIOS released for the problem with RTX 4090 before I bought the RTX 4090.I troubleshooted for a couple days,got everything working except performance on some PC Games.

So real answer ASRock has lost a customer for life.The RTX 4090 was planned for years,they should have released a BIOS 1 day before the RTX 4090 release and not a week later.
I even filled out ASRock support ticket,got canned response.

Overall my fault 100% for buying the RTX 4090 when I knew there was a chance of problems
 
Simple answer is yes.I knew about the problem on OCT 9 2022.I was convinced or not even concerned that ASRock Motherboard would have a BIOS released for the problem with RTX 4090 before I bought the RTX 4090.I troubleshooted for a couple days,got everything working except performance on some PC Games.

So real answer ASRock has lost a customer for life.The RTX 4090 was planned for years,they should have released a BIOS 1 day before the RTX 4090 release and not a week later.
I even filled out ASRock support ticket,got canned response.

Overall my fault 100% for buying the RTX 4090 when I knew there was a chance of problems

Ok fair enough thanks for detail.

I’ve ordered the Asus extreme board and checked the bios page which lists 4090 support but guess I won’t know what the system will be like until it’s up and running.

Seems mad that AMD needed a bios update so late in the process for support.
What could be so different, I plugged my 4090 FE into my old Intel X99 board that’s not had a bios update for years and it’s running fine! CPU bound but not had any actual issues.
 
Seems mad that AMD needed a bios update so late in the process for support.
What could be so different, I plugged my 4090 FE into my old Intel X99 board that’s not had a bios update for years and it’s running fine! CPU bound but not had any actual issues.
That is the part I do not get.On AMD 5 RTX 4090 should not of had problems at all.
You will be fine on the ASUS board and the AMD5 platform is real beast for my use case PC Gaming.I will be buying AM5 again.
 
"Snappy in actual use" - I take that as referring to day to day use, for which my reply is 100% accurate. I don't call benchmarking actual use, because you don't measure how "snappy" a benchmark is. Likewise a cinebench run can never be "snappy" - Day to day use within the OS though, that can and is snappy on almost all the mentioned modern CPUs, but you're not going to notice which is a 12th gen, which is Zen 4, which is 13th gen etc in that type of use alone, because they're all so fast and snappy in that type of use.

I have two systems in use, 12900k and 13900k. 13900K at 6.1 feels snappier to my 12900k @ 5.5. Have you tried both, or are you just assuming?

Feels like you just wish your 12th gen to be on "equal footing", with it also having" supreme single thread performance" - when it reality there is a difference.

Hope you'd agree that something as common as web browsing satisfies your critera of "day to day use"? Lets look at some browser benchmarks from Guru3D:

eaQdEnh.png


JFg2HFZ.png


Ryzen 7000 and 13th gen leading the pack. The above is stock - so the cap will be even greater, considering the ease that you can take a 13th gen i5/i7/i9 to 6Ghz and beyond. The extra cache combined with the big frequency bump is responsible.

Would I recommend someone upgrade from 12th to 13th gen (or to the Ryzen 7000 series)? No, not unless they're an enthusiast with money to burn. Though it's silly to pretend they all have the same single thread performance.
 
Last edited:
You have posted very select graphs that when looked at in isolation do fit your narrative. What do they actually mean in real actual day to day use? Nothing as the margins aren't big enough. You say you can tell the difference that the 13th feels snappier than the 12th - But those graphs do not represent a value that would be meaningfully observed by a user outside of looking at a graph. So I refuse to believe that you actually notice a sizeable difference like you say between those two chips in day to day just regular use, the spec bump simply isn't big enough to state that as true, so unless there is a meaningful test out there that outright shows results that would be noticeable to someone for day to day stuff, then I'll stand by my thoughts. To me it sounds more like placebo in this context otherwise.

Feels like you just wish your 12th gen to be on "equal footing", with it also having" supreme single thread performance" - when it reality there is a difference.
This is total speculation, I have already said a number of times in the rumours thread before that I will be skipping 13th gen as it doesn't seem to offer a sizeable leap from 12th gen, and besides that, the edge I also once had with 12th gen when I got it was for Lightroom exporting, but now that Lightroom fully utilises the GPU for export processing, this is less of a demand on the CPU - No games max out a 12th gen yet either at 1440p unless you have a 4090, and even then you're pushing frames that are far beyond any half decent gaming monitor refresh rate so seems fairly meaningless other than scoreboard points.

Gone are the days where I would upgrade just because it was cool to have the latest. Meaningful upgrades worth their money is what matters most to most of us now. If that means skipping a gen or two between upgrades, then that's cool. We are in a time where even mid-range chips are so good as it is for everyday stuff.

Edit*
The term I was looking for is Diminishing Returns to sum up this.
 
Last edited:
You have posted very select graphs that when looked at in isolation do fit your narrative. What do they actually mean in real actual day to day use? Nothing as the margins aren't big enough. You say you can tell the difference that the 13th feels snappier than the 12th - But those graphs do not represent a value that would be meaningfully observed by a user outside of looking at a graph. So I refuse to believe that you actually notice a sizeable difference like you say between those two chips in day to day just regular use, the spec bump simply isn't big enough to state that as true, so unless there is a meaningful test out there that outright shows results that would be noticeable to someone for day to day stuff, then I'll stand by my thoughts. To me it sounds more like placebo in this context otherwise.


This is total speculation, I have already said a number of times in the rumours thread before that I will be skipping 13th gen as it doesn't seem to offer a sizeable leap from 12th gen, and besides that, the edge I also once had with 12th gen when I got it was for Lightroom exporting, but now that Lightroom fully utilises the GPU for export processing, this is less of a demand on the CPU - No games max out a 12th gen yet either at 1440p unless you have a 4090, and even then you're pushing frames that are far beyond any half decent gaming monitor refresh rate so seems fairly meaningless other than scoreboard points.

Gone are the days where I would upgrade just because it was cool to have the latest. Meaningful upgrades worth their money is what matters most to most of us now. If that means skipping a gen or two between upgrades, then that's cool. We are in a time where even mid-range chips are so good as it is for everyday stuff.

Edit*
The term I was looking for is Diminishing Returns to sum up this.

Are you really arguing whether web browsing is something actually done day to day? I could link to rendering (where there are huge gains due to extra cores) or gaming, where there are notable gains. Instead I chose something that everyone does every day, web browse. Look at the difference between 12th and 13th gen, it's notable, a generational jump.

Ignore the data if you like buddy. I think you're just determined to believe that your 12th gen CPU has the same "supreme single thread performance" (using your own words) as Ryzen 7000 or 13th gen, when it's clearly doesn't.

I'll bow out at this point, I've presented data to backup my claims and have actually used both 12th and 13th gen CPU's, so I'm not just assuming and wanting/needing to believe.
 
The margins in the graphs you posted aren't big enough to categorically say "yes people will notice the difference from a 12900K to a 13900K" - That is what you have said, this is what I don't believe to be accurate. That is what I'm saying. Show me otherwise with a more meaningful set of comparisons where the difference captured is a big enough margin to be able to say that yes, that gap is big enough to be noticed by the human brain.
 
The margins in the graphs you posted aren't big enough to categorically say "yes people will notice the difference from a 12900K to a 13900K" - That is what you have said, this is what I don't believe to be accurate. That is what I'm saying. Show me otherwise with a more meaningful set of comparisons where the difference captured is a big enough margin to be able to say that yes, that gap is big enough to be noticed by the human brain.

You're just moving goalposts. The whole debate began because of this post of yours:

I mean, so is any modern day CPU with supreme single threaded performance, basically all 12th gen+

Enjoy your supreme performance :)
 
You're just moving goalposts. The whole debate began because of this post of yours:



Enjoy your supreme performance :)
I really really doubt you can tell the difference between 5.5 and 6.1 ghz. Are you using high power profile on windows? Cause that for me makes the biggest difference in responsiveness, putting that setting on eco mode introduces noticeable delay in the desktop
 
Last edited:
is he really trying to say you will see difference with web browsing LOL that benchmark doesnt translate in real use difference that you will notice not sure how down you need to go to down to start noticing it
 
Back
Top Bottom