• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5800X Boost set to 5050Mhz all cores with AMD Curve Optimizer

Associate
Joined
4 Oct 2018
Posts
30
This is not right. Definitely a problem with waterblock contact.
Forget to remove plastic protector from base of cooler maybe?
Dude, Ive remounted the x62 & the arctric freezer 280 each like 4times...Plastic defo not there!:D
Using mx4 paste,tried smearing with gloved hand, pea method, 3dot method, x method....its always the same result....its not the aio or contact with the coldplate..this chip is just freaking hot... :/
 
Associate
OP
Joined
27 Apr 2014
Posts
857
Thanks pal, having a few niggling issues though hope you can help..

everything is showing up but i cant get these options to work

min/avg/max not showing anything ?
also core clocks are not showing just black/blank..
game time played is also blank not showing anything.

apart from that everything else works, i tried selecting and adding but still nothing showing up under those options im probably doing it wrong, have any idea`s lol

Sure I can help. You have to have MSI Afterburner/RTSS (4.6.3 Beta 4) up to date for them to work.I just made the quick start up guide, to get you up and running but some leg work can be done with the links I provided, also you can make your own graphics/animation/graphs/bars and put them in.

Tags for changing stuff with MSI/RTSS
Time-<BTIME>
Framerate Max-<G=Framerate Max,-2,-2,1,0,300,28>
Framerate Average-<G=Framerate Avg,-2,-2,1,0,300,28>
Framerate min- Min\n<FRMIN> You can remove the /n

Can be used in MSI configs also

[## Tags, variables ##]
# <C> ------- color:
# <Cn=yyxxxxxx> -> <Cn>
# n ----------------- index number
# yy ---------------- alfa canal in hex
# xxxxxx ------------ color in hex
# <B=x,y> --- background
# <S> ------- text size in %:
# <Sn=50> ----------- superscript (íàäñòðî÷íûé);
# <Sn=-50> ---------- subscript (ïîäñòðî÷íûé).
# <A> ------- text alignment:
# <An=10> ----------- 10 symbols, left;
# <An=-10> ---------- 10 symbols, right.
# <P=x,y> --- position in pixels:
# x ----------------- horizontal;
# y ----------------- vertical.
# <APP> ----- used API ("VULKAN","OGL","D3D*")
# <FR> ------ FrameRate
# <FRMIN>, <FRAVG>, <FRMAX>,
# <FR10L>, <FR01L>
# <FT> ------ FrameTime
# <OBJ> ----- ?
#
# \n -------- line break (ïåðåíîñ ñòðîêè)
# \b -------- new column
# \t -------- indent (îòñòóï)
# \r -------- ?
#
# %CPU%, %FullCPU%, %RAM%, %GPU%, %FullGPU%, %Driver%, %Time%
# %FRAMETIME%, %FRAMERATE%,

Suspect min fps be affected even at 4k.
I just find 1440p be my resolution as 32inch screens allows immersion.

I have not found this to be a problem at all,I have tested this at nauseum. I have tested from 1080p and up . I can not say 100% certain for AMD 5000 CPU's because I only tested CPU scaling on 2600X/3600X/3600XT/3800X/3800XT.
Best advice I can say,with whatever Ram one has ,get the tightest settings possible and all will be fine on lower resolutions like 1920x1080.Higher resolution not so much.Ryzen FCLK is also a myth everyone throws around.



5.2Ghz is very impressive. Does it actually result in more performance in games or benchmarks from what you can tell?
I play PC games @4K. I should say I have not tested Intel resolutions @720 where CPU clock speed will make a difference in some game but that is no real life.
No performance gains at all in PC gaming that I have notice. I have been telling people test it themselves run CPU @4300Mhz/4350Mhz/4400Mhz/4500Mhz/4600Mhz etc and default out of the box clocks.
By testing I mean at least 3 runs of the same test and average out the numbers.

Now what I find hard to comprehend when someone looks at a chart with FPS benchmarks numbers and sees a difference of 123FPS vs 125FPS somehow they translate that into a huge difference.
Now with benchmarks like Cinebench you can get higher numbers but it means nothing in real life usage, unless you are a 3d artist that uses 4D Cinema.Then setting a stable all core overclock would be the way to go.

Also I know there is no difference in performance in the higher clocks even over default, I personally like the higher numbers.

For the 5200Mhz on my 5800X I have not nailed all core yet at the same time. I have also seen 5300Mhz in HWinfo64 but this is not possible and I do not believe the numbers at all unless I use BCLK overclocking then I would believe the 5300Mhz which I will show in this thread when I start BCLK overclocking .Ok I went way overboard there,I will stop now.


Well Ive literally now been reading back posts for 15mins and cant dfind what youre referring to! I did try a fixed vcore of 1.28 about a week ago but was unstable...I tried 1.32 which still wasnt stable but the temps were almost already where they ate without a fixed vcore, soo...
I would appreciate if u could just tell me the info again or point me towards the exsct posts discussing this. Thx

Yes I can try to help. My real advice would be return the AMD CPU and get another one. AIO Water cooling should not have problems with temperatures at all and top out around 80°C.
AMD own recommendations
[/URL]

For now to get your AMD Ryzen 5800Z working with desirable temperatures.

You can set CPU override voltage to keep boost clocks ,but the trick is to have enough voltage so you do not get clock stretching. That CPU override voltage would be around 1.325 on a 5800X after voltage drop however you set your CPU.
If you just want temperatures in the 50°C-60°C and do not care much about clock stretching then set CPU Override voltage to 1.25v and check out your boost clocks and test to see if it helps out some.

Go to BIOS.You have these setting in your BIOS
Press F6
Set all to optimize defaults and restart back in BIOS.
Do not tough anything but override voltage and put in your desired voltage. If you use CPU 1.22 override voltage you will get clock stretching but it is ok.
Only areas in red 1/2/3/4
 
Last edited:
Associate
Joined
3 Jul 2013
Posts
56
I'm really not sure what I'm doing here. This OCing is unlike anything I've seen before. (Been Intel since Q6600)
Running a 5900X

- I have my Auto OC set to 125 (150 is unstable)
- Curve set to -5 all cores (-10 is unstable)
- CCD0 Seems to boost between 5025 and 5075 / CCD1 barely touches 4950

I've tried 150 auto OC @ -5 CCD0 and +5 CCD1 to no avail.

I'm not entirely sure what I can do to get a stable +150 Auto OC, or if it even does anything.

Should I bin off my second CCD and just tune the first? If so how can I do that? It used to be easy to figure out if you had a dud CPU, but I have no idea whether my 5900X is good or bad. Judging by the Auto OC of others (+200) it seems bad.

Temps are fine, maximum temp i'm seeing in HWMonitor is around 74c so I have room to boost voltages, but nobody is boosting voltages (I have no idea what boosting voltages does, since lowering them seems to give me better clocks)

FLCK is set to 1900Mhz, DRAM Voltage @ 1.4, no other voltage alterations.
 
Associate
Joined
19 Sep 2020
Posts
271
I'm really not sure what I'm doing here. This OCing is unlike anything I've seen before. (Been Intel since Q6600)
Running a 5900X

- I have my Auto OC set to 125 (150 is unstable)
- Curve set to -5 all cores (-10 is unstable)
- CCD0 Seems to boost between 5025 and 5075 / CCD1 barely touches 4950

I've tried 150 auto OC @ -5 CCD0 and +5 CCD1 to no avail.

I'm not entirely sure what I can do to get a stable +150 Auto OC, or if it even does anything.

Should I bin off my second CCD and just tune the first? If so how can I do that? It used to be easy to figure out if you had a dud CPU, but I have no idea whether my 5900X is good or bad. Judging by the Auto OC of others (+200) it seems bad.

Temps are fine, maximum temp i'm seeing in HWMonitor is around 74c so I have room to boost voltages, but nobody is boosting voltages (I have no idea what boosting voltages does, since lowering them seems to give me better clocks)

FLCK is set to 1900Mhz, DRAM Voltage @ 1.4, no other voltage alterations.

The idea is AMD are going for 80c on temps and push the clocks as much as poss up to 80c, so you do have quite a bit of room left. What do you have PBO limit set to? does setting it to motherboard will higher your PPT thus allowing you to hit the magic 80c.

If your max temp is anything under 80c under load you are basically restricting performance.
 
Associate
Joined
3 Jul 2013
Posts
56
The idea is AMD are going for 80c on temps and push the clocks as much as poss up to 80c, so you do have quite a bit of room left. What do you have PBO limit set to? does setting it to motherboard will higher your PPT thus allowing you to hit the magic 80c.

If your max temp is anything under 80c under load you are basically restricting performance.

I've just done a small amount of testing. In game boost clocks are lower with any combination of PBO/Auto OC/Curve Optimiser vs Stock.

For example, with -5 all cores > 125 PBO > I get decent single core boosts up to 5075Mhz, however, when I launch CoD:MW and play a game, I get around 4.5 - 4.65Ghz Multicore. TDP does not change between PBO limit settings, sticks around 105-110 watts, temps around 70-72c.

Completely stock I regularly see during CoD:MW gameplay boost clocks of 4.7Ghz multicore. Wattage is also 110w and temps are also around 70-72c.

Either this is HWMonitor being buggy or PBO makes my CPU worse, and not better. For reference, FPS seemed to be within 5fps between both settings and its hard to test on a multiplayer game.

Either I'm doing it wrong, or gaming performance is best when these CPUs are left alone.
 
Associate
Joined
4 Oct 2018
Posts
30
Sure I can help. You have to have MSI Afterburner/RTSS (4.6.3 Beta 4) up to date for them to work.I just made the quick start up guide, to get you up and running but some leg work can be done with the links I provided, also you can make your own graphics/animation/graphs/bars and put them in.

Tags for changing stuff with MSI/RTSS
Time-<BTIME>
Framerate Max-<G=Framerate Max,-2,-2,1,0,300,28>
Framerate Average-<G=Framerate Avg,-2,-2,1,0,300,28>
Framerate min- Min\n<FRMIN> You can remove the /n

Can be used in MSI configs also

[## Tags, variables ##]
# <C> ------- color:
# <Cn=yyxxxxxx> -> <Cn>
# n ----------------- index number
# yy ---------------- alfa canal in hex
# xxxxxx ------------ color in hex
# <B=x,y> --- background
# <S> ------- text size in %:
# <Sn=50> ----------- superscript (íàäñòðî÷íûé);
# <Sn=-50> ---------- subscript (ïîäñòðî÷íûé).
# <A> ------- text alignment:
# <An=10> ----------- 10 symbols, left;
# <An=-10> ---------- 10 symbols, right.
# <P=x,y> --- position in pixels:
# x ----------------- horizontal;
# y ----------------- vertical.
# <APP> ----- used API ("VULKAN","OGL","D3D*")
# <FR> ------ FrameRate
# <FRMIN>, <FRAVG>, <FRMAX>,
# <FR10L>, <FR01L>
# <FT> ------ FrameTime
# <OBJ> ----- ?
#
# \n -------- line break (ïåðåíîñ ñòðîêè)
# \b -------- new column
# \t -------- indent (îòñòóï)
# \r -------- ?
#
# %CPU%, %FullCPU%, %RAM%, %GPU%, %FullGPU%, %Driver%, %Time%
# %FRAMETIME%, %FRAMERATE%,



I have not found this to be a problem at all,I have tested this at nauseum. I have tested from 1080p and up . I can not say 100% certain for AMD 5000 CPU's because I only tested CPU scaling on 2600X/3600X/3600XT/3800X/3800XT.
Best advice I can say,with whatever Ram one has ,get the tightest settings possible and all will be fine on lower resolutions like 1920x1080.Higher resolution not so much.Ryzen FCLK is also a myth everyone throws around.




I play PC games @4K. I should say I have not tested Intel resolutions @720 where CPU clock speed will make a difference in some game but that is no real life.
No performance gains at all in PC gaming that I have notice. I have been telling people test it themselves run CPU @4300Mhz/4350Mhz/4400Mhz/4500Mhz/4600Mhz etc and default out of the box clocks.
By testing I mean at least 3 runs of the same test and average out the numbers.

Now what I find hard to comprehend when someone looks at a chart with FPS benchmarks numbers and sees a difference of 123FPS vs 125FPS somehow they translate that into a huge difference.
Now with benchmarks like Cinebench you can get higher numbers but it means nothing in real life usage, unless you are a 3d artist that uses 4D Cinema.Then setting a stable all core overclock would be the way to go.

Also I know there is no difference in performance in the higher clocks even over default, I personally like the higher numbers.

For the 5200Mhz on my 5800X I have not nailed all core yet at the same time. I have also seen 5300Mhz in HWinfo64 but this is not possible and I do not believe the numbers at all unless I use BCLK overclocking then I would believe the 5300Mhz which I will show in this thread when I start BCLK overclocking .Ok I went way overboard there,I will stop now.




Yes I can try to help. My real advice would be return the AMD CPU and get another one. AIO Water cooling should not have problems with temperatures at all and top out around 80°C.
AMD own recommendations
[/URL]

For now to get your AMD Ryzen 5800Z working with desirable temperatures.

You can set CPU override voltage to keep boost clocks ,but the trick is to have enough voltage so you do not get clock stretching. That CPU override voltage would be around 1.325 on a 5800X after voltage drop however you set your CPU.
If you just want temperatures in the 50°C-60°C and do not care much about clock stretching then set CPU Override voltage to 1.25v and check out your boost clocks and test to see if it helps out some.

Go to BIOS.You have these setting in your BIOS
Press F6
Set all to optimize defaults and restart back in BIOS.
Do not tough anything but override voltage and put in your desired voltage. If you use CPU 1.22 override voltage you will get clock stretching but it is ok.
Only areas in red 1/2/3/4
Dude this is a msi bios, I havent got a "cpu override voltage option" I have a Asus x570-e gaming board..
I have a manual voltage option, I went ahead and set it to 1.328 and the temps are about the same as before....and Im getting a pathetic 3.9ghz allcore boost...r20 score of 4356...this cpu is just hot....I dont think im gonna rma it cos even though it idles in the 50s its not harmful to it and I have it capped at 85...still getting a allcore boost of 4.4-4.5 at 85C and 3-4 core boosts of 4.7-4.8.....Its just a donkey of a chip for temps but performs as expected of a 5800x
 
Last edited:
Associate
OP
Joined
27 Apr 2014
Posts
857
Dude this is a msi bios, I havent got a "cpu override voltage option" I have a Asus x570-e gaming board..
I have a manual voltage option, I went ahead and set it to 1.328 and the temps are about the same as before....and Im getting a pathetic 3.9ghz allcore boost...r20 score of 4356...this cpu is just hot....I dont think im gonna rma it cos even though it idles in the 50s its not harmful to it and I have it capped at 85...still getting a allcore boost of 4.4-4.5 at 85C and 3-4 core boosts of 4.7-4.8.....Its just a donkey of a chip for temps but performs as expected of a 5800x
I know what motherboard you have.
 
Associate
OP
Joined
27 Apr 2014
Posts
857
Did you read the google doc for the 5800X ,I wonder if my name is on it,yeah it is. Why do you think I do not know about this.Why you trying to shame me into some sort of guilt trip.Do you think I stated anywhere that I am one of the lucky ones (more than once in this thread alone)I am trying to help you and your being a take this video and watch all the 5000+Mhz glory you will never have
 
Associate
Joined
1 Feb 2006
Posts
1,104
Location
Worcestershire
Mine still isnt stable with anything over +25 on Boost hey ho!. I've tried every variation of settings and PBO Curve but it wont have it.

I've settled on leaving it with PBO on and -10 Curve optimizer with +0 boost. Gives me slightly higher multi core boost 4600MHZ and single stays at 4850MHZ That'll do I done messing for a couple of % performance gains. Guess we dont all have chips that can do this hey :)
 
Associate
OP
Joined
27 Apr 2014
Posts
857
AMD 5000 series are the fastest CPU's that are out there. No need to do anything to the CPU unless your like me and like seeing the bigger number. There is no other reason to use AMD Curve Optimizer.

5800X Curve optimizer set to 5175Mhz 75% CPU fane speed Rank 2 DDR$ CL 20 3600
 
Last edited:
Associate
Joined
17 Sep 2009
Posts
61
I've been having a bit more of a play around with PBO and curve optimiser using the 5900x on the X570 Tomahawk. Don't have any screenshots but I don't have anything particularly incredible as of yet (R20 638 single core, ~8600 multi thread). I have however learned quite a few different things, some of which might just be peculiar to my setup. Some of these things I don't understand WHY they are the case and having used Intel since Althon 1800x it's all new and I'm just beginning to get a handle on them :)

One thing I have discovered is that the PBO power settings, at least for me, seem to influence the ability to apply negative curve optimiser settings to the fastest cores and also how much they boost. This seems to deviate from some of the recommendations of AMD in their presentation. I get the best single thread results on Auto settings, with -15 possible on the top 2 cores, netting me 638 in R20, up from 630, sitting about 60 degrees c.

However, this is not a good setting for multi core results as EDC and PPT immediate max out under R20 multi, restricting boost frequency to about 4100 all cores. I don't think this is as much as an issue on the 5800 as its not hitting these overall limits Switching to motherboard power settings in bios apparently - according to Ryzen master - uncaps PPT to 500W, TDC to 210A, and EDC to 280A. However, I'm not sure if these are the real limits imposed by the motherboard, particularly in terms of EDC. This also seems to support a negative curve on the 10 slower cores to -20, and better frequencies in R20 around 4350. However, for whatever reason, single thread aren't as good (632), and the negative offset reduces results above -10, with instabilility occuring over -12.

Lastly, setting manual settings of EDC to around 205 seems to support -15 on the 2 fastest cores and -20 on the 10 slowest. However, the single thread still isn't as good, and the cores are not boosting over 5000 anywhere near as consistently as in auto. I got my best R20 multi result here (just over 8600) at the cost of quite a lot of extra heat, going from the low 80s up to about 86 degrees.

One thing that freaked me out when I first built this is that because it was new I was using review benchmarks as a yardstick for performance, and reviewers were getting ~8400 on 'default' settings, with no mention of PBO, on a 5900x in R20, whereas I was getting about 7600 when I first built this. I've come to the conclusion that there was some tweaked power configuration on the setups AMD allowed reviewers to use (which I don't think included the MSI Tomahawk) because there's no way the default out of the box power settings alow that multi result over 12 cores, there just arent enough amps available irrespective of temperatures. Now I understand things like EDC a lot more and the influence it has on results.
 
Soldato
Joined
26 Jan 2004
Posts
6,276
Location
Scotland
AMD 5000 series are the fastest CPU's that are out there. No need to do anything to the CPU unless your like me and like seeing the bigger number. There is no other reason to use AMD Curve Optimizer.

5800X Curve optimizer set to 5175Mhz 75% CPU fane speed Rank 2 DDR$ CL 20 3600

Something not quite right with your memory`s write speeds. im sure there ment to be not far off the copy speed or read speed but they look quite low in comparison.

You see what i mean this is mine but i defaulted the memory and run it a little slower than its defaults and tweaked just to show comparison..

i lowered my memory to 3800mhz @18.18.18 you can see read is normal and write is slightly lower then copy is slightly lower, yours has low write but higher copy

BVCnh75.png
 
Last edited:
Associate
OP
Joined
27 Apr 2014
Posts
857
Another EDIT: I just seen someone claim this an hour ago in a thread that his
My L3 has doubled in speed as well from 2702 to 3003.
on Asus motherboard ,so maybe I do not know crap .This is on 5600X.
(44) ASUS ROG X570 Crosshair VIII Overclocking & Discussion Thread | Page 195 | Overclock.net

New BIOS agesa 1.1.8.0 has double L3 Cache numbers now,that is about time to be read correctly. Back to normal for L3 Cache

Something not quite right with your memory`s write speeds. im sure there ment to be not far off the copy speed or read speed but they look quite low in comparison.

You see what i mean this is mine but i defaulted the memory and run it a little slower than its defaults and tweaked just to show comparison..

i lowered my memory to 3800mhz @18.18.18 you can see read is normal and write is slightly lower then copy is slightly lower, yours has low write but higher copy

You just do not know how Memory works on AMD Ryzen and it is ok 100%.
On single CCX it will be half the speed of 2 CCX on AMD Ryzen 3000 and 60%-70% or so on AMD 5000 with 2 CCX.

EDIT the crap out of this so far :p:I guess I should also say AMD Ryzen first gen like 1800X would be the same as your 5950X on the numbers but I am just going of the top of my head here .Now my numbers are not exact to % point just guess. I will go look up the excat numbers if needed.

Then L3 Cache will be half again ,it is perfectly normal and how AMD Ryzen Memory works.

2600X and 3800X and 5800X and compare to your 5950X hope that helps. 2600X has the best memory so far. Switching back to 2600X. Anyway I can not stress this enough that ram does not mater that much at all in PC Gaming or applications ,tight timings are good enough

 
Last edited:
Soldato
Joined
6 May 2009
Posts
19,909
Something not quite right with your memory`s write speeds. im sure there ment to be not far off the copy speed or read speed but they look quite low in comparison.

You see what i mean this is mine but i defaulted the memory and run it a little slower than its defaults and tweaked just to show comparison..

i lowered my memory to 3800mhz @18.18.18 you can see read is normal and write is slightly lower then copy is slightly lower, yours has low write but higher copy

BVCnh75.png
5800X its perfectly normal. Though i can’t see @gerardfraser AIDA64 full stats. The screenshot is cut off above


5800X PBO auto. DDR 3800, CAS14
lLqY6mg.png

edit. Beaten to it
 
Soldato
Joined
6 May 2009
Posts
19,909
^^

Your ram looks pretty slow for speed/timings your using.
Talking to me, if so what do you base this on? More info please


Yeah I am a slow typer @Guest2 . Added 2600X/3600X/5800X

IF your talking to me,it is not clear.
Either you have not looked at the screen shots with that thought you have no clue at all.
Should I break it down for you or you want to double check the screen shot.

Are yes, can see them now. Look spot on to me in comparison to mine :)
Slightly higher read/write and lower latency due to DDR3800/IF1900 CAS14, where as you benefiting on cache speeds from your great CPU clocks

Some people on here do talk twaddle
 
Last edited:
Back
Top Bottom