• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD 3000 users how often do you hit boost clocks?

Just felt I wanted to utilise the cores because the games I run so far didn't do much, GPU is 100% and CPU is less than 6% and possibly not many cores utilised. Plus I have loads of videos in stupid formats as well.

I noticed playing AC Unity and Alien Isolation the low cpu % also. I know they are old titles but with settings cranked up I thought CPU use would still be much higher.
 
I noticed playing AC Unity and Alien Isolation the low cpu % also. I know they are old titles but with settings cranked up I thought CPU use would still be much higher.
Coming from a q9550 at 3.6Ghz where cpu usage was 100% almost everywhere seing 6% usage now feels like something is wrong with my pc haha :P. I know what is wrong, it needs a new gpu ...
 
Coming from a q9550 at 3.6Ghz where cpu usage was 100% almost everywhere seing 6% usage now feels like something is wrong with my pc haha :p. I know what is wrong, it needs a new gpu ...

:D Good situation to be in. Only a £500+ card should get it sweating.
 
Plural, as in you have two? #notjealous
One of my worst expenses of my life. Bought them for my new pc to run crossfire back when they were full price and lo and behold amd and nvidia stopped support for multi gpu setups. Never had a crossfire setup before and honestly not worth it. Witcher 3 runs 75ish 4k maxed out but there is some weird thing happening while you are moving the camera cannot describe it and takes away from the experience.

Crysis 3 runs 60ish no aliasing but horrible tearing makes it unplayable.

I have had two perfect vega 56 nitros gathering dust because I went and bought them first and finished my setup lately thought my old pc would run them but that was old crossfire and wouldnt see the second card.

I'll possibly sell these cards or at least one. One vega is brilliant and runs witcher 3 4k 45ish to 50 if I drop the aliasing. Plus new monitor is ultrawide 1440 so should be easier for the vega. Just want to play the good games on my 4k tv..

Lesson learnt never go multi gpu.
 
One of my worst expenses of my life. Bought them for my new pc to run crossfire back when they were full price and lo and behold amd and nvidia stopped support for multi gpu setups. Never had a crossfire setup before and honestly not worth it. Witcher 3 runs 75ish 4k maxed out but there is some weird thing happening while you are moving the camera cannot describe it and takes away from the experience.

Crysis 3 runs 60ish no aliasing but horrible tearing makes it unplayable.

I have had two perfect vega 56 nitros gathering dust because I went and bought them first and finished my setup lately thought my old pc would run them but that was old crossfire and wouldnt see the second card.

I'll possibly sell these cards or at least one. One vega is brilliant and runs witcher 3 4k 45ish to 50 if I drop the aliasing. Plus new monitor is ultrawide 1440 so should be easier for the vega. Just want to play the good games on my 4k tv..

Lesson learnt never go multi gpu.

Damn that sucks although I thought the higher tier stuff would be better. I ran a 7770 setup a long time ago before getting a 7990 which was a beast. Crossfire has never been a super experience due to too many games neglecting it (most gamers wont run that setup). I would get another vega 56 for sure because I sometimes mine on the cards overnight or not using the resources.

Would be handier if the crossfire was at least yielding another 50% performance but not great if the new drivers being released are not bothered about mGPU.
 
my 3950x boosts on 1 core to 4.725 but only for a few seconds before it sits around 4.575.ive ran hwinfo playing bfv and all 16 coes ae boosting to 4,23 max with avererage on all cores sitting at 4.250.id say i could probably squeeze a tiny bit more out of itut i know volts would have to be thrown at it lol.ive a 360 rad connected to a 240 rad so temps i dont think will be a problen.games never above 48ish .all cores maxed on cinebench r20 never above 6o.c.i cant get the pc stable with a 1900 infinity fabric overclock though.i had a 3600x chip that did the 1900 no plrobs
 
my 3950x boosts on 1 core to 4.725 but only for a few seconds before it sits around 4.575.ive ran hwinfo playing bfv and all 16 coes ae boosting to 4,23 max with avererage on all cores sitting at 4.250.id say i could probably squeeze a tiny bit more out of itut i know volts would have to be thrown at it lol.ive a 360 rad connected to a 240 rad so temps i dont think will be a problen.games never above 48ish .all cores maxed on cinebench r20 never above 6o.c.i cant get the pc stable with a 1900 infinity fabric overclock though.i had a 3600x chip that did the 1900 no plrobs

I'm guessing they just bin the chiplets and not the I/O dies as I've seen a lot of the top chips not doing 1900 yet my lowly 3600 manages it.
 
my 3950x boosts on 1 core to 4.725 but only for a few seconds before it sits around 4.575.ive ran hwinfo playing bfv and all 16 coes ae boosting to 4,23 max with avererage on all cores sitting at 4.250.id say i could probably squeeze a tiny bit more out of itut i know volts would have to be thrown at it lol.ive a 360 rad connected to a 240 rad so temps i dont think will be a problen.games never above 48ish .all cores maxed on cinebench r20 never above 6o.c.i cant get the pc stable with a 1900 infinity fabric overclock though.i had a 3600x chip that did the 1900 no plrobs
What is your stable IF clock? I am about to try this week also to get my IF higher than 1800.
 
I can get 1867mhz but have to put 1.125v through the soc to even get that it.so I've just settled with 1800 and tightened the timings on the 32gig 8pack 3600 ram.great ram btw.i think the950x with a 1900 if clock will be rare.
 
My 3900X struggles to hit 4.5 GHz let alone 4.6 on AGESA 1.0.0.4B. Boost was slightly more consistent on 1.0.0.3 ABBA (in terms of hitting 4.5 GHz or above) and Cinebench R20 single core performance was slightly higher (up to 2%) although R15 single core was about the same, multicore performance is slightly higher on 1.0.0.4B though.

However, I can force my best core to hit 4.6GHz+ by running Aida 64 memory latency test or the Crystal Disk Mark SSD benchmark (which results in a couple of cores hitting 4.575GHz too) so technically it is hitting the rated boost clock. Most of the time my 3900X is a 4.4-4.45 GHz chip though.

This is all at default settings (no PBO) on a Noctua NH-D15S cooler with the XMP 3600MHz profile enabled (my scores were worse at JEDEC 2666MHz RAM speed). I have also tested on two different motherboards (an MSI X570 ACE and a Gigabyte X570 Master) with close to identical results.

My Cinebench R20 scores with 1.0.0.3 ABBA:
Single core about 520.
Multicore about 7150.

1.0.0.4B:
Single core about 510.
Multicore about 7300.
 
Last edited:
My 3900X struggles to hit 4.5 GHz let alone 4.6 on AGESA 1.0.0.4B. Boost was slightly more consistent on 1.0.0.3 ABBA (in terms of hitting 4.5 GHz or above) and Cinebench R20 single core performance was slightly higher (up to 2%) although R15 single core was about the same, multicore performance is slightly higher on 1.0.0.4B though.

However, I can force my best core to hit 4.6GHz+ by running Aida 64 memory latency test or the Crystal Disk Mark SSD benchmark (which results in a couple of cores hitting 4.575GHz too) so technically it is hitting the rated boost clock. Most of the time my 3900X is a 4.4-4.45 GHz chip though.

This is all at default settings (no PBO) on a Noctua NH-D15S cooler with the XMP 3600MHz profile enabled (my scores were worse at JEDEC 2666MHz RAM speed). I have also tested on two different motherboards (an MSI X570 ACE and a Gigabyte X570 Master) with close to identical results.

My Cinebench R20 scores with 1.0.0.3 ABBA:
Single core about 520.
Multicore about 7150.

1.0.0.4B:
Single core about 510.
Multicore about 7300.
That doesn't seem too bad, at least its rocking along at 4.4ghz most of the time which I imagine is pretty good for gaming.
 
My 3900X struggles to hit 4.5 GHz let alone 4.6 on AGESA 1.0.0.4B. Boost was slightly more consistent on 1.0.0.3 ABBA (in terms of hitting 4.5 GHz or above) and Cinebench R20 single core performance was slightly higher (up to 2%) although R15 single core was about the same, multicore performance is slightly higher on 1.0.0.4B though.

However, I can force my best core to hit 4.6GHz+ by running Aida 64 memory latency test or the Crystal Disk Mark SSD benchmark (which results in a couple of cores hitting 4.575GHz too) so technically it is hitting the rated boost clock. Most of the time my 3900X is a 4.4-4.45 GHz chip though.

This is all at default settings (no PBO) on a Noctua NH-D15S cooler with the XMP 3600MHz profile enabled (my scores were worse at JEDEC 2666MHz RAM speed). I have also tested on two different motherboards (an MSI X570 ACE and a Gigabyte X570 Master) with close to identical results.

My Cinebench R20 scores with 1.0.0.3 ABBA:
Single core about 520.
Multicore about 7150.

1.0.0.4B:
Single core about 510.
Multicore about 7300.

That's a lot better than mine will do. My Cinebench R20 is 7050 with stock settings on latest AGESA with 3733mhz RAM/1866mhz IF.
 
Back
Top Bottom