• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

i9 11900K ABT on or off

Associate
Joined
23 Nov 2005
Posts
718
Location
Kingdom Of Fife
Hi All,

New system is great and stable (unlike the Ryzen 5900x/Dark Hero build I had in the past!)

Use the system primarily for gaming anyway! Anyway I have a custom EK loop with an EK360 and 240 radiators with 3 and 2 14mm fans.

My system copes with the added power consumption of using ABT, all core boost to 5.1GHz single core boost to 5.3GHZ lighter workloads. When gaming average temps are in the low 70's, with peaks in 80s...

Turn ABT off and it now settles in the high 50s with peaks in mid 60s.

I know that ABT will not void any warranty as overclocking does, any 11th Gen owners offer suggestions? I don't think the extra heat and power consumption appears to be worth it, as framerates at 1440p don't seem to change at all. Perhaps different in synthetic benchmarks?
 
Soldato
Joined
6 Feb 2019
Posts
17,464
Overclocking voids warranty? Lol

but ditto on the system stability, that's the reason I want to switch my 5950x to a 12900k
 
Associate
OP
Joined
23 Nov 2005
Posts
718
Location
Kingdom Of Fife
Overclocking voids warranty? Lol

but ditto on the system stability, that's the reason I want to switch my 5950x to a 12900k
I know what you mean mate, been overclocking since my i7 920 days! I think it has more to do with ABT a new feature with 11th gen, that will allow it to boost and apply power and voltage limits automatically meaning it is stable, can get toasty though. On Ryzen I found the tech impressive but almost like I was beta testing hardware for AMD! Just don't think they have nailed it yet, Intel platforms for me anyway are way better to work with...
 
Soldato
Joined
31 May 2009
Posts
21,257
Hi All,

New system is great and stable (unlike the Ryzen 5900x/Dark Hero build I had in the past!)

Use the system primarily for gaming anyway! Anyway I have a custom EK loop with an EK360 and 240 radiators with 3 and 2 14mm fans.

My system copes with the added power consumption of using ABT, all core boost to 5.1GHz single core boost to 5.3GHZ lighter workloads. When gaming average temps are in the low 70's, with peaks in 80s...

Turn ABT off and it now settles in the high 50s with peaks in mid 60s.

I know that ABT will not void any warranty as overclocking does, any 11th Gen owners offer suggestions? I don't think the extra heat and power consumption appears to be worth it, as framerates at 1440p don't seem to change at all. Perhaps different in synthetic benchmarks?

Out of interest, without the ABT what boost clocks can you manage?
 
Soldato
Joined
31 May 2009
Posts
21,257
Settles at 4.8GHz all cores.... Massively cooler though, with ABT on and run Real Bench temps can nudge mid 90s, system draw is 800W! Still stable for 8 hours though! Gotta love Intel...

800Watts, wow, that'll keep you warm in winter!
What is your r23 score for multi with that 5.1 all core btw?
 
Caporegime
Joined
17 Mar 2012
Posts
47,379
Location
ARC-L1, Stanton System
Ahh i thought the all core clock of 5.1 was the base clock.

All core boost is 4.8Ghz on the 11900K.

That just a cpu test or gpus running too?

Well we know the 3090 is also inefficient at about 400 Watts continuous power draw but taking out looses in the PSU and the rest of the system that is still about 300 Watts for the CPU, an 8 core CPU at best equivalent to a 5800X, its nuts.

Full load GPU and CPU my system draws about 350 to 400 Watts
 
Soldato
Joined
28 May 2007
Posts
18,190
All core boost is 4.8Ghz on the 11900K.



Well we know the 3090 is also inefficient at about 400 Watts continuous power draw but taking out looses in the PSU and the rest of the system that is still about 300 Watts for the CPU, an 8 core CPU at best equivalent to a 5800X, its nuts.

Full load GPU and CPU my system draws about 350 to 400 Watts

4.8Ghz if cooling and motherboard allows IIRC. Performance is opportunistic.

But, yeah just get 5900X and RX6900XT. Bonkers performance for much less power use. 800 watts is a lot to dissipate into a room.
 
Caporegime
Joined
17 Mar 2012
Posts
47,379
Location
ARC-L1, Stanton System
4.8Ghz if cooling and motherboard allows IIRC. Performance is opportunistic.

But, yeah just get 5900X and RX6900XT. Bonkers performance for much less power use. 800 watts is a lot to dissipate into a room.

That's not a big ask on higher end boards and good AIO's.

Its actually easier to cool an 11900K despite its high power draw because it's a large monolithic die about an inch squared, it has lost of surface area to transfer the heat where as the Zen 3 CPU chiplets are no bigger than your little fingernail.
I'm hitting 80c in Cinebench but my RAD, even the tubes out of the pump are room temperature.

@Jonnygrunge What PSU do you have? If temps are not exceeding 90c i wouldn't worry about the CPU, i think they are rated to 100c or 105c TJMax.

I would worry more about the GPU because its been known to blow even pretty decent PSU's on the count that all GPU's have momentary power spikes and the 3090 can do this pretty spectacularly, good PSU's are designed with that in mind but if you already have a continuous load on it near its rated output a large spike from that 3090 could be what makes it go *pop*
 
Associate
OP
Joined
23 Nov 2005
Posts
718
Location
Kingdom Of Fife
To sum up with ABT on all cores boost to 5.1GHz in my system with Custom dual rad EK loop. With ABT off all cores boost to 4.8GHz. I quickly benched a game (fairly demanding at 1440p) Ghost Recon - Breakpoint. I ran the in-game benchmark, with both ABT on and off. Interesting results, and temps were 18-20oC cooler for CPU with ABT off! Full system power draw was around 520W.

ABT on:

1DC8tkv.jpg

ABT off:

hfMaYL0.jpg
 
Associate
OP
Joined
23 Nov 2005
Posts
718
Location
Kingdom Of Fife
That's not a big ask on higher end boards and good AIO's.

Its actually easier to cool an 11900K despite its high power draw because it's a large monolithic die about an inch squared, it has lost of surface area to transfer the heat where as the Zen 3 CPU chiplets are no bigger than your little fingernail.
I'm hitting 80c in Cinebench but my RAD, even the tubes out of the pump are room temperature.

@Jonnygrunge What PSU do you have? If temps are not exceeding 90c i wouldn't worry about the CPU, i think they are rated to 100c or 105c TJMax.

I would worry more about the GPU because its been known to blow even pretty decent PSU's on the count that all GPU's have momentary power spikes and the 3090 can do this pretty spectacularly, good PSU's are designed with that in mind but if you already have a continuous load on it near its rated output a large spike from that 3090 could be what makes it go *pop*

ASUS/Seasonic ROG THOR 850W
 
Associate
OP
Joined
23 Nov 2005
Posts
718
Location
Kingdom Of Fife
4.8Ghz if cooling and motherboard allows IIRC. Performance is opportunistic.

But, yeah just get 5900X and RX6900XT. Bonkers performance for much less power use. 800 watts is a lot to dissipate into a room.

I had a 5900X on a Dark Hero for just under 6 months. Tried various methods of tuning from manual overclocking to CTR and Hydro. I have to say impressive multithreaded application CPU, but that platform I feel is not quite there yet, with Intel with regards to overall system stability. I was getting far too many sporadic reboots no BSOD, caused by WHEA errors due to CPU cores! My Intel setup has never skipped a beat.

I have also had AMD VGA cards in the past, again I felt they were plagued with driver issues, and nVidia for me is the best option, moved back green for a 1080Ti years back, coming from a pair of R9 280Xs OC TOP cards!

I had not been with AMD CPU wise, since my Athlon X2 6400+ days, moved to an i7 920 P35 setup around 2007 and never really looked back...

Can I also stress (excuse the pun!) that the 800W was pulled when running AVX instructions in real bench, not your average gaming session, 6900XT is way behind Ampere in my opinion, no DLSS and RTX performance is poor for the money, its more on par with Turing in that sense.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
18,190
I had a 5900X on a Dark Hero for just under 6 months. Tried various methods of tuning from manual overclocking to CTR and Hydro. I have to say impressive multithreaded application CPU, but that platform I feel is not quite there yet, with Intel with regards to overall system stability. I was getting far too many sporadic reboots no BSOD, caused by WHEA errors due to CPU cores! My Intel setup has never skipped a beat.

I have also had AMD VGA cards in the past, again I felt they were plagued with driver issues, and nVidia for me is the best option, moved back green for a 1080Ti years back, coming from a pair of R9 280Xs OC TOP cards!

I had not been with AMD CPU wise, since my Athlon X2 6400+ days, moved to an i7 920 P35 setup around 2007 and never really looked back...

Can I also stress (excuse the pun!) that the 800W was pulled when running AVX instructions in real bench, not your average gaming session, 6900XT is way behind Ampere in my opinion, no DLSS and RTX performance is poor for the money, its more on par with Turing in that sense.

Pretty much the opposite of my experience. DLSS isn’t much of a feature unless Nvidia can get the tech baked in at the API level. Support is too minimal and super scaling AA is also a somewhat of a dying technique, it’s also pretty much irrelevant on Uber high end cards as the whole point of those is run at very high resolutions. IMO DLSS will go the way of physX, GPP, GSsync and the other fluff Nvidia have failed to push on the market. Ray tracing is somewhat useful but again to see real traction within the industry it has to see mass adoption. Nvidia has caused me many more head aches over the years then AMD. Failure rates have been higher on Nvidia and comparability issues more numerous. Not enough to put me off them and I hope Nvidia sort out the Ampere problems with a respin of its silicon. Clearly Nvidia missed the mark this time round and AMD have now edged past.
 
Last edited:
Back
Top Bottom