Seeing in HWiNFO64 that my CPU was hitting 100% of the EDC limit, I thought that clearly the clock speed is being held back, and so I would get better performance by raising the limit. Thus begins the journey.
Setting EDC higher than the default 140A didn't have the effect I expected, so I decided to run some tests. I tested with EDC from 75-200A in 25A increments, plus the default 140A. TDC and PPT were left at the default values (95A and 142W). The Curve Optimizer was set to -15 all cores. The test system consists of a 5950X in an Asus CH8DH, and 64GB of RAM @ 3600 CAS 18.
I used the CPU-Z benchmark, as it's quick and I'm lazy. Ten runs at each value of EDC looked like this...
I offset the two series on the x-axis slightly so the points don't overlap, which would make it hard to read. I added quadratic lines of best fit, though the drop on the left looks too sharp to be quadratic.
Counterintuitively, raising EDC actually reduced both single and multi-threaded performance. I hypothesize that this is because the highest current peaks are when the CPU is least efficient, so it gains only a little performance from that, while the increase in heat reduces clock speed long-term. If I can overcome my laziness I might test that.
Setting EDC higher than the default 140A didn't have the effect I expected, so I decided to run some tests. I tested with EDC from 75-200A in 25A increments, plus the default 140A. TDC and PPT were left at the default values (95A and 142W). The Curve Optimizer was set to -15 all cores. The test system consists of a 5950X in an Asus CH8DH, and 64GB of RAM @ 3600 CAS 18.
I used the CPU-Z benchmark, as it's quick and I'm lazy. Ten runs at each value of EDC looked like this...
I offset the two series on the x-axis slightly so the points don't overlap, which would make it hard to read. I added quadratic lines of best fit, though the drop on the left looks too sharp to be quadratic.
Counterintuitively, raising EDC actually reduced both single and multi-threaded performance. I hypothesize that this is because the highest current peaks are when the CPU is least efficient, so it gains only a little performance from that, while the increase in heat reduces clock speed long-term. If I can overcome my laziness I might test that.
Last edited: