• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

piledriver - Official AMD FX4300,FX6300 and FX8350 review thread

For Phenom II x4's I did but they never used to show sub ambient temperatures on idle like Thuban's etc(Was kinda meh with doing it with Thuban's, and haven't given any Bulldozer temperature advice because people can't grasp they don't have 40c headroom left)
It's still guesstimates, why wouldn't you want a proper sensor with proper limits?


The thing with all sensors is 'are any of them true to life?

AMD's sensors are thought to be pretty unreliable at low temperatures, but pretty accurate at higher temperatures.

It all depends on how the BIOS and then the software interprets the DATA coming from the sensors and how that is then collaborated, it may be accurate at one given temperature and yet completely inaccurate at another.

All that really maters is what number is it reading? what number will it go to before it starts having heat issues.
 
I'm not sure how they can get reliable at higher temperatures when the Phenom II's used to crap out at 60~ by "core temp". I wouldn't consider that high.

No sensor may be 100% accurate, but they're a damn sight more accurate than what we're seeing with FX CPU's.
My GPU and CPU considering their cooling and clock seem perfectly plausible, same with them under load.
 
I'm not sure how they can get reliable at higher temperatures when the Phenom II's used to crap out at 60~ by "core temp". I wouldn't consider that high.

No sensor may be 100% accurate, but they're a damn sight more accurate than what we're seeing with FX CPU's.
My GPU and CPU considering their cooling and clock seem perfectly plausible, same with them under load.

A 200 watt AMD or Nvidia GPU running flat out with a reading of 50 to 60c is just as implausible, yet a lot of them do.

What you do need to know is the (CPU Temp) is never consistent from one rig to the next as that comes from a sensor on the Motherboard, and so its Motherboard dependant.

Its why AMD's own software reads from the cores, its also why one should always advise on the (core temp) in HWMonitor, never the (CPU temp)
 
But the CPU temp in Thuban used to be higher than the core temp, there was quite a difference.

I'd never use AMD core temps to monitor them, not since Thuban, because they were just plain wrong.

The core temp would report sub ambient, the CPU temp didn't, it was consistently higher.

But we can just stop this now, we're not going to agree 100%, but I've not seen many outclock me stable on a Thuban on conventional cooling, so either I know what I'm doing or I just had dumb luck, which I doubt.

Although I find it kind of silly you'd put more faith in a temperature that's reporting 14c (Core temp) over the 30c + from CPU temp.
 
Last edited:
But the CPU temp in Thuban used to be higher than the core temp, there was quite a difference.

I'd never use AMD core temps to monitor them, not since Thuban, because they were just plain wrong.

The core temp would report sub ambient, the CPU temp didn't, it was consistently higher,

There is no cooling under the CPU, Motherboard manufactures use that sensor to monitor how hot the socket is getting for their own thermal protection.

The cooling is on the other side of the CPU, on the top side (where the CPU cooler is and the core sensors are)

But who knows.... its not important to me, i'm not so anal that i need to know the silicon is at the true temperature.
 
Last edited:
I actually found only a +1-2'C difference between the temperature sensor reading from the Asus software and my infra-red temp gun pointed at the core.

So at idle the sensor/software says 30'C whereas the gun reads 27-29'C dependant on which edge of the chip you read.
 
I actually found only a +1-2'C difference between the temperature sensor reading from the Asus software and my infra-red temp gun pointed at the core.

So at idle the sensor/software says 30'C whereas the gun reads 27-29'C dependant on which edge of the chip you read.

The Asus software is the Asus AI probe stuff isn't it? The one that reported the "CPU temp"? That's what I used to use (Or HWMonitor CPU temp/TMPIN1, they monitored the same sensor)
 
Just curious, has anyone here gone from a X$ 955 or X6 1090T and above to a comparable one of these Piledriver's?

If so, how is the performance gain because i've not seen any benchmarks pitting the old Phenoms against these new Piledriver's anywhere.
 
Just curious, has anyone here gone from a X$ 955 or X6 1090T and above to a comparable one of these Piledriver's?

If so, how is the performance gain because i've not seen any benchmarks pitting the old Phenoms against these new Piledriver's anywhere.

There not in stock in the UK yet anywhere so no 1 has of yet.

A lot of people are going from Phenom's to piledrivers looking at the forums though
 
Back
Top Bottom