• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

piledriver - Official AMD FX4300,FX6300 and FX8350 review thread

Of course it matters, and how's it irrelevant? You can't quote temperatures it's running at that it isn't actually running at.
It says it's running ~8c cooler than ambient idle.

Yeah sure, it's irrelevant.

Its not conducive, the general consensus has always been +10c, for 10 years this has been the case.

On Thuban the thing is stable at 55c / 70c, whether that's 55c or actually 65c or 165c its irrelevant, 55c (on the reading is what you go by) anything much past that and the chip shuts its self down to protect its self from unassuming Intel nOObs not knowing anything about AMD chips :D
 
I'm not an unassuming Intel noob, I've proven more than once I know more than you about AMD CPU's and platforms.

I know plenty about AMD temps, but when you can see a blatant sub ambient temperature, why bother quoting a temperature as it's just plain wrong?
That's what I'm trying to get across.
Why bother posting a temperature that has absolute no bearing?

Just show a 100 long run of IBT at maximum RAM usage, if the GFLOPS don't go down then it hasn't throttled and it's proven its stability.

If you're going to start trying to use temperatures, have an accurate reading from a sensor one has placed them-self.
 
Last edited:
Is there a decent performance improvement going from 8120 to 8320? I couldn't find any benchmarks anywhere for these specific cpu comparison.

Don't need the upgrade that much but I'd rather sell my 8120 and put a bit now to upgrade than having to shell out the whole retail price next year for another cpu when mine becomes dirt cheap in the second hand market.

8120 vs 8350 thread here for reference.

http://www.overclock.net/t/1320957/benches-fx-8120-4-3ghz-vs-fx-8350-4-3-4-6ghz
 
I'm not an unassuming Intel noob, I've proven more than once I know more than you about AMD CPU's and platforms.

I know plenty about AMD temps, but when you can see a blatant sub ambient temperature, why bother quoting a temperature as it's just plain wrong?
That's what I'm trying to get across.
Why bother posting a temperature that has absolute no bearing?

If you already know this then what are you going on about it for?

At no time have i or any one said those temps are accurate to real life, the first thing i did was explain to you whats going on with that.

There is no problem here what so ever.
 
I'm going on about why bother giving a temperature that has absolutely no bearing whatsoever.

Of course it's a problem, why wouldn't you want an accurate temperature reading from your CPU?

And I, or other people are constantly telling some AMD owners that their CPU isn't running X or Y temp under stress, so of course again it's a problem, people don't realise the temperatures they're given aren't accurate.
 
Last edited:
I'm going on about why bother giving a temperature that has absolutely no bearing whatsoever.

Of course it's a problem, why wouldn't you want an accurate temperature reading from your CPU?

What? as opposed to posting an overclock without a temperature?

Do you not ask for temperatures when your helping someone overclock their CPU or GPU?
What do you do, do you tell them to keep going until it goes bang? or do you set a temperature target for them?

Don't be stupid, of course it has a bearing.
 
How can you set a temperature target when the temperature isn't accurate?
All you can do is try and offset it, but even then that could end badly.

People don't realise their CPU's aren't running at X temperature, so they think they've got Y headroom left in temperatures, when in reality they've only got Z, that's what I'm getting at, it's a problem.
 
Last edited:
When are they expected in stock? is it going to be a repeat of Bulldozer where stock is scarce for several months after release?
 
How can you set a temperature target when the temperature isn't accurate?
All you can do is try and offset it, but even then that could end badly.

Whether its accurate or not; it is a reading, its a reading with known limits of which you calculate targets to.

are you telling me you have never asked someone for an AMD core temperature reading to which the response was 47c, to which you responded (knowing 55c was ok) "clock it up some more, your temps are good"

without that how else would you know?
 
Whether its accurate or not; it is a reading, its a reading with known limits of which you calculate targets to.

are you telling me you have never asked someone for an AMD core temperature reading to which the response was 47c, to which you responded (knowing 55c was ok) "clock it up some more, your temps are good"

without that how else would you know?

Too many assumptions.
You're assuming that every CPU has the same type of offset value.

Back in Phenom II's, I'd always use the CPU temp from HWMonitor, TMPIN1 or whatever it was, on Asus boards it said CPU Temp, then get it no higher than say 60c. This wasn't too bad for Deneb's, as they didn't show sub ambient temperatures, the Thuban's however did, and that became a massive problem. Coolermaster V10 wouldn't initiate the TEC because of the temperatures etc, people were throttling at 44ishc on the "core temp" etc.

But that's not to say I don't think it's a problem.

Because more and more I see people thinking they've got more temperature headroom when they don't.

Since on the CPU packaging you can't see "My temperatures are inaccurate" how are people meant to know?
 
Last edited:
Whether its accurate or not; it is a reading, its a reading with known limits of which you calculate targets to.

I'd have to agree with this. You have to work with what you've got and the limts laid down. They throttle before you get to anything troublesome anyway AFAIK. Think about what you're doing and bench smartly, then it shouldn't be an issue.
 
About the AMD core temps: they are calculated from a formula, based off the temperature sensor under the socket (socket temp) and current load/operations.

It gets more accurate the closer it gets to 46 degrees C (and over.) That's why it will read lower than ambient temps at idle. Apparently the thermal shutdown for the chip is based off this temperature.
 
Too many assumptions.
You're assuming that every CPU has the same type of offset value.

Back in Phenom II's, I'd always use the CPU temp from HWMonitor, TMPIN1 or whatever it was, on Asus boards it said CPU Temp, then get it no higher than say 60c. This wasn't too bad for Deneb's, as they didn't show sub ambient temperatures, the Thuban's however did, and that became a massive problem. Coolermaster V10 wouldn't initiate the TEC because of the temperatures etc, people were throttling at 44ishc on the "core temp" etc.

But that's not to say I don't think it's a problem.

Because more and more I see people thinking they've got more temperature headroom when they don't.

Since on the CPU packaging you can't see "My temperatures are inaccurate" how are people meant to know?

The chips can calculate how hot they are by themselves, as for reading that it depend on the software used, just like any chip. If you use AMD's own software its right.

Do you do give temperature advice or not?
 
Last edited:
Do you do give temperature advice or not?

But it's not really the same is it when you've got an accurate sensor?
Yes I do, if you knew how to read you'd have been able to get the answer from the fact I gave advice during the Phenom II's.

But I don't see the angst about wanting a proper sensor that's accurate.

I know about the theories, this whole rather pointless discussion started with ;

I wouldn't ever take an AMD temperature reading as evidence what it's running at.

Which would kind of prove I knew about AMD temperature issues.

And AMD's software was not right, AOD would give the same wrong readings as core temp with the sub ambient temperatures, since they were reading from the same wrong sensor.
 
Last edited:
So you'd rather have to work out temperatures with guesstimates rather than a proper working sensor? That's all I'm getting at.

EDIT : Also, AMD's software was not right, it gave the same readings as core temp did.
 
Last edited:
For Phenom II x4's I did but they never used to show sub ambient temperatures on idle like Thuban's etc(Was kinda meh with doing it with Thuban's, and haven't given any Bulldozer temperature advice because people can't grasp they don't have 40c headroom left)
It's still guesstimates, why wouldn't you want a proper sensor with proper limits?
 
Back
Top Bottom