• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Preps Core i7 9700k 8 core 16 Thread Mainstream CPU

Interesting point. How much more power does a hotter chip draw?

It's a basic thermodynamics law:

TDP= Thermal Design Power or the safe power that an object is made to absorb.
In thermodynamics the equation is Q=mCp(T2-T1)

Where Q is Heat (power)
m is mass
Cp=enthalpy
T2=Temperature 2
T1=Temperature 1

This is the basic engineering thermodynamics computation but for electrical components (CPU) the equation is different but the principle is the same.

So, stating the above facts, you will notice that there is a direct effect between Q (Power) and temperature as for the Temperature increases the power also increases.
http://www.tomshardware.co.uk/answers/id-2485090/affects-tdp-temperature-power-consumption.html

Do electronic devices consume more power when the ambient temperature is cold?
Just a casual observation: when it is winter, my personal devices such as cellphone and Ipad consume power more quickly compared when it is in the summer.

But this could simply because of change is usage habit: staying at home more leads to using your gadgets more often.

At the same time I have noticed that the back of my Ipad heats up a lot more during the winter, which could be caused by greater power usage.

Does anyone have any insights into how ambient temperature dictates power consumption in electronics (maybe even consumer electronics)?

It's the opposite, actually.
Your devices are made up of individual electrical components, while datasheet for the devices are harder to come by, we can look at component datasheets easily.

The above is a power consumption over temperature plot for a processor, one of the major power consumers in your devices. Most components have a positive coefficient for power consumption and temperature. Some have less linear curves where consumption is minimum at some middle value.

But for the most part, hotter devices use more power to operate normally.

As others have commented, this doesn't take into account the battery capacity when it gets cold. The lower capacity of the battery due to temperature might be large enough to make the devices seem to last less time in the cold. But, the answer of "do they consume more power when cold?" is no.
https://electronics.stackexchange.c...re-power-when-the-ambient-temperature-is-cold
 
Adored is that you? I thought this 8400 bs was already put to bed......
On another note there's a leaked intel roadmap pic showing no 8 core coffeelake nor icelake in 2018

No it really wasn't put to bed. Intel hit a wall with the current designs and they have to supply a lot of markets from a small pool of chips. Not only that they have make do until 2021. If Intel don't offer an 8 core chip on Z390 that's even more telling...
 
A hotter chip draws more power from the grid, so, no, using TIM isn't friendly for the environment.

Derp.

This akin to saying; My dog doesn't like sleeping in it's bed, so no, I don't wear black socks on a Monday.

I'm struggling to see some of the logic here. Compared to what, exactly? The mining of rare minerals?
 
Last edited:
No. It is like saying a hotter chip pulls more power though.

Another comment that needs no real response given the context. Gold and indium verses your (mums) electricity bill :D.

Sorry, but when you boil everything down that's being said - that's what we're talking about.

Anything for a bit of controversy.
 
Another comment that needs no real response given the context. Gold and indium verses your (mums) electricity bill :D.

Sorry, but when you boil everything down that's being said - that's what we're talking about.

Anything for a bit of controversy.

Well that's what you might boil things down to. Maybe others can can distil a little more from the solution.
 
The voltage is the same, the current increases ;)

First thing you've said that's remotely accurate lol :D

Well that's what you might boil things down to. Maybe others can can distil a little more from the solution.

As far as I'm aware, the ecological implications and reason came from Intel directly. So you'd need to put your tinfoil (or indium) hat on and discuss between yourselves further. When all is said and done, I'm in the same boat as you. Well, almost. You don't even own an Intel CPU, least not one that's affected - so there is that. But we'd both rather have a CPU that has the better thermal design.
 
Last edited:
First thing you've said that's remotely accurate lol :D



As far as I'm aware, the ecological implications and reason came from Intel directly. So you'd need to put your tinfoil (or indium) hat on and discuss between yourselves further. When all is said and done, I'm in the same boat as you. Well, almost. You don't even own an Intel CPU, least not one that's affected - so there is that. But we'd both rather have a CPU that has the better thermal design.

As long as I can cool it in 2 square feet case, the performance is good and the price makes sense. I couldnt give a monkey's about heat.

I would care if the crappy TIM was holding back the performance to stretch out a life cycle of a design and that come at higher power use for no reason. Pretty sure everyone on this forum should be concerned about that...
 
As far as I'm aware, the ecological implications and reason came from Intel directly.

I wouldn't be surprised if they get some government grants or tax incentives because of it.
I wonder if they'll switch the Xeon line to polymer TIM or something else anytime soon.

I think Intel is downward spiraling right now. Bad decision after bad decision. Soon they probably will become fabless, and in the more distant future may become of AMD's current size.
Your opinion isn't backed by reality though, have you taken a look at their quarterly reports?
Also this little bit of info from IDC: Skylake-SP Xeons are shipping in higher volume than any previous Xeon line. In HPC they have 94.2% of the market and 100% of the new deployments.
These are scary numbers, even with aggressive competition from IBM, AMD, Qualcomm and Cavium all releasing new, very competitive chips, they still managed to increase their Xeon shipments (2.67 million in Q3 2017).
So from that point of view, you have to be living in an alternate world if you think Intel somehow is on a downwards spiral. I'm still worried we might end up with a monopoly.

And as to why they're not soldering consumer chips, enthusiasts who will want to delid their CPUs are an insignificant part of the market. It's the sad reality, but you at least have alternatives from AMD if a soldered CPU is a must for you, for whatever reason.
 
As long as I can cool it in 2 square feet case, the performance is good and the price makes sense. I couldnt give a monkey's about heat.

I would care if the crappy TIM was holding back the performance to stretch out a life cycle of a design and that come at higher power use for no reason. Pretty sure everyone on this forum should be concerned about that...

Those that are concerned enough will opt to delid. As I say, from where I'm sitting typing to you now I'd much rather Intel kept using indium, but they won't.

And as to why they're not soldering consumer chips, enthusiasts who will want to delid their CPUs are an insignificant part of the market. It's the sad reality, but you at least have alternatives from AMD if a soldered CPU is a must for you, for whatever reason.

What good is indium when it doesn't overclock worth a damn! That and Synopsys and their involvement, who have no investment in overclocking, really.

What's your flavour, pick what's available lol.
 
Last edited:
What good is indium when it doesn't overclock worth a damn!

I suppose it just means you have a cpu operating at the correct temperature, with out the need to dismantle what is otherwise an excellent product and risk voiding your warranty. ;)

I'd still like to know how all those sandylake cpu owning people are getting on, especially given how much of an issue micro fracturing is these days?
 
Back
Top Bottom