Another important feature alongside HT is 64-bit. Even video editing software can't use all cores equally to balance the load whilst programmed as x86. As with gaming we've seen that x64 makes little difference in distribution of core loads and was used mainly as a gimmick for AMD in FarCry in 2004. Like Crysis it was more like a tech demo and isn't anywhere as intensive compared to running large databases which is what it's designed for.
HT though is designed for mainstream use as most software can make use better of it. It's also useful if your building a media center PC to include a HT-capable CPU as the power usage will be lower due to extra 20-30% power compared to non-HT CPUs. I'm generalising there as I'm not sure exactly what the figure is. From Sandy Bridge CPUs using only 90w typical, the more cores available will equal even lower power consumptions whilst improving performance - as long as programmers keep improving their code. I've no idea how many cores is possible to fit on a die, there must be a upper limit. And then what, will 50 or 60 cores be enough by then as is 5 blades on a razor?
There is one good example of a certain software getting it right recently that I'm aware of. To render a 15min AVCHD clip; I noted in a magazine that Sony Vegas Movie Studio x86 took 46mins on a i7 860, 8GB RAM but only used 30% of the CPU. Meanwhile, PowerDirect 9 Ultra x64 took 31mins using 96% of the CPU. The previous version of PowerDirect was a regular x86 but CPU usage varied from 60 to 20% - a massive difference to performance.
I'm skimming lightly over the really technical aspects since this is best kept as a light-hearted exchange of opinions.
HT though is designed for mainstream use as most software can make use better of it. It's also useful if your building a media center PC to include a HT-capable CPU as the power usage will be lower due to extra 20-30% power compared to non-HT CPUs. I'm generalising there as I'm not sure exactly what the figure is. From Sandy Bridge CPUs using only 90w typical, the more cores available will equal even lower power consumptions whilst improving performance - as long as programmers keep improving their code. I've no idea how many cores is possible to fit on a die, there must be a upper limit. And then what, will 50 or 60 cores be enough by then as is 5 blades on a razor?
There is one good example of a certain software getting it right recently that I'm aware of. To render a 15min AVCHD clip; I noted in a magazine that Sony Vegas Movie Studio x86 took 46mins on a i7 860, 8GB RAM but only used 30% of the CPU. Meanwhile, PowerDirect 9 Ultra x64 took 31mins using 96% of the CPU. The previous version of PowerDirect was a regular x86 but CPU usage varied from 60 to 20% - a massive difference to performance.
I'm skimming lightly over the really technical aspects since this is best kept as a light-hearted exchange of opinions.