• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Q6600 vs Q9450

Well, come on then, you'll have to give me a bit more than that!! :)

My E8400 at 3.4ghz was max 5% faster in specific apps and a general PC benchmark (Custom PC) than an identically clocked E4300. Yeah, at super PI the E8400 was much faster, but in general use?

EDIT: For a Q9450 at 3.6Ghz to be like a Q6600 at 4Ghz, it needs to be approx 10% faster clock for clock. That's not the case comparing the Q9450 vs the Q6700 in the table above is it, it's more like 3%?
 
Last edited:
Well, come on then, you'll have to give me a bit more than that!! :)

My E8400 at 3.4ghz was max 5% faster in specific apps and a general PC benchmark (Custom PC) than an identically clocked E4300. Yeah, at super PI the E8400 was much faster, but in general use?

EDIT: For a Q9450 at 3.6Ghz to be like a Q6600 at 4Ghz, it needs to be approx 10% faster clock for clock. That's not the case comparing the Q9450 vs the Q6700 in the table above is it, it's more like 3%?

http://forums.overclockers.co.uk/showpost.php?p=10368310&postcount=102

graysky requested me to run his new X264 benchmark. Thread on XS here. It shows that the Q9450 (2.66Ghz) is equivelent to a Q6600 @ 3Ghz for encoding DVD using next generation codecs.
 
Well, come on then, you'll have to give me a bit more than that!! :)

My E8400 at 3.4ghz was max 5% faster in specific apps and a general PC benchmark (Custom PC) than an identically clocked E4300. Yeah, at super PI the E8400 was much faster, but in general use?

EDIT: For a Q9450 at 3.6Ghz to be like a Q6600 at 4Ghz, it needs to be approx 10% faster clock for clock. That's not the case comparing the Q9450 vs the Q6700 in the table above is it, it's more like 3%?


This Penryn is 2 to 5 hundred mhz faster than a Q6600 clock for clock also using a lot less power making it easier to cool and overclock.
 
At 3.6ghz a Q9450 is like having a 4ghz Q6600

The SS4 Instruction set of the Q9450 sees huge gains in multimedia tasks.

It will need less vcore so in essence will run cooler.

Not quite, most people who know what they're talking about are saying that SSE4 is practically useless (admittedly I don't have a clue about that).

Also most benchies aren't showing a 10% improvement, its ranging between 2-12%...

Most games are showing ZERO tangible improvement.
 
Also, a CPU benchmark is going to be tailor made to take advantages of cpu features that prolly won't be used by most programs in every day tasks etc.
 
Not quite, most people who know what they're talking about are saying that SSE4 is practically useless (admittedly I don't have a clue about that).

Also most benchies aren't showing a 10% improvement, its ranging between 2-12%...

Most games are showing ZERO tangible improvement.

Well I know what I am talking about.

Adobe premier for example, The latest version programmed with SS4 instructions see big gains when video editing.

Games will show little gains due to current games being GPU limited not Cpu

limited.

So your last post is a moot.
 
I did see some comparisons on some foreign site some days ago, can't remember where (if I find it again, I will post it), between the Q6600 and Q9450 and at the same GHz, the 9450 was between 10-15% better than the Q6600, but in some types of encoding it was more than 15% better. :)
 
You gotta ask yourself why your upgrading?

Unless your a heavy duty Video Editor or into 3D Graphics where you have long render times you are not gonna see a improvement moving to a quad core from your current nicely overclocked E2160.

However if you really do need the quad power for very cpu intensive tasks then the newer Yorkfield quads are just the ticket, new instructions and extra cache and a more efficient core will lend themselves nicely to your editing tasks, not to mention the lower power consumption (i,e cheaper bills, more eco!).

The G0 Quads are getting long in the tooth now, certainly very normal kit around here and certainly not worth the money unless your do indeed measure your tasks in 30min intervals over 0.03ms

Some people do not understand what a quad core is really for but instead just jump on the bandwagon?

If your main hobby is sitting in your bedroom playing with your chips then by all means spend £150 on a Quad G0 and have fun overclocking it and benching etc but if your looking for real value for money I would find a better reason to upgrade than just being bored.
 
Well there's a number of reasons really.

Firstly (and the main reason really), I do a lot of video encoding. Some times with files as large as 6-8gb in batches. Mostly using tmpenc 4 express which seems to support SSE4.

Secondly, I want to future proof my PC as much as I can now because I know that I won't have the money to do it for some time. I've been saving up for this........and re-tiling the bathroom!

Thirdly, It's a hobby but I don't sit in my bedroom playing with my chips........I do it in the Garage where my wife can't see what new bits are being fitted!
 
Firstly (and the main reason really), I do a lot of video encoding. Some times with files as large as 6-8gb in batches. Mostly using tmpenc 4 express which seems to support SSE4.

Then you have almost answered your own question, if you do a lot of encoding then the extra £80 - £100 could well be worth it.

For the 90% of people who dont use there computers to encode the G0 would be better.
 

You come across as a funny so and so sometimes easy!! :)

Anyway, thanks for the links, but I'm not sure it's as conclusive as your silence hinted at!

The original x264 video encoding link over on XS showed the Yorkfield's being 5 to 6% faster clock for clock than the Q6600. I've already said where an app uses SSE4, the Yorkies will be quicker, no question, but generally I'm not convinced about how much faster overall they are than the Kentsfield chips yet.

I suppose I'm trying to ask are the general performance increase worth the extra outlay? I'm sure once folks on here get them, we'll know, and I hope they do perform as well as we hope. Certainly they should use less power, which is a good thing.

Is this how urban computer myths start - before you know it everybody bandies about the 'fact' that a Yorkfield at 3.6 is 'like having a Q6600 at 4Ghz'. They were saying the same about the E8XXX CPU's, and I know for a fact they weren't, at least in my test system. So, how are the Yorkfields different, apart from the obvious of having two more cores?

I'm going to find out for myself soon enough anyway, as got a Q9300 on order.

One thing I do know for sure though, who's going to be buying a Q6700 when the Q9450 arrives?! The Q6700 never looked the most competitively priced chip, but it looks even worse value now.
 
Last edited:
You come across as a funny so and so sometimes easy!! :)

Anyway, thanks for the links, but I'm not sure it's as conclusive as your silence hinted at!

The original x264 video encoding link over on XS showed the Yorkfield's being 5 to 6% faster clock for clock than the Q6600. I've already said where an app uses SSE4, the Yorkies will be quicker, no question, but generally I'm not convinced about how much faster overall they are than the Kentsfield chips yet.

I suppose I'm trying to ask are the general performance increase worth the extra outlay? I'm sure once folks on here get them, we'll know, and I hope they do perform as well as we hope. Certainly they should use less power, which is a good thing.

Is this how urban computer myths start - before you know it everybody bandies about the 'fact' that a Yorkfield at 3.6 is 'like having a Q6600 at 4Ghz'. They were saying the same about the E8XXX CPU's, and I know for a fact they weren't, at least in my test system. So, how are the Yorkfields different, apart from the obvious of having two more cores?

I'm going to find out for myself soon enough anyway, as got a Q9300 on order.

One thing I do know for sure though, who's going to be buying a Q6700 when the Q9450 arrives?! The Q6700 never looked the most competitively priced chip, but it looks even worse value now.


TBH most people on here with the 8400 upgrade behind them rarely used

their PC's for multimedia tasks.

They use them for games.

And we all know modern games are GPU limited rather than CPU limted.

The Q9300 is a bad chip.

Lower multi means you will never max the chip.

Wether the extra for people is up to them.

The Q6600 is a great chip and I ran mine @ 3.8ghz 24/7

I would rather have a q9450 @ 3.6ghz anyday of the week thats why I sold it.:)
 
The Q9300 was all I could get my hands on relatively quickly from our usual supplier.

I've got a mobo that will do 450fsb, so should be good for 3.2 Ghz+.

It's full time job will be batch photo processing, so it will be an interesting comparison to see how it does against the Kentsfield setups that already do the same tasks.

I'm not saying the Yorkfields are bad chips by the way far from it. I'm sure they will be better, I'm just trying to understand by how much! :)
 
I'm sure it will, depending on the task.

There seems to be less of a performance improvement from 4mb to 6mb than 1mb to 2/4mb's though, but again depending on the software being run.

Super PI seems to benefit!! :)
 
Read this thread:- Yorkfield Q9450 vs Kentsfield Q6600 at 3.6Ghz

Look at the temps of the PWM's, 30 celsius less with the Q9450, not sure of coretemp reading the cpu temps right or not but they do run cooler.

Power consumption and Heat
Q9450 @ 3.6ghz

Q9450%20Prime95.JPG


Power consumption and Heat
Q6600 @ 3.6ghz

Q6600%20Prime95.JPG
 
Impressive if true!

According to Abit EQ;

Q9450 = Cooler PWM's

Q6600 = Cooler CPU???

Seems something not quite right there. Maybe Abit EQ can't read the systems properly?

Also, does Coretemp work on the Yorkfields do we know?

That shows temps more like I'd expect going on results with the Wolfdales vs previous Core2's.
 
Well I know what I am talking about.

Adobe premier for example, The latest version programmed with SS4 instructions see big gains when video editing.

Games will show little gains due to current games being GPU limited not Cpu

limited.

So your last post is a moot.

Chill out, just trying to get some debate going. I've read of on a few video forums that SSE4 is pretty useless in terms of video encoding and its not likely to be widespread for quite a while.

On the front of being GPU limited, I'm talking about low res stuff to demonstrate the power of the CPU, GPU isn't really coming into it at all, or minimally. Anyways no need for cheek!
 
Back
Top Bottom