• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Quads finally coming into their own?

While I am aware of the strengths of a quad core it seems you are not aware of their downside . . . which is they always use more power/wattage/electricity than a dual core so basically they have higher running costs, less eco and basically for what?

What is the logic in increasing your power consumption to fuel a technology that get's very little use? and lets be honest even now in late 2008 there is precious little outside the professional media content-creation arena that benefitsn from the true power of a quad.

The rule of diminishing returns sadly applies to the extra cores for most users and forum nOObzillas however you still pay for it :confused:

Botton line = Power Efficiency, look into it when you have some free time! :)

Lower power consumption is not just a buzzword used for laptops, Lower Power Consumption is probably the most important thing driving computer hardware design today!

If a dual-core chews through most computer tasks we throw at it and at the same time uses 50% less power . . . what;s not to like? :cool:

I think you need to learn it's not about MHz or Multicore anymore, it's about power consumption, system and memory bandwidth and years of developers cooking up some really great software that we can use.

In case you didn't try one yet Mav then you should play with a wolfdale, they are really brilliant! :D

used 4ghz wolfdale and wasnt impressed, a q6600 @ 3ghz quad provided snappier system performance and far more processing power. power efficiency doesnt really come into it if you have the knowledge on hardware to begin with.
take the above example, 2.4ghz q6600 did an encoding job much faster than the wolfdale dual core at 4ghz. where is the power savings comming from when you have to leave the dual core machine on for a lot longer to get your encoding done?
but i do understand where your comming from, those who dont do heavy encoding or use multi-threadded apps are probably better off spending the extra 30 quid on a slower dual core cpu and saving on average 10 quid extra a year on eleccy bills (provided both quad and dual are run on for same length of time).
false economy i say when you have to run the dual core cpu for longer periods of time to get the job done as the quad. these days with the ability of running speed step along with overclocks without issues (on the right motherboards) and the eleccy bills wont mean squat.
its a trade off at the end of the day, you want raw power then you have to pay the price for it. q6600 is still the best bang for the buck cpu, no point buying a dual core these days especially if you have to pay more than the cost of a quad for it.
 
It does, also costs more to run!

I can understand some peeps may not be into eco computing, they would rather talk about their computer hardware etc but what I don't get is why a lot of people have no idea about kWh?

Obviously a fair share of forum posters here are still living at home and have no idea what an electricity bill looks like but for anyone much over 18 there is no reason to desire paying more £££ every month for no real reason!

Lol I'm fighting a loosing battle here trying to get this energy consumption thing across! :o:p
No not at all, remember overclocking uses more power also, even tho my e4300 will run at 3.3GHz, I run it at 2.4GHz as it uses much less power.

Also, a quad at idle doesn't use that much, it's only under load to you see the quad using more power...

I do have a q6600 also running at 3.6 under water, but that's for use as a DAW
 
these days its costing more outlay to be eco-friendly. if your paying more to get a dual core cpu over a quad to save a tiny bob on the eleccy bill you may as well fork out the cash on ddr3 ram which runs at lower voltage than ddr2 so it runs cooler and sucks up less power. question is at what price does this all come to?
 
these days its costing more outlay to be eco-friendly. if your paying more to get a dual core cpu over a quad to save a tiny bob on the eleccy bill you may as well fork out the cash on ddr3 ram which runs at lower voltage than ddr2 so it runs cooler and sucks up less power. question is at what price does this all come to?
If you really want you could build a low power system that uses less than 100W... It will be pretty useless for gaming, but fine for high def video and many other things...

High end GFX cards use a silly amount of power compared to cards like the 4670 which offer 8800GT performance but 30W max and 3W idle power consumption....
 
I'm new to this forum but I've been a member on a couple of the bigger stateside ones, this thread is like deja vu from last year..lol.
Normally started by quad core owners who see some favourable gaming review and leap up with "see I told you so!", it is, after all a glimmer of joy for the owners of a chip that sits 50% idle most of the time whilst the cheaper wolfdale chips scream past them at 4ghz+.
I don't know why this happens, it's simply "horses for courses" surely, liken the Q6600 to a big Scania lorry with a 260hp V8 and the E8400 to a 300hp 4 cyl Mitsubishi GTO, sure the Mitsi is going to win most races...until you stick a 16 ton load behind it and then it's Scania all the way.
All you've got to do in either instance is decide at the outset what you need it for and spend your money there, I personally don't do much encoding and only dabble in Photoshop / play a few games, for me having studied just about every review on the web the E8400 was the best choice (esp now I've turned it, easily, up to 4ghz). But had I done a lot of intensive photo workflow or a lot of movie encoding I would have just as happily bought a Q6600.
Very few games fully utilise 4 cores and those that do tend also to be quite happy with screaming mhz anyway.
This situation doesn't, as far as I can see, seem to be changing any time soon so I can't blame those that bought Q6600's on the promise that they would provide a future gaming nirvana for a bit of sour grapes.
 
used 4ghz wolfdale and wasnt impressed, a q6600 @ 3ghz quad provided snappier system performance and far more processing power.

Snappier system performance? Something must have been wrong with your setup because I can't tell a difference between my Q9650 @ 4.4ghz and my E8500 @ 4.4ghz in terms of general system performance, or games, or ANYTHING at all.... other than benchmarks and encoding (something I very, very rarely do).

So a 3ghz old school 65nm q6600 which is significantly slower clock for clock feeling snappier than a 45nm dual at 4ghz must have all been in your head, or something was wrong with your setup.
 
I built my PC to play SupCom: Forged Alliance.

Quads are much faster than dual cores in that game, end of. No sour grapes required, it's just a simple fact. Having played many LAN games you can see the difference between the Dual & Quad systems on the in game stats as AI and game engine are actually shared peer to peer in SupCom. :)
 
Glad you know what you want from your quad and are benefiting from it.

But the fact is that not everyone buys a quad for just 1 game....and there are only a couple games that actually show real world SIGNIFICANT performance benefits with a quad. Supreme commander is 1, GTA4 is the other that comes to mind and the game is dreadful on any system really so in my book that should just be ignored.

UT3, maybe...but who cares if you are getting 200fps when duals are getting 185....you're never going to tell the difference so that's not really significant, and the game is old anyway.
 
If you really want you could build a low power system that uses less than 100W... It will be pretty useless for gaming, but fine for high def video and many other things...

Yes - I have a Q6600 box, using a Gigabyte G33 board (P35 with on-board video). Running chess programs using all 4 cores 100% load, it only pulls 110w, measured using one of those plug-in energy monitors. And yes it would be useless for games... I could overclock of course, but then energy consumption would increase

I can use all the cores I am given pretty much - lots of commercial chess programs are certified to run with 8 cores and some more than that. So I have a definite use for all 3 of my quad boxes. A dual won't cut it.

As one of the previous posters said - there is no single answer for everyone. Decide what you need to use your machine for, and very importantly how long you intend to keep it, then choose the best CPU for the task
 
ive got a e6600@3ghz would it really be a huge amount of differnts if i went for a Q9550?

There would be if your using the cpu power you have now a lot. No point if your only use it at 100% once a week for 20 minutes.

You should be able to clock 3.4/3.8 with a Q9550 and it will run quicker clock 2 clock then your Q6600.

Dont buy if its just a toy to play with, and you wont use it. Overclock your Q6600 a bit more if you want to do that :)
 
Snappier system performance? Something must have been wrong with your setup because I can't tell a difference between my Q9650 @ 4.4ghz and my E8500 @ 4.4ghz in terms of general system performance, or games, or ANYTHING at all.... other than benchmarks and encoding (something I very, very rarely do).

So a 3ghz old school 65nm q6600 which is significantly slower clock for clock feeling snappier than a 45nm dual at 4ghz must have all been in your head, or something was wrong with your setup.

worked well in my experience, set off a video to encode and then did the general web browsing and played crysis and the quad core setup was in a league of its own. the dual core setup was chugging along with stutters now and then and with a lot less performance than the quad.
 
There would be if your using the cpu power you have now a lot. No point if your only use it at 100% once a week for 20 minutes.

You should be able to clock 3.4/3.8 with a Q9550 and it will run quicker clock 2 clock then your Q6600.

Dont buy if its just a toy to play with, and you wont use it. Overclock your Q6600 a bit more if you want to do that :)

he got e6600 not q6600?:confused:
 
L4D seems to keep all my four processors reasonably busy...

Is it really that demanding? I have maxed everything except resolution as much as possible and its still pretty damned smooth on an E5200 at 3.4ghz with an 8800GT with 1400Mhz DDR. Admittedly its at 1280 resolution because I'm monitor limited for a month or two though.
 
Is it really that demanding? I have maxed everything except resolution as much as possible and its still pretty damned smooth on an E5200 at 3.4ghz with an 8800GT with 1400Mhz DDR. Admittedly its at 1280 resolution because I'm monitor limited for a month or two though.

I was watching last night... I have everything set to max and all the 4 cores seemed to be running at about 75-85%...
 
small margin of error, but:

TDP = Stock TDP * (MHz / Stock MHz) * (voltage / stock voltage)^2

So for example:

Q6600 1.25vid @ 3.6ghz 1.35V = 95W*(3600/2400)*(1.35/1.25)^2 = 166W
Q9550 1.2vid @ 3.6ghz 1.25V = 95W*(3600/2830)*(1.25/1.2)^2 = 131W

So about a 35W difference. Obviously depends on actual voltages used...


Not sure you can use the TDP that way as its a thermal design guide not an actual consumption figure for that cpu for a given task

Agree that the power requirment varies with the square of the voltage, and has a more linear relationship with the frequency.

BTW 35w saved at the cpu levels goes with less power loss in the system PSU (80% efficient), mobo voltage regualtors(?) and less load on the cpu cooler so that 35 ish watts is more like >42 ish at system level. The more efficent system is quieter, smaller and uses less materials in its construction when running a decent cpu load.

It all adds up.
 
The TDP is typically not the most power the chip could ever draw (such as by a power virus), but rather the maximum power that it would draw when running real applications.
 
Not sure you can use the TDP that way as its a thermal design guide not an actual consumption figure for that cpu for a given task

Agree that the power requirment varies with the square of the voltage, and has a more linear relationship with the frequency.

BTW 35w saved at the cpu levels goes with less power loss in the system PSU (80% efficient), mobo voltage regualtors(?) and less load on the cpu cooler so that 35 ish watts is more like >42 ish at system level. The more efficent system is quieter, smaller and uses less materials in its construction when running a decent cpu load.

It all adds up.
Hmmm.... All I know is my Q6600, which is running at far high voltage than that (1.465), rarely even puts my fans into their higher speed mode. Typically all my fans are running in their slow state :)
 
Back
Top Bottom