• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is it time for Quad Core?

Obviously you feel strongly about the issue and like to think you are doing something positive.
Yeah, is that ok? :)

I don't think it's right to ram it down other people's throats though, especially on a hardware enthusiasts forum like this one.
I'm not raming anything down anyones throats, I'm discussing what I consider to be an important part of the hardware scene, alongside CRYSIS benchmarking, CPU Idle temps and VISTA rating scores.

What people consider important is up to them of course, all I am doing is saying "Hey guys, you want cheaper power bills then start paying attention to power draw" kinda thing . . . There is not many people doing it so I got my work cut out for me atm

It's a discussion and the aim of it is to share opinions and insights etc, it's all good! :cool:
 
Of course the answer to 99% of all "eco" arguments for product X over product Y is that the truly ecologically friendly thing to do would be to drop out of the consumerist rat race, don't give in to the marketing driven compunction to upgrade and stick with what you have. A new 8xxx will always be more polluting than an old Q6600 simply by the fact of having been manufactured. Buy second hand or keep your old gear.

All these arguments to the effect that there are no/few programmes that 'need' a quad core are equally applicable to the current crop of dual cores. What was that you needed it for? To play a game? Ah, yes... Seems to me, eco-pompousity is particulary misplaced on a forum dedicated to PC overclocking and upgrading ;) Bah! Humbug!

Seems to me the answer is whichever one is easier to get hold of? A draw
 
Last edited:
Seems to me, eco-pompousity is particulary misplaced on a forum dedicated to PC overclocking and upgrading ;)
I completely disagree with you there, it's very valid discussion for those not small minded enough to see boyond the fluff in their belly button!

Overclocking/tweaking are the highest forms of PC hardware mastery, some people cannot even get a computer they built to boot let alone run at year 2010 speeds. These same overclocking masters are also gurus to their own flocks and quite possibly an I.T professional who can influence the choice of which processor to use for their 300 machine roll out 2009.

50w, 100w, 200watts difference multiplied across hundreds maybe millions of machines is where my head is at, really huge difference, close down a nuclear power station or several kinda difference.

The people giving hardware advice have to take power costs into consideration, even the humble clocking teen living in his parents house cracking off the 3D-Marks, there's a lot of those I reckon, add them all together and the numbers soon rack up.

Think Big! :cool:
 
[TW]Fox;13090915 said:
I don't know, I've got no idea, I feel a bit like somebody who thinks he knows about cars and wanders into OcUK motors to ask an innocent 'Astra v Focus' question :p

Thought about a 1.6 golf with 18" audi reps?
 
You see, a little prod from me to prove your figures and the wattage has increase by nearly 100% from your earlier post! :p

Not really, as I did acknowledge that the 95W and 65W were stock figures and that overclocking increases power usage greatly :) (Been a while since I talked about this, but if I remember correctly, the formula that's used is that power usage goes up linearly with clock speed and with the square of any voltage increase. So it's overvolting that's bad for the climate!)


I wouldn't base a debate on something I had read from a 3rd party website, you will only really know how much power your computer is drawing by plugging it into a power-meter/brick thingie, they cost about £10-£12 and can display all kinds of useful information on the small LCD,
Admittedly the figures are estimates at best, based on Intel's official figures. But since we're talking about the power consumption of CPUs here, measuring the power drawn by my entire PC isn't going to help us very much :) If a dual ever comes into my possession, I'll find a watt meter and see first-hand what the difference is (not forgetting that PSUs are 80-85% efficent at best, so that needs to be factored in!)

(edit) It's also interesting that more and more people are buying laptops as opposed to desktop PCs. Laptops are much more power-efficient, so to an extent this problem is solving itself!
 
Anyway what the hell am I waffling about? . . . . oh yeah you bunch of nOOblets who think you know it all get off your lofty perches and listen to what some of us are telling you, don't be stubborn or proud, just fess up that actually it's possible to make a mistake from time to time, admit it, learn from it and buy yourself a lovely energy efficient Wolfdale, it will not hold you back in any way and you will using less electric and reduce your power bills, a positive eco friendly move and cash in the pocket!

Energy Efficiency
Energy Efficiency
Energy Efficiency
Energy Efficiency
Energy Efficiency
Energy Efficiency
Energy Efficiency!!!!!!!!!!!

Real energy efficiency with quads has been already been touched on in another thread you hijacked recently you trolling clown.

http://forums.overclockers.co.uk/showthread.php?p=12820973

The only real way to save any significant money is to simply turn off the PC when you're not using it. You think a CPU idle use with speedstep of 20W instead of 10W makes a huge difference when there's the whole PC using up electricity? Consider every component that needs to be on when the PC is working - lets say a dual core does a task in four hours and a quad in just over two. Where's the wasted energy here? The CPU? LOL.

BTW - you owe everyone here a couple of quid for the total power used while reading and replying to your tedious, ill thought out diatribes.
 
It is so tempting to turn this into a 'climate change' debate ;)

My computer power use contributes 0, yes, 0 to climate change.

It is not MY fault that the UK uses coal power instead of nuclear power, I have no effect on that.
Hence, it is not my fault UK produces lots of CO2 to power my PC.

Go bitch at Gordon Brown about not building nuclear power stations, I never voted for him so I fail to see why I should have to change what I do when the real solution lies with him.
 
Last edited:
My computer power use contributes 0, yes, 0 to climate change.

It is not MY fault that the UK uses coal power instead of nuclear power, I have no effect on that.
Hence, it is not my fault UK produces lots of CO2 to power my PC.

Go bitch at Gordon Brown about not building nuclear power stations, I never voted for him so I fail to see why I should have to change what I do when the real solution lies with him.

I think you misunderstand me - I think so-called climate change is a load of nonsense. You might want to direct your ire at Big Wayne if anyone :p

I agree we needed to build nuclear power stations years ago but this is probably more suited to SC than the CPU forum.
 
I think you misunderstand me - I think so-called climate change is a load of nonsense. You might want to direct your ire at Big Wayne if anyone :p

I agree we needed to build nuclear power stations years ago but this is probably more suited to SC than the CPU forum.

Yer heh, I was kidna quoting your post, but bitching at Big Wayne. My bad.

more electricity you use the more joe has to shovel more coal in the burner. fact.

No he doesn't .. as I said, it is NOT MY fault we do not use nuclear.
 
Real energy efficiency with quads has been already been touched on in another thread you hijacked recently you trolling clown
Steady on mate, it's just a discussion, no need to get heated :confused:

The only real way to save any significant money is to simply turn off the PC when you're not using it
Check out S3 standby when you have some free time, works pretty good although still a few niggling USB problems.

It is not MY fault
Look, I'm not apportioning blame, I'm not telling you your a bad bad man so stop running off screaming "It wasn't me, it wasn't me, Timmy made me do it!" :D

What I am saying is even if your really not caring about your planet, care about your wallet, most people don't really understand that a quad core has higher running costs and all I am doing is point this fact out, much to the displeasure of a few overclocked 65nm quad core owners it seems! :p

Feeling the heat though, good job I had those teflon panels installed, seriously though guys keep it friendly, keep the discussion going hopefully without a punch up! :o
 
My Q6600 is not overclocked and enjoys a lower VID and clock when not under load due to speedstep. I haven't left it overclocked because I know it would use more power and thus far there has been no game which has not performed perfectly without overclocking.

It isn't that I would begrudge paying a higher power bill for decent performance gains but I won't pay more if there are none.
 

I mean a quad @ default speed versus a quad @ overclocked speeds of course, rather than dual versus quad :D

At idle it uses 1.152v (1.6GHz) according to CPU-Z and I would imagine it isn't using much juice there. Presumably less than the P4 Northwood 3.0 I recently upgraded from which used 1.475v.
 
Quite an interesting thread, i own a q6600 myself which ive clocked up to 3.8ghz. A speed that i would never envisage running on a day to day basis. My system is run at a modest 3.0ghz on stock volts with all the various speedstep/CIE EIST progs running as well. Ive only ever run high clockspeeds when priming/benching etc, as for gaming and other tasks that i require, ive found 3.0ghz to be a nice compromise of speed, and a cooler running system.
 
Back
Top Bottom