• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Quads finally coming into their own?

then there wont be any games that support the 6-8 cores for a good year after their release tbh.

If companies spent the extra 6 months and wrote everything to instead of using a fixed number of cores be a true parallel program then we would not have this problem.

And the investment of 6 months will mean that their engine would not need to be re-written down the line to add more core support.
 
They just need to write the next Unreal Engine so it works well with quads, and the next few years wort of games will mostly work ;)
 
Quads would be a wiser purchase, especially with Windows 7, multi-core games, app's etc becoming more prevelant. I have a Q6600 (G0) running at 3.5Ghz that would wipe that floor with most dual cores anyway - and at around ~ £120-150 they are a steal.
 
the only people who should be buying duals are prolific overclcokers and upgrades, or people who can't afford quads.
Quads are the future and will see nice increase in speeds late 2009 when dx11 comes out. Most people like me, only upgrade every few years and for us lot quad is the only sensible choice.
Hey AcidHell2, I kinda understand what you are saying but I don't share your viewpoint. It would seem that yourself and a few others are perhaps a bit shy of cracking open your case to swap out components or maybe your just can't be ****** :p

While I am aware of the strengths of a quad core it seems you are not aware of their downside . . . which is they always use more power/wattage/electricity than a dual core so basically they have higher running costs, less eco and basically for what?

What is the logic in increasing your power consumption to fuel a technology that get's very little use? and lets be honest even now in late 2008 there is precious little outside the professional media content-creation arena that benefitsn from the true power of a quad.

The rule of diminishing returns sadly applies to the extra cores for most users and forum nOObzillas however you still pay for it :confused:

Botton line = Power Efficiency, look into it when you have some free time! :)

It doesn't matter on ocuk, the dual owners will still tell us otherwise ;) I personally don't see the point in dual processors now apart from laptops where lower power is such a factor.
Lower power consumption is not just a buzzword used for laptops, Lower Power Consumption is probably the most important thing driving computer hardware design today!

If a dual-core chews through most computer tasks we throw at it and at the same time uses 50% less power . . . what;s not to like? :cool:

those that keep saying a dual core is better for games are just sore that they spend more money for a wolfdale dual core than purchasing the 65nm q6600.
those guys need to learn that the clock speed race ended a long time ago and its all about multi-threadding for now and the future.
I think you need to learn it's not about MHz or Multicore anymore, it's about power consumption, system and memory bandwidth and years of developers cooking up some really great software that we can use.

I'm not sure why you think anyone would be *sore* for buying a piece of computer equipment, is that what you do Mav? do you get sore of miffled when u buy computer hardware, strange? :p

Personally if I buy a piece of kit and I don't like it/it's doesn't perform as I expected or perhaps it just doesn't play nice then you either return it or sell it on, learn from the experience and move on! :cool:

In case you didn't try one yet Mav then you should play with a wolfdale, they are really brilliant! :D
 
small margin of error, but:

TDP = Stock TDP * (MHz / Stock MHz) * (voltage / stock voltage)^2

So for example:

Q6600 1.25vid @ 3.6ghz 1.35V = 95W*(3600/2400)*(1.35/1.25)^2 = 166W
Q9550 1.2vid @ 3.6ghz 1.25V = 95W*(3600/2830)*(1.25/1.2)^2 = 131W

So about a 35W difference. Obviously depends on actual voltages used...
 
Im very seriously considering going back to a dual core, maybe E8500 or E8600. Ive been running a Q6600 for a while now, (successfully clocked @3.8ghz) The long promised gains in games with quads, have failed to materialise and i find myself doing less video encoding, which i got the quad for in the first place. Add to that the fact that the q6600 is a bit on the power hungry side and chucks out a lot of heat, plus i wouldnt mind something different/newer to have a play around with.
 
Hmm so basically I'm saving 28-30w of power having moved from a P35 based board to a P45 board, and I could save another 30w by moving to a 9xxx CPU

The problem I have with this is if it costs £100 to upgrade I'd have to use the CPU solidly for months to actually recoup that cost, which doesn't seem worth it. Energy savings alone cannot be the only motivation...
 
Well that's true also, but then until GTA IV came along there was no game a Intel CPU running over 3Gig could not run perfectly. I mean looking at your spec gurusan your CPU is insanely fast and surely completely bottlenecked by the single HD4850?
 
I'd take a 45nm quad over the Wolfdale :)
How come? ePeen? back to back video encoding? fold, encode and game at the same time, what? :D

However, the Quad costs a lot more.
It does, also costs more to run!

I can understand some peeps may not be into eco computing, they would rather talk about their computer hardware etc but what I don't get is why a lot of people have no idea about kWh?

Obviously a fair share of forum posters here are still living at home and have no idea what an electricity bill looks like but for anyone much over 18 there is no reason to desire paying more £££ every month for no real reason!

Lol I'm fighting a loosing battle here trying to get this energy consumption thing across! :o:p
 
Well that's true also, but then until GTA IV came along there was no game a Intel CPU running over 3Gig could not run perfectly. I mean looking at your spec gurusan your CPU is insanely fast and surely completely bottlenecked by the single HD4850?

Not sure you realize how quick a 4850 at 880/1150 is :) I'm only at 1680x1050...if you get the RV770 above 830-850mhz it really does fly.

That said, I have a 4870 coming tomorrow and will be vmodding it, hoping for well past 900mhz as a daily clock.

Lol I'm fighting a loosing battle here trying to get this energy consumption thing across! :o:p

lol yeah I've been watching you in your struggle. I do agree with your point of view but I think its the fact that a quantifiable difference is not entirely easy to see. Not to mention at idle our puters really dont use too much....mine doesn't at least. My vcore rises under load as does my gpu voltage....I wish I could make my CPU idle at .8V-1V or something like I can with my 4850, perhaps in the next gen CPUs.
 
Last edited:
It does, also costs more to run!

I can understand some peeps may not be into eco computing, they would rather talk about their computer hardware etc but what I don't get is why a lot of people have no idea about kWh?

Obviously a fair share of forum posters here are still living at home and have no idea what an electricity bill looks like but for anyone much over 18 there is no reason to desire paying more £££ every month for no real reason!

Lol I'm fighting a loosing battle here trying to get this energy consumption thing across! :o:p

It comes out at something like £20 more a year in energy use I think
 
Obviously a fair share of forum posters here are still living at home and have no idea what an electricity bill looks like but for anyone much over 18 there is no reason to desire paying more £££ every month for no real reason!

Being I own my house and am very much responsible for my own electricity bill that isn't the case. But the simple fact is I already have a Q6600, theres no point in getting a dual because in my eyes it's a step back and theres no point in spending another £200+ on a 9xxx chip to save a few quid a year in electricity. Anyhow the Q6600 means I don't need the heating quite as high ;)
 
Now is that having your PC on 24 hours a day, 7 days a week, 52 weeks a year?

At stock the E8xxx is a TPD of 65W, the Q94xx is 95W, that's 30W difference.

30W = 0.03kW

Now, there are 24*365 hours in a year, so 0.03*(the value above) gives us 262.8kWh extra in a year.

Electricity is what, £0.10 per kWh, maybe £0.13 either way that works out at a max of £34 a year.

Now, CPUs do not have to draw full power 24/7, and infact they dont ... meaning that the actual difference between is way less.
 
I would rather pay an extra £34 a year to have the benefit of 2 extra cores as and when you need them.

Its like the AA, you can do most mechanics but when you need the extra help its there on call :D

Big Wayne i see you post often on leccy bills and cost of running and you will be losing the battle here everytime, at first i didnt agree too much but my parents did moan about the rising cost of our bill, probably didnt help i was off work for 7 weeks after an operation on my foot so PC was on solid for 10 hours a day for that time!

I did the change from Q6600 to E8600 before and was not impressed, i was going to do a proper comparision but a stick of my RAM died a day in and had to RMA the kit (guess what make lol) so my testing was Q6600 @ 3.6 with 4gb RAM against E8600 @ 4.5 with 2gb RAM. Have since lost the screenies since a format but both came out about the same (RAM probably accounted for a lot, crysis was unbenchable on my settings on 2gb ram)

I personally would go with a quad, thus me moving to a Q9650. I'm at or above the speeds a lot of people would go for on a dual (in general - not extreme/suicide clocks) and on decent volts too. Add in the fact same 45nm, double the cores, double the cache but literally same vcore needed (in my case - your experience may differ :p) then quads will always get my thumbs up
 
Back
Top Bottom