• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

6000x2 + 8800gt vs q6600+ 8800gt

Caporegime
Joined
1 Jun 2006
Posts
33,820
Location
Notts
what difference in games would i notice fps wise would it be worth the extra cash if not overclocking? this is for a friend whos deciding what to build his new pc.it would only be a few fps wouldnt it.he only got a 19 inch monitor aswell.
 
real fps difference would be handy as dont want to waste money on quad if its only going to be like 3 fps on cod 4 which he will be playing mainly on a 19 inch mon.
 
Well as i told you it depends, if you want to be absolute then real world difference will not be significant NOW.

However, if its a new build it makes little sense to go amd now, even if he is not overclocking. The Dual core Intels are better than the old Athlons and phenoms are not yet up to par.
 
If he can possibly stretch then get the Q6600, otherwise I'd go for a P35 motherboard with a E6550 or E6750 for much nearer price to the AMD 6000+, that way in the future an Intel quad-core is just a drop in upgrade. As much as I like AMD the current crop of CPU's isn't quite as good as Intels.
 
For games there is absolutely no difference with that graphics card unless your mate is gaming at 1024x768. That is pretty much the only resolution at which the CPU makes any impact.

You can't buy a bad processor nowadays (no, not even Phenom is 'bad') that won't run your mate's games to their fullest extent with that graphics card.
 
For games there is absolutely no difference with that graphics card unless your mate is gaming at 1024x768. That is pretty much the only resolution at which the CPU makes any impact.

You can't buy a bad processor nowadays (no, not even Phenom is 'bad') that won't run your mate's games to their fullest extent with that graphics card.

Depending on which games he plays ofc ;) But you did state that it would be FPS games, and there won't be any noticable difference, games such as the RTS genre will benefit from the quad (if it supports quad core, even more so...)

But completely agree with Dureth
 
For games there is absolutely no difference with that graphics card unless your mate is gaming at 1024x768. That is pretty much the only resolution at which the CPU makes any impact.

You can't buy a bad processor nowadays (no, not even Phenom is 'bad') that won't run your mate's games to their fullest extent with that graphics card.

Sorry to disagree but there is a difference. I game at 1280x1024 and have just rebuilt my rig as it is in my siggy. I have just spent the last week benching this rig with cpu and gpu at stock, cpu at stock with the gpu overclocked to 702/1782/1900, cpu at 3ghz with the gpu at the same overclock and the cpu at 3.6ghz, again with the gpu at the same overclock. What i found was that the cpu does have a large effect on fps in games.

Company of Heroes. Windows XP, DX9, settings completely maxxed out and set to ultra where available.

CPU & GPU @ stock = Min 18 / Max 77 / Avg 73

Cpu @ stock, Gpu @ 702/1782/1900 = Min 21 / Max 76 / Avg 74

Cpu @ 3ghz (8x400), Gpu at 702/1782/1900 = Min 36 / Max 76 / Avg 74.5

Cpu @ 3.6ghz (9x400), Gpu @ 702/1782/1900 = Min 39 / Max 76 / Avg 74.5.

While the avg does'nt really increase look at the min fps. Thats a huge increase. It's clear that the card was held back by the Quad at stock.

World in Conflict. Windows XP, DX9, settings completely maxxed out and set to 4x AA & AF.

CPU & GPU @ stock = Min 20 / Max 86 / Avg 42

Cpu @ stock, Gpu @ 702/1782/1900 = Min 23 / Max 95 / Avg 45

Cpu @ 3ghz (8x400), Gpu at 702/1782/1900 = Min 29 / Max 95 / Avg 51

Same results for 3.6Ghz as 3 Ghz.

Not so big a result here but there is still a decent gain in the minimum and average fps to make a difference.

I ran the Crysis demo but no amount of overclocking on either the Cpu or Gpu made a lot of difference. With everything set to high along with 4x AA and AF i got more or less the same results of Min 17.97 / Max 37.99 / Avg 31.66.

All of the 3DMark benchies also showed huge gains from overclocking the Cpu. I have'nt listed them because they are not games and this is all about gaming. So as you can see, having a powerful CPU really does make a difference and basically sets the card free.
 
Apologies - I've been looking at benchmarks on Guru3d and FiringSquad... anything above 1024x768 between an FX-62 and a 4GHz Yorkfield was limited to around 15fps difference.

And does it set the card free? Your maximum frame rate is still limited to #73-75fps regardless of CPU speed.

It is nice to see that the minimum frame rate improves in DX9, but I'd be curious to see what the effect of running DX10 would be.

I'm not saying the AMDs are better all-round CPUs - quite simply put, they aren't and get walloped by Core2s. However, if gaming at or above 1280x1024, there isn't that much difference to be had.
 
Apologies - I've been looking at benchmarks on Guru3d and FiringSquad... anything above 1024x768 between an FX-62 and a 4GHz Yorkfield was limited to around 15fps difference.

And does it set the card free? Your maximum frame rate is still limited to #73-75fps regardless of CPU speed.

It is nice to see that the minimum frame rate improves in DX9, but I'd be curious to see what the effect of running DX10 would be.

I'm not saying the AMDs are better all-round CPUs - quite simply put, they aren't and get walloped by Core2s. However, if gaming at or above 1280x1024, there isn't that much difference to be had.

It really does'nt matter about the max, it's all about the minimum. COH for example, the minimum with just the card clocked is 21fps which could lead to some stuttering. Overclock the cpu and it goes up to a whopping 39fps. That's extremely smooth gaming. WIC tells a similar story, although not to such a large extent.

As it happens i will be installing Vista on another hdd in this pc. I was curious about DX10 and got Vista home premium for a tenner. :eek: I will run the same tests with Vista and COH and Crysis. Any other DX10 games about? Preferably demos with inbuilt benchies. It will probably be the end of the week before i get it all installed and the benchies run though.
 
That's cool - I apologise for making the blanket assumption at anything over 1024x768. What is the load - core-wise - on the Q6600?

I would be curious to see if the effect were replicated at 1600x1200.
 
It really does'nt matter about the max, it's all about the minimum. COH for example, the minimum with just the card clocked is 21fps which could lead to some stuttering. Overclock the cpu and it goes up to a whopping ...

Min fps doesnt always matter, you may experience near Identical performance after a few seconds once the game has loaded. The min fps isnt a good enough to go by on its own you have to take the average into account too. Example, when you run 4 loops of a Crysis benchmark only the first will have a low minimum fps the others will be similar because you only need to load the level once in a real game.

Since the average and maximum fps are nearly identical in CoH this shows that there is only a short period of time when the fps drops and its doesnt keep dropping (otherwise the average would change too).
This may be due to the fact that there isnt enough stuff going on in the Game to drop the fps a lot if the fps is dropping due to general gameplay rather than loading.

In WoC you can see that the CPU makes a bit more difference but nowhere near as much as oc'ing the GPU, if you were comparing the AMD6000+ to the Q6600 you would probably see a bigger difference in fps.

If I were building a new PC id get the Q6600 I doubt he'll be using the PC for only gaming i'll make an improvement in general PC usage, also the AMD6000+ as been around £100 for an age now.
http://www.overclockers.co.uk/showproduct.php?prodid=CP-161-IN
 
Last edited:
Sorry to disagree but there is a difference. I game at 1280x1024 and have just rebuilt my rig as it is in my siggy. I have just spent the last week benching this rig with cpu and gpu at stock, cpu at stock with the gpu overclocked to 702/1782/1900, cpu at 3ghz with the gpu at the same overclock and the cpu at 3.6ghz, again with the gpu at the same overclock. What i found was that the cpu does have a large effect on fps in games.

Company of Heroes. Windows XP, DX9, settings completely maxxed out and set to ultra where available.

CPU & GPU @ stock = Min 18 / Max 77 / Avg 73

Cpu @ stock, Gpu @ 702/1782/1900 = Min 21 / Max 76 / Avg 74

Cpu @ 3ghz (8x400), Gpu at 702/1782/1900 = Min 36 / Max 76 / Avg 74.5

Cpu @ 3.6ghz (9x400), Gpu @ 702/1782/1900 = Min 39 / Max 76 / Avg 74.5.

While the avg does'nt really increase look at the min fps. Thats a huge increase. It's clear that the card was held back by the Quad at stock.

World in Conflict. Windows XP, DX9, settings completely maxxed out and set to 4x AA & AF.

CPU & GPU @ stock = Min 20 / Max 86 / Avg 42

Cpu @ stock, Gpu @ 702/1782/1900 = Min 23 / Max 95 / Avg 45

Cpu @ 3ghz (8x400), Gpu at 702/1782/1900 = Min 29 / Max 95 / Avg 51

Same results for 3.6Ghz as 3 Ghz.

Not so big a result here but there is still a decent gain in the minimum and average fps to make a difference.

I ran the Crysis demo but no amount of overclocking on either the Cpu or Gpu made a lot of difference. With everything set to high along with 4x AA and AF i got more or less the same results of Min 17.97 / Max 37.99 / Avg 31.66.

All of the 3DMark benchies also showed huge gains from overclocking the Cpu. I have'nt listed them because they are not games and this is all about gaming. So as you can see, having a powerful CPU really does make a difference and basically sets the card free.

It might be worth metioning that both of the games you tested are RTS which generally do benefit from a faster cpu or quad core.

However the op stated that his friend will mostly be playing FPS games and therefore the cpu difference won't be as (if at all) apparent.
 
It might be worth metioning that both of the games you tested are RTS which generally do benefit from a faster cpu or quad core.

However the op stated that his friend will mostly be playing FPS games and therefore the cpu difference won't be as (if at all) apparent.

I know for a fact from my pre upgrade rig (320MB GTS, [email protected], 2x1GB Geil Ultra PC2-6400) that the cpu made a big difference to my framerates in Oblivion. I will bench that using the ingame FPS counter along with Far Cry. I would do Stalker but the bloody disk blew up in my drive.
 
Back
Top Bottom