• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Lowest CPU for a 8800 GTX?

juno_first said:
ok then,

Shadowrun with 14 bots:

AMD dualcore @ 2.64 = 30fps
Intel Quad @ 3.55 = 90fps

That's hardly a fair comparison though. AMD Dual core versus Intel Quad core. If you dropped the speed of either rig and then compared it to the same rigs faster speed would be a better indication. Such as the AMD at stock and then the AMD at 2.64ghz and the same with the Intel rig.
 
juno_first said:
ok then,

Shadowrun with 14 bots:

AMD dualcore @ 2.64 = 30fps
Intel Quad @ 3.55 = 90fps


lol :p

as useful as the 3dmark one

on a side note... too many people have 4xdrives (80 gig Hitachi's most of the time) in raid

I think i need to move to 6 in raid 0, 5 isn't enough :p
 
I'll find a way to go 7x RAID so naa :p :p

stunner

2.75 was the best I could get out my AMD with water & even if all at stock 2.2 dual vs 2.4 quad; even 1 core on the quad would be quicker than an AMD core.

I suppose a quad is a big leap from a 2 yr old amd but so are the fps eg stalker 40 vs 80

Its hard to say if quad is better than a higher clocked dual, recon depends on software.
 
Last edited:
what are the people talking about, cpu's in all but 3-4 games aren't the limiting factor.

it DOES NOT MATTER if you get higher fps down at 1024x768, or even 1280x1024, you do not buy a 8800 of any kind to play at that resolution, because a x1900 card can play those res's fine still.

if you can get 200fps in source at 1024x768, thats pretty much the cpu limit, if at 1920x1200 you are getting 100fps, the cpu is capable of double that, you ARE gpu limited.

i can list the number of games that are cpu limited on my fingers, i couldn't list the number of games that were gpu limited on my fingers, unless i had a couple hundred thousand fingers.

engines that are 5 years old can still be gpu limited. anyone that says otherwise is completely wrong.

as for dual gfx card setups, yes you can become more cpu limited at a higher resolution. but it won't be at the resolution the cards/price justify or are designed for. a single 8800gtx, at top settings on a hard engine is gpu limited at say 1600x1200, but in sli it becomes cpu limited. but theres little point having 2x 8800gtx's to play at that low a res. people with 30" 2500x1800 resolutions get sli, and at those resolutions you can make 8800gtx's in sli crawl and be gpu limited again.


EDIT:- a e6600 is more than enough, way more than enough. honestly, a £40 2.2Ghz x2 would only be 1-2% off the fps you get with a e6600, and overclock either cpu to 3Ghz and you'd again only see a 2-3% increase in fps, if that.
 
drunkenmaster said:
you do not buy a 8800 of any kind to play at that resolution

Oh dear, I have to go to nvidia prison as I run at 1280x1024 :p

Funny thing, my 19x12 screen can only be driven by my 7950gtx :(:mad:

lowest cpu i paired up my gtx with was an athlon 64 2800+ clawhammer at stock, wasn't too bad actually apart from the graphics card had more ram than the system :eek:
 
Anyone who says the CPU is more than the GPU is also wrong, the last good test I seen was the specs above where the CPU could not feed the single 7800GTX nevermind SLI and as most games I play are mostly about the GPU I dont really bother about this bottleneck talk.

This may have changed but I aint seen any reviews stating so.

Obv I would not run a 8800GTX with a low end CPU and I know what your getting at.
 
helmutcheese said:
Anyone who says the CPU is more than the GPU is also wrong, the last good test I seen was the specs above where the CPU could not feed the single 7800GTX nevermind SLI and as most games I play are mostly about the GPU I dont really bother about this bottleneck talk.

This may have changed but I aint seen any reviews stating so.

Obv I would not run a 8800GTX with a low end CPU and I know what your getting at.

Play oblivion with every setting at max (you cant do hdr and aa though) @ 2560x1600 with a c2d at 4ghz and tell me its holding back a 7800gtx :p
 
I had a 7900GTX and obv I missed out on HDR+AA same time, I think ATI got it with the Chucky Patch correct ?

My 8800 will though :)

The 7800GTX (256MB) was not using a C2D lol, I said the Evesham rigs had some AMD *** and Intels top CPU at that time. lol, the CPU's have came on but so have GPU's through 7900's, now 8800's.

It is not the case in all games or else SLI would not gain anything, obv in 3Dmark 2005 a slow CPU with a 8800GTX will score more then same CPU with a 7800GTX, 3d Mark 2006 will be different story though.
 
Last edited:
helmutcheese said:
AFAIK and by reading and I aint seen any actual reviews to show otherwise, no CPU can do more than top GPU's, so CPU is always the bottleneck eventually, the GPU will keep giving more.

I'm open to any links but at 1am and long day I will read tomorrow.

I posted some links above....

Essentially when playing at proper resolutions, aside from a few games (and 3dm06) even with a 8800GTX you do not get SEVERELY cpu limited even on low end C2D setups. Sure maybe 5-10% at worst but that difference is tiny compared to the gain from gpu upgrades.

To quote from something I wrote in PCG forum:

When you are looking at 1600x1200 and beyond, GPU is king and there is arguably a case to suggest that the best value systems for gaming are NOT well balanced systems with decent cpu/ram/gfx, but actually hideously unbalanced systems running a 8800gtx/ultra alongside a cheapo cpu.

Example: Say you have £450 to spend on gfx, cpu and RAM. The typical balanced approach would be say:

GFX: £175 (8800gts-320?)
CPU: £175 (Q6600?)
RAM: £100 (2gig PC8500?)

Whereas in reality a faster gaming rig for 'proper' resolutions would be something like:

GFX: £330 (8800gtx?)
CPU: £70 (E4300?)
RAM: £50 (2gig PC5400?)

In fact it wouldn't surprise me if even 8800ultra-SLI is gpu limited in some cases (high end cpu running very high res with max aa+af in some titles)
 
Last edited:
helmutcheese said:
Anyone who says the CPU is more than the GPU is also wrong,


i honestly don't mean to be rude, at all, but i'm not sure what you're trying to say. kinda guessing you're not english, because the way that sounds, is you're saying cpu isn't as important as the gpu, but other posts and rest of that one seem to be saying that a low cpu couldn't feed your gpu well enough so cpu is more important.

either way, 3dmark 06 isn't played at a resolution that you would game at.

with a 2.4Ghz conroe stock with a 8800gtx and a 8800gts you'd probably score pretty close with both, probably around the 9k-9.5k mark. thats fine, thats cpu limited, overclock that up to 3.5Ghz and the gts at stock wouldn't get much past 10k, and the gtx should be closer to 12k showing it WAS cpu limited. but at a LOW resolution without AA/AF, if you have the pro version, and you run the benchmark at 1600x1200 with full aa/af, of at 1920x1200 with aa/af you will score WAY lower, i dunno, say you scored around the 6k mark. you can see the score has dropped massively, ONLY due to the resolution, the cpu's speed, or lack of it, is irrelevant. the cpu is capable of letting both cards hit 9k, the only reason they don't hit 9k is because the gpu's are completely limited. almost every single game around behaves in the exact same way.

e-mail anandtech, or any respectable review site and ask them if they only benchmark games, at least for cpu reviews, at low resolutions and settings just so that the gpu's aren't limited, letting us see the cpu's theoretical power. but in gpu benchmarks, they use much higher settings. they will tell you, and most state in their cpu reviews, why they do it, and that in super high resolutions don't show any real difference.


xbitlabs.com did a very good review, was AGES ago now, looking at newish games , same gfx card, different cpu, there was little difference between a top clocked ath fx, and a sempron much lower clocked at high settings.

we've gotten way MORE gpu limited because instead of the tiny increases in clock speed, which even then didn't matter between 1.8ghz and 2.8Ghz at proper resolutions, but we jumped from 2.4Ghz single cores to 2.4Ghz dual cores straight away. with most games coming out with dual/multi core support for a while now the difference is massive. the jump is cpu power has been completely doubling cpu power, which was already more than needed, compared to gpu's definately NOT doubling gpu power, and hdr being added to most games now, higher textures.
 
drunkenmaster said:
i honestly don't mean to be rude, at all, but i'm not sure what you're trying to say. kinda guessing you're not english, because the way that sounds, is you're saying cpu isn't as important as the gpu, but other posts and rest of that one seem to be saying that a low cpu couldn't feed your gpu well enough so cpu is more important.

either way, 3dmark 06 isn't played at a resolution that you would game at.

with a 2.4Ghz conroe stock with a 8800gtx and a 8800gts you'd probably score pretty close with both, probably around the 9k-9.5k mark. thats fine, thats cpu limited, overclock that up to 3.5Ghz and the gts at stock wouldn't get much past 10k, and the gtx should be closer to 12k showing it WAS cpu limited. but at a LOW resolution without AA/AF, if you have the pro version, and you run the benchmark at 1600x1200 with full aa/af, of at 1920x1200 with aa/af you will score WAY lower, i dunno, say you scored around the 6k mark. you can see the score has dropped massively, ONLY due to the resolution, the cpu's speed, or lack of it, is irrelevant. the cpu is capable of letting both cards hit 9k, the only reason they don't hit 9k is because the gpu's are completely limited. almost every single game around behaves in the exact same way.

e-mail anandtech, or any respectable review site and ask them if they only benchmark games, at least for cpu reviews, at low resolutions and settings just so that the gpu's aren't limited, letting us see the cpu's theoretical power. but in gpu benchmarks, they use much higher settings. they will tell you, and most state in their cpu reviews, why they do it, and that in super high resolutions don't show any real difference.


xbitlabs.com did a very good review, was AGES ago now, looking at newish games , same gfx card, different cpu, there was little difference between a top clocked ath fx, and a sempron much lower clocked at high settings.

we've gotten way MORE gpu limited because instead of the tiny increases in clock speed, which even then didn't matter between 1.8ghz and 2.8Ghz at proper resolutions, but we jumped from 2.4Ghz single cores to 2.4Ghz dual cores straight away. with most games coming out with dual/multi core support for a while now the difference is massive. the jump is cpu power has been completely doubling cpu power, which was already more than needed, compared to gpu's definately NOT doubling gpu power, and hdr being added to most games now, higher textures.

Im sorry but even at high settings i found going from an x2 4400 to a highly clocked core 2 duo made a huge difference in many games (dirt/oblivion/supreme commander e.t.c.). I agree in some games then it is gpu limited all the way but A.I intensive games certainly take advantage of better cpu's

It not as clear cut as " oh, just put the resolution up and you won't see any difference".
 
drunkenmaster, what do you not understand, games are eventually CPU limited not GPU limited by todays top GPU's and I am British, actually Scottish so correct not English so have no clue what that is all about, I know my typing skills aint office worthy but readable.

Ofcoarse in a game that needs good CPU and GPU (lets say Oblivion) it will run better with a AM2 X2 6000 + 8800GTX than a AM2 X2 3800 + 8800GTX(example), but the eventual bottleneck will be the CPU not the GPU, even in a game thats all about GPU like FEAR it will still run better with former set up.

I really dont see whats so hard to understand, its been this way since about the 7000's in that the GPU are so damn powerfull and they claim to double each gen.

There was also a helpful linked in a thread, think it was by Loadsamoney showing that most games are fine with a decent CPU and a 8800GTX, you dont need a uber OC'd quad at 4GiG to get good FPS in them.

If I am now wrong fair enough but I do not even visit Anandtech now as of their lies in past, the original post I made would have been single core FX's, but I dont think dual core actually double it and most games dont use fully even today (not all games).

Again try find that post by above member, it is a eye opener and can put a nail in this modern coffin of being CPU limited in games. (depending on game).
 
Last edited:
I can think of one game which has a major CPU limitation: MS Flight Simulator. Which happens to be my primary raison d'etre for a gaming rig.

MS have over the years employed some strange design decisions in building such a processor intensive application: eg heavy reliance on the CPU and recently, limited multi-core and SLI/Crossfire support.

Presumably this is one of the exceptions... :(

Still won't stop me getting an 8800GTX though :), but I will probably decide to pair it with an E6850 rather than a Q6600.
 
Thats correct for that game, pair that GPU with best CPU you can afford, it still will not make the GPU the bottleneck it will remain the CPU somewhere down the line.
 
As I mentioned above (SupCom/DiRT/~Source) - plus of course the traditional cpu junkies I forgot, flight sims - some games can be cpu limited but gpu limitations are still much more common at high settings.

Obviously if can afford a very high end gfx setup then there is a small gain to be had from a better cpu, but aside from people building rigs for specific titles/genres that are cpu limited I would recommend blowing your wad on the fastest gfx card you can afford first and foremost. Or in other words certainly don't get anything slower than a 8800GTX if you are gonna be spending that cash on a flash cpu costing a ton or more. A dual-core setup with GTX will blow a quad-core setup with GTS out of the water. One reason for this is that any game which is quad-core 'aware' will by definition be dual-core 'aware' too, which would have course drastically reduce the chance of things being cpu limited.

Oh, one more thing. Seriously hardcore gamers playing competitively may need to worry a bit more about cpu speed than I'm implying here. This is because in all likelihood they will be running in low res with the fancy stuff disabled => more likely to be cpu limited.
 
So how the CPU/GPU performs in games depends on the following:

High resolution = needs better GPU
Lots of bots/stratagy eg. supreme comander = needs beter CPU

Therefore:
Supreme Comander at any res needs a good CPU.
Shadowrun with 15 bots up to 1280x1024 needs a good CPU.
Shadowrun with 1 bot at 1900x1200 needs a good GPU.
Shadowrun with 15 bots at 1900x1200 needs a good GPU + good CPU.

I think at the end of the day, get the best graphics within reason eg 8800GTX or 2900XT & spend as much as you can on a CPU.

I've noticed a big performance boost in games at 1360x768 with an 8800GTX, going from AMD dualcore 2.6 (20k mips) to quadcore 3.5 (70k MIPS), though the figures speak for themselves.
 
Back
Top Bottom