• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Bottlenecking 250GTS SLI?

Soldato
Joined
29 Aug 2010
Posts
8,731
Location
Cornwall
Hi,

I've been reading a fair amount lately on these forums about CPUs bottlenecking GPUs. Specifically Dual Cores vs. Quad Cores.

This started me thinking, could my C2D E6850 @ Stock (3GHz) be bottlenecking my 250GTS SLI setup?

I've done some runs with the free version of 3DMark06.
With 1 1GB 250GTS card in I get about 12000 (roughly).
With 2 1GB 250GTS in SLI I get about 14000 (again, roughly).

Now I was under the impression that SLI scaled better than that!

My initial thought was that since I've got an old motherboard doing SLI means I have to run both lanes at 8x instead of 16x. Since the board is old it's PCI-e v1.0 (or whatever) so I thought it was a bandwidth issue.

Then I thought that it was because 3DMark06 only runs at 1280*1024 that it wasn't pushing the 2 cards enough to display a difference.

But now I'm wondering if it's my CPU not feeding the cards quick enough.

So could my E6850 be holding back my 250SLI or are 250GTS cards in SLI not that 'hungry'?

Thanks.
 
i had a similar problem a few days ago, i was testing my new 8800gt sli test machine with an athlon x2 4200+ @ 2.5 and running the unigine tropics benchmark.

at the default 1024 x 768 setting running the cards in sli actually scored less than a single card, by turning the resolution up to 1280 x 1024 and maxing the AA and AF settings the cpu no longer became the bottleneck and sli started scaling properly.

try a benchmark that you can turn the graphics settings up as high as possible in and you should see the difference between 1 card and sli :)
 
Core 2 duo at 3gig won't (significantly) bottleneck a GTS250 SLI setup for gaming tho you won't see the max possible results from that GPU setup in benchmarks.
 
Why are you even using that PC mate?! you have 4 other high spec gaming PC's that way surpass the components mentioned here. Ditch the old machine and use your newer PC's (btw i know a good charity you can donate those components to... me :p )
 
Why are you even using that PC mate?! you have 4 other high spec gaming PC's that way surpass the components mentioned here. Ditch the old machine and use your newer PC's (btw i know a good charity you can donate those components to... me :p )

2 main reasons.
1) The PC generally seems to work fine so don't want to give it away, but would like to make sure it's working as well as possible.
2) It's the quietest PC I have and so is useful if I'm also trying to watch TV or alternating between TV and PC. But mostly reason 1.

And it's only 2 other PCs with higher specs :)
 
Hi,

I've been reading a fair amount lately on these forums about CPUs bottlenecking GPUs. Specifically Dual Cores vs. Quad Cores.

This started me thinking, could my C2D E6850 @ Stock (3GHz) be bottlenecking my 250GTS SLI setup?

I've done some runs with the free version of 3DMark06.
With 1 1GB 250GTS card in I get about 12000 (roughly).
With 2 1GB 250GTS in SLI I get about 14000 (again, roughly).

Now I was under the impression that SLI scaled better than that!

My initial thought was that since I've got an old motherboard doing SLI means I have to run both lanes at 8x instead of 16x. Since the board is old it's PCI-e v1.0 (or whatever) so I thought it was a bandwidth issue.

Then I thought that it was because 3DMark06 only runs at 1280*1024 that it wasn't pushing the 2 cards enough to display a difference.

But now I'm wondering if it's my CPU not feeding the cards quick enough.

So could my E6850 be holding back my 250SLI or are 250GTS cards in SLI not that 'hungry'?

Thanks.
I would say probably not that much, 250's in SLI correct me if i'm wrong, but i think there about as fast as a 280GTX or there abouts

I think if you can get your dual core to 3.4 or more, it wont be, at 3ghz it might be a little, but only in certain games

I had a dual core at 3.8 and a GTX 280, and gamed at 1650x1080 and felt, they were a fair match, with neither really holding the other back, except maybe in Bad Company which i hear does fair better on Quads
 
I would say probably not that much, 250's in SLI correct me if i'm wrong, but i think there about as fast as a 280GTX or there abouts

I think if you can get your dual core to 3.4 or more, it wont be, at 3ghz it might be a little, but only in certain games

I had a dual core at 3.8 and a GTX 280, and gamed at 1650x1080 and felt, they were a fair match, with neither really holding the other back, except maybe in Bad Company which i hear does fair better on Quads

I'd like to overclock the CPU to see if that helped, but unfortunately in my wisdom all those years ago I bought a motherboard using the nForce 650i SLI chipset. It seems to be unoverclockable :(

I think 250 SLI is basically the Nvidia 9800GX2 but with more memory. Not sure quite how that compares to the 200 series cards but I would guess 280GTX probably isn't far off.
 
have you tried them in anything other than 3dmark yet ?

i've been trying different resolutions and settings in tropics with my setup and the higher you turn the gfx options up the more you take the cpu out the equation and the better sli scales :)
 
No, didn't try anything else.
If the PCI-e lane being 8x instead of 16x isn't causing an issue, I can just change the SLI profile for something to say to use only 1 GPU and that'd be the same as removing a card wouldn't it? (It's a bit awkward to keep removing the card)
 
yeah that is what i have been doing in testing and the other card sits twiddling its thumbs when sli is disabled through the nvidia control panel.
 
Well, I didn't disable SLI, instead I just changed the SLI profile for a specific game (BC2) to be Single GPU. Judging by the temps that the cards hit that worked as one card didn't get higher than idling temps.
The odd thing was that in BC2 I was getting pretty much identical framerates with SLI enabled and disabled!
Slightly worrying/annoying...
 
Trouble is you cant run one piece of software to work out where your bottlenecks are.

I suggest you run a few games whilst monitoring your cpu usage (task manager is fine)
 
Well, I didn't disable SLI, instead I just changed the SLI profile for a specific game (BC2) to be Single GPU. Judging by the temps that the cards hit that worked as one card didn't get higher than idling temps.
The odd thing was that in BC2 I was getting pretty much identical framerates with SLI enabled and disabled!
Slightly worrying/annoying...


Don't have Vsync on do you?
 
Don't have Vsync on do you?

I do.
The Average framerate in both cases though was around 42, so I don't think Vsync was limiting it (well, it's not like it was 60 on both, put it that way).

Maybe I need to check the CPU usage while playing (as suggested), maybe the CPU is limiting it in both cases.
 
Easiest way to find out is to record your FPS then overclock your GPUs a touch, if there is no increase in FPS then the CPU is bottlenecking. You can confirm by overclocking your CPU a bit and seeing if there is an increase.

Overclocking the GPUs a bit with Rivatuner would be the easiest, no need to worry about voltages etc if you are only increasing the speed an bit and it will give you your answer....
 
Well, I didn't disable SLI, instead I just changed the SLI profile for a specific game (BC2) to be Single GPU. Judging by the temps that the cards hit that worked as one card didn't get higher than idling temps.
The odd thing was that in BC2 I was getting pretty much identical framerates with SLI enabled and disabled!
Slightly worrying/annoying...


bad company 2 will be cpu limited with your cpu, what gfx settings are you running it on ?

run it at as high as possible gfx settings, try running hwinfo while running the game windowed and it will show you the loads on each gpu and use core temp or task manager to monitor the cpu.
 
Last edited:
Back
Top Bottom