• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why are modern graphics cards CPU dependent?

Associate
Joined
21 May 2004
Posts
859
Location
Cheltenham
With graphics cards being ultra powerful and fast why are we still talking of "CPU bottlenecks"?

Surely if you spend £1000 on a graphics set up shouldnt this EASE the burden on your CPU?

Can someone please explain :confused:
 
nope because a graphics card can only apply graphics as quickly as a cpu can work out physics and the like
 
In other words, the Graphics cards need to be told what to do by the processer.
If the processer sucks the graphics cards can't do anything
 
Are we at the point where CPU`s have levelled out then?
As 2 GPU`s and a physics card are still CPU bound?
 
rayb74 said:
With graphics cards being ultra powerful and fast why are we still talking of "CPU bottlenecks"?

Surely if you spend £1000 on a graphics set up shouldnt this EASE the burden on your CPU?

Can someone please explain :confused:

This isn't true. ;)

All games are limited by wither CPU or GPU at some point. If you think you are getting towards CPU limits at a particular point, turn up the eye candy. If you can't max out the eye candy, then you are GPU bound.
 
rayb74 said:
Are we at the point where CPU`s have levelled out then?
As 2 GPU`s and a physics card are still CPU bound?

I don't think we are. Stuff like oblivion is very obvioulsy GPU bound (once the eye candy is turned up, there does not seem to be much difference after using a A64 3200...)

If you look at yesteryears games, then you will be able to max out the eye candy, but it doesn't really indicate a CPU limit, just that the visual effects are not cutting edge.
 
"CPU limited" is a technical thing thats measurable but never really effects your in game experiance, by the time you get CPU bound, eg all effects and highest res with AA and AF, by the time your CPU limited your FPS will be over the refresh rate of your monitor.
 
Rick_Barnes said:
"CPU limited" is a technical thing thats measurable but never really effects your in game experiance, by the time you get CPU bound, eg all effects and highest res with AA and AF, by the time your CPU limited your FPS will be over the refresh rate of your monitor.


depends what hardware your running (and id know)
 
m3csl2004 said:
depends what hardware your running (and id know)


m35rh.gif
(x10 times)

yes, always blaming the graphics.. EDIT: FINE THEN, "hardware"! :rolleyes:
 
Last edited:
m3csl2004 said:
erm...cpu limitation? - think not

maybe you should start your own thread then trash this 1

hush.. :p

Basically, the cpu is dependant only at low resolution, increasing the resolution to where the graphics is the bottleneck won't decrease performance if it's really cpu limited...
 
Last edited:
.::lawrywild::. said:
hush.. :p

Basically, the cpu is dependant only at low resolution, increasing the resolution to where the graphics is the bottleneck won't decrease performance if it's really cpu limited...

i remember when i had my cely nw and 9800pro, the stress test at 640x480 low was 40 fps, at 1024x768 all high 4x aa 4x af i got 39.9 :rolleyes:
 
The processor does things like physics and A.I. This will include smoke and shadows (the way they are supposed to be drawn etc).

The graphics card uses techniques to make effects look realistic.

Say you are playing oblivion. You walk around a town with lots of people. Those people need their A.I processing and shadows processing aswell as any physics involved.

Once this is done - the graphics card applies nice effects to the scene to make it look realistic. If the processor is taking too long to process the A.I and other stuff then the graphics card has to wait before it can apply the effects.

Oblivion is both CPU and GPU bound but it is more CPU bound in towns where there is A.I to be processed and a lot more GPU bound outside where there are more effects to be processed.

This was proven in a thread here about oblivions cpu performance. A custom demo was made and a few of us did the testing.

The guy with a dual core cpu was something like 50% faster than us single core guys because the scene was a very grassy area, and the physics of the grass moving in the wind are VERY cpu hungry. And since oblivion uses multi threading in its programming it made good use of the second cpu.

Edit- This is where physics cards come into play. If Oblivion was coded to make use of physics cards to process the grass and shadows etc the performance would be greatly enhanced.
 
Last edited:
Back
Top Bottom