• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What graphics card next?

Actually my discussion with andybird123 was more about the overclocked Q6600...he insisting overclocked Q6600 would't bottleneck the GTX580 and get constantly 99%, when I myself used a Q6600 overclocked to 3.6GHz bottlenecked my single 5850 GPU usage in various games with anything from 90% down to 40% and at times with frame rate dip to low 20s at times. I also posted link to other topic where someone have show much higher frame rate with on a single 5850 using a i5 3570K comparing to Q6600 both at 3.5GHz, plus other overclocked Q6600 users mention seeing the GPU usage is only around 80% at times on their 5850 as well. All I was saying he shouldn't generalise all games out there, just because he get constant 99% GPU usage in the games he plays. It is either him being right about overclocked Q6600 wouldn't bottleneck GTX580 with constant 99% GPU usage, or me and other users of overclocked Q6600 being delusional on seeing GPU usage along with frame rate drop in various games even on the much slower 5850. If a Q6600 can bottleneck a 5850, then it is only logical to assume that a Phenom II X4 at only 3.7GHz would bottleneck the 7850 as well, considering it is around 35-40% faster than 5850 when both are overclocked.

And regarding this topic, to be honest from performance standpoint on paper going from CF4890 to 7850 isn't much of an upgrade, and doesn't seem to worth it considering the cost involve...however, if the OP was to believe the crossfire performance is questionable or not very reliable, then the 7850 might be good as a reasonable minor upgrade.


It's all about resolution, the lower your res the less work your GPU has to do and the more your CPU needs to do, meaning a faster CPU is more noticeable the fewer pixels on the screen. Crank up to at least 1920x1080 and most CPUs preform give or take the same as you're more than likely GPU bound.

I also feel upgrading to the 7850 would be money well spent, my opinion.
 
It's all about resolution, the lower your res the less work your GPU has to do and the more your CPU needs to do, meaning a faster CPU is more noticeable the fewer pixels on the screen. Crank up to at least 1920x1080 and most CPUs preform give or take the same as you're more than likely GPU bound.

I also feel upgrading to the 7850 would be money well spent, my opinion.
I know already know that logic thank you, but the thing is me and the other overclocked Q6600 users that seen game at 1920 res have all seen our single 5850 getting bottlenecked. It's depended on games. Yea I think we all want games to all use 4 or more cores, but the fact remains that not all games do that (in fact majority don't) even for new game releases. Take Guild War 2 for example...it is one of the most anticipated game...and it doesn't use 4 cores, and not having a faster CPU is what holding back the frame rate at the moment...lowering "graphic" settings doesn't help increasing frame rate at all. Game developers are either like doing lazy console ports for more games, or enjoy trolling the players by making CPU demanding games using less than 4 cores.
 
Last edited:
Actually my discussion with andybird123 was more about the overclocked Q6600...he insisting overclocked Q6600 would't bottleneck the GTX580 and get constantly 99%, when I myself used a Q6600 overclocked to 3.6GHz bottlenecked my single 5850 GPU usage in various games with anything from 90% down to 40% and at times with frame rate dip to low 20s at times.

I hadn't until now bothered looking at the relative performance of a 5850.
These results (dips to low 20's) look to be inline with what a 5850 should deliver;

http://www.anandtech.com/bench/Product/512?vs=547

I had a 560ti and got the same results as those listed on that link - when I switched to a 580 I got a massive improvement and similar albeit slightly lower results to this;

http://www.anandtech.com/bench/Product/512?vs=517

I played through the single player campaign on StarCraft II with a 560ti and never once noticed a severe dip, which is what a 560ti should deliver based on these benches. Did I sit there constantly looking at an FPS counter while I did it? No, but that's because it wasn't a problem.

I did experience dips to 20's or even 10's in one game with the 560ti - and that was battlefield 3... a problem that went away when I switched to a 580
picking out 1 or 2 badly coded games and using them as a crass generalisation to all games is pointless... saying "if you play game x or y you might not get the results you want" yeah fair enough, but harping on that anyone who wants to spend more than £50 on a GPU needs to spend £500+ on a new PC first is very short sighted - in all the links you keep posting up, there are more games that are NOT limited than games that ARE, yet your accusing me of cherry picking a few results and applying that to "all games"

I'm not even saying "all games" will run fine on a Q or a phenom... but I am saying is try it first before condeming it, because I had been largely happy with the results I was getting until I planned on getting a 1440p monitor - and then yes switching to 670 SLI I had to upgrade my processor - well dur didn't see that one coming


edit: also where are all these 5850 owners who have severe problems with a Q6600?... because all the other people (apart from you) commenting on this thread are saying that phenom / Q have no real issues with a 5850 or better
 
Last edited:
my q6600 (stock) and Rad 5850 played SC2 single player well enough perhaps the odd slow down during heavy action times...

I think a lot of folk see these new games coming and see how intense it is and just upgrade lol
 
Also going to throw something else out there... Perhaps the games you were seeing low usage weren't optimised properly for the drivers you were using? Perhaps the games engine didn't scale well?

There's too many factors out there to just say x CPU will definitely heavily bottleneck y GPU. Only way to be sure is to test it and it's far easier (and cheaper) to stick a new GPU in than it is to gut the whole system especially when it's probably fine.

The low resolution benchmarks don't really prove anything in my eyes because they're hardly what you'd call real world benchmarks.

Anyway if the bottleneck is a few %, so what? Not everyone wants to run maxed out utilisation they just want a good end user experience. Another thing to consider, the dreaded VRAM... How much do the 4890's have again?
 
Lots of people are suggesting the 7850- when I went from 2x 4890 to my 7870, I saw minimal improvement in games that agreed with crossfire at all. This might also be because I went from DX10 to 11 in games like Metro. Huge upgrade in games that dont like crossfire though. I upgraded more for the consistancy of a single card, dx 11 and less faffing about.
 
I originally posted.


Why are people still suggesting the 7850, when is this case it is £200 for hardly any improvement over his xfire 4890's.

sheer madness

To which there was this reply.

Here's 6 reasons off the top of my head

1. When overclocked a single 7850 will be more powerful than CF 4890's
2. A single 7850 uses less power
3. 7850 is DX11
4. More driver compatability in games
5. Takes up less space in case
6. Less heat

Now both quotes seem to be confirmed by this nice quote here.

Lots of people are suggesting the 7850- when I went from 2x 4890 to my 7870, I saw minimal improvement in games that agreed with crossfire at all. This might also be because I went from DX10 to 11 in games like Metro. Huge upgrade in games that dont like crossfire though. I upgraded more for the consistancy of a single card, dx 11 and less faffing about.

But realistically it is all immaterial because the opening poster, who has probably run a mile due to the other discussion/argument that dominates his thread (i really do hope he is sitting there laughing his backside off over all of this) has said quite clearly that he cannot afford £200 for a 7850.

oh i relaise its not, i cant afford £200 quid!
 
The low resolution benchmarks don't really prove anything in my eyes because they're hardly what you'd call real world benchmarks.

Anyway if the bottleneck is a few %, so what? Not everyone wants to run maxed out utilisation they just want a good end user experience. Another thing to consider, the dreaded VRAM... How much do the 4890's have again?
Yes and no. The purpose of low res is for strictly showing what the CPU can deliver when it is not GPU bounded.

One example I can give is take BF3 for example...a friend of mine with a GTX560Ti 2GB together with a i3 2120...if putting everything on ultra, then of course the GPU usage would be pushed to 99%, as BF3 Ultra settings is not meant for a single GTX560Ti in the first place, but when dropped from Ultra to High, on the 64 player maps the frame rate at times would drop to below 40fps, while the GPU usage is like only around 80% at those moments. What this mean is the frame rate is held back by the CPU rather than graphic card, and lowering "graphic" settings further would still no help in making the frame rate better.

my q6600 (stock) and Rad 5850 played SC2 single player well enough perhaps the odd slow down during heavy action times...
Yea it is moments like those that I saw my 5850's GPU usage pitfall down to 40%, with frame rate also dropped to hell.
 
Last edited:
One example I can give is take BF3 for example...a friend of mine with a GTX560Ti 2GB together with a i3 2120...if putting everything on ultra, then of course the GPU usage would be pushed to 99%, as BF3 Ultra settings is not meant for a single GTX560Ti in the first place, but when dropped from Ultra to High, on the 64 player maps the frame rate at times would drop to below 40fps, while the GPU usage is like only around 80% at those moments. What this mean is the frame rate is held back by the CPU rather than graphic card, and lowering "graphic" settings further would still no help in making the frame rate better.

was the i3 overclocked at all?
because according to the anandtech bench the i3 and Q6600 flip back forth at stock settings, but obviously we were talking about a Q @ 3.6ghz, so an i3 dual core might be limited but that doesn't extrapolate directly to a Q @ 3.6 or phenom x4 @ 3.7

unless you can show me some BF3 benchmarks of a 2120 @ whatever your friend had vs. a Q @ 3.6, this is yet another unrelated post

the only 2120 gaming benches are your favourite kind - old games / low res with a low end graphics card (e.g. does it matter if you get 101 or 120FPS in L4D2, really?)

I can tell you that my 560ti 1GB with [email protected] played BF3 online 64 player absolutley smooth with no noticable dips and that with a 580 it could play ultra with no major dips, if BF3 is CPU limited with a Q6600 then why did I get a big boost to FPS going from a 560ti to a 580?
 
Last edited:
Yes and no. The purpose of low res is for strictly showing what the CPU can deliver when it is not GPU bounded.

One example I can give is take BF3 for example...a friend of mine with a GTX560Ti 2GB together with a i3 2120...if putting everything on ultra, then of course the GPU usage would be pushed to 99%, as BF3 Ultra settings is not meant for a single GTX560Ti in the first place, but when dropped from Ultra to High, on the 64 player maps the frame rate at times would drop to below 40fps, while the GPU usage is like only around 80% at those moments. What this mean is the frame rate is held back by the CPU rather than graphic card, and lowering "graphic" settings further would still no help in making the frame rate better.


Yea it is moments like those that I saw my 5850's GPU usage pitfall down to 40%, with frame rate also dropped to hell.
sounds like BS as bf3 is one of the games where cpu scaling barely makes any difference at all.

look at this page
http://www.techspot.com/review/458-battlefield-3-performance/page7.html
the only reason your friends cpu sucks at gaming is because its only dual core, modern games are already using 3-4 cores effectively.
the op would have ZERO problem maxing out any gpu in bf3

in bf3 the difference between a x4 @3.1ghz and a i7 2600k @ 3.4 is 2 fps.....

doesnt the 560ti only have 1gb of memory aswell? bf3 can easily use 1.4gb at 1920 with ultra settings
 
Last edited:
doesnt the 560ti only have 1gb of memory aswell? bf3 can easily use 1.4gb at 1920 with ultra settings

+1 to everything you said :D

the 560ti did come in a 2GB version (which he refers to) and the 2GB version does get better frame rates at ultra than the 1GB but still not really what you would call playable - 560ti 2GB SLI is a decent setup (not one you'd rush out to buy now with the newer 2GB cards available though)
 
sounds like BS as bf3 is one of the games where cpu scaling barely makes any difference at all.

look at this page
http://www.techspot.com/review/458-battlefield-3-performance/page7.html
the only reason your friends cpu sucks at gaming is because its only dual core, modern games are already using 3-4 cores effectively.
the op would have ZERO problem maxing out any gpu in bf3

in bf3 the difference between a x4 @3.1ghz and a i7 2600k @ 3.4 is 2 fps.....

doesnt the 560ti only have 1gb of memory aswell? bf3 can easily use 1.4gb at 1920 with ultra settings
Somehow I thought the infamous single player campaign GPU boundeded bench from techspot was gonna pop up sooner or later..and in case you don't realise...those so-called 2fps difference is just margin of error when it is GPU bounded . The i3 2120 gaming performance is on par with, if not better than a Phenom II X4 at 3.7GHz. And read again, I said it is GTX560Ti 2GB. The only game that the i3 2120 would do worse than Phenom II X4 at 3.7GHz is probably the coding disaster GTA IV.

Running BF3 on ultra would always tends to be GPU bounded, but for people that don't have something like a GTX670 or above, they would sensibly play on High settings on mutiplayer because frame rate is more important than the little graphic diffierence with High and Ultra...it is then when playing on High, that the GPU usage drop would clearly show during the intensive scenes.
 
Last edited:
VRAM 4890 anyone??
And regarding this topic, to be honest from performance standpoint on paper going from CF4890 to 7850 isn't much of an upgrade, and doesn't seem to worth it considering the cost involve...however, if the OP was to believe the crossfire performance is questionable or not very reliable or the having 1GB VRAM is gonna be a problem for the games that HE plays, then the 7850 might be good as a reasonable minor upgrade.
Lots of people are suggesting the 7850- when I went from 2x 4890 to my 7870, I saw minimal improvement in games that agreed with crossfire at all. This might also be because I went from DX10 to 11 in games like Metro. Huge upgrade in games that dont like crossfire though. I upgraded more for the consistancy of a single card, dx 11 and less faffing about.
 
Back
Top Bottom