• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Phenom II 965 vs 2500k or 3570k

My 3 main PCs are

I7 @ 3.4
PhenomII 955 @ 3.2
PhenomII 1050 @ 3.2

They all play games pretty much 100% the same... Or at least I never notice any difference at all, and never have, even under load.

They do all however have massive differences in converting AVI & FLV to DVD
 
25fps was referring to semi-CPU demanding games in general (especailly those use less than 4 cores)....but specifically for BF3, it would be more like around low 30s on the 64 players map.

The quick drop from 60fps down to 30fps rather than 40fps within seconds is by far more noticable.

Which games exactly :S? With my 955 i never saw a dip below 30fps tbh at max settings for most games and some AA enabled.
 
I run BF3 on a [email protected] with 7970 and 4GB ram. Have not had frame rate problems thus far. Don't see why people are giving such bad advice. Save your money till Haswell or get a 3570k. All settings on ultra btw, with CCC all default except texture optimization for high quality.

Perhaps graphics upgrade would help more than a cpu?
 
Last edited:
I run BF3 on a [email protected] with 7970 and 4GB ram. Have not had frame rate problems thus far. Don't see why people are giving such bad advice. Save your money till Haswell or get a 3570k. All settings on ultra btw, with CCC all default except texture optimization for high quality.

Perhaps graphics upgrade would help more than a cpu?

I don't think a graphics upgrade would sort this out, the 955 would bottleneck a high end gpu for sure. I remember when i had my 955 matched with a GTX 680, the card usage would never go above 60-70%
 
I don't think a graphics upgrade would sort this out, the 955 would bottleneck a high end gpu for sure. I remember when i had my 955 matched with a GTX 680, the card usage would never go above 60-70%
He's happy to use a 7970 with a Q6600 at 3.2GHz...and I wasn't even happy with my 5850 on my old Q6600 overclocked to 3.6GHz. The 5850 get bottleneck frequently when playing online games...also always get GPU usage jumping up and down between 50% to 80%. It is ALWAYS people that haven't got the opportunity to use Sandy/Ivy themselves that keep banging the Core2Quad/Phenom II CPU are still "great" for gaming. They are still "ok" for today's gaming, but certainly ain't "great".

Which games exactly :S? With my 955 i never saw a dip below 30fps tbh at max settings for most games and some AA enabled.
Try Total War series, Starcraft 2, and random mmos (which most for some odd reason ALWAYS only use 3 cores or less, despite how CPU intensive it is with lots of players in the same area)?
 
Last edited:
My solution to BF3 is to play it on medium with no AA and with FXAA. TBH with a 1080p 22 inch screen I can barely see the pixels when I'm sat 3 ft away from it, so I don't even know if I need that enabled. I prefer the gameplay to the visuals, too high and they are just distracting in MP.
 
My 3 main PCs are

I7 @ 3.4
PhenomII 955 @ 3.2
PhenomII 1050 @ 3.2

They all play games pretty much 100% the same... Or at least I never notice any difference at all, and never have, even under load.

They do all however have massive differences in converting AVI & FLV to DVD

Same experience here. I went from a PhenomII 965 @3.8GHz to a Phenom II 1090T @3.8GHz to an i7 2600k @4.2, and saw no noticeable difference in gaming at all.
Also, several games I tested at 4.2Ghz saw no more than 1% frame rate improvement over stock, so I didn't see any point in spending hours to find the best overclock and set it back to stock.
There were clear and substantial differences in media conversions, but gaming - nope.

I dont play BF3, though. Maybe it makes a difference in that one game.
 
Same experience here. I went from a PhenomII 965 @3.8GHz to a Phenom II 1090T @3.8GHz to an i7 2600k @4.2, and saw no noticeable difference in gaming at all.
Also, several games I tested at 4.2Ghz saw no more than 1% frame rate improvement over stock, so I didn't see any point in spending hours to find the best overclock and set it back to stock.
There were clear and substantial differences in media conversions, but gaming - nope.

I dont play BF3, though. Maybe it makes a difference in that one game.
From the sound of it you not only not play BF3, but not playing lots of online games out there as well. And no..unless you play online games or Total Wars, Starcraft 2 etc, or game on 120Hz monitor, you probably won't notice performance difference between a stock clock 2600K and at 4.2GHz, as a stock clock 2600K would be more than fast enough for most of the FPS out there on single player campaigns or other single player games.
 
Last edited:
I upgraded from a [email protected] to a stock (for now) 3570k and I have seen a vast improvment.

Biggest improvement was skyrim and to some degree, BF3.

The improvment is in the minimum frames... when the shizzle hits the fan, the 3570k will keep my 670 fed at a higher minimum, than the Qx9650 could ever maintain.
 
Last edited:
From the sound of it you not only not play BF3, but not playing lots of online games out there as well. And no..unless you play online games or Total Wars, Starcraft 2 etc, or game on 120Hz monitor, you probably won't notice performance difference between a stock clock 2600K and at 4.2GHz, as a stock clock 2600K would be more than fast enough for most of the FPS out there on single player campaigns or other single player games.

That's true, I'm mainly a single player gamer (exceptions being coop games like Borderlands, Serious Sam 3, Saints Row 3). Are online multiplayer games that much more demanding of the CPU? Why is that?
 
That's true, I'm mainly a single player gamer (exceptions being coop games like Borderlands, Serious Sam 3, Saints Row 3). Are online multiplayer games that much more demanding of the CPU? Why is that?
Well, the simple example would be for example in FPS game single player campaigns like BF3, the number of enemies you would face (or in the area you are in) are usually just a few at a time and usually with some simple attack patterns, but if you were playing online, we would be talking about potentially with around may be as many as 20-30 players in one area, shooting and spamming grenades and other explosives at each other. As for mmos, one of the biggest problem is that for some unknown reason, game developers ALWAYS desigh the game to only use 3 core or less....while you can usually get solid 60fps if you were soloing, but if you go raid or whatever along with lots of players on your screen, as well as an army of mobs on screen at the same time, those moments would be more demanding than the older CPUs like Core2Quad or Phenom II X4 can cope with, and GPU usage would usually fall too hell and so does the frame rate. This is because the CPU is not fast enough to give the workload needed to the graphic card, thus bottlenecking it. On mmos that I play, comparing to my Q6600 overclocked to 3.6GHz which use to get as low as low 20 fps in those intensive moments, after upgrading to my i5 2500K overclocked to 4.5GHz, I get solid 55fps+ on those same scenes, and that's with the same 5850.

So it's mostly down to "how much things going on at the same time on your screen/in your area".
 
Last edited:
Back
Top Bottom