• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

P4 3.2 ---> Athlon 64 3700

  • Thread starter Thread starter Seirrah
  • Start date Start date

Seirrah

S

Seirrah

I've the opportunity of swapping my mobo so that I could use socket 939, and still use my AGP.

I won't lie, if I'd like my pc to be faster at anything then it'd have to be in games......it's fine for me doing everything else. I was wondering how much an upgrade from a P4 3.2 to an Athlon 64 3700 would mean in real terms. I know it's hard to comment as some games are more CPU intensive than others, but is it negligible, or is there a significant difference?

Thanks
 
pastymuncher said:
Check that thread that thefranklin posted on the games tests but put the 3.2 P4 in instead of the 2.6 P4 and the Athlon 64 is miles ahead.

Im not saying it isnt, but look at the figures, in far cry the a64 gets 190 and the p4 gets 160.

Can anybody honastly tell the difference between 190fps and 160 fps?
 
Northwood Cores, with 800mhz FSB (2.4, 2.6, 2.8, 3.0, 3.2) are all Hyperthreading enabled.

Northwood Cores with 533mhz FSB are HT Disabled, apart from the 3.06Ghz which is HT enabled.

Willemotte, and Northwood cores with 400mhz FSB are all HT disabled.

3.2Ghz 800mhz FSB Northwood cores are still pretty decent processors, and will make good use of pretty much any top end AGP Graphics card, even the Gainward GS7800+ (7900GT)

Assuming you play games at 1280x1024 or higher, the graphics card makes more of a difference than the CPU.

As Defcon 5 says, can you tell the difference between 160 and 200fps... Im playing 'Oblivion' one of the latest high end games, on a Geforce 6800GT AGP, and a P4 Northwood 3.2Ghz, and its both lovely to look at, and smooth to play. (Although upgrading to the Gainward 7800+ would probably give me a 100% performance improvement in some outdoor areas)

You may notice that the Farcry benchmark in that Tomshardware set, is taken with 1280x1024 resolution, but LOW quality settings. Increasing the quality places a greater load on the graphics card, and would make the difference between a 3.2P4, and an athlon 3700 much less noticable.
 
Last edited:
Games are pretty much GPU bound anyway.

Oblivion sees a nice performance increase with dual core CPU's for over single core processors for example at 800X600 resolution but put it up to 1280x1024 and and FPS are exactly the same across the board as you are now GPU limited.

With games utilising vertex and pixel shaders more heavily, I can't see the trend changing tbh.
 
I'm running at 1680x1050, I really should have mentioned this ...sorry.

So there would be a very small difference in frame rate between CPUs (when the gfx is working hard (ie. struggling)? I guessed so.
 
Back
Top Bottom