Hi all
Long time lurker, first time poster
I'm looking at getting a new system, desktop, purely for gaming and to a much lesser extent, a streaming media server. Having spent time looking through all the reviews and benchmarks I'm somewhat baffled!
Having always had low to mid end systems, I've no idea what it's like to have a game set to max and run a smooth 30+ fps and to that end can't relate to benchmarks stating 90+ fps on max settings.
In the real world, someone sat playing, is there any difference to a system pulling 50 fps min to a system at 90 fps min on the same game, same detail settings? Basically, I'm trying to justify the price difference between the more budget option compared to the full on overclocked i7.
Games I will be playing cover:
Everquest 2 (mainly cpu related, but a new update to help with gfx performance due soon)
Eve Online
TF2/CSS/any number of Source games
Fallout 3
Call of Duty Modern Warfare 2 (Nov time I think)
Left 4 Dead
Left 4 Dead 2
I'll be running a 24" monitor (probably Samsung) at 1900x1080 and wanting to run at the highest possible settings.
To the hardware!!....
For the tin itself, I'd stretch to £1.5k if it genuinely made a difference to experience, but the closer to £1k the better.
I'm really tempted by the OCers ready built systems as it just makes life easier for me, however if I could save £100+ I would look at building myself.
Apart from the maxxed benchmarks, could anyone advise if there is much 'actual' difference for a gamer between an AMD Phenom II x4 955 black system at 3.8 ghz (4gb ram) and an Intel i7 920 at 3.8 or 4ghz (6gb ram)? Or should I consider a dual core Wolfdale and crank it to 4ghz plus? The gfx card I'll be looking to run will be one of the new ATIs launched in the (very) near future. HDD initially just +/- 1tb 7200 rpm, with looking to move to SSD in the future.
Oh, and finally - quiet is good. Is there a particular case I should look at for that aspect?
Any and all advice is much appreciated, cheers!
Long time lurker, first time poster
I'm looking at getting a new system, desktop, purely for gaming and to a much lesser extent, a streaming media server. Having spent time looking through all the reviews and benchmarks I'm somewhat baffled!
Having always had low to mid end systems, I've no idea what it's like to have a game set to max and run a smooth 30+ fps and to that end can't relate to benchmarks stating 90+ fps on max settings.
In the real world, someone sat playing, is there any difference to a system pulling 50 fps min to a system at 90 fps min on the same game, same detail settings? Basically, I'm trying to justify the price difference between the more budget option compared to the full on overclocked i7.
Games I will be playing cover:
Everquest 2 (mainly cpu related, but a new update to help with gfx performance due soon)
Eve Online
TF2/CSS/any number of Source games
Fallout 3
Call of Duty Modern Warfare 2 (Nov time I think)
Left 4 Dead
Left 4 Dead 2
I'll be running a 24" monitor (probably Samsung) at 1900x1080 and wanting to run at the highest possible settings.
To the hardware!!....
For the tin itself, I'd stretch to £1.5k if it genuinely made a difference to experience, but the closer to £1k the better.
I'm really tempted by the OCers ready built systems as it just makes life easier for me, however if I could save £100+ I would look at building myself.
Apart from the maxxed benchmarks, could anyone advise if there is much 'actual' difference for a gamer between an AMD Phenom II x4 955 black system at 3.8 ghz (4gb ram) and an Intel i7 920 at 3.8 or 4ghz (6gb ram)? Or should I consider a dual core Wolfdale and crank it to 4ghz plus? The gfx card I'll be looking to run will be one of the new ATIs launched in the (very) near future. HDD initially just +/- 1tb 7200 rpm, with looking to move to SSD in the future.
Oh, and finally - quiet is good. Is there a particular case I should look at for that aspect?
Any and all advice is much appreciated, cheers!