Soldato
- Joined
- 18 Oct 2002
- Posts
- 6,672
I've already made my mind up, but I'm intrigued to hear OcUK's views on whether they would upgrade the CPU (hence mobo and RAM too) if gaming at 1920 x 1200.
System is in sig - CPU is running at 4Ghz.
My opinion is that there will be almost zero benefit from upgrading both the CPU and GPU as opposed to just the GPU. The only reason I can see is to have a new CPU to overclock (which is always fun!)
Does anyone disagree, and if so do they have benchmarks to show? Remember that this is purely about gaming at high resolution.
System is in sig - CPU is running at 4Ghz.
My opinion is that there will be almost zero benefit from upgrading both the CPU and GPU as opposed to just the GPU. The only reason I can see is to have a new CPU to overclock (which is always fun!)

Does anyone disagree, and if so do they have benchmarks to show? Remember that this is purely about gaming at high resolution.

) but I think it's due to the type of workload placed on a CPU during a game. If you're encoding/folding it's a pretty regular stream of numbers for your CPU to crunch so flat lining is possible. When gaming I'd imagine it's a far less uniform workload which would be dynamically changing with the pace of the game, this would probably explain the peaks and troughs.

