Associate
- Joined
- 19 Jun 2009
- Posts
- 1,306
I have recently been considering this and I'm struggling to answer this question looking at reviews and other benchmark.
Current Spec:
Phenom II x4 955 @ 3.2 ghz (c2 stepping- poor overclocker)
4Gb Ram
AMD 6950 2gb @ stock
As an example, I have seen a few benchmarks of Crysis 3 wih various CPU's which has shown that there is a lot of frames to be had with newer processors. However the game was tested with a GTX 690.
My questions
1. Assuming I keep the same graphics card (HD 6950 2 GB) and changed the CPU, will I see a tangible improvement in games? Or I'm I pretty balanced with CPU and GPU at present and to see tangible benefits, will need a new CPU + GPU?
2. As with the next gen consoles being x86, will game resources be better optimised, so much so that newer games may actually be smoother and easier on the CPU that it is not worth changing the CPU or GPU at all.
3. If I choose to upgrade the GPU to a 290X for example (I wish
), how much performance will I be losing if I kept my current CPU?

Current Spec:
Phenom II x4 955 @ 3.2 ghz (c2 stepping- poor overclocker)
4Gb Ram
AMD 6950 2gb @ stock
As an example, I have seen a few benchmarks of Crysis 3 wih various CPU's which has shown that there is a lot of frames to be had with newer processors. However the game was tested with a GTX 690.

My questions
1. Assuming I keep the same graphics card (HD 6950 2 GB) and changed the CPU, will I see a tangible improvement in games? Or I'm I pretty balanced with CPU and GPU at present and to see tangible benefits, will need a new CPU + GPU?
2. As with the next gen consoles being x86, will game resources be better optimised, so much so that newer games may actually be smoother and easier on the CPU that it is not worth changing the CPU or GPU at all.
3. If I choose to upgrade the GPU to a 290X for example (I wish
