Nvidia says Core i7 isn't worth it ?

Associate
Joined
28 Jul 2009
Posts
28
Nvidia said:
Nvidia would prefer that you spend your money on GPUs rather than CPUs

The Intel Core i7 chip is an awesome CPU – this we know. If we were to build a gaming rig, we’d want to have one of these inside it. But Nvidia is telling everyone that the CPU isn’t everything.

Intel claims that gaming performance goes up by 80 percent when you use a Core i7 chip. This impressed Nvidia’s technical marketing director Tom Petersen, who decided to take a closer look at Intel’s claim.

“I was impressed by that claim, and I was trying to figure out how they could possibly say such a thing, and it turns out that Intel is basing that claim on only 3DMark Vantage’s CPU test.”

Of course, a CPU test is just that – to test the CPU. Peterson goes on to explain his view: “…it doesn’t actually measure gameplay, it doesn’t actually measure anything about game performance. Sure enough, if you do that test you will see Core i7 running faster, but I think it’s a little disingenuous to call that game performance.”

Peterson then transitioned to an example that would further his case that Core i7 isn’t the clear superior choice for the gaming PC. He compared two systems, calling the Core i7 965-based one a “Hummer,” and likening the one with a Core 2 Duo E8400 to a BMW.

Nvidia showed benchmark graphs of various systems running Crysis Warhead, Fallout 3, Call of Duty: World at War and Far Cry 2 at 1920 x 1200 (no AA or AF). According to bit-tech.net, the Core 2 Duo E8400 and a GeForce GTS 250 scored an average of 41.6 fps. The frame rate moved slightly up to 42.4 fps after upgrading to a Core i7 965, but jumped all the way up to 59.4 fps after upgrading to a GeForce GTX 260 (216 stream processors) SLI setup.

Here we have a case where the games running at 1920 x 1200 are fillrate-bound rather than CPU. A faster CPU did little to make things better for the GPU, but upgrading to a significantly stronger 3D acceleration setup opened up the headroom for more frames.

Peterson acknowledges that at high-resolutions, it’s smarter to spend on buying more fillrate: “…it is a fact, that when you’re gaming and you’re running at resolutions of 1920 x 1200 or better, the Core 2 Duo is perfect for running all of today’s games. In real gaming, there’s no difference between a Core i7 and a Core 2 Duo.”

Nvidia Says Core i7 Isn't Worth It
 
i7 for gaming comes into it's own in multi GPU setups. In most games, if you ran an E8400 @ 3.6+ with something like GTX285s in SLI an i7 OC'd would show significant gains.
 
Last edited:
Nvidia is making perfect sense and is the fact. For gaming it is better having a good gpu and a poor cpu. Rather than a good cpu and poor gpu.

Of course most people who buy an i7 wont have a poor GPU so it's a bit pointless.
And there are a few games that support quad core and this trend will only increase.
 
I agree totally.

People spend far too much money on the CPU (for gaming).

My rule of thumb is to spend double on the GPU (for gaming).

(Although I notice my sig doesn't reflect that fact! I have been putting off a GPU upgrade for too long..) :p
 
Which is why you should go for AMD... and ATI. AMD and ATI work better than AMD and nVidia. Intel work better with nVidia than Intel with ATI, which is why i always recommend them for different processors.
 
This 4870 I'm using works fine with Intel.

Well, only the fans have started giving out, but I'll blame Gainward.
 
I think the cpu plays a part role in games or some games, but most titles are gpu intensive. Online play is where the cpu makes a real difference.
 
I've been telling people this for so long. CPU's have far outpaced GPU's and i7 really isn't needed at all, and by the time it is no doubt the current i7 buyers will have i9's or greater.....
 
Its pretty clear they are both trying to sell their own products!

Both are required to experience the ultimate but that is daft money just look at the CUDA cards for rendering I believe they are something like £5000 each! Saw a config with 4 of them!!! no need! Now put that with a low spec CPU naff results!
 
Purely from a Gaming point of view ofcourse it isn't worth it. But encoding and such other programs that stress the cpu then they're worth it.
 
I can sense a real battle coming in the future of PC hardware, Graphics cards are turning into mini PCs with huge heatsinks, fans and large amounts of on board memory.
I am sure games could now be written to utilise the CPU rather than the GPU, One thread/core for physics etc. Therefore will future GPUs become less complex? I am sure this isn't the way nVidia would like to see it go.
 
Evidence?

Well, it's a general rule of thumb. The drivers for ATI cards work better with AMD boards because since AMD bought out ATI they've designed and marketed in such a way that they want you to use them together and they can say that because they've made it so.
 
Well, it's a general rule of thumb. The drivers for ATI cards work better with AMD boards because since AMD bought out ATI they've designed and marketed in such a way that they want you to use them together and they can say that because they've made it so.

In theory then, that could be the future of PC gaming, having both parts compliment each other must be the way forward.
 
They are aiming this more at the 8400GS with a Q9550 from a high street shop crowd.

They would prefer a GTS 250 with a Q8200 as a better pairing for gaming.

For the ultra high end the combination of a i7 with 285/295 is still the way to go. Translate this into the real low end parts and them trying to educate joe public makes complete sense and gets them a better system at the end of the day. :)
 
I can sense a real battle coming in the future of PC hardware, Graphics cards are turning into mini PCs with huge heatsinks, fans and large amounts of on board memory.
I am sure games could now be written to utilise the CPU rather than the GPU, One thread/core for physics etc. Therefore will future GPUs become less complex? I am sure this isn't the way nVidia would like to see it go.

I could agree with that but I am sure we would have seen a shift by now I cant see nVidia going backwards or ATI for that matter but definately a closer relationship between the CPU & GPU!
 
They are aiming this more at the 8400GS with a Q9550 from a high street shop crowd.

They would prefer a GTS 250 with a Q8200 as a better pairing for gaming.

For the ultra high end the combination of a i7 with 285/295 is still the way to go. Translate this into the real low end parts and them trying to educate joe public makes complete sense and gets them a better system at the end of the day. :)

All they were doing was disproving Intel's claim that it can improve your gaming performance by over 70%. What this means is up to you to interpret.
 
Back
Top Bottom