• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Core i7 Extreme Vs Core 2 Quad Extreme

Soldato
Joined
9 Nov 2003
Posts
17,700
Location
Leeds
Some interesting food for thought for you guys on what gains Core i7 has actually made.

Two systems both with clock speeds of 3.30GHz, one with a Core i7 Extreme 965 coupled with 6GB the other a Core 2 Quad Extreme QX9650 with 8GB.

Both systems used 3x Nvidia GTX 280 graphics cards on benchmarks where the CPU should be the limiting factor.

Have a look at these babies:

100ig1.jpg

QX9650 @ 3.30GHz
===================
101bo3.jpg

i7 965 @ 3.30GHz
===================

A good little comparisson, thought it may be interesting to see what a decent clocked Core 2 rig scores with similar hardware and what it would take to match the i7 3.30GHz...

103ul2.jpg

QX9650 @ 4.20GHz
===================

Even at 4.20GHz the Core 2 is still bottlenecking the graphics cards in Tri-Sli. The Core i7 gear although pricey certainly has Core 2 beaten.

From the results above to get a similar GPU score in Vantage with Core 2 as the i7 rig the Core 2 would have to be clocked at 5.10GHz or beyond.
 
As much as I like to play a good game of Vantage, Some game numbers would be nice.

Not the purpose of what I was aiming for, was using it as a easy to use and reproduce way of seeing what clock speed on the CPU would be needed to get the same score. There are thousands of reviews around the net of i7 performance in games but I have yet to find a clock for clock review or comparison.

I do have some DMC4 bench screenshots I can post up for the 3.30GHz i7 Vs 4.20GHz Core 2. Will get them up in a short while.
 
dmc33qq0.jpg

965 i7 @ 3.30GHz
==============

dmc42ac7.jpg

QX9650 @ 4.20GHz
==============

Same sort of results with DMC4 benchmark.
1920x1200 - 16xQ - Vsync off - 60FPS Disable - All on Super High (Maximum Settings)

Keep in mind that the 3.30GHz i7 chip I have here can do 3.93GHz quite easily on a Akasa Nero cooler and 4.00GHz when the volts are really flying.
 
The cards are still bottlenecked in Vantage at 3.30GHz, still shows large gains for each 100MHz you stick on the i7.
 
Just what i wanted to hear Yewen. :)

Wallet's gonna find it difficult to stay shut this xmas. :D

gt

:D

Just go for the i7 920 with a good setup of cooling, pair with some decent RAM and away you go.

Internally me and Gibbo are finding the Kingston and Cellshock to be the best two brands at the moment, the 1800 Kingston I have in the Rampage II Extreme handles 2000MHz if you are prepared to work with it.
 
Excellent. Thanks for the heads up re: the DDR3! :)

Yeah, the chip would be going under water with a P120.2 in the loop (system linked to in my sig) which should hopefully be sufficient for a 4ghz+ clock.

gt
 
Excellent. Thanks for the heads up re: the DDR3! :)

Yeah, the chip would be going under water with a P120.2 in the loop (system linked to in my sig) which should hopefully be sufficient for a 4ghz+ clock.

gt

If I can manage 3.8GHz on a Akasa air cooler and the stock cooler is getting to 3.60GHz+ stable wtih good temps I think it is a reasonable target. ;)
 
is it only me that can only load the first image on both posts? The fact is Vantage has a cpu score in it, its not a game, it doesn't behave like a game and at no point in a game is the cpu doing rendering just to see how fast it is and get a score.

Games, no difference, tri sli isn't cpu bottlenecked. You just have to realise that having tri sli for 1920x1200 is completely ridiculous, its designed and aimed for a higher resolution and when playing at that resolution you are gpu limited again.

You get yourself a cpu, say a quad core at 3Ghz, its capable of 250fps at 1024x768, its capable of 250fps at 1920x1200 and its capable of 250fps at 2640x1600(or whatever the heck that res is, I forget). THe difference is the graphics cards, just because you get an increase at one res doesn't mean you will at the resolution you will actually game at.

i7, as shown in every review that doesn't use a dual core vs a brand new spanking quad core (guru3d review) shows, at any resolution you would actually game at with your card, no difference. some games faster, some slower. buy i7 if you encode, more than once a month, work with 3d rendering, photoshop or whatever else, for everyone else its a waste of cash. Phenom 2 and the midrange i7's will be a far better option for an upgrade, but again neither will offer any gaming improvements at all.

EDIT;- can you rerun at a higher res with aa/af enabled to max aswell, rather than default settings, assuming you can, only run it the once ages ago. You might also want to notice that the pure graphics tests, the features tests, show all but identical results, only the tests that aren't cpu limited are showing an increase, which isn't unsurprising.
 
Last edited:
is it only me that can only load the first image on both posts? The fact is Vantage has a cpu score in it, its not a game, it doesn't behave like a game and at no point in a game is the cpu doing rendering just to see how fast it is and get a score.

Games, no difference, tri sli isn't cpu bottlenecked. You just have to realise that having tri sli for 1920x1200 is completely ridiculous, its designed and aimed for a higher resolution and when playing at that resolution you are gpu limited again.

You get yourself a cpu, say a quad core at 3Ghz, its capable of 250fps at 1024x768, its capable of 250fps at 1920x1200 and its capable of 250fps at 2640x1600(or whatever the heck that res is, I forget). THe difference is the graphics cards, just because you get an increase at one res doesn't mean you will at the resolution you will actually game at.

i7, as shown in every review that doesn't use a dual core vs a brand new spanking quad core (guru3d review) shows, at any resolution you would actually game at with your card, no difference. some games faster, some slower. buy i7 if you encode, more than once a month, work with 3d rendering, photoshop or whatever else, for everyone else its a waste of cash. Phenom 2 and the midrange i7's will be a far better option for an upgrade, but again neither will offer any gaming improvements at all.

EDIT;- can you rerun at a higher res with aa/af enabled to max aswell, rather than default settings, assuming you can, only run it the once ages ago. You might also want to notice that the pure graphics tests, the features tests, show all but identical results, only the tests that aren't cpu limited are showing an increase, which isn't unsurprising.

Who said I was aiming for another game round up, lots of review sites can do that much better than I can with the time I have.

It is a comparison between what the two systems need to get the same score in vantage... what the difference in clock speed is to get the same performance.

To that effect I think it was a success, I just do not want to do "another" i7 round up with the same benchmarks that everyone else is running, you can get that from 50+ websites already.
 
:D

Just go for the i7 920 with a good setup of cooling, pair with some decent RAM and away you go.

Internally me and Gibbo are finding the Kingston and Cellshock to be the best two brands at the moment, the 1800 Kingston I have in the Rampage II Extreme handles 2000MHz if you are prepared to work with it.

Thanks for the heads up Yewen! I think most Tri SLI 280's owners will be gagging for the i7/intel chipsets :D
________
Maine dispensaries
 
Last edited:
You get yourself a cpu, say a quad core at 3Ghz, its capable of 250fps at 1024x768, its capable of 250fps at 1920x1200 and its capable of 250fps at 2640x1600(or whatever the heck that res is, I forget). THe difference is the graphics cards, just because you get an increase at one res doesn't mean you will at the resolution you will actually game at.

Resolution's not the thing though, SLi scales with CPU power.

Look at this page, here, and scroll down to the CPU benchmark table:

http://www.anandtech.com/video/showdoc.aspx?i=3183&p=4

Bioshock and Oblivion don't see much of a gain from raising the CPU power, but in Crysis, the SLi setup is getting bottlenecked by the CPU, and we see a difference of 13FPS (30FPS -> 43FPS) from a QX9650 at 2GHz to one at 3.3GHz, and 3 8800 Ultra graphics cards this is at high, 1920x1200. Obviously the benefits of a faster CPU than a fast Core 2 processor in gaming terms is limited to a niche, but that niche certainly exists and CPU power can help in these sorts of situations.
 
Back
Top Bottom