• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Difference between i7 Cpu's ?

Now why should Intel waste money on adding more cores in their consumer CPUs when they already hold such a performance advantage?

Even when you look at AMD 8-core performance vs Intel's quad offerings AMD still suffer.

It is still strange how people judge CPU's based on core count, AMD fanboys all too often talk about Intel as though they are behind in the game because they don't offer a mainstream 8 core, completely ignoring that the only reason AMD have an 8 core processor at all is because their processors are only roughly as powerful as Intel's quads.

Banding around benchmarks like those posted above showing an FX8350 beating an IB-E by 1-2fps in a heavily GPU bottlenecked situation (results well within margin of error), comparing encoding times which is a best case scenario for lots of tiny cores, completely ignoring the games where AMD's processors get left for dead by even an i3.

You only have to look at the state of the new consoles to see that lots of weak cores is not all it's being cracked up to be, developers are already talking about CPU bottlenecks preventing them from maintaining a solid 60fps so gamesare being capped at 30fps, that would have been far less of an issue had the consoles had 2-4 much faster cores. It doesn't matter how much programming is optimised for multiple cores there will always be cases where fewer faster cores are beneficial.
 
Lots of tangents here, and mmj is only retorting what many on here are bored of hearing.

Without prolonging the intel v AMD debate, there is no contest when your in the £90-100 tier. It has been mentioned on here which is completely true, that if your buying top GPU's and other high end components you would not be considering an FX combo!

Boring fanboy comments and negative AMD slants only highlights that haters gonna hate. If anyone had a period of time in the last couple of years where the pennies counted and chose an FX then you would witness first hand for the money they are great performers.
 
Ahh, being paid to say it, classic argument :p
I also gave left open the alternative of your being dishonest.

I agree SLI isn't normal and neither is 4k gaming, but being GPU bound is very much the norm. SLI reduces the GPU bottleneck so should place more emphasis on the CPU so this abnormality should be helping your side of the debate.
a) You get a significant performance drop in games going from a 2500K to an 8350.
b) Again, you can buy a better Intel processor for the same kind of money as an FX-83xx.

I'll just skip out the next bit about emergent 4K because I don't have time to list all the reasons it wasn't the best reasoning out there.

As I named a specific processor, the 8320, as you quoted, I'm curious which apparently superior i5 you're going to buy for the same money which makes my argument so wrong?..

But with a limited budget it is almost always better to save money on the CPU and spend it on the GPU - and Intel have nothing to match a clocked up 8320 for £95.
105, not 95. And yes you need to spend 125 vs 105 to move to the i5 4440. But David, finding a 20 pound gap in Intel's pricing doesn't change Intel and AMD's relative value for money. Surely you realise that?

You're the one who claimed AMD were NEVER value for money which is the blanket statement. In case you've forgotten what you said was, with no reference to any single processor comparison:
I said they are never better value for money than Intel, even though as I've said the 8320 gets reasonably close.


If we have a limited budget for a full system, which is almost always the case, then any money we save on the CPU can be spent on the GPU. If we're going to be GPU limited anyway then saving even a few pounds to allow us a better graphics option is well worth it. If that money doesn't let us get anything better in the GPU department then sure, spend it where you want, this doesn't make one CPU bad value for money.
But what does make it bad value for money is if it's worse value for money than another chip. See above.

Lots of tangents here, and mmj is only retorting what many on here are bored of hearing.

Without prolonging the intel v AMD debate, there is no contest when your in the £90-100 tier. It has been mentioned on here which is completely true, that if your buying top GPU's and other high end components you would not be considering an FX combo!

Boring fanboy comments and negative AMD slants only highlights that haters gonna hate. If anyone had a period of time in the last couple of years where the pennies counted and chose an FX then you would witness first hand for the money they are great performers.

I'm sorry Thont I've played with a lot of Bulldozer and Piledrivers over the recent few years and these Bulldozer derived cores are not up to the task of beating the Blue Team in any meaningful sense. That's just reality; everyone else has moved on.
 
Last edited:
can the mods just delete all posts about amd ..... ive got one myself but im interested in the original question . I want an informed reason on stuff to make decisions on future buys
 
I'm sorry Thont I've played with a lot of Bulldozer and Piledrivers over the recent few years and these Bulldozer derived cores are not up to the task of beating the Blue Team in any meaningful sense. That's just reality; everyone else has moved on.

I am not here to justify any such purpose, or wallow in pity. I bought what I did over a year ago and it has done all I need without any negative placebo's most bang on about.

I have also built many an intel machine for people and got my hands on kit to see first hand. I dont game on 4k. I stopped playing MMO's some time ago. I dont need a £300 cpu to play the games I have installed and it certainly isnt required to complete work spreadsheets and databases which unfortunately I tend to do more of these days.

So bully to you or other superior beings for spending that extra £50 and going intel! At least that's your justification as it cost more when I looked.

I built the whole rig for under £500 which included the Carbide 540 (which was a ton on it's own). It was impossible to get anywhere near that when looking at an intel option, sadly I was made redundant over that period and every penny mattered.

If I was to buy a few months back with the comfort of a full time job I would just save for an i7 and be done with it. However jumping to that setup from what I have is not worth the money, it will be a different story in a couple of years. :cool:
 
can the mods just delete all posts about amd ..... ive got one myself but im interested in the original question . I want an informed reason on stuff to make decisions on future buys

Fine.

0th generation Core 2 Solo/Duo/Quad- Conroe': Debuting on 65nm with a later design tock giving 45nm variants, this was the first decent Intel processor since the Pentium 3. Core 2 was based on the Mobile-only 'Core' chip, which was arguably actually the zero generation 'Core'. Either way what I'm calling the zero' generation (Core 2) was a massive improvement on the Pentium D and Athlon 64.


1st generation Core i7/5/3 'Nehalem': Debuted on 45nm and moved to 32nm. A significant improvement which moved the memory controller off the North Bridge and onto the CPU die and was Intel's first 'monolithic quad core' i.e. a quad core made of a single piece of silicon rather than two dual-cores stapled together on a Multi-Chip Module. FYI the first x86/x64 montlithic quad was the original Phenom, which you have a die shrunk version of, Talon.


2nd Generation Core i7/5/3 'Sandy Bridge': Started on 32nm and missed the planned move to 22nm due to bad Intel die-shrinkages. Sandy was A very so-so improvement CPU wise which targeted two things. 1) TDP, making the device more mobile friendly 2) improving integrated graphics, it was also the first Intel mainstream performance CPU to include an integrated GPU as standard.


3rd Generation Core i7/5/3 'Ivy Bridge': By the time 22nm was ready Intel's design team were ready to improve the design (ever so slightly). Ivy thus mixed up a tick and a tock; having some architectural improvements but it was primarily a die shrink and implementation of the '3D' Tri-Gate transistor design. In the end it further shifted the design towards being low-TDP, improved the GPU and was mobile geared. Overclocked slightly worse in MHz than Sandy Bridge and was marginally faster per MHz.


4th Generation Core i7/5/3 'Haswell': Deuted on 22nm and will stay there as 14nm has been too difficult to fab. Fixes up the 3D transistor design a bit and is slightly faster per MHz. Doesn't overclock any better than Ivy. However Haswell does have greatly improved graphics (particularly on mobile with the Iris Pro GPU, which redintroduced Multi Chip Modules for graphics memory/L4 cache) and uses much less power (at low speeds) making it a truly mobile oriented CPU which just happens to have a high power desktop cousin. But it is the first chip that can legitimately be found anywhere from tablets to desktops.


So to sum up the zero'th gen gave a largely new design and the first improved upon core with many new (and many borrowed from AMD) design choices. The second, third and fourth all brought similar changes; giving small improvements (averaging about 7% iirc) but moved the Core series towards being a mobile CPU with a significant GPU onboard. This is why they are very similar for desktop users.

The point of all this has been to:
- Deliver better CPUs for servers (who are low volume but massively more profitable per chip) and TBH are the only CPU consumers who also genuinely need more grunt.
- Keep the distance or slightly increase it between Intel and AMD on desktop whilst provoking enthusiasts to buy new chips.
- Move the Core series towards mobile and tablet applications by lowering Thermal Design Power (TDP) and adding and improving the onboard GPU.



My pick for you Talon would be to keep the X4 until Broadwell is released as that will be on 14nm, will have some genuine architectural changes and may finally see MHz finally increase again.

Derailment undone IMHO.
 
Last edited:
Lots of tangents here, and mmj is only retorting what many on here are bored of hearing.

Many of us are bored of hearing that Intel are the devil because they're milking everyone of their life savings, when they're doing basically the same thing that has driven PC technology forward since the 90s.

I suppose arguments against Intel are very limited though when AMD haven't done anything meaningful in years and have admitted that their architecture was a mistake and that they're scrapping it.
 
1st generation Core 2 Solo/Duo/Quad- Conroe':
...
2nd generation Core i7/5/3 'Nehalem':
...
3rd Generation Core i7/5/3 'Sandy Bridge':
...
4th Generation Core i7/5/3 'Ivy Bridge':
...
5th Generation Core i7/5/3 'Haswell':

Not sure where you get that "generation" list from but, Intel describe Haswell as a 4th generation core product / Ivy as 3rd / Sandy as 2nd / and Nehalem as 1st

sources:
http://www.intel.com/content/www/us/en/processors/core/4th-gen-core-processor-family.html
http://ark.intel.com/products/codename/42174/Haswell#@All
http://ark.intel.com/products/codename/29902/Ivy-Bridge#@All
http://ark.intel.com/products/codename/29900/Sandy-Bridge#@All
 
Not sure where you get that "generation" list from but, Intel describe Haswell as a 4th generation core product / Ivy as 3rd / Sandy as 2nd / and Nehalem as 1st

That's correct, the Core and Core2 were a different product set to the Core-i line.

Hmm. They're not a different product in the sense that the first i7/i5/i3 were very directly descended in terms of design, from the Core 2 and the Core (one). Still, Blue team's chips, Blue team's rules. Edited, thanks for the correction Armageus.
 
Many of us are bored of hearing that Intel are the devil because they're milking everyone of their life savings, when they're doing basically the same thing that has driven PC technology forward since the 90s.

I suppose arguments against Intel are very limited though when AMD haven't done anything meaningful in years and have admitted that their architecture was a mistake and that they're scrapping it.

Nobody is saying "intel are the devil", the irony with the mirror fanboys is that my last system was based on the C2D (E8500), I have 3 laptops and a Dell dimension right next to me all with intel processors... let me count for you, that's FIVE machine to my ONE amd processor machine. :confused:

Despite what axe to grind you have with AMD, they paved the way with 64-bit and if I recall correctly intel back tracked and went u-turn with their own architecture going back to the p6 as it was more efficient. :rolleyes:
 
I have to say that Th0nt seems pretty reasonable about things mmj. Perhaps a little wistful for AMD but not exactly illogical about it; something he's taken pains to point out to both of us.
 
If Intel weren't so obsessed with iGPUs we would have 6 core desktop chips with no more die area than the current chips with pointless iGPUs.
 
my venerable i7 2600k oc'd to 4.8 kicks it up with the latest Intels. Th i7 is the first chip I have ever had that has been future proof to some small extent, at least for a few years. The on board graphics on the newer gen Intel chips seems to be better and more efficient but not worth buying new kit for as most people will have a decent dedicated graphics card anyway
 
Back
Top Bottom