• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will inbuilt graphics kill desktop CPU's?

Associate
Joined
21 Jan 2010
Posts
573
Hello

After reading a multitude of Ivy Bridge reviews, it seems to me there is a worrying trend developing in regards to desktop CPU's.
Integrated graphics on the CPU chip seems to be the way of the future for Intel and AMD. While this is great for laptops, where does it leave desktops? I for one have absolutley no use for an integrated graphic chip on my CPU. If I want graphics, I buy a specialist card.
Looking at the ivy bridge heat issues, could this be linked to the improved graphics chip on processor? Maybe it's causing the extra heat. If the graphics weren't embedded, or they kept the same HD3000, would this chip have blown the SB chips out of the water for overclocking?
Will there ever be a cpu in the future that is just that, a CPU?!

(I'm bored at work, and ust had to get this off my chest ;))
 
i just find it puzzling to put a onboard gpu on a top of the range desktop chip. anyone spending £300 on a cpu is going to be backing it up with a proper card.

i guess its for oems but you might as well just buy a laptop with llano in then.
 
No, integrated GPU's are a long way off competing with graphics cards that can handle modern gaming at good settings.
 
No, integrated GPU's are a long way off competing with graphics cards that can handle modern gaming at good settings.

I agree that it's a long way off, but it seems to me this is the area that they're focussing on more and more.
We desktop gamers seem to be a dying breed thanks to consoles and those social networking sites with games etc.
I suppose I'm just worried that the (poor) inbuilt graphics will take the performance away from the raw processing speeds they could be potentially reaching.
Damn mass market!
 
I tried to make this point last night. It's like the igpu is spread across the range the wrong way around. If they could keep the best parts of it in the HD2000 for video encoding etc and that would keep the size of it down that would seem like a better option for the 'K' model chips or even produce some without any igpu. Anyway I think they could make 4 more models, black edition series or something that would have allowed better spacing of the transistors and provided greater OC headroom as a result.

http://forums.overclockers.co.uk/showthread.php?t=18395879&page=4
 
I feel the heat issue with the ivybridge chips is more to do with the 3D transistors then the integrated graphics. If you think about it logically, only a small part of a 3D transistor will be dissipating heat into the spreader/heat sink. The sides that aren’t facing up will just dump their heat into the surrounding components causing the heat build up we're seeing.

In the long term, I think the igpus will benefit gamers massivley. Just look at the lucid virtu tech. Its very early days but we're already seeing some fps gains in certain circumstances. The performance increase will only get bigger as the technology matures and we get more powerful igpus.
 
So as thenewoc said

I tried to make this point last night. It's like the igpu is spread across the range the wrong way around. If they could keep the best parts of it in the HD2000 for video encoding etc and that would keep the size of it down that would seem like a better option for the 'K' model chips or even produce some without any igpu. Anyway I think they could make 4 more models, black edition series or something that would have allowed better spacing of the transistors and provided greater OC headroom as a result.

http://forums.overclockers.co.uk/showthread.php?t=18395879&page=4

The extra space used by the igp could have been utilised to spread out the transistors, thus negating the heat issue and potentially allowing increased overclocks (in my tiny mind that makes sense, may be completely wrong). Are Intel missing a trick here and could manufacture exactly the same chip without the igp - rebadge it and make us all happy?
 
No, the 22nm is the half-pitch (half the distance between identical features), if you spread it out it goes back to 32nm or somethng.
 
So as thenewoc said



The extra space used by the igp could have been utilised to spread out the transistors, thus negating the heat issue and potentially allowing increased overclocks (in my tiny mind that makes sense, may be completely wrong). Are Intel missing a trick here and could manufacture exactly the same chip without the igp - rebadge it and make us all happy?

If they did redesign a chip without an igpu then I dont see why they would waste silicon making the CPU bigger. They could fit more of the compact cpu's per wafer using their current transistor density and create better margins. That is most likely what they's do.
Also, if I remember rightly (im sure someone will correct me soon enough if im wrong) the physical distance for signals to travel also plays a role in maximum clock speed, so you might loose some oc headroom, but I guess that would depend on whether it a very early heat barrier is killing the clockspeed before the design/transistor limitations are.
 
AMD are way behind Intel, so I think Intel don't need to waste time and money producing large chips which provide cutting edge performance for the mainstream. They can already afford to segment the market by forcing overclockers to buy the more expensive unlocked K chips.

Those that are willing to pay for the absolute best performance have SB-E and will get IB-E, expensive chips.

The demise of AMD has been very bad.
 
Last edited:
AMD themselves sure are to blame, but its not just that....

There are aspects including some games where an FX-8### performs better than a 2500K and even some times a 2600K.

But go to most forums and put those benchmarks up and you would be amazed how fast the thread will fill with Intel fanboys behaving like a spoilt 6 year old whose chocolate has just been stolen. some of them can get quite demented.

AMD have such fanboys too, yet glory supporters are always the vastly larger group.

Not many people these days would dare to go against the grain and turn left when everyone else goes right.

I paid £40 less for my CPU than what would have been the apparently sensible option.

Here's the thing, media encoding is multi threaded x264 these days and my x6 beats the 2500k there hands down, so i can encode my movies faster with what i have.
It also renders at least as well if not at a higher rate.
it packs and unpacks archives just as fast....
it will handle BF3 at exactly the same frame rates (when and if i get a 7870) as the 2500k would.

What more is there?

AMD have an image problem. partly there own fault, partly unwarranted and partly because of trolling immature idiots.

If things all round don't improve those same idiots will get what there after, there favorite team totally dominating, what will come as a nasty surprise to them will be what the result of that is.

I'm looking forward to what PB can do for me.
 
Last edited:
i just find it puzzling to put a onboard gpu on a top of the range desktop chip. anyone spending £300 on a cpu is going to be backing it up with a proper card.

i guess its for oems but you might as well just buy a laptop with llano in then.

Try using Lucid Virtu MVP on a z77 board. That uses both GPU and IGP to improve input lag and graphics performance. So IGP's are started to be used main stream now.
 
I would say 2 more generations of AMD fusion chips and the inbuild gpu will match not just entry level, but also mainstream graphis. Which should be plenty to run most games at max settings on normal setups. Highend and super highend will still need discreate cards tho, and hopfully AMD will have caught up CPU performance by then.
 
Back
Top Bottom