• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What will Intel's solution be for the GPU "wars" ?

Soldato
Joined
12 May 2005
Posts
12,631
What do people think they will do ?

Create a GPU similar in design to say NV or ATI, or use their knowledge of CPU's to bring out a more familar (to them) design ?
 
I think they'll leave NV and ATI to it tbh.
I reall don't think they'll want to get involved.
On the other hand.............YAY for quad core GPU's :D
 
Larabee is intels development for GPU processing. Its equivilent to the processing cores on a graphics card. I dont think they have the texture units for it yet, but it's certainly got a lot of parallel inorder computing power, and is likely scaleable to hundreds of cores.

Intel absolutely want to get envolved, as they will want something which can match AMD's fusion. Most likely the long term plans would be to incorporate a few blocks of larabee cores into the CPU, and then for people who want faster graphics a whole ton more larabee cores on a separate card, and allow all the cores to work together be they on the CPU or on the plugin card.

Intel are way ahead of TSMC who fabs a lot(most?) of Nvidia's chips. I know Nvidia have some deals with IBM, but arnt TSMC fabs making the majority of Nvidias products?

Pretty sure that 240 Larabee cores on 32nm process would be a lot smaller than the new Geforce GPU's. But dont expect Larabee anytime soon, the first samples are rumored to be done by the end of the year, but its quite a move from a low core count sample, to a high core count working GPU product.
 
^^^^ Larabee is intel's response. As pointed out above, there is a huge developing market that Intel want to be involved in, and they have the upper hand in terms of manufacturing.

That said, all the information we've recieved has painted larabee as a pure-GPGPU device, perhaps sacrificing some degree of raw processing power in favour of compabability and ease of programming.

It's possible that, some time later, we could see a gaming-GPU version of the larabee architecture, but I'm not so sure that will ever happen.
 
At the moment I think they'll be sitting back and laughing at Nvidia during a short break. At the same time, they'll be looking at the price Nvidia are able to charge, and thinking "we want some of that".
 
Pretty sure that 240 Larabee cores on 32nm process would be a lot smaller than the new Geforce GPU's. But dont expect Larabee anytime soon, the first samples are rumored to be done by the end of the year, but its quite a move from a low core count sample, to a high core count working GPU product.

This is the key point. TSMCs fab for the GT200 is still on the 65nm process, although moving to 55nm in the next few months. It's still MILES behind Intels fab technology, which means that Intel can produce in theory much cooler chips for far less money, since they are getting way more GPUs per silicon wafer due to the die size. If Intel ever do get going, both NV and ATI are potentially in v deep poo-poo!
 
IIRC Intel are already the worlds largest graphics chip company, in terms of chips produced. Some of their more recent chips are actually pretty decent, and can deliver the kind of performance you need for '2D' (Aero/compiz) these days.

Intel have always made graphics chips that deliver good 2D and passable 3D, I guess that with Vista, and how that is driving the graphics card market, Intel will need to step up a level, but I doubt they will ever go for building 'The Fastest' card. I imagine they will continue to fill their boots being the best value product.
 
Larabee has a lot of potential in the gaming markets too though, it all depends how intel want to proceed. Imagine a high end GPU running Cuda/PhysX engine, Sure it might be 15-20 times faster than a Quad core processor, but what impact does dedicating so any of the Nvidia cores to PhysX have on a games FPS.

Larabee would certainly be able to be used as a Hardware Havok accellerator for example, it could also do a shed load of the work that the CPU is still doing in most game engines.

Picture a gaming system where the CPU's basically just controlling the overall system, combining input from devices, managing the network, and organising everything, with Larabee running a lot of the heavy maths, and then handing its results over to a high end GPU for AA/AF etc. No need for SLI, just split the workload between the various processing resources available.

Larabee is supposed to have a powerfull vector unit as part of it. What it currently appears to lack are texture units.

Pretty sure intel will eventually wish to market these parts to the gamers, but perhaps to enhance standalone GPU's rather than replace them.
 
i think i read somewhere that intel realise their fault with graphics cards, and were pushing for some "new technology". forget what its called but it was in a recent pcformat/custom pc mag.
 
Intels history for graphics cards is a joke, their integrated graphics has been broken without drivers for literally years.

Strange, they are extremely good under fbsd (linux too I expect). Theres a whole team at intel working on em :P

Larabee looks extremely interesting, and I'm sure Intel want a piece of the 'enthusiast' market - you can even buy an intel branded mobo on ocuk, normally have to hunt around a bit for the intel boxed boards, but they have awesome NICs. Everything software-wise at Intel atm is about efficient MT/MP techniques, and I expect they will be able to come up with some sort of interface for Havok and Larabee that allows some sort of offload. I'm actually very interested in Larabee as a very open, programmable vector array. You could very easily use this to accelerate video codecs, and seeing how (allegedly) it will be programmed as a x86 instruction set extension, so its immediately open to being used by anyone with a compiler. :)
 
Last edited:
Intels history for graphics cards is a joke, their integrated graphics has been broken without drivers for literally years.

I don't recall a single intel graphics card being available. All their solutions are chipset-integrated, with only the most basic 3D features and performance. The 3D performance market, particularly since the advent of unified architectures and GPGPU applications, is another matter entirely.
 
I don't recall a single intel graphics card being available. All their solutions are chipset-integrated, with only the most basic 3D features and performance. The 3D performance market, particularly since the advent of unified architectures and GPGPU applications, is another matter entirely.

I must say that the few intergrated GPU's I have seen in peoples PC's always have worked fine and installed drivers ok. They are though, alebit a little useless for a game of Quake.
 
I must say that the few intergrated GPU's I have seen in peoples PC's always have worked fine and installed drivers ok. They are though, alebit a little useless for a game of Quake.

I've not had any problems with the ones I use at work either - they've always been stable. They offer pretty poor performance for 3D work of course (I use java 3D occasionally...), but anyone who needs 3D performance will simply buy an add-in card. For well under £100 you can improve your performance by a factor of 20 easily.
 
Back
Top Bottom