• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*Breaking News* Intel Larrabee is canned

There is no room in the GPU market for a 3rd player. The R&D cost in designing chips is huge, and it's increasing all the time. ATI barely make profits on their current market share.

Yes but placing a CPU and GPU into a single device has a great return although it's probably cheaper to partner and forgo the costs of buying in the GPU solution.
 
The problem was down to drivers - if they could convince game devs to develop native engines they had the potential to do awesome stuff, but for DirectX they had to run everything through a translator and emulate the functions in software, which requires a lot more transistors than having set hardware functions.
 
I think they mean games wise, I've had a lot of issues trying to get some games working with intel onboard GPUs.

I would say it's a software/dev issue too. I think a lot of devs had difficulty finding the time to make sure their game works on a niche graphics card, which offers insanely low performance anyway.

I was games testing a few years back on a LARGE movie IP, and support for Intel graphics chip sets was added fairly late into development. It wasn't going to be in there at all, but they thought it best to add it to make sure as much of the market as possible could play the game.

So that's my 2 cents on that issue anyway :)
 
I don't really think Intel will be bothered about the GPU market to be totally honest, when (IMO) the days of dedicated cards like the GTX / Radeon ranges are limited. A good article by El Reg the other week commented on how IBM aren't really focusing on the PS3 being nodes in a super computer anymore and are much more interested in GUGPU type approaches. The larrabe announcement coming the same week as Intel demos a 48 core CPU should tell everyone whats coming.

I recon before long we're looking at highly integrated CPU GPU technology, banks of non-independent programmable "SPUs" flanked by a 12-24 core CPU all integrated at low level for high bandwidth low latency data transfer. It's what the "science bods" need for their work and what gamers need for their fix.

We can't keep using low bandwidth PCI-esq interfaces for GPU generating terraflops of processing power. It completely limits how you can use the SPUs on them.
 
i think intel should just concentrate on CPUS or even go for motherboards with there own built in cards inside them ( like mini itx systems) but make them better than nvidias instead of going full on high end gpu.
 
I'm not sure exactly what Rroff thinks is bad performance compared to a 280gtx, 1/4 of the performance of Nvidia's top end card right now, on an unreleased not seen not final driver and completely not optimised for card is actually downright impressive.

I'm not sure if that was actually 1/4 the speed of a 280GTX or 1/4 the speed they were expecting... and that was extrapolated performance anyhow as from what I can make out they are based on a prototype thats running a lot slower than that.
 
Intel has the worst graphics track record ever

and almost unlimited resources in terms of cash for R&D

if intel want to make a market leading GPU, they will. its just a matter of time


sure, their integrated solutions were a bit rubbish, and their initial larrabee benches were not great

but this is a product in its earliest of development stages that they havent had time to refine or create fully capable software for

you have to remember, the only time we have seen a larrabee gpu comared against anything is when they compared a one of a kind, crude engineering sample to that years best selling flagship GPU that had/has excellent ongoing driver support
 
Short of buying up an already successful business or hiring someone with the wisdom and balls to make changes I don't think intel will manage to make a successful entry to the gaming/enthusiast market - no matter how much money they throw at it - their entire approach and mentality is too rigid. They need to engage more with the 3rd party developers and consumers and be prepared to be a bit more flexbile - than their current one size fits all approach.
 
I expect they'll keep chugging away at the drivers (esp since i expect they will start using larrabee tech in their CPU's), and standalone boards will sell well in the HPC / Raytracing market. (a 48 core larrabee theoretically pushes out around 3TFLOPS SP or 1.5TFLOPS DP vs The top Fermi board managing around 1.2 TFLOPS sp / 0.6 TFLOPS DP)

As GPU's manufacturers are pushed into spending more transistors on making flexible GPU's larrabee starts looking better and better. Eventually Intels massive manufacturing advantages will start to tell - they can be at least a year ahead of TSMC & Co, thus allowing them a considerable edge in transistor budgets.
 
Last edited:
The interesting thing about Larrabee isn't necesarily the raw performance but the flexibility it gives you in the pipeline. Because the whole pipeline - including the rasterizer - is implemented in software (and at least semi-open source from my understanding) it has the potential to allow some novel rendering techniques. Carmack's proposed sparse-octree voxel rendering is one such example.

It's only a matter of time before something better than triangle based rasterization comes along IMO. Larrabee style graphics may be impractical at the moment, at least for the high-end, but it is definitely the future.
 
On a side note, dont all the movie studios who use massive render farms, use only a larrabee style CPU software rendering for movie CGI, not a graphics card in sight?....
 
Back
Top Bottom