• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel's Larrabee Architecture Disclosure: A Calculated First Move

Soldato
Joined
25 Oct 2005
Posts
13,779
Anandtech said:
Oooh this is dangerous.

It started with Intel quietly (but not too quietly) informing many in the industry of its plans to enter the graphics market with something called Larrabee.

NVIDIA responded by quietly (but not too quietly) criticizing the nonexistant Larrabee.

What we've seen for the past several months has been little more than jabs thrown back and forth, admittedly with NVIDIA being a little more public with its swings. Today is a big day, without discussing competing architectures, Intel is publicly unveiling, for the first time, the basis of its Larrabee GPU architecture.

Full 16 page article: http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3367
 
In AMD/NVIDIA GPUs, DirectX/OpenGL instructions map to an internal GPU instruction set at runtime. With Larrabee Intel does this mapping in software, taking DX/OGL instructions, mapping them to its software renderer, and then running its software renderer on the Larrabee hardware.
Does this mean that these cards will support any future version of Direct X?
 
Does this mean that these cards will support any future version of Direct X?

I'm guessing it means Larrabee will do DX/OGL support at driver level, so to me that says that they should only need to patch in support for new versions. A pretty nice feature if it doesn't hamper performance. :D
 
Wow :)

It looks like future Intel desktop chips will be a mixture of these large Nehalem-like cores surrounded by tons of little Larrabee-like cores. Your future CPUs will be capable of handling whatever is thrown at them, whether that is traditional single-threaded desktop applications, 3D games, physics or other highly parallelizable workloads. It also paints an interesting picture of the future - with proper OS support, all you'd need for a gaming system would be a single Larrabee, you wouldn't need a traditional x86 CPU.

This could potentially open up PC gaming to anyone who buys a Larrabee based computer which is only good news for us gamers. A bigger market could give the industry a much needed boost.

Thinking from a competition and pricing point of view, what worries me is that if Larrabee is successful, will AMD and Nvidia have something in response?
 
That CPU is going to have to be a factor of several times quicker than the competition to make up for the software rendering part...and we all know Intel can't write drivers for **** as the G3x\4x launches showed.
 
Well it's not gonna be god-awful software rendering in quite the same way as it would be on a typical CPU, as the very threaded architecture of larrabee lends itself to GPU style work.
 
Well it's not gonna be god-awful software rendering in quite the same way as it would be on a typical CPU, as the very threaded architecture of larrabee lends itself to GPU style work.

Yes, that "threaded" architechture is going to have to be amazingly quick to compete with even a low-cost dedicated GPU.
 
That's pretty much what GPUs are though, (recently) hugely programmable, parallel (i.e. threaded) pieces of hardware.
 
im confused so this gpu will be integrated in cpu. so will we still be able to use other gpu like nvidia or ati in the same mob with larrabee :confused:
 
It does have me wondering, maybe havendale = dual core Nehalem with a scaled down Larrabee attached to the back of it?
 
i wonder if intel will keep the default driver setting of CTRL + ALT + DownArrow = flips your screen over

dumbest default setting ever :p
 
Someone raised an interesting point in the comments section of that article: If the drivers are going to translate all the DirectX/OpenGL calls into the typical x86 instruction set, does this mean it could bring seamless DirectX/OpenGL support to OS X and Linux?
 
That was a damned invigorating article!
I truly have a lot of confidence in what Intel has planned.
I think its also worth noting their purchase of "Project Offset" as highly connected to Larrabee.
 
Last edited:
That's pretty much what GPUs are though, (recently) hugely programmable, parallel (i.e. threaded) pieces of hardware.

Well, it isn't clear how many threads Larrabee can handle at once. GMA X3100 has 8 unified shaders, but its often worse than its predecessor in games...

Someone raised an interesting point in the comments section of that article: If the drivers are going to translate all the DirectX/OpenGL calls into the typical x86 instruction set, does this mean it could bring seamless DirectX/OpenGL support to OS X and Linux?

I thought Cedega was trying to do just that.
 
That was a damned invigorating article!
I truly have a lot of confidence in what Intel has planned.
I think its also worth nothing their purchase of "Project Offset" as highly connected to Larrabee.

Ray tracing has been specifically mentioned in a number of Intel slides IIRC. ;) :D

I thought Cedega was trying to do just that.

Indeed, but maybe we'll see it in driver-level without any need for messing around. :)
 
Wow :)



This could potentially open up PC gaming to anyone who buys a Larrabee based computer which is only good news for us gamers. A bigger market could give the industry a much needed boost.

Thinking from a competition and pricing point of view, what worries me is that if Larrabee is successful, will AMD and Nvidia have something in response?

AMD likely will, thats why they purchased ATI.
 
Ray tracing has been specifically mentioned in a number of Intel slides IIRC. ;) :D

Yeah... man... that games gonna be sweet! Unlimited funding from Intel :eek::D
I can actually see them releasing it in a bundle with Larrabee cards.

And about ray tracing - All the images or videos of that game are 3 years old now - and look better than current games. They were taken a year after development started.
I can only imagine what godly creations could come out of such a prolific microprocessor manufacturer aqquiring an independant games studio with the potential of having the same effect on the gaming industry as arnold schwarzenegger on steroids would have on... well... nevermind.

Intel & Project Offset are my new gods.
All hail microprocessors and PC game developers!
 
Back
Top Bottom