• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia CUDA ace to move on to AMD

Please read my post, the story itself is wrong. Hedge is not responsible for CUDA, he owned Ageia and after Nvidia bought them he took the position of VP of PhysX and CUDA marketing. He is not a key person to CUDA in any shape or form.

Ian Buck is the guy behind CUDA, and he also created Brook programming language at Stanford University.

This site KitGuru (where the story originated) is highly questionable, they have serious errors in almost all of their 'news' stories recently.

now now don't let facts get in the way of a chance to poke at nVidia...
 
ah a cunning plan, send Nvidia top cuda man to work for the opposition then they will listen to him and start to use a very similar system, soon the two will be amalgamated therefore making cuda actually worthwhile....:D:p:D
 
Well the funniest thing is that AMD could write a CUDA driver tommorow for free (or could have had one for over a year already). CUDA is a proprietary technology, but there is no license fee or royalties to be paid for using it. It's fully documented and anyone who wants to write apps/drivers can do so.

It's not a question of CUDA not working on ATI cards for hardware reasons, it's that AMD don't support CUDA because it's owned by Nvidia. After STREAM failed to take off they switched fully to OpenCL development and i don't expect them to change that position, supporting CUDA would seem like AMD loosing face even if it is widely used. The only way you will see CUDA on ATI cards is from hacked homebrew efforts.
 
Please read my post, the story itself is wrong. Hedge is not responsible for CUDA, he owned Ageia and after Nvidia bought them he took the position of VP of PhysX and CUDA marketing. He is not a key person to CUDA in any shape or form.

Ian Buck is the guy behind CUDA, and he also created Brook programming language at Stanford University.

This site KitGuru (where the story originated) is highly questionable, they have serious errors in almost all of their 'news' stories recently.

so this person is only a marketing guy....great...* for NV*
 
Last edited:
Nvidia is, and always has been about marketing.

Personally I don't see him as a major coupe, but then I don't know anything about him.

He was big at Ageia, he could have been a marketing guy, or a tech guy , he may have writen the original code, designed the hardware to accelerate it, or he might have been the marketing only guy, who knows.

You can't judge ANYTHING based on a job title, the CEO of Nvidia is a marketing guy, with a degree in business marketing(or management, i forget), and yet he's got a huge, massive, untold influence over the path Nvidia hardware takes, what he wants in, and out of it. Marketing does not mean not involved in the hardware, not even close.

In terms of Nvidia, I have no idea what he did, though its highly likely he was heavily involved on physx's evolving under Nvidia(which might be a bad sign due to lack of improvements, for years :p ). For AMD it would appear as if his job will be to round up companies, offer support, and get a commitment to move towards apu accelerated software in the future.

Intel and AMD will be pushing this HUGELY due to gpu's on die and essentially having more FPU power than ever under the hood, it would be insane for them to not try and use it and whoever uses it best and can get the most power out of the space available to them will likely end up with the fastest CPU.
 
Ageia was big news in the developer world, the idea of producing a physics library much like we have graphics libraries such as Direct3D and OpenGL is indeed visionary, but also an obvious extension of the standard computing ethos. If we see a standardised problem, we try and produce a single code-base from which to build more complex programs (i.e. games). It should also be said Novodex (PhysX) was certainly not the first library to try this, there were others before it (ODE springs to mind), but it did introduce some quite significant improvements and ultimately, having used them all myself recently, it was designed from the ground up to be easy to use by any c/c++ developer.

WRT to CUDA, NVIDIA saw PhysX as a nice way to promote their tech, however it has taken them much much longer than I imagine they anticipated to port the existing PhysX code to run with CUDA. It really isn't surprising, PhysX is a monolithic library, when you see the source it's quite staggering. Plus the problems PhysX solves simply don't lend them to massive parallelism such as stream processing (i.e. CUDA) demands. Simply put, the PhysX team have a massive job on their hands to do it well! Stuff like the liquid and cloth simulations that can currently work on your card via CUDA were the trivial aspects, it's the rigid body stuff that is very tough.

Who knows why he's leaving, perhaps he has simply been poached by AMD, ultimately if they doubled his salary I can't say I blame him for going where the money is, or maybe as has been said he simply didn't get on with those around him.

In terms of NVIDIA dying soon or dropping the CUDA brand, I'd be very surprised. They may start to switch direction by building CUDA on-top of OpenCL (which is where AMD has put their money), but OpenCL is a different beast to CUDA, it aims to effectively replace all current parallelism libraries and offer a single interface to multi-core processors and GPUs and indeed other processing devices. CUDA is lower level than that, it interfaces directly and as its NVs tech, its hardly surprising they limit it to their own HW!

Especially within the research communities, CUDA has taken off, implementing well known libraries in CUDA is really in-vogue right now, researchers are getting used to using the CUDA language. It could well be that in the future CUDA will effectively use OpenCL, which will be what is actually providing the low level access to the GPU, but as a language I reckon it'll stay.
 
there probley is many reason why hes jumping boat, probely due to Nv actions who knows
 
it could be a case of join AMD, learn what there doing RIGHT, then go back to Nvidia to get cards that can actually out perform ATi's with lower temps and higher performance
 
Why Hardware Ace Left Nvidia for Rival AMD

By Don Clark

Manju Hegde, a hardware expert with a track record in startups and academia, has been helping to lead one of the most high-profile initiatives of chip maker Nvidia. In a sharp about-face, he’s decided to lead a related and equally ambitious quest at rival Advanced Micro Devices.

Hegde is best-known in the videogame world for founding a company called Ageia Technologies, which developed software and specialized chips for simulating physics–the animated motions, collisions and explosions that make videogames lively. Nvidia purchased the company in 2008.

After the deal, Hegde became involved in Nvidia’s CUDA program, a term that refers to the company’s stated goal to turn its graphics chips into more general-purpose computing engines. His most recent title was vice president of technical marketing for CUDA.

AMD, since its 2006 purchase of the graphics chip company ATI Technologies, has endorsed the same basic idea. But the company is better known for competing with Intel in selling x86 microprocessors, the chips that run general-purpose software in most of the world’s computers.

AMD’s own big idea is called Fusion, a term that refers to putting the x86 functions along with graphics and other circuitry on a single piece of silicon. At AMD, he will hold the title of vice president of the Fusion Experience program, which among other things will help galvanize software developers to exploit the technology.

Why did Hegde prefer one quest to another? He says AMD had that key weapon–the x86 technology–that is not currently in Nvidia’s arsenal.

“Nvidia is a great company to work for,” he says. “It’s like a big startup; they move extremely fast. But the opportunity to having all the processing capability on one chip was too good to miss.”

Hegde, 53 years old, was originally educated in India. But he went on to get degrees that included a doctorate from the University of Michigan, Ann Arbor, in computer information and control engineering. He was on the faculty of Washington University during the period he ran Ageia, he says. Other startup activities including serving as co-founder of Celox Networks Inc., a maker of network processors that didn’t survive the dot-com bust, he says.

An Nvidia spokesman declined comment on Hegde’s departure.
http://blogs.wsj.com/digits/2010/05/26/why-hardware-ace-left-nvidia-for-rival-amd/
 
Back
Top Bottom