• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Prediction - Intel to buy Nvidia?

Soldato
Joined
25 Sep 2009
Posts
10,210
Location
Billericay, UK
I was thinking if AMD's Fusion technology becomes a hit that will in turn put pressure onto Intel to develop a better integrated graphics solution and with the failure of Intel's engineers to build a consumer level version of Larrabee I feel there is a good chance Intel would buyout Nvidia just to get access to the IP, engineering skills and resources that Nvidia can offer them.

You would think that for the money it would cost to buy Nvidia ($10-$15 billions perhaps) Intel could setup or further develop/invest further it's own consumer graphics department but it's clear from the failure of Larrabee it's not as simple as throwing money at it.

Thoughts?
 
Yup, I've said this at every suggestion, right now Nvidia's current IP, current shaders and current designs have no chance working in the future, they've faught tooth and nail for the large die strategy which brings worse manufacturing by the gen. They've got a CEO who doesn't listen to reason and a philosophy that they'll do what they want and ignore production problems until its too late.

They don't have any production facilities, their mobile stuff in Tegra is doing incredibly badly on HUGE R&D spending and is based on a licenced tech thats widely used more effectively by dozens of companies with larger market share and high profits with working chips out in hundreds of products shipping in the millions. This is while Nvidia spend millions on R&D, and manage to get a couple products out, that failed, in the low thousands making a massive loss in a market where almost every other company with an ARM chip out is making money hand over fist. Frankly while ARM themselves are making money like crazy and everyone using ARM designs is making money like crazy with simple working, on time, on power, on spec products, Nvidia failing is, honestly a complete joke and makes them look incredibly bad in the industry. If a couple hundred companies can licence ARM chips, get something going, produce it and get it out and Nvidia, a hugely experienced chip design firm is still trying to do to much, is too late, and can't hit design targets it looks VERY bad.


Their entire gpu/gpgpu range needs a massive overhaul to be manufacturable in the future and the guys that can design that, are just that, designers, Intel can simply buy all the designers, the only worth of a design company whose current products are a disaster(fast but profit wise a complete unmitigated disaster) is what those guys can design in the future.

Why spend say 2 billion for Nvidia's current designs and the staff, when they can just offer the staff contracts/jobs at Intel, and save the 2billion.

It also looks like Intel have a very much commited, and very VERY much improved GPU for their Sandy bridge, those who said they had no chance of high performance with few cores, well their 12 core gpu is matching a 5450, their drivers and IQ quality is according to Anand, spot on compared to previous IGP's which were awful quality, so they are WELL on their way to sorting drivers, they've got an architecture thats maturing quickly, they've got production.

Nvidia don't have assets except for staff, if Tegra was based completely on their own chips, not ARM based, and were selling well that would be a different matter, but as the "fermi" design won't continue to work in the future, they don't own anything worth any money at the moment.

280gtx was late and apparently missing final clock speed targets by 50% or so, 285gtx was even later, took even longer to get working on a process, and was 100 to 150mhz short of targetted clocks, Fermi took even longer, had even more problems and missed clock targets by 200Mhz or so, missed power targets by 50-100W and was painfully late. Big = bad and as process's get smaller the problems get worse, and worse, and worse.

The problem is the industry is starting to view Nvidia as a joke, end users do great, when a company is selling very fast cards at massive losses to stay competitive, its great, but Nvidia are doing very very badly.
 
Last edited:
IF Intel were to ever buy nVidia... it wouldn't be at anything close to their current market worth... intel can be quite brutal when it comes to aquisition, they'd insidiously cut nVidia to its knees and buy them up cheap (or just attract their main staff away and basically gut the company that way).

RE: Drunkenmaster - I heard the same things said about nVidia with the FX series launch... and they are still very much here and now... so I don't think they are going anywhere quickly. Tho the failures with tegra will have hurt them badly - but they've always come back from being on the backfoot in the past so I wouldn't be so quick to count them out.

Oh and ARM is just awesome :D

EDIT: Oh and yeah intel will never succeed in the performance graphics market however much money they throw at it... despite being capable of very good software and hardware development... unless theres a huge shift in mentality - their whole approach to products while working very well in other markets doesn't suit performance graphics well.
 
Last edited:
IF Intel were to ever buy nVidia... it wouldn't be at anything close to their current market worth... intel can be quite brutal when it comes to aquisition, they'd insidiously cut nVidia to its knees and buy them up cheap (or just attract their main staff away and basically gut the company that way).

RE: Drunkenmaster - I heard the same things said about nVidia with the FX series launch... and they are still very much here and now... so I don't think they are going anywhere quickly. Tho the failures with tegra will have hurt them badly - but they've always come back from being on the backfoot in the past so I wouldn't be so quick to count them out.

Oh and ARM is just awesome :D

EDIT: Oh and yeah intel will never succeed in the performance graphics market however much money they throw at it... despite being capable of very good software and hardware development... unless theres a huge shift in mentality - their whole approach to products while working very well in other markets doesn't suit performance graphics well.

See you said this, for a long time, every time Larabee was brought up, however their Sandybridge GPU shows a MASSIVE, monumental leap forwards in low end gpu power, also most importantly, Anand who used to mock Intel IGP's for giving awful IQ and dodgey drivers suggest their new GPU is massively massively better and giving equal IQ aswell as performance to a 5450.

Lets be honest, theres VERY little difference between a small 12core Sandybridge gpu and sticking one with 10 units stuck together on a discrete card.


They've clearly addressed, significantly, IQ, power, ability to deliver it and considering the supposed power of Sandybridge chips, their 12core gpu seems set to use very little power, to the degree that you could stick 20 of these units together and have a workable product.

I mean a Fermi high end and low end, a Nvidia IGP or AMD IGP is little more than just one or two units of a high end card, the architecture is all but the same.

Also no, no one said anything similar at the FX launch, no one predicted Nvidia's downfall or their unwillingness to change their ways, it was one bad product, with a good one before and after it. the 280gtx through to the current gen ALL show the same signs of sticking to certain design philosophy that has made them progressively worse over 3 years.

The situation around the FX wasn't the same, the FX was memory bus limited, it was on a new process, it had a new type of memory that wasn't working well, the fx5900 came out not much later at all, with double the memory bus, back to the original ddr memory, with almost certainly some idea's of fixes for the new process, etc, etc. THe problem was, when Nvidia had that trouble, it was just too early for the process, it wasn't part of an ongoing problem, the volume part of the company wasn't about to fold, and they were selling gf4mx-ti in the rest of the range without issue that made profit and were good cards for the prices.


Same with AMD and the 2900xt, while that and the FX5800 were late and had problems, the ENTIRE rest of the range were selling like hotcakes, making the companies oodles of money and neither company had ANY problem, it took Nvidia 3 months to go from a uncompetitive 5800(with the first dual slot cooler) to a 5900 which was hugely better and had no problems.

These days, Nvidia is 6 months late, have EOL'd the previous chips, have no mid or high end cards selling in quantity and the low end is based on the previous gen, isn't that competitive. LIkewise around the 5800 Nvidia had huge brand recognition with partners who loved them, this time around they've gone through 2 years of publicity disasters, hugely higher RMA numbers than normal due to faulty parts, huge problems for many of their partners. Nvidia can't count on the console market, probably at all, in the next gen, they can't count on the low end, they can't count on Tegra for a long while as it quite obiviously needs to be redone from the ground up. Considering everyone else can get working ARM cores done right, on spec, on time, the only difference is Nvidia using its own gpu's on board, so take 2 guess's which part of Tegra is over power and late and off spec? Again pointing to the fact that its their own gpu's that have fundamental issues, those being, they are completely unsuitable for mass production on process's this small, they have old thinking and its killing them.

Remember when the 5900 came out, it was a high profit part, and the low and mid end were making money and they had the massive share of the market.

This time when Fermi finally launched, it made a loss on every sale, and continues to be utterly slapped around at the current price the 470gtx sells at, the "5900" of the Fermi, the GF104, isn't fast enough to compete while the fx5900 was on par with a 9700, the gf104 is massively slower than cores that are smaller and sold at a higher cost, the GF104 is barely scraping a profit.

The situations aren't remotely comparable, for it to be the same, for AMD, while the 2900xt was screwed the x1950pro would need to be making a loss or just breaking even, or EOL'd, and the low end would have had to be up against Nvidia's 8xxx low end, it wasn't, Nvidia's mid and low end weren't out before AMD's, the x1950 was competing against the 7900, not the 8600 it should have been, same goes for the low end.

Nvidia is screwed across their entire product range, in the 2900xt or the fx5800 days it was ONE product, with the rest selling with a healthy profit with huge volume. None of that is true of Nvidia now.
 
Could not happened because of competition issue, even if Intel would want to (which they possibly don't). Oh and don't count on Tegra to be big; you have to emulate the x86 instructions to run a regular Linux desktop distribution or a Windows one.
 
Could not happened because of competition issue, even if Intel would want to (which they possibly don't). Oh and don't count on Tegra to be big; you have to emulate the x86 instructions to run a regular Linux desktop distribution or a Windows one.

I don't see any reason why a desktop distribution of Linux couldn't be built for ARM CPUs. I mean you could at least get Debian with a desktop going on it. Of course application support is another matter, but I think that'd sort itself out with a powerful enough ARM nettop or the like with wide enough availability that some open source guys decide it's interesting enough to port their applications to.
 
IF Intel were to ever buy nVidia... it wouldn't be at anything close to their current market worth... intel can be quite brutal when it comes to aquisition, they'd insidiously cut nVidia to its knees and buy them up cheap (or just attract their main staff away and basically gut the company that way).

I agree with this.

But you could argue that Intel have been doing this for years now. nVidia's chipset division is all but dead, stocks are low and the graphics division is at the weakest it's been for years (financially, I admit they're more than performance competitive with ATI/AMD but they were haemorrhaging money at one point).

I don't think a buyout is off the cards but at the same time I don't think it's on either. Intel will have a good go at the discrete graphics high end and try to drain nVidia's profit from there (as well as the low-end with Sandy Bridge) before considering the company ripe for takeover.
 
IF Intel were to ever buy nVidia... it wouldn't be at anything close to their current market worth... intel can be quite brutal when it comes to aquisition, they'd insidiously cut nVidia to its knees and buy them up cheap (or just attract their main staff away and basically gut the company that way).

I was going to say it might not be that simple to take staff away but since both companies as based in Santa Clara I doubt it would be a big upheaval for Nvidia's engineers to move sides (that might explain why AMD choose to buy ATI who were based in Canada rather then try and tempt it's staff away).

straxusii said:
Nvidia's current market cap. is 5.5 billion, not sure of the kind of deals Intel do but I wonder if that is too rich even for them.

I wouldn't have thought $5.5 billion would be hard for Intel to find, they have just snapped up McAfee for near $8 billion and Infineon for $1.4 in the last month.
 
I think drunkenmaster is an ATI/AMD fanboy, all I get from reading his posts is how nvidia are terrible and how they are going under.

...right.

I think there are a lot of ATI/AMD fanboys out there right now simple because there isn't anything better to be. It's a fact that NVidia's chips are getting worse and worse and if they don't change now they simply are going to get worse and worse. The GF104 was a step in the right direction. Hopefully their future chips are going to take a hint from ATI.

It's really sad, but that's how it is. It's good for the consumer right now since at least someone at NVidia has the brains to at least be compedative with the GF104, but it's not earning the company much money and isn't going to do them good in the long run... :(
 
His posts are a lot better than what some people post on here, well thought out, constructive to the thread with information to back it up.

Good work I say
 
What will most likely happen is ATI/AMD will squeeze them from the profitable top end market and Intel will put pressure on the low end market. When nVidia is about to go bust Intel will most likely do a 3dfx style takeover, buying their IP and main staff.
 
His posts are a lot better than what some people post on here, well thought out, constructive to the thread with information to back it up.

Good work I say

Lets pick one of the easier points to shoot down in flames.

the gf104 is massively slower than cores that are smaller and sold at a higher cost

Really now? Massively slower than what exactly?

You have to remember that these are the words of a self confessed AMD share holder who claims to have over 20K worth of AMD stock. I'd argue that it is hard to make a subjective post (not that they even come close) in such circumstance and are not the words of somebody I would trust for impartial analysis of the industry. Each post always paints ATI in the best possible light, and Nvidia in the worst. It gets tiresome after a while.
 
Back
Top Bottom