• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Samsung to buy AMD, rumor?

HP are the biggest player in high-end supercomputers, not IBM, with 36% of the top 500 to IBMs 30%. IBM were the biggest in the past but have been losing market share rapidly.

HP are small fry, they have one installation in the top 50 (15th place). IBM have 13, including 4 in the top 10. Two of the top 10 are using Nvidia GPU's. Only one machine in the top 100 is using AMD GPU's, that rocks in at a heady 93rd place.
 
Being first and having better tools. For the majority of stuff, OPEN CL is now a better bet, certainly for almost anything that doesn't involve massive number crunching. Especially as AMD have a large architectural lead now and will have a gigantic one with the new gen cards.

By the time Pascal hits, NVIDIA should have well below 50% of the Pro market. Most estimates put AMD at 15-20%, NVIDIA at 70% and Intel with most of the rest currently. However this is likely out of date, since the Mac Pro contract was huge for AMD.

Also, they have nothing to combat HSA with really.

I get the impression we are moving away from CUDA to OpenCL now.

I'm not seeing it - OpenCL is very good for certain things but doesn't have the ecosystem CUDA has as things stand to give it overall appeal in the same way - CUDA just has it beat as things stand for any kind of service support, documentation, tools, etc. for the mainstream GPGPU market. There was a huge migration away from OpenCL to CUDA for those reasons and atleast within the window of my awareness of the industry I'm just not seeing a reversal - I will say OpenCL is slightly expanding its market share proportionally but both are seeing increasing uptake with minimal shift from one camp to the other among existing users. (Don't confuse the fact that OpenCL is seeing increasing usage with it making directly proportional inroads into CUDA's user base).

On top of that projects and deployment in the pro/high end GPGPU markets tend to have long lead times even if there is a shift away from CUDA its going to take a lot more than 12 months before only the diehard are left on CUDA.
 
Last edited:
I'm not seeing it - OpenCL is very good for certain things but doesn't have the ecosystem CUDA has as things stand to give it overall appeal in the same way - CUDA just has it beat as things stand for any kind of service support, documentation, tools, etc. for the mass GPGPU market. There was a huge migration away from OpenCL to CUDA for those reasons and atleast within the window of my awareness of the industry I'm just not seeing a reversal - I will say OpenCL is slightly expanding its market share proportionally but both are seeing increasing uptake with minimal shift from one canp to the other among existing users.

On top of that projects and deployment in the pro/high end GPGPU markets tend to have long lead times even if there is a shift away from CUDA its going to take a lot more than 12 months before only the diehard are left on CUDA.

I agree with what your saying Roff, CUDA is very well established. i know this, as you know i have a 3D hobby and there is no denying CUDA is king.
There are other areas where where CUDA is also king.

But i think times are changing, AutoDesk are increasing their OpenCL functionality with every new build. its getting better and better.. Adobe appear to be actively phasing CUDA out in favour of OpenCL, its one reason Apple switch from Nvidia to AMD. even they can see the OpenCL march. Apple are not stupid and their user base are almost entirely productivity orientated.

One thing that for a lot of people is becoming a sticking point with CUDA is that it is vendor specific.
Software engineers are beginning to realise that to support CUDA and not OpenCL restricts them to just one segment of the vendor market, the 'others' are not just AMD, they are Intel, Qualcomm, ARM, Samsung...

CUDA isn't just going to go away, of course not, but its relevance is diminishing, its going to continue diminishing.
 
If Samsung where to buy AMD, it would be good for us the consumer. I think Samsung would really make them competitive. I doubt they would stop producing GPU and desktop CPU. Why else would they buy them. AMD are not exactly competitive in the mobile gpu/cpu department.
 
CUDA is still far ahead in terms of support and tool chains, and is the far superior option once you move beyond small time single workstation apps. There is a reason IBM (the biggest supercomputer vendor on the planet) solely deal with Nvidia for HPC GPU needs, and it's not because of marketing.

CUDA is definitely the tool of choice in industry - I have several colleagues working in distrusted computing that use large arrays of Nvidia hardware. When I did my PhD the university spent a few million buying Tesla hardware.
The parent company for who i work for investigated GPU compute for a project requiring significant DSP such as Fourier transforms of super marge datasets and CUDa hardware was top of the pile, they are awaiting to find out if the DoD funding will be awarded.

One of the big sticking points is he usual AMD-NVidia story - Nvidia support is just massively better than AMD. The tool chains, software, drivers and everything is just more mature and nvidia are more likely to send engineers over to help integrate or resolve technical issues.
 
CUDA is definitely the tool of choice in industry - I have several colleagues working in distrusted computing that use large arrays of Nvidia hardware. When I did my PhD the university spent a few million buying Tesla hardware.
The parent company for who i work for investigated GPU compute for a project requiring significant DSP such as Fourier transforms of super marge datasets and CUDa hardware was top of the pile, they are awaiting to find out if the DoD funding will be awarded.

One of the big sticking points is he usual AMD-NVidia story - Nvidia support is just massively better than AMD. The tool chains, software, drivers and everything is just more mature and nvidia are more likely to send engineers over to help integrate or resolve technical issues.

No argument from me on that, But, AMD are aware of this, they even agree and are working to fix it.
their market share is growing, people are starting to take them seriously and even take them on board.
 
I agree with what your saying Roff, CUDA is very well established. i know this, as you know i have a 3D hobby and there is no denying CUDA is king.
There are other areas where where CUDA is also king.

But i think times are changing, AutoDesk are increasing their OpenCL functionality with every new build. its getting better and better.. Adobe appear to be actively phasing CUDA out in favour of OpenCL, its one reason Apple switch from Nvidia to AMD. even they can see the OpenCL march. Apple are not stupid and their user base are almost entirely productivity orientated.

One thing that for a lot of people is becoming a sticking point with CUDA is that it is vendor specific.
Software engineers are beginning to realise that to support CUDA and not OpenCL restricts them to just one segment of the vendor market, the 'others' are not just AMD, they are Intel, Qualcomm, ARM, Samsung...

CUDA isn't just going to go away, of course not, but its relevance is diminishing, its going to continue diminishing.


See, you are talking about desktop applications.

The real money is in HPC, a company or institute might throw 10-50million at Nvidia to buy CUDA capable hardware.

Apple may have selected AMD hardware in part due to compute performance but there were undoubtedly many other factors. AMD is more desperate than ever for sales so could have accepted lower prof margins - Apple is infamous for forcing suppliers into incredible small margins. Nvidia had a price point they didn't want to go below.
 
No argument from me on that, But, AMD are aware of this, they even agree and are working to fix it.
their market share is growing, people are starting to take them seriously and even take them on board.

It probably is growing but from a very small starting point since they were late to market.

I do see it as an area that AMD could make a lot of money from though, more so than desktop GPUS.

The whole market is growing quite fast so even if AMD's market hare doesn't grow as fast their total revenue can grow rapidly. There is much more profit in selling hardware and services to industry than to consumers who want the lowers price points.
 
See, you are talking about desktop applications.

The real money is in HPC, a company or institute might throw 10-50million at Nvidia to buy CUDA capable hardware.

*snip nonsense*

Yes, i'm talking about what i know and understand.

I Know your talking about massive supper computers and other stuff i don't understand.

But, i think i am right in saying that 5 years ago AMD had almost nothing if indeed anything at all in these markets, the fact that they do today is progress, they are even winning recognition with in that industry.
they are still the new boys. but its a far cry from where they were not so long ago.

You can't ignore that.
 
Last edited:
Back
Top Bottom