• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's GPU market share drops again, even after the release of Fury X

I kinda feel for AMD a bit. They don't make bad products, but in a market where there's only two competitors and you're either considered the best or not, being the 'not' really hurts. They can provide cards that are on-paper equal with what Nvidia do at a given price point, but that's just not good enough when there's a general perception that Nvidia provides better software and driver support.

AMD really have to knock things out of the park to get back in the game. And that's hard when they're not able to commit the same resources to the job as Nvidia can. AMD have certainly made efforts here, trying to take more innovate routes with things like async compute power and stacked memory, but they're not being utilized by developers enough at the moment for their advantages to show.

AMD try. They really do. They price aggressively, they innovate, and they don't build bad GPU's. But they've got to work on fixing their reputation somehow. It feels like their biggest problem is just an image problem.
 
Right now Intel and Nvidia have a Symbiotic ecosystem. Nvidia offer best graphics, Intel offer best CPU. Over time though with Intel's rise in IGPU negating the need for many people to bother with DGPU maybe that ecosystem could be altered. Although Nvidia have stated they are not worried about IGPU as they said they would simply raise the bar on lower end DGPU to compete.



Would love to see an Intel / Nvidia hybrid all in one chip. Would buy that in heartbeat.

Maybe in the future Nvidia will look at offering CPU/GPU solutions who knows?

The main issue,is AMD and Nvidia are dependent on TSMC and GF not only making nodes which can produce large chips like GPUs fine,but also cost effectively.

Remember this a few years ago:

http://www.extremetech.com/computin...y-with-tsmc-claims-22nm-essentially-worthless

Nvidia was deeply unhappy with the costs going forward and having to shoulder a lot of the burden regarding the risks.

No doubt things might have moved on from there now,but thats the real issue currently.

However,luckily in some ways,the problems with the Intel 14NM process might have given them a bit of respite.
 
they haven't said the dGPU market is "collapsing"
it has shrunk yes, but that isn't the same as collapsing
a small reduction over time isn't collapsing, as long as at least one company is making millions of dollars a year serving that market there will continue to be a market

Nvidia's turnover and profit from FY 2011 to 2015

2011 2012 2013 2014 2015
3.54B 4B 4.28B 4.13B 4.68B
253.15M 581.09M 562.54M 439.99M 630.59M

Yeah, collapsing

Its going down at a 6% rate on average for the last few years according to JPR(Overall, the CAGR from 2014 to 2017 is now -6%.)

Yeah,it is collapsing. What you don't seem to understand is that,Nvidia has diversified into Tegra and into commercial compute markets and the latter has higher profit margins anyway. Plus their entire sub £400 stack now boasts tinier GPUs and lower board costs than 18 months years ago. Plus,you also don't seem to get that AMD lost massive marketshare in mobile since Kepler destroyed AMD in laptops. You are seeing the effects of marketshare cannibalisation in a decreasing market,ie,the shift from arpound 40% marketshare for AMD to under 18% or thereabouts.

GTX980 and GTX970 with 400MM2 GPUs and 256 bit memory controllers now replace the 565MM2 GK110 in the GTX780,GTX780TI,etc.

GTX960 with a 250mm2 GM206 die replaces 300mm2 GK104 in the GTX770 and GTX760 and only uses half the memory chips.

GTX750TI with 150mm2 die replacing/supplanting GTX660 which use 221mm2 die and more memory chips.

So they have concentrated on decreasing costs per card,and making them cheaper to make.

They also split their commercial and consumer card lines too.

Nvidia knew where the market was going,so did the wise thing. AMD,OTH,well...

Yet,again do not seem to be able to answer some basic questions??

Are Q3 2015 and Q4 2015 card sales going to be more than Q3 2014 and Q4 2014??

Are 2015 sales going to be more than 2014 sales??

You won't answer repeatedly since you know overall things will be worse,and every analyst has said the discrete PC market is getting smaller. The question you need to ask,is what happens when Nvidia has cannibalised as much as AMD dGPUs sales as possible?

Where is the growth going to come from??

We will have to agree to disagree.
 
Last edited:
The main issue,is AMD and Nvidia are dependent on TSMC and GF not only making nodes which can produce large chips like GPUs fine,but also cost effectively.

Remember this a few years ago:

http://www.extremetech.com/computin...y-with-tsmc-claims-22nm-essentially-worthless

Nvidia was deeply unhappy with the costs going forward and having to shoulder a lot of the burden regarding the risks.

No doubt things might have moved on from there now,but thats the real issue currently.

However,luckily in some ways,the problems with the Intel 14NM process might have given them a bit of respite.

Yeah most of us are aware that there's been problems with shrinking, been stuck on 28nm near 5 years now..

We're just about to get over the 28nm hump. With a die shrink finally coming next year.

Are you saying we might get stuck on the next node as well? I really hope we don't end up with a repeat of this..
 
Yeah most of us are aware that there's been problems with shrinking, been stuck on 28nm near 5 years now..

We're just about to get over the 28nm hump. With a die shrink finally coming next year.

Are you saying we might get stuck on the next node as well? I really hope we don't end up with a repeat of this..

The problem is eveything is being focussed more towards mobile,ie,phones,tablets,etc and customers like Apple with small die chips.

Larger products like big Pascal which are meant for high profit margin commercial markets might get some volume,but what about the rest of the AMD and Nvidia ranges,when they need to compete with companies like Apple??

These need volume at a reasonable cost.

You can see how Maxwell for most of their range has significantly reduced BOM over Kepler and Kepler probably did the same over Fermi. OTH,with AMD they have done more a Fermi instead of doing what they did with the HD3000,HD4000 and HD5000 series where they used efficient engineering to make sure they could keep a reasonable cost base.

I have a feeling that 14N/16NM might last a while since I have no faith in TSMC or GF being able to deliver affordable,high volume,large die GPU products in 2017.

I hope to be very wrong on this!:(
 
Last edited:
Yeah most of us are aware that there's been problems with shrinking, been stuck on 28nm near 5 years now..

We're just about to get over the 28nm hump. With a die shrink finally coming next year.

Are you saying we might get stuck on the next node as well? I really hope we don't end up with a repeat of this..

We are near the end of silicons viability, it costs way too much for fabricators to continue shrinking, i don't even think we'll get to the point that Quantum tunneling messes up the circuitry, no one will buy the damn things for being too expensive.

The only performance increases you'll find are different configurations on the dies and obviously taking the software down to barebones interaction like consoles do.

Other than that, we simply await graphene, silicene or whatever to replace this largely EOL technology.
 
It's still a week of sales I think it's fair to say launch window is probably when the most units get purchased, stock or not.

You only have to look at the Fury owners thread lists compared to the 980Ti one though to see that AMD's market share will be dire in Q3 as well.


Congratulations on your win...you must be over the moon then.
 
DOOOOOMED!!! Wooo Whoooo DIE AMD DIE!

Wait... wut... without AMD, nVidia will rape our wallets? And only release incremental improvements on their GPUs much like Intel do with their CPUs?..

errrrhudurrrrr...

Don't die AMD don't die!!!
 
We are near the end of silicons viability, it costs way too much for fabricators to continue shrinking, i don't even think we'll get to the point that Quantum tunneling messes up the circuitry, no one will buy the damn things for being too expensive.

The only performance increases you'll find are different configurations on the dies and obviously taking the software down to barebones interaction like consoles do.

Other than that, we simply await graphene, silicene or whatever to replace this largely EOL technology.

Graphene hit a few problems but I recall reading that they are sorted and it is a viable step and Intel have ploughed money into something else but can't remember what it is off the top of my head but silicon is pretty much finished, with only a couple of shrinks left.
 
We really need to move to Optical components as much as possible as well, its strange that we run our internet off of light, but its still the same ol' crap to compute it all.
 
It's a real shame that AMD didn't get the FURY X launch right, because when the card works, it competes and can beat the the 980ti in some situations.

The price was a bit much too though, I'd have liked it to have been 450-500 on launch, with good stock and no pump whine it would have done well.

Sadly I think it's too late now, the damage is done regardless of how well the card performance, unless amd release a miracle driver that allows it to beat the competition in most things.
 
We really need to move to Optical components as much as possible as well, its strange that we run our internet off of light, but its still the same ol' crap to compute it all.

We don't really run it off light more like transmit it! Most of our infrastructure is FTTC so in the end its still a good amount of copper to your house!

But optical would be a great direction to go but were years away from that yet!
 
The problem is eveything is being focussed more towards mobile,ie,phones,tablets,etc and customers like Apple with small die chips.

Larger products like big Pascal which are meant for high profit margin commercial markets might get some volume,but what about the rest of the AMD and Nvidia ranges,when they need to compete with companies like Apple??

These need volume at a reasonable cost.

You can see how Maxwell for most of their range has significantly reduced BOM over Kepler and Kepler probably did the same over Fermi. OTH,with AMD they have done more a Fermi instead of doing what they did with the HD3000,HD4000 and HD5000 series where they used efficient engineering to make sure they could keep a reasonable cost base.

I have a feeling that 14N/16NM might last a while since I have no faith in TSMC or GF being able to deliver affordable,high volume,large die GPU products in 2017.

I hope to be very wrong on this!:(


We are near the end of silicons viability, it costs way too much for fabricators to continue shrinking, i don't even think we'll get to the point that Quantum tunneling messes up the circuitry, no one will buy the damn things for being too expensive.

The only performance increases you'll find are different configurations on the dies and obviously taking the software down to barebones interaction like consoles do.

Other than that, we simply await graphene, silicene or whatever to replace this largely EOL technology.

Graphene hit a few problems but I recall reading that they are sorted and it is a viable step and Intel have ploughed money into something else but can't remember what it is off the top of my head but silicon is pretty much finished, with only a couple of shrinks left.

Read somewhere that Graphene wasn't suitable yet as well.

Considering how long the transition on 28nm has taken nearly 5 years so far..

Could we see a situation where each node lasts 5 years ?! :eek: So maybe for example we have 3 shrinks left on silicon, then it's no longer viable..

Imagine that another 15 years of slow transitions to go before something better is found lol..

Hopefully 28nm is the exception not the rule and future nodes transition much quicker, otherwise much milking and drip fed upgrades is going to happen all over again :P
 
Really AMD market share dropped in Q2, April, May and June..... Fury X launched in June 24th with low early launch supplies..... you really think Fury X regardless of quality, even if it had been 100% faster than it was and half the price, would have had a massive impact on the sales of Q2?

I think what your needlessly daft thread title should have read is, Fury X launched after Q2 sales predictably still dropped.

It won't be a surprise if we still see a drop in the next quarter either, Let's face it the Fury X release was one big disaster. I waited months for the card to release and when it did I had to sit here with the money for the card being:

1, Unable to buy a Fury X.... 2, Scared too when I could due to the issues..... And 3, I was right too be worried because when the new stock finally did turn up it was a mix of both fixed and non-fixed (whistling) cards. Meaning it was a lottery

The saddest thing is that in the end I bought the Fury Tri-X and it's a brilliant card which I'm very happy with.
They could have avoided so much trouble if they'd also offered Fury X Tri-X models to start with instead of only AIO's. We'd of all got what we wanted then cause even though I'm very happy with the Tri-X I'm disappointed I couldn't buy the full chip. And next time I won't wait so long for AMD I'll simply move to Nvidia.
Taking everything into account I can't imagine them doing any better in the sales due to Fiji, Grenada is there best chance of improving there standing. From what I can see they've done a good job of fixing up Hawaii and they've got themselves a strong performance/value king in the 390.
 
Read somewhere that Graphene wasn't suitable yet as well.

Considering how long the transition on 28nm has taken nearly 5 years so far..

Could we see a situation where each node lasts 5 years ?! :eek: So maybe for example we have 3 shrinks left on silicon, then it's no longer viable..

Imagine that another 15 years of slow transitions to go before something better is found lol..

Hopefully 28nm is the exception not the rule and future nodes transition much quicker, otherwise much milking and drip fed upgrades is going to happen all over again :P

http://www.theregister.co.uk/2015/08/04/stanene_2d_layer_tin/
 
Last i heard they had a breakthrough with Stanene (single layered tin) but it is a matter of producing the stuff at high yields with good uniformity in the structure.

The only other options on the table currently are; silicon-germanium, germanium and gallium arsenide transistors.
 
Back
Top Bottom