• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Officially 10 Years since the ATI AMD merger. Was it a good idea?

I think AMD should stop chasing the £400+ market, they don't sell enough GPU's in that segment anyway.

Let Nvidia have it, they already do and there is no way AMD can take it from them.
So put all R&D into making good sub £400 GPU's.

So don't try to make your GPUs as fast as possible just make them fast enough and slap on a price that challenges nvidia's offerings in mid range?

Sounds like a fail plan to me.

TBF they have tried to do something like this. They have tried to make Polaris as efficient as possible while keeping good performance to target mid range. Thier Vega is going to follow on from this but target performance end such as the high end. 480s that are around 100 watts when overclock at 1400Mhz+ and are faster than 980s is not bad tbh. They are even faster than the Nano which uses more power.
 
Last edited:
I think AMD should stop chasing the £400+ market, they don't sell enough GPU's in that segment anyway.

Let Nvidia have it, they already do and there is no way AMD can take it from them.
So put all R&D into making good sub £400 GPU's.

Problem is not having an offering competing in the high end even if it isn't the fastest tends to make consumers think twice sub-consciously - AMD could have a considerably better offering in the mid-range and still a lot of consumers would lean towards the nVidia offering due to association with the higher end cards. People like who frequent this part of the forums make up a small percentage of buyers.
 
I think AMD should stop chasing the £400+ market, they don't sell enough GPU's in that segment anyway.

Let Nvidia have it, they already do and there is no way AMD can take it from them.
So put all R&D into making good sub £400 GPU's.

So develop a new Architecture and dont make it scale from top of range to bottom?
Seems a waste of R n D to me . Basically what your suggesting is what they have currently done with polaris topping out at say 1060 performance and leaving a good chunk of the market missed out.
IMO they should do what Nvidia has done make a Architecture that scale from top to bottom ( yes change rops,shaders,cores count etc for the price point/performance you want).
Making two architectures like Polaris and then Vega i'd assume costs more then one
 
I think AMD should stop chasing the £400+ market, they don't sell enough GPU's in that segment anyway.

Let Nvidia have it, they already do and there is no way AMD can take it from them.
So put all R&D into making good sub £400 GPU's.

Surely this strategy would put them right back into the budget GPU maker image, that they have been trying to get out of.
 
I think they have clawed quite a bit back with the 480. It punches above it's weight in dx12 and is well priced. That is the price bracket most people go for, they should focus on that.

But the worst thing would be for Nvidia to be the lone manufacturer of GPUs. Prices would skyrocket, people would refuse to buy and PC gaming would start to die.
 
Last edited:
So develop a new Architecture and dont make it scale from top of range to bottom?
Seems a waste of R n D to me . Basically what your suggesting is what they have currently done with polaris topping out at say 1060 performance and leaving a good chunk of the market missed out.
IMO they should do what Nvidia has done make a Architecture that scale from top to bottom ( yes change rops,shaders,cores count etc for the price point/performance you want).
Making two architectures like Polaris and then Vega i'd assume costs more then one

It is probably saving R&D and will have more to do with the to market development cost of a large die. I don't know the cost for GPU's on the latest process but I read a semiconductor article a few years back that broke down the cost (guy in the industry but not GPU's) and it easily ran into the 10's of millions. Il try and find it if I have time.

AMD are effectively looking at their own reliably predictable cost sweetspot (and taking into consideration ales/revenue generation ofc) and trying not to stray too far. Which could mean sacrificing developing large dies early, at least for the moment. So by the time it might be more cost effective (X quarters later) for a large die you are then near the launch of your next arch iteration where a smaller, medium die matches the former (hypothetical) large die that could have been produced earlier. This is likely what you are seeing with polaris/vega (there is also the probability that polaris arch was originally intended on 20nm but delayed to 14nm to take into account timing wise).
 
2017 the year that makes AMD the gamers choice for CPU and GPU.

Best joke all day.
Although I get what you're saying by saying "gamers".....it doesn't have the be the fastest but good value for the average gamer.

With both Intel and Nvidia spending more on R&D for their CPU's or GPU's than what 'probably' AMD has to spend on both types of processors, I doubt they'll suddenly have outstanding products in both areas. Who knows, but the best technology will likely be bought from those companies with the deepest R&D spend pockets.
Same with anything really. We love our car anologies, it's like one of the backend F1 teams competiting with the big spenders for wins throughout an entire season, not just a lucky race. It doesn't happen.

Like with all tech, it does get cheaper, so obtaining Nvidia 1080 level of performance by 1H 2017 should be possible and much more cheaply done (in R&D).
I think it's more likely AMD's CPU's will take a great step forward than their GPU's, but just my speculation. I know I won't be impressed by 1080 level performance come 1H 2017 so they'd need to be another 20-20-40+% faster by then just to show they've caught up, let alone stepped ahead.
 
Last edited:
ATI was in trouble if I remember correctly, but in retrospect it seems that AMD was the wrong company to buy them out. My understanding from reading is that the fusion strategy was poorly thought through, and the merger itself badly implemented.
 
I think it's more likely AMD's CPU's will take a great step forward than their GPU's, but just my speculation.

Its looking more and more like with their CPUs they took a running leap at Intel, tripped over their laces and by the time they were back on their feet Intel had already moved.

Not just hating on AMD for the sake of it - I believe strong competition is good for everyone but realistically you need a lot of optimism for AMD right now.
 
He's talking about the future, not the past, Zen. ^^^^^ you're just looking backwards.

Surely this strategy would put them right back into the budget GPU maker image, that they have been trying to get out of.

They have had this image for a lot of years making successful high end cards.

The truth is it will never change, so work with it instead of against it, its not a bad thing, it can be a good thing.
 
ATI was in trouble if I remember correctly, but in retrospect it seems that AMD was the wrong company to buy them out. My understanding from reading is that the fusion strategy was poorly thought through, and the merger itself badly implemented.

Not sure if we are equipped with the knowledge to say it was badly implemented. Bit like if Brexit goes wrong, it'll be so easy to blame those that walked us through the pain of it (conservatives,Mrs May, the negotiators) without really knowing, than it will be to blame ourselves for taking us down the path :). Slightly off-topic
 
It's strange that they weren't profitable because I remember their products being extremely competitive and their market share being high.

The 9700 and 9800 series (I had a wonderful 9800 Pro 256 MB) were legendary GPUs that were dominant in comparison to the Nvidia equivalents. X800 series, X1800 and X1900 were all excellent too if I remember correctly.

ATI were awesome at some point and made really good cards. I am not sure how they managed to screw that up so royally. I always used to have ATI cards until they screwed up the drivers. I remember I had to keep the same old drivers for about 2 years because had I upgraded I would have lost any 3D functionality. Pathetic.
 
ATI were awesome at some point and made really good cards. I am not sure how they managed to screw that up so royally. I always used to have ATI cards until they screwed up the drivers. I remember I had to keep the same old drivers for about 2 years because had I upgraded I would have lost any 3D functionality. Pathetic.

When was this? Been using ATI/AMD cards since 2002 and used to update the drivers when I felt it was necessary. I think it was pre this stage which got them this ongoing nonsense that there drivers are rubbish. The drivers must have been terrible as even a name change won't get rid of the damage done. They are not even the same company so effectively AMD never made these rubbish drivers for Gpu's. Just as with Nvidia AMD's drivers go through good and bad patches.
 
When was this? Been using ATI/AMD cards since 2002 and used to update the drivers when I felt it was necessary. I think it was pre this stage which got them this ongoing nonsense that there drivers are rubbish. The drivers must have been terrible as even a name change won't get rid of the damage done. They are not even the same company so effectively AMD never made these rubbish drivers for Gpu's. Just as with Nvidia AMD's drivers go through good and bad patches.

I never said it was AMD, I was talking about ATI. I don't remember the exact year to be honest, but this was a long time ago, pre 2000 possibly. Not 100% sure anymore. At that point I was using exclusively AMD processors and ATI cards.
 
When was this? Been using ATI/AMD cards since 2002 and used to update the drivers when I felt it was necessary. I think it was pre this stage which got them this ongoing nonsense that there drivers are rubbish. The drivers must have been terrible as even a name change won't get rid of the damage done. They are not even the same company so effectively AMD never made these rubbish drivers for Gpu's. Just as with Nvidia AMD's drivers go through good and bad patches.

People seem to have some really short memories when it comes to AMD's earlier GPU drivers. It wasn't until 2009 that AMD really brought the driver quality upto a good standard. I think it was Jan 2009 or something release where they brought out a fixed driver for Far Cry 2 that also gave a massive boost in performance but also was a sudden big step forward in their driver with a lot of legacy issues fixed, etc.

ATI's early drivers were pretty BAD - I had a few of their cards like the Rage II+ 3D where you literally had to swap between 3 different drivers depending on which game you wanted to play that day.
 
Last edited:
When was this? Been using ATI/AMD cards since 2002 and used to update the drivers when I felt it was necessary. I think it was pre this stage which got them this ongoing nonsense that there drivers are rubbish. The drivers must have been terrible as even a name change won't get rid of the damage done. They are not even the same company so effectively AMD never made these rubbish drivers for Gpu's. Just as with Nvidia AMD's drivers go through good and bad patches.

I remember walking into a company in 1998 and having the computer I was using contantly rebooting itself. Installed updated graphics driver and all was good...it had an Ati card. A laptop couldn't put out the correct resolution to a monitor, AMD card, driver update and all good.
I've had Ati cards around 2007 and that had shocking first release drivers - couldn't even play movies smoothly (and two different cards that did this around same period 4850 I think and 5770), and in both cases it took a few driver releases to fix). I don't think Ati's drivers are rubbish generally, but their initial drivers have often been tripe at least. Talk about rush jobs. I know it's easy to download the latest drivers but when it leaves a bad product impression when the first few driver releases are not great, cannot play movies smoothly etc.

I hope one day to try their products again, but after having the previously mentioned driver issues, only one graphics card so far that has been DOA (AMD card), and one unstable CPU (AMD CPU) in my entire component buying life it puts me off, but I will try again when they release something that does seem to stand ahead of the competition. I don't care about pricing so putting out cheaper products doeesn't interest me, I want perrrrfoorrrmmmaaance :D
 
Last edited:
I never said it was AMD, I was talking about ATI. I don't remember the exact year to be honest, but this was a long time ago, pre 2000 possibly. Not 100% sure anymore. At that point I was using exclusively AMD processors and ATI cards.

I know you didn't but I would hazard a guess that after all these years it still puts you off. The drivers for ATI's best remembered best remembered cards were not bad though as in 9700/9800/x800/x1900/4870/5870/7970/290 yet this bad driver thing still gets talked about.
 
Back
Top Bottom