• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Officially 10 Years since the ATI AMD merger. Was it a good idea?

One thing to note is how did the name Wattman get past the current management! That name alone would make me want to format my PC to remove all traces of that silly name.

Fury = great name, Wattman!!! Come on now.
 
Whole of the high end Broadwell E line up has 140 watt TDP slapped on it - I doubt on default settings the 8 core has power draw or thermal dissipation anything close to that - infact it needs a substantial overclock past 4GHz to exceed 100 watt power draw in normal use and only significantly exceeds it in synthetic loading.


Funny that, my Devils Canyon 4 core 4 thread exceeds its 88 Watt TDP at about 4.2 to 4.3Ghz, at 4.6 / 4.7 its pulling a country mile over 100 Watts, closer to 200 watts.
 
I had the Gainward Golden Sample 5900XT "Ultra" - the only card in the 5 series that didn't suck (basically a completely redesigned card by Gainward with uprated memory, etc.) - mine did something like 40% core overclock, 50% memory overclock or something which meant it could keep up with the ATI 9800 cards.

Sorry but I really can't entertain anyone saying ATI's drivers from that period were anything but bad - they were a nightmare from an IT support perspective, tons of games from the period have extensive patch notes about the problems, etc. the evidence is out there and I've posted it before atleast a couple of times on these forums to prove the point.

I had the 9500 Pro which was my first ATI card ever and it worked fine. I never even would have touched an ATI card before the FX came out(had a TI 4200 at the time which walked over the ATI 8000 series equivalent). Compare that to mates who had cards like the FX5600 and FX5800 who had loads of problems with games like HL2 since Nvidia ***** up on both their hardware implementation of DX9 which was non-standard and they had ****e drivers to match. This why I knew so many people who got burnt by the FX series they decided to buy ATI cards the next gen.

Companies like Valve had to make SPECIAL patches for the FX just to run HL2 and I am sorry HL2 was one of the biggest titles launched at the time and one of the most famous games ever created. Loads of people were waiting for that game. Those patches actually degraded image quality in DX9 mode,so it could actually run at reasonable framerates. Some of my mates were ****ed off and they lost faith in Nvidia at that time.

Sorry bit I really can't entertain anyone ignoring the disaster which was the FX and it was shown in sales - the time the X800 series which followed was the ONLY time ATI or AMD actually went past Nvidia in cards sold,since the FX series was an utter disaster on all levels. This is despite the Nvidia 6000 series being a superior design and had DX9C support which the X800 series lacked(I went back to Nvidia with the 6000 series). The FX series was the worse set of GPUs Nvidia ever created.

Even,Nvidia famously mocked the FX series in a video afterwards. The Nvidia Focus Group came out of the debacle with the FX series as was the whole "The Way It's Meant to be Played" programme which also came out of it.

Edit!!

Plus that 9500 PRO had a secondary life for many years in a mates rig. Long after all the FX series cards which my mates got were consigned to the waste bin,that card was still going even running some newer games.
 
Last edited:
I had the 9500 Pro which was my first ATI card ever and it worked fine. I never even would have touched an ATI card before the FX came out(had a TI 4200 at the time) Compare that to mates who had cards like the FX5600 and FX5800 who had loads of problems with games like HL2 since Nvidia ***** up on both their hardware implementation of DX9 which was non-standard and they had ****e drivers to match. This why I knew so many people who got burnt by the FX series they decided to buy ATI cards the next gen.

Companies like Valve had to make SPECIAL patches for the FX just to run HL2 and I am sorry HL2 was one of the biggest titles launch at the time. Those patches actually degraded image quality in DX9 mode,so it could actually run at reasonable framerates.

Sorry bit I really can't entertain anyone ignoring the disaster which was the FX and it was shown in sales - the time the X800 series which followed was the ONLY time ATI or AMD actually went past Nvidia in cards sold,since the FX series was an utter disaster on all levels. It was the worse set of GPUs Nvidia ever created.

Even,Nvidia famously mocked the FX series in a video afterwards. The Nvidia Focus Group came out of the debacle with the FX series as was the whole "The Way It's Meant to be Played" programme which also came out of it.

Pretty much sums up my thought's and experience in that era.
 
Pretty much sums up my thought's and experience in that era.

Yep it was an utter shock at the time - the ATI 7000 and 8000 series cards were middling. So when the 9700 PRO came out of the blue on an old process node,and then months later Nvidia launched a card with a two slot cooler(which was rare for the time),on a new process node,had a borked DX9 implementation(hence needing patching to work properly) and rubbish drivers(at least by Nvidia standards), people were like WTF??
Loads of people still bought them due to the goodwill they had with awesome cards like the TI4200 I had,since they thought Nvidia would fix it. They partially got there with FX MK2 but even that was too little too late.

At this point people started to take ATI cards half seriously. Nobody expected the 9700 PRO and it appears Nvidia were caught off guard.

This is why I loled whened ATI just re-used the same design for the 9800 series and just extended the same design for the X800 series,and ended up outselling Nvidia for the first time,when Nvidia had a better design.

Luckily for Nvidia the X1800 was a bit middling(the X1900 series wasn't),so the 6000 series actually started to bring back some faith with users(although it did have some capacitor issues),and the 7000 series did repair a lot of the damage done.

Having said that I had great a few 6000 and 7000 series cards myself. The unlockable 6800LE cards were pretty awesome.

Edit!!

Here is the spoof video Nvidia made at the time mocking the FX5800:

https://www.youtube.com/watch?v=H-BUvTomA7M

 
Last edited:
Sorry bit I really can't entertain anyone ignoring the disaster which was the FX

Uh click my link in one of the posts about the 5900XT above.

Having said that I had great a few 6000 and 7000 series cards myself. The unlockable 6800LE cards were pretty awesome.

I had this sucker in the 6800 generation (the left hand side): http://www.aten-hosted.com/images/GPUZ.jpg (actual stable clock though it could hit something like 105 or 109C lol). Was something like stock 7900GT performance or something :S

I miss the days when Gainward actually made custom cards from the ground up that were stupidly better than the normal cards.
 
Last edited:
Yep it was an utter shock at the time - the ATI 7000 and 8000 series cards were middling. So when the 9700 PRO came out of the blue on an old process node,and then months later Nvidia launched a card with a two slot cooler(which was rare for the time),on a new process node,had a borked DX9 implementation(hence needing patching to work properly) and rubbish drivers(at least by Nvidia standards), people were like WTF??
Loads of people still bought them due to the goodwill they had with awesome cards like the TI4200 I had,since they thought Nvidia would fix it. They partially got there with FX MK2 but even that was too little too late.

At this point people started to take ATI cards half seriously. Nobody expected the 9700 PRO and it appears Nvidia were caught off guard.

This is why I loled whened ATI just re-used the same design for the 9800 series and just extended the same design for the X800 series,and ended up outselling Nvidia for the first time,when Nvidia had a better design.

Luckily for Nvidia the X1800 was a bit middling(the X1900 series wasn't),so the 6000 series actually started to bring back some faith with users(although it did have some capacitor issues),and the 7000 series did repair a lot of the damage done.

Having said that I had great a few 6000 and 7000 series cards myself. The unlockable 6800LE cards were pretty awesome.

Edit!!

Here is the spoof video Nvidia made at the time mocking the FX5800:

https://www.youtube.com/watch?v=H-BUvTomA7M


Love that video. The x800's were decent and matched Nvidia but like you say the forward looking cards were the Nvidia 6 series. I had a x1900xtx and again Amd nailed it as far as my gaming was concerned.
 
Their drivers were terrible before the Catalysts, and thats whats done the damage to their drivers, as its stuck with them, they'll never shake the good cards, but **** drivers that just don't work tag off.

It's the other way round now. Some of the recent Nvidia drivers have been awful and they just stop supporting cards as soon as the next ones come along, so their performance degrades. Like what happened to the 7xx series :/

Where as AMD keep optimising for years and we see their old cards overtaking Nvidia's equivalent.
 
Last edited:
Funny that, my Devils Canyon 4 core 4 thread exceeds its 88 Watt TDP at about 4.2 to 4.3Ghz, at 4.6 / 4.7 its pulling a country mile over 100 Watts, closer to 200 watts.

Are you aware that Haswell and Broadwell are built using a different manufacturing process? As such the TDP scaling is not identical, making any comparison pointless.

For example, a CPU built on a 22nm process will draw a different amount of power compared to a CPU built on a 14nm process, at the same voltage.

Also, which scientific method of measuring the power draw are you using? :)
 
It's the other way round now. Some of the recent Nvidia drivers have been awful and they just stop supporting cards as soon as the next ones come along, so their performance degrades. Like what happened to the 7xx series :/

Where as AMD keep optimising for years and we see their old cards overtaking Nvidia's equivalent.

I've had my 1070 for quite a few months now, absolutely no problems with the drivers what so ever.

Before upgrading to my 1070, I'd used ATI/AMD cards for the last decade, never had any problems with their drivers either, though I always used a single card, no MGPU nonsense.
 
Funny that, my Devils Canyon 4 core 4 thread exceeds its 88 Watt TDP at about 4.2 to 4.3Ghz, at 4.6 / 4.7 its pulling a country mile over 100 Watts, closer to 200 watts.

Don't forget that while linked TDP and the power drawn from the wall aren't the same thing - the 6900K can be pushed upto 200 watt power draw with synthetic loads but I suspect that Zen would be similar in that respect. I suspect that in real world usage the power/watt will be broadly similar maybe advantage to Zen but not by anything earth shattering.

I've had my 1070 for quite a few months now, absolutely no problems with the drivers what so ever.

Before upgrading to my 1070, I'd used ATI/AMD cards for the last decade, never had any problems with their drivers either, though I always used a single card, no MGPU nonsense.

Been pretty smooth with my 1070 though there has been the odd driver I've skipped due to problems - for some people though it hasn't been as smooth with the 1070.
 
Been pretty smooth with my 1070 though there has been the odd driver I've skipped due to problems - for some people though it hasn't been as smooth with the 1070.

I've had endless problems with drivers with my 1080 so much so I'm basically 2 drivers behind as that one has no issues although I might give the latest a try as its supposedly optimized for Skyrim SSE. Just want to make sure I have a copy of the previous so I can roll back just in the case
 
Don't forget that while linked TDP and the power drawn from the wall aren't the same thing - the 6900K can be pushed upto 200 watt power draw with synthetic loads but I suspect that Zen would be similar in that respect. I suspect that in real world usage the power/watt will be broadly similar maybe advantage to Zen but not by anything earth shattering.



Been pretty smooth with my 1070 though there has been the odd driver I've skipped due to problems - for some people though it hasn't been as smooth with the 1070.

I have not bothered to get the power meter out but I suspect that my 6950Xs @4.0ghz use very little extra power. When they are idling doing nothing they draw very little power and when they are doing something like gaming for a few hours a day the extra watts does not add up to much.
 
I never said it was AMD, I was talking about ATI. I don't remember the exact year to be honest, but this was a long time ago, pre 2000 possibly. Not 100% sure anymore. At that point I was using exclusively AMD processors and ATI cards.

Surely if you had problems that bad the products involved would be salient in your memory?

As for the merger there have been mixed results card wise but at this point a lot is riding on the upcoming series of products.
 
Up to 2016: NO

in 2015 at one point AMD was worth 1/4 of what they paid for it, a lot of mismanagement and general ffing around. Selling there mobile graphics division to Qualcomm was an idoit’s move.

2017 and beyond: Yes but

Reforming ATI as a proper graphics division again has done wonders the past 12 months, even massive upswing in share prices taking inflation into consideration , ATI+AMD is only worth a little more than what AMD paid for ATI in the first place.

In short AMD with ATI has fumbled badly the past 10 years but there's light at the end of the tunnel

That’s one way to put it. We shouldn’t hold our breath for 2017 and beyond as AMD hasn’t released Vega yet. Although, up until 2016, their graphics division had a few hit products, the HD 48XX series, 58XX, 69XX, 79XX, R9 2XX series, the Fury series was okay.

Yeah shes doing a fantastic job, no longer able to compete with Nvidia, and massively behind on market share.
You do realize it takes about 3 years to design a brand new GPU from the ground up. She took over a little more than two years ago. A time when AMD’s graphics division got hit bad buy Maxwell series and their market share started a steep decline like to 27% back in Q3 of 2014, now they are close to 34%. She created the Radeon Technologies Group and put in charge the guy who worked on the legendary 9700 Pro GPU.
And 34% isn’t massively behind. That’s about what traditionally ATI and even AMD’s graphics division had with the exception of the X800 years and HD 48XX and 58XX series years where they had over 50% market share and up to 40% market share.
Source:
For the consumer it was a good thing AMD came along and purchased ATI otherwise Nvidia would have had monopoly on the discrete video card market in 2006 rather then 2014.

nVidia would not have had a monopoly in 2006, that's absolutely ridiculous.

In Q2 2006, they had 48% GPU market share. Hardly a monopoly. Yes they would have lost market share due to the HD 2000 series not being competitive with nVidia's 8800 series and their market share dropped to 34% but to say that they would have had a monopoly back in 2006 is absolutely ridiculous.

Here is the graph of their Market Share prior and post merger:

http://techavenue.wixsite.com/techavenue/amd-critic

The way I look at it is that ATI were going under and looking at how AMD'S CPU ' S have been over the last good few years, if they hadn't of had the GPU side of things from ATI they might have gone under as well. So in my opinion it was a good move.

Second part you might be write about that as the graph below shows that their GPU division has consistently been profitable excepts for maybe post Maxwell launch and the all the way up to Polaris' launch which I assume based on their recent earnings call gained them or at least stopped the bleeding of losses on the graphic side of things.

CPU vs GPU division profits:

3152178-cpu+vs+gpu.png


But to say ATI was going under post merger is ridiculous.

Also, their GPU division as the above grap shows actually had a profit of nearly $530 million from 2007 all they way through Q2 2014 while their GPU divsion was exactly the opposite with losses of over $100+ million in the same time period. They got hammered in 2007 and 2008 due to Phenom not being able to compete with Core 2 Duo.

From what I recall wasn't ATI on its knees ready to go bankrupt? From that perspective then yes it was a good idea from consumers point of view or there would be no competition whatsoever. But obviously that doesn't fit with the doom and gloom and lets-bash-AMD agenda most people love to revel in.

Uh no there were not in the verge of going bankrupt. It was actually profitable prior to the merger.

Profit:
2004 : $200+ million
2005 : ~15 million
Q2 2006 : ~70 million
 
Tomshardware article was years out of date before AMD spinned off Fabs

Was it a good idea for AMD to buy ATI?

I voted HELL NO!!!

AMD should been NEVER bought ATI back in 2005 and AMD would had spinned off fabs much sooner as 2008 if management team identified the root cause of heavy losses earlier.

AMD was really incredible very lucky to get out of fabs business in 2009 so how is GlobalFoundries doing nowday? Nope they never got any better everyday since spinned off from AMD but things got far much worsen after bought IBM fab 2 years ago, GlobalFoundries's losses was absolutely massive off the scale every year. They never made a profit since first day business opened in 2009, in 2011 GlobalFoundries lost $1.2bn, $2.5bn in 2015 and now in 2016 saw $1.6bn in the first 6 months at the rate of $8.79m everyday so it mean full 2016 result could see losses soared to $3.2bn. Globalfoundries owner Mubadala Technology still explored all options since few years ago to dump all 100% of toxic Globalfoundries business for around $20bn but talks with china government to acquire stakes in Globalfoundries business fell out at the last minute few days ago because they complaint that equipment expenditure for the 12 inch fab was too cheap.

Maybe the owner Mubadala Technology and former ATIC wished they never bought AMD fabs back in 2009 and realised they been fooled that Hector Ruiz had cashed them out and Ruiz laughed off like JR Ewing.

Things would be a lot much different if AMD had not bought ATI for $5.4bn back in 2005. AMD would licensed to use ATI GPU in AMD CPU and ATI still made Imageon for mobile phones and digital TVs today that both Broadcom and Qualcomm would licensed it to use in their products and ATI would be still here and their 2005-2015 annual revenues would be close to Nvidia annual revenues after collected license and royalty fees from AMD, Broadcom, Qualcomm, Microsoft, Nintendo and Sony. ATI would had big Polaris to compete with Nvidia GP104 back in summer and AMD would launched Zen back in 2013 or 2014 after spinned off Fabs in 2008 as GlobalFoundries with different partner.

That's an interesting post. I actually had similar thoughts as you. I always wonder what would have happened if AMD had just licensed ATI's GPU's for their APUs. In that way they wouldn't have had to cough up the $5.4 billion buying ATI. They could have used that cash to pour it into their CPU division. ATI would have probably kept the handset division and not had sold it off to Qualcomm and the TV set division to Broadcom.

Now, AMD CPU division would have been screwed anyways because the Phenom was already in development when they brought ATI and they would have gotten hammered by intel but the cushion would have been less as they wouldn't had to deal with the $5.4 billion dollar payment and subsequent debt.

Their could have been some challenges though because sub contracting out the GPU side of things to ATI and not having all your engineers in house working on the chip could pose possible challenges like when 3DFX outsourced the 2D processing in their Voodoo Rush graphics chip to a different contractor and it was disaster everything after that was designed in house by 3DFX but then again 3DFX isn't a good example as they ultimately failed.I suppose that is a moot point as Apple is using Power VR chips which is based on tech provided by a totally different company all being put together in a single SoC so just because you have one side of the chip being designed by a totally different company may not be an issue.

I agree with you that the only "smart" and clever thing that AMD did was to spin off their fab division to ATIC which then created Globalfoundries. They played ATIC like a fiddle giving them the impression that the Saudis were going to buy their Fabs (even though they had no interest in it) forcing ATIC to pull the trigger on buying AMD's fab.

It would have been interesting to see though as their CPU division crumbled the load would have lessened if they hadn't had to deal with $5.4 billion cost, who knows, AMD CPU division could have been able to be more focused in directly competing with intel instead of having to fight two giants at the same time.
 
Hopefully AMD are improving now but reading or listening to what Lucy Lui said or whatever her name is, I think she said with time they can improve their spending on R&D and also I think mentioned it cannot compare with Nvidia's spending at present, which to me means they are limited at present. I think they might be trying to take a fight to Intel on the processor side for now, but who knows, but I doubt personally they'll take a good fight to both Intel and Nvidia at the same time - they don't have the pockets.

It's Lisa Su not Lucy Lui from Charlies Angles. :p But you are right AMD did reduce their debt payments which freed up some cash and their R&D will increase to over $100 million+ from previously.
 
That's an interesting post. I actually had similar thoughts as you. I always wonder what would have happened if AMD had just licensed ATI's GPU's for their APUs. In that way they wouldn't have had to cough up the $5.4 billion buying ATI. They could have used that cash to pour it into their CPU division. ATI would have probably kept the handset division and not had sold it off to Qualcomm and the TV set division to Broadcom.

Now, AMD CPU division would have been screwed anyways because the Phenom was already in development when they brought ATI and they would have gotten hammered by intel but the cushion would have been less as they wouldn't had to deal with the $5.4 billion dollar payment and subsequent debt.

Their could have been some challenges though because sub contracting out the GPU side of things to ATI and not having all your engineers in house working on the chip could pose possible challenges like when 3DFX outsourced the 2D processing in their Voodoo Rush graphics chip to a different contractor and it was disaster everything after that was designed in house by 3DFX but then again 3DFX isn't a good example as they ultimately failed.I suppose that is a moot point as Apple is using Power VR chips which is based on tech provided by a totally different company all being put together in a single SoC so just because you have one side of the chip being designed by a totally different company may not be an issue.

I agree with you that the only "smart" and clever thing that AMD did was to spin off their fab division to ATIC which then created Globalfoundries. They played ATIC like a fiddle giving them the impression that the Saudis were going to buy their Fabs (even though they had no interest in it) forcing ATIC to pull the trigger on buying AMD's fab.

It would have been interesting to see though as their CPU division crumbled the load would have lessened if they hadn't had to deal with $5.4 billion cost, who knows, AMD CPU division could have been able to be more focused in directly competing with intel instead of having to fight two giants at the same time.


There is a huge flaw with all these types of arguments, they are based on fantasy, a fantasy that ATI was a healthy business.
AMD are not working with or licensing anything from a dead business.

AMD paid $5.4BN because that is the cash injection ATI needed to save it.
 
Back
Top Bottom