Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Whole of the high end Broadwell E line up has 140 watt TDP slapped on it - I doubt on default settings the 8 core has power draw or thermal dissipation anything close to that - infact it needs a substantial overclock past 4GHz to exceed 100 watt power draw in normal use and only significantly exceeds it in synthetic loading.
I had the Gainward Golden Sample 5900XT "Ultra" - the only card in the 5 series that didn't suck (basically a completely redesigned card by Gainward with uprated memory, etc.) - mine did something like 40% core overclock, 50% memory overclock or something which meant it could keep up with the ATI 9800 cards.
Sorry but I really can't entertain anyone saying ATI's drivers from that period were anything but bad - they were a nightmare from an IT support perspective, tons of games from the period have extensive patch notes about the problems, etc. the evidence is out there and I've posted it before atleast a couple of times on these forums to prove the point.
I had the 9500 Pro which was my first ATI card ever and it worked fine. I never even would have touched an ATI card before the FX came out(had a TI 4200 at the time) Compare that to mates who had cards like the FX5600 and FX5800 who had loads of problems with games like HL2 since Nvidia ***** up on both their hardware implementation of DX9 which was non-standard and they had ****e drivers to match. This why I knew so many people who got burnt by the FX series they decided to buy ATI cards the next gen.
Companies like Valve had to make SPECIAL patches for the FX just to run HL2 and I am sorry HL2 was one of the biggest titles launch at the time. Those patches actually degraded image quality in DX9 mode,so it could actually run at reasonable framerates.
Sorry bit I really can't entertain anyone ignoring the disaster which was the FX and it was shown in sales - the time the X800 series which followed was the ONLY time ATI or AMD actually went past Nvidia in cards sold,since the FX series was an utter disaster on all levels. It was the worse set of GPUs Nvidia ever created.
Even,Nvidia famously mocked the FX series in a video afterwards. The Nvidia Focus Group came out of the debacle with the FX series as was the whole "The Way It's Meant to be Played" programme which also came out of it.
Pretty much sums up my thought's and experience in that era.
Sorry bit I really can't entertain anyone ignoring the disaster which was the FX
Having said that I had great a few 6000 and 7000 series cards myself. The unlockable 6800LE cards were pretty awesome.
Yep it was an utter shock at the time - the ATI 7000 and 8000 series cards were middling. So when the 9700 PRO came out of the blue on an old process node,and then months later Nvidia launched a card with a two slot cooler(which was rare for the time),on a new process node,had a borked DX9 implementation(hence needing patching to work properly) and rubbish drivers(at least by Nvidia standards), people were like WTF??
Loads of people still bought them due to the goodwill they had with awesome cards like the TI4200 I had,since they thought Nvidia would fix it. They partially got there with FX MK2 but even that was too little too late.
At this point people started to take ATI cards half seriously. Nobody expected the 9700 PRO and it appears Nvidia were caught off guard.
This is why I loled whened ATI just re-used the same design for the 9800 series and just extended the same design for the X800 series,and ended up outselling Nvidia for the first time,when Nvidia had a better design.
Luckily for Nvidia the X1800 was a bit middling(the X1900 series wasn't),so the 6000 series actually started to bring back some faith with users(although it did have some capacitor issues),and the 7000 series did repair a lot of the damage done.
Having said that I had great a few 6000 and 7000 series cards myself. The unlockable 6800LE cards were pretty awesome.
Edit!!
Here is the spoof video Nvidia made at the time mocking the FX5800:
https://www.youtube.com/watch?v=H-BUvTomA7M
Their drivers were terrible before the Catalysts, and thats whats done the damage to their drivers, as its stuck with them, they'll never shake the good cards, but **** drivers that just don't work tag off.
Funny that, my Devils Canyon 4 core 4 thread exceeds its 88 Watt TDP at about 4.2 to 4.3Ghz, at 4.6 / 4.7 its pulling a country mile over 100 Watts, closer to 200 watts.
It's the other way round now. Some of the recent Nvidia drivers have been awful and they just stop supporting cards as soon as the next ones come along, so their performance degrades. Like what happened to the 7xx series :/
Where as AMD keep optimising for years and we see their old cards overtaking Nvidia's equivalent.
Funny that, my Devils Canyon 4 core 4 thread exceeds its 88 Watt TDP at about 4.2 to 4.3Ghz, at 4.6 / 4.7 its pulling a country mile over 100 Watts, closer to 200 watts.
I've had my 1070 for quite a few months now, absolutely no problems with the drivers what so ever.
Before upgrading to my 1070, I'd used ATI/AMD cards for the last decade, never had any problems with their drivers either, though I always used a single card, no MGPU nonsense.
Been pretty smooth with my 1070 though there has been the odd driver I've skipped due to problems - for some people though it hasn't been as smooth with the 1070.
Did nvidia buy any of the R&D talent that AMD made redundant from ATI?
Don't forget that while linked TDP and the power drawn from the wall aren't the same thing - the 6900K can be pushed upto 200 watt power draw with synthetic loads but I suspect that Zen would be similar in that respect. I suspect that in real world usage the power/watt will be broadly similar maybe advantage to Zen but not by anything earth shattering.
Been pretty smooth with my 1070 though there has been the odd driver I've skipped due to problems - for some people though it hasn't been as smooth with the 1070.
I never said it was AMD, I was talking about ATI. I don't remember the exact year to be honest, but this was a long time ago, pre 2000 possibly. Not 100% sure anymore. At that point I was using exclusively AMD processors and ATI cards.
Up to 2016: NO
in 2015 at one point AMD was worth 1/4 of what they paid for it, a lot of mismanagement and general ffing around. Selling there mobile graphics division to Qualcomm was an idoit’s move.
2017 and beyond: Yes but
Reforming ATI as a proper graphics division again has done wonders the past 12 months, even massive upswing in share prices taking inflation into consideration , ATI+AMD is only worth a little more than what AMD paid for ATI in the first place.
In short AMD with ATI has fumbled badly the past 10 years but there's light at the end of the tunnel
You do realize it takes about 3 years to design a brand new GPU from the ground up. She took over a little more than two years ago. A time when AMD’s graphics division got hit bad buy Maxwell series and their market share started a steep decline like to 27% back in Q3 of 2014, now they are close to 34%. She created the Radeon Technologies Group and put in charge the guy who worked on the legendary 9700 Pro GPU.Yeah shes doing a fantastic job, no longer able to compete with Nvidia, and massively behind on market share.
For the consumer it was a good thing AMD came along and purchased ATI otherwise Nvidia would have had monopoly on the discrete video card market in 2006 rather then 2014.
The way I look at it is that ATI were going under and looking at how AMD'S CPU ' S have been over the last good few years, if they hadn't of had the GPU side of things from ATI they might have gone under as well. So in my opinion it was a good move.
From what I recall wasn't ATI on its knees ready to go bankrupt? From that perspective then yes it was a good idea from consumers point of view or there would be no competition whatsoever. But obviously that doesn't fit with the doom and gloom and lets-bash-AMD agenda most people love to revel in.
Tomshardware article was years out of date before AMD spinned off Fabs
Was it a good idea for AMD to buy ATI?
I voted HELL NO!!!
AMD should been NEVER bought ATI back in 2005 and AMD would had spinned off fabs much sooner as 2008 if management team identified the root cause of heavy losses earlier.
AMD was really incredible very lucky to get out of fabs business in 2009 so how is GlobalFoundries doing nowday? Nope they never got any better everyday since spinned off from AMD but things got far much worsen after bought IBM fab 2 years ago, GlobalFoundries's losses was absolutely massive off the scale every year. They never made a profit since first day business opened in 2009, in 2011 GlobalFoundries lost $1.2bn, $2.5bn in 2015 and now in 2016 saw $1.6bn in the first 6 months at the rate of $8.79m everyday so it mean full 2016 result could see losses soared to $3.2bn. Globalfoundries owner Mubadala Technology still explored all options since few years ago to dump all 100% of toxic Globalfoundries business for around $20bn but talks with china government to acquire stakes in Globalfoundries business fell out at the last minute few days ago because they complaint that equipment expenditure for the 12 inch fab was too cheap.
Maybe the owner Mubadala Technology and former ATIC wished they never bought AMD fabs back in 2009 and realised they been fooled that Hector Ruiz had cashed them out and Ruiz laughed off like JR Ewing.
Things would be a lot much different if AMD had not bought ATI for $5.4bn back in 2005. AMD would licensed to use ATI GPU in AMD CPU and ATI still made Imageon for mobile phones and digital TVs today that both Broadcom and Qualcomm would licensed it to use in their products and ATI would be still here and their 2005-2015 annual revenues would be close to Nvidia annual revenues after collected license and royalty fees from AMD, Broadcom, Qualcomm, Microsoft, Nintendo and Sony. ATI would had big Polaris to compete with Nvidia GP104 back in summer and AMD would launched Zen back in 2013 or 2014 after spinned off Fabs in 2008 as GlobalFoundries with different partner.
Hopefully AMD are improving now but reading or listening to what Lucy Lui said or whatever her name is, I think she said with time they can improve their spending on R&D and also I think mentioned it cannot compare with Nvidia's spending at present, which to me means they are limited at present. I think they might be trying to take a fight to Intel on the processor side for now, but who knows, but I doubt personally they'll take a good fight to both Intel and Nvidia at the same time - they don't have the pockets.
That's an interesting post. I actually had similar thoughts as you. I always wonder what would have happened if AMD had just licensed ATI's GPU's for their APUs. In that way they wouldn't have had to cough up the $5.4 billion buying ATI. They could have used that cash to pour it into their CPU division. ATI would have probably kept the handset division and not had sold it off to Qualcomm and the TV set division to Broadcom.
Now, AMD CPU division would have been screwed anyways because the Phenom was already in development when they brought ATI and they would have gotten hammered by intel but the cushion would have been less as they wouldn't had to deal with the $5.4 billion dollar payment and subsequent debt.
Their could have been some challenges though because sub contracting out the GPU side of things to ATI and not having all your engineers in house working on the chip could pose possible challenges like when 3DFX outsourced the 2D processing in their Voodoo Rush graphics chip to a different contractor and it was disaster everything after that was designed in house by 3DFX but then again 3DFX isn't a good example as they ultimately failed.I suppose that is a moot point as Apple is using Power VR chips which is based on tech provided by a totally different company all being put together in a single SoC so just because you have one side of the chip being designed by a totally different company may not be an issue.
I agree with you that the only "smart" and clever thing that AMD did was to spin off their fab division to ATIC which then created Globalfoundries. They played ATIC like a fiddle giving them the impression that the Saudis were going to buy their Fabs (even though they had no interest in it) forcing ATIC to pull the trigger on buying AMD's fab.
It would have been interesting to see though as their CPU division crumbled the load would have lessened if they hadn't had to deal with $5.4 billion cost, who knows, AMD CPU division could have been able to be more focused in directly competing with intel instead of having to fight two giants at the same time.