• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Inventor of AMD's Eyefinity Technology criticizes AMD. Your thoughts?

Associate
Joined
8 Jan 2016
Posts
94
Location
Across the pond
The tech/blog site published an article last year where it complied a series of posts by former AMD Carrell Killebrew where he criticized the current state of AMD's Graphics Division. Mr. Killebrew was let go in late 2011 during a series of job cuts after then CEO Rory Read was hired. After that, subsequently a number of AMD Graphics Guys left AMD. If you remember last year AMD lost significant amount of market share. As Mr. Killebrew stated:

"AMD is losing a lot of the best people. That has to be laid at the feet of its current leadership, which includes the Board of Directors and the Chairman Bruce Claflin, not just (former CEO) Rory Read. "

Mr. Killebrew even uses Star Wars to reference this:

"To try and stem that tide through fear, whether it has some foundation in fact or not, reminds me of Princess Leia in the original Star Wars, "The more you tighten your grip Tarkin, the more star systems will slip through your fingers." Positive leadership attracts people. Repelling people, the best people, is commentary enough on AMD's leadership."



"AMD is losing a lot of the best people. That has to be laid at the feet of its current leadership, which includes the Board of Directors and the Chairman Bruce Claflin, not just (former CEO) Rory Read. "


Mr. Killebrew on AMD losing talent:

"Many people at AMD are looking to leave. They talk to their friends, especially those who have told people privately, "I'm' leaving", or friends who have just left. They ask, "hey, is there any room for me where you are going?"

"AMD's losses of top-rate graphics talent is appalling. In order of losses, AMD lost Rick Bergman, me, Eric Demers, Clay Taylor, Bob Feldstein, Mark Leather, Fritz Kruger, and too many others to name. They've lost a substantial part of the Orlando design team to Apple (about a dozen people I hear). In our business we all know the difference between success and failure is a few percent. Lose key leadership and you've probably lost the critical few percent. Make a graphics chip a bit too power hungry, a bit too expensive, a couple of features substandard, and even more importantly miss market cycles and you start the downward spiral."

"It's a real shame. ATI in its latter days and as part of AMD through 2011 was contending against nVidia head-to-head. It is very hard to see how that will happen in the future, and even worse to see the destruction and disintegration of a a world class team such as that."


It's hard to argue against what he is saying. AMD had up to 38% market share only a year and a half ago. Then Maxwell hit and it went downhill from there. Granted this article was published before Radeon Technologies Group was formed to make AMD's Graphics Division more "agile" as AMD's CEO Lisa Su put it.

What do you guys think? Is his argument valid? I think certainly loss of talent played a part in AMD losing nearly half their market share in a year and a half. Had the old ATI guys been around I don't think this would have happened.

This is not to bash AMD, I have a Fury X and I got it cheaper than the 980 Ti and I get similar level of performance, that's why I brought it. I just think it was priced incorrectly and it could have shipped with more mature drivers with better performance boost (like it is currently now with the newer drivers) when it launched. Also, the R9 300 series was late to market to counter Maxwell.

Source:

TechAvenue


Hopefully AMD will have a great 2016 with Polaris.
 
Last edited:
Would have been nice to see something a little more substantial or eye opening insider knowledge of how things were run rather than what could just as easily be sour grapes by someone sore at being sacked.
 
well, it wasn't easy for AMD to keep the ship afloat, the situation they were in in 2011, they had to cut jobs, and they are still not out yet.
 
Would have been nice to see something a little more substantial or eye opening insider knowledge of how things were run rather than what could just as easily be sour grapes by someone sore at being sacked.

It could definitely be interpreted that way. But a lot of the names he mentions were not "sacked". They left willingly such as their former Graphics head Eric Demers and a bunch of others as mentioned in the article.

Granted after the recent creation of Radeon Technologies Group AMD hired new talent like the guy who developed TXAA tech (I think that's what is called the tech that was implemented in Watch Dogs) from nVidia and they are doubling down on their drives with Crimson. They are also trying to improve their frame pacing by hiring the guy from Tech Report who first started to use Frame Pacing as a measurement in his tests. They made Raja Koduri as the head who recently came back from Apple 2+ years ago. But needless to say if he is right that's like 2 dozen+ graphics talent who worked at ATI going back before AMD brought ATI that left.

I think a lot of what he is saying is true. ATI did build a World Class Grpahics engineer team and did many acquisitions like buying up ArtX back in 2000 and XGI back in 2006 to build on their talent.

It seems like with their new RTG group they are trying to go back to what ATI was, a company primarily focused on bringing the best Graphics Talent.
 
Brand new account, already posted 2 AMD-trashing threads... whose alt is this?

As for the story, typical narcissist nerd lashing out after being fired, seen it a hundred times. Nice Star Wars reference, dork.
 
Last edited:
Brand new account, already posted 2 AMD-trashing threads... whose alt is this?

As for the story, typical narcissist nerd lashing out after being fired, seen it a hundred times. Nice Star Wars reference, dork.

I see constructive criticism.
 
Brand new account, already posted 2 AMD-trashing threads... whose alt is this?

As for the story, typical narcissist nerd lashing out after being fired, seen it a hundred times. Nice Star Wars reference, dork.

No man. I have a Fury X.

I am just curious as to what other's think.

And exactly how am I bashing? The inventor of AMD's Eyefinity seems like a reasoned criticism. I mean he did list several names (many of whom left AMD willingly), not to mention a dozen or so others that he alluded to that left.
 
well, it wasn't easy for AMD to keep the ship afloat, the situation they were in in 2011, they had to cut jobs, and they are still not out yet.

I could see them cut jobs. Buy why the engineering team? Not to mention the guy who invented Eyefinity and was the brains behind the architecture of the HD 4800 series that was very successful. Not to mention he worked on the 5800 series.

The new CEO Lisa Su cut jobs too when she became CEO first in 2014 and then last year. But she didn't cut engineering jobs.

I can see cutting management and marketing but engineers no. If engineers can push out great products and in a timely manner then the products will succeed no matter what. There is less need for "marketing" if a product is good.

The fact is they were late to market with the 300 series had they launched the products back in September or October of 2014 there would be greater marketshare, many people had already brought GTX 970 and 980 by the time the R9 300 series came out. Not to mention there was a greater "perception" of the GTX 900 series with those being on the market longer.
 
well, it wasn't easy for AMD to keep the ship afloat, the situation they were in in 2011, they had to cut jobs, and they are still not out yet.
This, plus people lose talent all the time naturally along with the fact that they've actually started to do a few new things (HBM, polaris, new software, freesynch, HDR etc.) along with catching back up on updates. I'd say if they're losing there best people then it's not showing in what they are delivering. Fury could have been a tad better (not sure what's up with the lack of overclocking but that's really it's only main issue) but it wasn't all that bad to be fair. If it could overclock it'd have been a fair and there other stuff has been good enough in my eyes.
 
This, plus people lose talent all the time naturally along with the fact that they've actually started to do a few new things (HBM, polaris, new software, freesynch, HDR etc.) along with catching back up on updates. I'd say if they're losing there best people then it's not showing in what they are delivering. Fury could have been a tad better (not sure what's up with the lack of overclocking but that's really it's only main issue) but it wasn't all that bad to be fair. If it could overclock it'd have been a fair and there other stuff has been good enough in my eyes.

Well this was posted before the creation of the RTG. I think they have realized their mistake and are trying to correct it.

I think the main effect was slow to market with the R9 300 series and the subsequent loss of market share. I don't think this would have happened with the people that left as they had tremendous amount of experience in the GPU market.

To be fair they made Raja Koduri the head as he knows the GPU market as well as other top Graphics Talent.
 
With GPUs companies are only as good as their latest family of chips.

In less than 6 months Maxwell and all the success that it brought NVidia will be history and will count for nothing.

I think the upcoming Polaris chips/GPUs will be very good indeed and may even have the edge on Pascal.

What will people be writing in 12 months time ?
 
Well this was posted before the creation of the RTG. I think they have realized their mistake and are trying to correct it.

I think the main effect was slow to market with the R9 300 series and the subsequent loss of market share. I don't think this would have happened with the people that left as they had tremendous amount of experience in the GPU market.

To be fair they made Raja Koduri the head as he knows the GPU market as well as other top Graphics Talent.

R9 300/Fury launch was botched IMO unless they had good reason to believe nVidia was about to launch next gen cards relatively soon. Too much overlap with the 200 series, too many cards vying for close performance tiers - should have launched 390X as the Fury LE, pushed the Fury a little more and held the Fury X back for the Christmas market when hopefully they'd have more stock and could get it a little more competitive against or even decently beating the 980ti.
 
This, plus people lose talent all the time naturally along with the fact that they've actually started to do a few new things (HBM, polaris, new software, freesynch, HDR etc.) along with catching back up on updates. I'd say if they're losing there best people then it's not showing in what they are delivering. Fury could have been a tad better (not sure what's up with the lack of overclocking but that's really it's only main issue) but it wasn't all that bad to be fair. If it could overclock it'd have been a fair and there other stuff has been good enough in my eyes.

the furys gpu is a far denser part than Maxwell. It also has hardware based memory managment and scheduling which is mostly done in drivers for maxwell. this adds a lot more power usage to the fury parts as moving data around in a processor uses around 40 times more power than the data processing itself.

essentially the hardware scheduling and the sheer size of the chip, as well as using dense libraries limited its overclocking headroom.
 
Last edited:
the furys gpu is a far denser part than Maxwell. It also has hardware based memory managment and scheduling which is mostly done in drivers for maxwell. this adds a lot more power usage to the fury parts as moving data around in a processor uses around 40 times more power than the data processing itself.

essentially the hardware scheduling and the sheer size of the chip, as well as using dense libraries limited its overclocking headroom.
Can you cite me the source for this? I would appreciate it.
 
The guy who complains is it the same dude who made the eyefinity, and AMD is trying more that a year now to implement freesync to his solution?

Because if he had done better job things could have been better

And while everyone is entitled an opinion, i am not taking seriously someone sacked, not even with a tonne of salt.
 
FreeSync works in Eyefinity, the only requirement is the monitors must be identical and support FreeSync.
 
With GPUs companies are only as good as their latest family of chips.

In less than 6 months Maxwell and all the success that it brought NVidia will be history and will count for nothing.

I think the upcoming Polaris chips/GPUs will be very good indeed and may even have the edge on Pascal.

What will people be writing in 12 months time ?

remember the 9700, fast.
the 480 nvidia hot vaccum cleaner power house?
People forget fast.
Why do Nvidia produce such power draining hot cards?
silly really. Never buying that brand.

ah all eyes Polaris the future brighter than ever gotta bring shades.
sexy shades for me.
 
FreeSync works in Eyefinity, the only requirement is the monitors must be identical and support FreeSync.

10320x1440 HDR eyefinity
single card Polaris.
Ok, santa you know what I want, send them to me.:D
If AMD want me to write a review of gaming with such just send the hardware to me and I do a youtube video of the year with it. :D
 
R9 300/Fury launch was botched IMO unless they had good reason to believe nVidia was about to launch next gen cards relatively soon. Too much overlap with the 200 series, too many cards vying for close performance tiers - should have launched 390X as the Fury LE, pushed the Fury a little more and held the Fury X back for the Christmas market when hopefully they'd have more stock and could get it a little more competitive against or even decently beating the 980ti.

I actually thought the 390X (back then we would've called it 380X, third card down in the stack) would be a 2nd-tier salvage. Obviously the supply did not allow for that, yields were probably good on the chip itself so they could have lasered some but they had a real nightmare when it came time to actually assembling the final thing.
 
Last edited:
Back
Top Bottom