• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Officially 10 Years since the ATI AMD merger. Was it a good idea?

If your saying that due to the review of the XFX GTR 480 showing much better power and thermal efficiency that's probably more due to binning then process improvement. It would be good if it were due to an improved process but I'm not optatmisic.

It'd be good to get some owner feedback on the XFX GTR as that card was sent to 2cents for the purpose of showing it off to the masses and as you suggest it was undoubtedly cherry-picked.
 
Not having anything to compete with the high end can never be considered a positive.

How they handled not having a high end alternative ready was handled in the best way it could be under the circumstances, Focusing on what they did have ready rather than trying to make something into something it wasn't and possibly releasing a product that was flawed was the best way to go.

They played it safe and that was the smart move.
 
just bought xfx xxx gtr 8gb
That's the model below the one 2cents got but it'd still be good to get thoughts on it once you've had a play with it, maybe in the owners thread.

There's a £40 difference between the two which seems like a lot at that level, They're both 8gb 480's with very little difference visually. The clocks are different but not massively so, 50hz high boost on the Black.

EDIT: Your's is the model JZ2C tested. I was wrong sorry.
 
Last edited:
Last edited:
It'd be good to get some owner feedback on the XFX GTR as that card was sent to 2cents for the purpose of showing it off to the masses and as you suggest it was undoubtedly cherry-picked.

You say that as if its unusual, you don't think EVGA do the same?
 
1.34Ghz Stable

My card was able to hit [email protected], but I prefer running undervolted to [email protected].

Edit: [email protected] stable and 2250MHz memory stable.

Edit: I bought this from pccasegear in Australia

He hit 1.39GHZ stable on stock voltage,but he said he prefers running the card undervolted.

So he managed to hit the factory boostclock with a decent underclock and a decent memory overclock.

A few other people who have the card noted its quite a decent implementation. The same goes with the Powercolor Devil RX480 which according to a German site had noticeably better performance/watt over a normal RX480.

They were among the last of the non-reference RX480 cards to actually be available. The Sapphire was the first - I expect like with the HD4870 we will start to see more improvements as time progresses.
 
Last edited:
Look at that post on Reddit I linked to - the chap got around 1.4GHZ and the voltages were not that high.

It really seems to be the best RX480.

Yeah there are more volts to be had from it, 1.15v is nothing.

He's only getting overclocked temperatures of 65c, which would indicate like J2C's GTR its pulling about 85 watts on the core as opposed to 125 Watts on other / previous 480's.

J2C got 1475Mhz stable out of his, that is a good level overclock for any card, 17%. its way more that my 970 does 24/7 from its out of box speed of 1354Mhz.

XFX may well be the guinea pigs for revised Polaris cards.
 
Last edited:
Its a shame that AMD seems to have had issues producing Polaris 10 and 11. If the launch cards had been like some of these later ones,it would have been a slam-dunk for them. It does worry me for Vega,literally every new GPU launch they have had since the R9 290/290X has some issue.

Nvidia seems to have had near perfect launches since then! :(
 
Its a shame that AMD seems to have had issues producing Polaris 10 and 11. If the launch cards had been like some of these later ones,it would have been a slam-dunk for them. It does worry me for Vega,literally every new GPU launch they have had since the R9 290/290X has some issue.

Nvidia seems to have had near perfect launches since then! :(

Global Foundries.

I think early production just weren't good enough chips and AMD missed the power targets by a mile on Polaris.

If you look at how the PCB is designed it would indicate the power levels from the chip for it was much more efficient.

Reference Polaris PCB's have a 6 phase power VRM, thats a lot, even the much more power hungry Hawaii only had 5.

The GPU power phase is split between PCIe and 6 pin, which again is unusual, its as if originally the power phase was designed as 6 pin only on the core with extra power phase added to the PCIe later.

You would only do that if the SKU's didn't meet your expected power levels.

Think about it, no one designs core power phases in this way deliberately, it causes problems, can you guess what they are?
 
Last edited:
Global Foundries.

I think early production just weren't good enough chips and AMD missed the power targets by a mile on Polaris.

If you look at how the PCB is designed it would indicate the power levels from the chip for it was much more efficient.

Reference Polaris PCB's have a 6 phase power VRM, thats a lot, even the much more power hungry Hawaii only had 5.

The GPU power phase is split between PCIe and 6 pin, which again is unusual, its as if originally the power phase was designed as 6 pin only on the core with extra power phase added to the PCIe later.

You would only do that if the SKU's didn't meet your expected power levels.

Think about it, no one designs core power phases in this way deliberately, it causes problems, can you guess what they are?

Apparently so,and I think its also impacted by the best chips going to mobile. For instance,AMD seems to have eeked out a few more than the zero design wins they had in mobile already - started to see the mobile RX470 pop up in some HP and Alienware laptops.

Thats significant since Nvidia has ruled the roost for the past few years,so to even see an AMD option is a big deal in some way IMHO OFC.
 
How they handled not having a high end alternative ready was handled in the best way it could be under the circumstances, Focusing on what they did have ready rather than trying to make something into something it wasn't and possibly releasing a product that was flawed was the best way to go.

They played it safe and that was the smart move.

I agree. Clearly Vega wasn't ready in the Summer of 2016. Hopefully, with Vega they will get into the swing of things in 2017 and with Navi they will be able to go head to head with nVidia's Volta in 2018.

AMD needs to get back into the days especially in the pre-2011 where they were in lock step with nVidia with the exception of the HD 2900XT launch not only in performance but with releasing products right around the time when nVidia launches it. Ideally I would like to seem them get back to the Mid 2000's where they were not only in lock step with nVidia but beating them with performance.
 
Its a shame that AMD seems to have had issues producing Polaris 10 and 11. If the launch cards had been like some of these later ones,it would have been a slam-dunk for them. It does worry me for Vega,literally every new GPU launch they have had since the R9 290/290X has some issue.

Nvidia seems to have had near perfect launches since then! :(

I would agree. I think the R9 290/290X was the last graphics card that really went head to head with nVidia in the high end sucessfully. Only issue was that it was hotter.

I think this has to do with the fact that they lost a lot of talent in the last 5 years. I mean heavy duty talent that had worked with AMD and going back to ATI going back as far as 16 - 17 years ago and I think they had a deter-mental effect on AMD's graphics division.

As Carrel Killbrew put it, the guy who invented AMD's Eyefinity techonologo who was laid of back in late 2011 who put it this way:

"AMD's losses of top-rate graphics talent is appalling. In order of losses, AMD lost Rick Bergman, me, Eric Demers, Clay Taylor, Bob Feldstein, Mark Leather, Fritz Kruger, and too many others to name. They've lost a substantial part of the Orlando design team to Apple (about a dozen people I hear). In our business we all know the difference between success and failure is a few percent. Lose key leadership and you've probably lost the critical few percent. Make a graphics chip a bit too power hungry, a bit too expensive, a couple of features substandard, and even more importantly miss market cycles and you start the downward spiral."

Looks like they lost that key few percentage points especially with the launch of the Fury X that I think adversely affected their sales and image. Granted he stated this before the creation of the Radeon Technologies Group and with Raja Koduri being the head of the group, so we will have to see how things pan out this year and the next. It takes 3 years to design and bring a GPU to market so we are unlikely to see the results of this changes until Vega or Navi launches.

By the source of the article:

Link
 
There not broke at all AMD has plenty of cash reserves and a bunch of new consoles from Sony, MS and the Apple deal as well to keep them going. The issue with the lack of a competitive top end product may be due to a lack of talnet, by all account GPU designers are being snapped up by the likes to qualcomm to help them build better graphic chips for their SoC's.

This is true. After their last CEO let go of many Graphics talent including the guy who invented EyeFinity there was an exodus of talent. As a matter of fact one of their top Graphics Guys Eric Demers who worked on the GCN architecture left for QualComm shortly after the layoffs happened in late 2011. Here is what the inventor of Eyefinity had to say about this:

"Many people at AMD are looking to leave. They talk to their friends, especially those who have told people privately, "I'm' leaving", or friends who have just left. They ask, "hey, is there any room for me where you are going?"

Granted this was Before the creation of the Radeon Technologies Group, they hired back some guys especially in their driver division and surprise, surprise, look how great their drivers have improved since Crimson launch in late 2015.
 
This is true. After their last CEO let go of many Graphics talent including the guy who invented EyeFinity there was an exodus of talent. As a matter of fact one of their top Graphics Guys Eric Demers who worked on the GCN architecture left for QualComm shortly after the layoffs happened in late 2011. Here is what the inventor of Eyefinity had to say about this:

"Many people at AMD are looking to leave. They talk to their friends, especially those who have told people privately, "I'm' leaving", or friends who have just left. They ask, "hey, is there any room for me where you are going?"

Granted this was Before the creation of the Radeon Technologies Group, they hired back some guys especially in their driver division and surprise, surprise, look how great their drivers have improved since Crimson launch in late 2015.

It's not great when people leave but there's always new talent that can come along. I've worked with a lot of companies and sometimes have conflicting opinions about the quality of some employees vs what the company pays to keep them. Companies need a good employee turnover because it often brings new ideas, creativity and drive. It would be good for AMD for example if some of their past employees who might have joined Nvidia later return, bringing to AMD the experience they might have picked up at Nvidia that they might never have learned if they stayed in their "same ole" job.

I work with different companies and find it funny how sometimes I enter an organisation and see that they fail to spot the obvious. My experience then allows me to point them in the right direction to improve and that's nothing about me being brilliants but it's all down to experience - I get to see how different companies work, and what works and what does not and can then share that to help companies improve, while of course ensuring I don't breach any kind of confidentiality agreements.

If companies struggle when employees leave then to me that means they didn't have great communication in place, cross training, skill sharing etc which is their own fault.
 
Last edited:
AMD culled a lot of people, but assuming they were all the best is daft.

They haven't hired a huge amount of people, and certainly have lower staff levels by some amount than when Read streamlined the company... yet AMD are doing better both in CPU and GPU design.

One issue is, you make a graphics guy who was great in design... a management guy. It's a natural progression, but not every engineer makes a good management guy. So it's entirely possible that Demers was great as an engineer when being told what to do but when he became more senior he was making bad decisions, maybe he was a bad judge of talent when hiring people, maybe he was too unfocused when it came to development and had his team working on too many, or too few things.

Maybe he was pushed out for failing but was let leave on good terms and find another job, who knows. In general a lot of the "I'm leaving and everyone else is leaving" rumours were obviously made up. It's like the guys who complained about AMD moving to a computer driven layout process and acted like the company was in freefall because no one would be stupid enough to do it by computer. The entire industry is now outsourcing layout to extremely specialised people to do it because it's vastly cheaper and superior. That "AMD are so stupid for doing it" talk when this started happening probably 5-6 years ago was just a bunch of people who got ****** because they were losing their career due to automation.

AMD have become a far more streamlined company since Read came in with the directive to start a new architecture, push out the staff who aren't deemed to be doing well enough or push out people in areas where outsourcing made a lot more sense(like layout) and then bring in the right people such as Keller. Financially AMD are today in a far better position to profit from Zen than they were 5 years ago.


Also as Darren says, turnover isn't abnormal. Honestly people move between AMD, Nvidia, Intel, Qualcomm, Apple and many others relatively frequently.

Considering AMD decided to tighten the belts and get to work on a new architecture, the initial indications of Zen performance suggest they made a lot of the correct calls in who to get rid of, who to keep and who to bring in.
 
It'd be good to get some owner feedback on the XFX GTR as that card was sent to 2cents for the purpose of showing it off to the masses and as you suggest it was undoubtedly cherry-picked.
You say that as if its unusual, you don't think EVGA do the same?

Not at all, If EVGA and everyone else isn't sending cherry picked examples of their cards to reviewers they're stupid, I was just saying it'd be good to see if the average version runs as well as this one. I've almost grabbed one myself a few times since then just so I can play around with it but I keep stopping myself because the reality is I can't afford to and it'd mean an extra couple of months saving up for Vega.
 
This is true. After their last CEO let go of many Graphics talent including the guy who invented EyeFinity there was an exodus of talent. As a matter of fact one of their top Graphics Guys Eric Demers who worked on the GCN architecture left for QualComm shortly after the layoffs happened in late 2011. Here is what the inventor of Eyefinity had to say about this:

"Many people at AMD are looking to leave. They talk to their friends, especially those who have told people privately, "I'm' leaving", or friends who have just left. They ask, "hey, is there any room for me where you are going?"

Granted this was Before the creation of the Radeon Technologies Group, they hired back some guys especially in their driver division and surprise, surprise, look how great their drivers have improved since Crimson launch in late 2015.

That was a turning point for AMD and their software, I got my Fury 4 or 5 weeks after they launched and within 3 or 4 months the driver support had been transformed for single gpu user. For the last year my experience with the driver support has been exemplary, not perfect but they've kept on top of any issues and they've been quick to fix them, which is why I was willing to commit to a freesync monitor at the end of last year.

RTG has gotten off to a good start and I look forward to Vega and beyond.
 
Back
Top Bottom