• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's gaming scientist Richard Huddy on the firm's 300-Series graphics cards and more

They look promising, but the whole "100Hz is considered an overclock and therefore the customer is liable" is really offputting, considering the massive price tag of the X34 and the ROG version.

Edit: Not the place to be discussing monitors, really?
 
Fair enough - Never used VR myself but the OR CV1 is 1080x1200 in each eye isn't it?

Yes, which is a fair improvement over previous models and i believe it is 75hz too. Latency and tracking is also meant to be much better.

What is interesting is how they can take advantage of VR by having different cards (in a mgpu) set-up render each 'eye' without having to mirror each others memory.
 
90hz for consumer rift I believe

VRAM wise, the two eyes will still be virtually the same, so I doubt there will be big savings

The way i understand it is instead of rendering 2160 by 1200 on two graphics cards (as it would do on an equivalent monitor) they could run 1080 by 1200 per card. The same amount of eye candy as each other and very similar load needed but resolution essentially halved (saving on VRAM?). It doesn't mean you will use less GPU grunt but rather there will be less Vram used. As they can pretty much run separately to each other and it wouldn't matter as one eye cant see the other screen and their image is practically the same, whereas if you tried to split a monitor up to run one half differently to the other you would have a line down the middle most of the time due to the difference in fps as there will likely be a more intensive side of the screen.

So the saving i speak of is just in Vram and comparing it to the equivalent combined resolution on a monitor. Have i got this all wrong?
 
"Hopefully everyone will be buying Fury Xs and Furys, so everyone will be in a state of pure ecstasy"

The more I think about it the more the Nano seems like an incredibly odd business decision for AMD.

Why would you release an overpriced product into an incredibly limited and niche market when you are struggling with supply of your flagship card that is part of a much larger market. Surely AMD should have been using the GPU's used in the Nano to supply the limited numbers of Fury and Fury X's they have on the market.

Why would they spread the limited Fiji chips across MORE product lines, especially when one of those product lines is in a niche market where sales will be extremely limited (they haven't even sold the very limited stock they had on release day yet..).

It just seems bizarre. Surely they should be using every good Fiji chip as a Fury X at the moment?

At OCUK, AMD now have a fair few Nano's in stock (which will just sit rotting on the shelves at their current price), whilst they have no Fury X's in stock.

Is a monkey in charge at AMD or something?
 
Last edited:
Quite a decent read for huddy.

Really wish they'd drop this 'scientist' tag though, it's a laughable concept at best, he'd sound more credible with an actual real job title.
 
I think part of the problem with AMD is how they approach gaming in general.

They know they sell a product which is used to play games, so they (for some reason) don't think they should treat this product in a serious, professional manner.

That "gamers" want to talk about hardware like children on a playground.

It gives the impression that they view gaming as silly, and that their own hardware is silly by extension. So instead of being professional, they have all this cringe-worthy PR and marketing that does a great job of putting people off AMD.

Don't talk to us like children. It's not funny, it's not endearing. You just make yourselves look like idiots. "Gaming scientist" -- that's exactly how you might explain your job to a child.
 
I think part of the problem with AMD is how they approach gaming in general.

They know they sell a product which is used to play games, so they (for some reason) don't think they should treat this product in a serious, professional manner.

That "gamers" want to talk about hardware like children on a playground.

It gives the impression that they view gaming as silly, and that their own hardware is silly by extension. So instead of being professional, they have all this cringe-worthy PR and marketing that does a great job of putting people off AMD.

Don't talk to us like children. It's not funny, it's not endearing. You just make yourselves look like idiots. "Gaming scientist" -- that's exactly how you might explain your job to a child.

Maybe Huddy thought of the title himself. If you watch any of his interviews he likes to make a lot of puns. I suspect the ecstasy statement in the article was one of his usual bad joke comments.
 
Back
Top Bottom