• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD’s Next-Generation Polaris GPU Architecture Leaked

I imagine it is not the cost of the power but the detrimental elements created by this power that is behind the reasoning for wanting lower power usage.

It's been gone over so many times, I really struggle why people are still bleating about the "price of electricity" when it comes to GPU power usage.

GPU makers don't want to make 500w graphics cards and OEM's want cards that fit in to a particular power budget, the cost of running the cards is irrelevant, not needing 4 power connectors and a 3 slot heatsink/AIO are the issue!

How much performance you can get per watt determines what the performance will be at 250-300W, which is pretty much the limit for what they are willing to release as a single card.
 
Well it's AMD's new card time here in the OCUK Big Brother house... Page two and we are already discussing Gsync...

Carry on lads, I've got plenty of chocs and twiglets left over from Christmas :D
 
Well it's a good job he didn't tweet about his old car only doing 60mph and his new one doing 180mph, else we'd be saying that the new AMD card were going to be 3 times as fast. :rolleyes:
 
If only and spent more time on their graphics cards instead of bs marketing. Bulldozer,furyx all flops. I've lost faith in AMD I hope they are bought over

Closest AMD have been to a star is the temps on 290x, that was supernova
 
Last edited:
die shrink, 2.5 times more efficient this generation cards with HDR screens is really a brighter future for all AMD buyers its even 2.5 times brighter than before and thats bright!

Polaris the star of AMD
 
This geezer is full of it (I mean the Joker doing the clickbait article not you G)

Agreed. That is one awful article lol. Mind you, they don't have much to go on and wccftech are back to their usual click bait but hey, we have not had any news for awhile, so go with it I say :)
 
I imagine it is not the cost of the power but the detrimental elements created by this power that is behind the reasoning for wanting lower power usage.

Ha, when was the last time you read in this forum the argument "get the 970 less on the power bill" or when 290X was coming out "290X will destroy your electricity bill, get a 780Ti or a Titan Black"
And when myself or others proved that the argument of "power efficiency" and "electricity cost savings" is a ridiculous argument, they just carry on the same motto?
And we did prove in here (i can find my posts when electricity was 30% more expensive also than it is now) that you have to run the 780Ti & Titan Black for 25-40 years MORE than the 290X to justify the extra costs, at 10h per day 365d per year running all that time at 100% load. While most of them had no clue how much they pay for their electricity bill on Kw/h?

Even in here over this page, you see someone mentions the power efficiency of the Nvidia cards. Which is completely bonkers.
 
Last edited:
Ha, when was the last time you read in this forum the argument "get the 970 less on the power bill" or when 290X was coming out "290X will destroy your electricity bill, get a 780Ti or a Titan Black"
And when myself or others proved that the argument of "power efficiency" and "electricity cost savings" is a ridiculous argument, they just carry on the same motto?
And we did prove in here (i can find my posts when electricity was 30% more expensive also than it is now) that you have to run the 780Ti & Titan Black for 25-40 years MORE than the 290X to justify the extra costs, at 10h per day 365d per year running all that time at 100% load. While most of them had no clue how much they pay for their electricity bill on Kw/h?

Even in here over this page, you see someone mentions the power efficiency of the Nvidia cards. Which is completely bonkers.

A lot of the discussions I witnessed wasn't so much to do with the power draw, as to the actual TDP and AMD did a great job with that via the Fury X (changed my mind up totally on using an AIO cooler) but failed dismally with the 290X ref cooler. Another thing is power efficiency that gets missed and what more performance can be gained via that power efficiency.
 
Ha, when was the last time you read in this forum the argument "get the 970 less on the power bill" or when 290X was coming out "290X will destroy your electricity bill, get a 780Ti or a Titan Black"
And when myself or others proved that the argument of "power efficiency" and "electricity cost savings" is a ridiculous argument, they just carry on the same motto?


Not often, infact the cost of the power draw is only really ever mentioned by people arguing against the importance of power draw.

The cost shouldnt be an issue for anybody, but the knock-on effect should. Here's a more relevant question - how often do you see people trying to compare power draw across brands at stock clocks, again as an argument against the importance of power draw, on Overclockers UK Forums? I'll tell you what that happens a lot more often than people mentioning the £££ saving in electricity.
 
People who go on about power consumption always say the Fury Nano is the best card.

Oh.... wait, they never do :confused:

 
Last edited:
You know things are close when people start to talk about the cost of electricity. More relevant would be any extra heat generated by a more power hungry card.

I would think the criteria would be:

Enough grunt to run the frame rate you require.
Affordable
Quiet
Cool
 
A lot of the discussions I witnessed wasn't so much to do with the power draw, as to the actual TDP and AMD did a great job with that via the Fury X (changed my mind up totally on using an AIO cooler) but failed dismally with the 290X ref cooler. Another thing is power efficiency that gets missed and what more performance can be gained via that power efficiency.

I agree with you. The 295X2 which is actually 2 290X and mine overclocked to 1100/1625, the AIO cooler handles it fine. Let alone a single FuryX!

The biggest mistake AMD did, was to put the initial 290Xs with blower coolers out. All the bad reviews cost them because of the blower heat sink.

And stick like a stigma, even if the custom cooler ones were fantastic cards.
 
I agree with you. The 295X2 which is actually 2 290X and mine overclocked to 1100/1625, the AIO cooler handles it fine. Let alone a single FuryX!

The biggest mistake AMD did, was to put the initial 290Xs with blower coolers out. All the bad reviews cost them because of the blower heat sink.

And stick like a stigma, even if the custom cooler ones were fantastic cards.

What you seem to be forgetting is that AMD released the 290x for nearly half of what the GTX Titan cost yet it was every bit as fast. Things like noise can easily be overlooked in the face of value, IMO. Nobody was really complaining that much about it at launch so it shouldn't be used against it now.

They've now refined it and the 390 is an absolute cracker for the money.
 
I thought they were sticking with GCN for at least another generation, is this just a renaming of GCN2.0?

I wonder how much of this if anything is down t the samsung partnership, I have a suspicion Sammy have thrown in with AMD because of the bad blood with Nvidia.

Do we think this is just for the high end like the advent of GCN was or are we looking at top to bottom?
 
I thought they were sticking with GCN for at least another generation, is this just a renaming of GCN2.0?

I wonder how much of this if anything is down t the samsung partnership, I have a suspicion Sammy have thrown in with AMD because of the bad blood with Nvidia.

Do we think this is just for the high end like the advent of GCN was or are we looking at top to bottom?

samsung been making some stuff for nvidia too, i think they are just trying to finance 10nm faster, and yes they will stick with GCN architecture, hopefully add some interesting new features.
 
Back
Top Bottom