• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

As much as the forum warrioring would be unbearable, I hope we have a really close fight between AMD and Nvidia this time around. I Hope they both bring us some ridiculously good products.

Time to dust off the old savings account and kick her back into action.


+ 1, I want the competition between Nvidia, AMD and Intel to be so tight that the most hardened of fanboi's on all sides make fools of themselves shilling their corner.

That'd be awesome. :)
 
A benchmark slide comparing power consumption of a Polaris gpu with a GTX 950 @ 1080P

mlqbh0.jpg



the test is dated 2nd December 2015 so they have working samples of a next gen gpu which seems to be a low power notebook gpu. Hopefully more leaks coming in the next few months. Both cards are on medium preset for Battlefront but for comparison we know that ultra preset on a gtx 950 gets about 48 fps.

http://www.techspot.com/review/1096-star-wars-battlefront-benchmarks/

Not a notebook GPU. Both systems were 4790K.
 
+ 1, I want the competition between Nvidia, AMD and Intel to be so tight that the most hardened of fanboi's on all sides make fools of themselves shilling their corner.

That'd be awesome. :)

I really want AMD to decimate Nvidia this time, that is what they need in order to really put them back in the game. One can I dream I suppose.
 
It's all about the power now... I would rather it needed 3 x 8 pin connectors but could manage 60fps at 4k tbh.....

:D

I really don't care about power efficiency as an end-goal. However, there is a limit to how much heat a GPU can dissipate, so I do care in that power efficiency is likely to be relevant for where the performance limit for an air-cooled card is (Not interested in going water personally)

If power efficiency doesn't then get recycled back into a bigger, faster part then I couldn't give a hoot.
 
I'm no expert on that but in my mind 15% overhead means the PSU is comfortable.
If my system peaks at 550 i would want at least 600 Watts minimum.

Ideally i'm more happy with one that has way more than enough, so its never working anything like hard.

Even with my overclocks, 3 HDD, an SSD, 2x 200mm Fans, a water pump, Rad fan and 4 RAM sticks i doubt i'm pulling 400 Watts.

My PSU has an ECO switch on it, the fan only spins up when it gets hot enough, the dust filter under the PSU is always clean.

Apparently PSUs are most efficient at 50% load. So if your system pulled 200W 99% of the time, you'd go for a 400W PSU.

Of course you need to make sure that peak consumption doesn't exceed 380W or so... there is a huge drop-off in efficiency as you reach maximum output, and a huge increase in heat generated.
 
Ok, i did say i was no expert ^^^^ :)

It's all about the power now... I would rather it needed 3 x 8 pin connectors but could manage 60fps at 4k tbh.....

:D

Its always been about the power, the 390 uses 50 Watts more, thats like 4 times 8 bazillion watts man! :D
 
Why would they choose medium settings in that case?

It doesn't add up. My guess is that medium settings are the highest they can use without dropping below 60 FPS.


Looks to me like they are not showing performance with that example, only a power efficiency comparison.
 
Pc gaming people are not bothered about how much power its not going to use we just want a single card that can game at 4k @ 60fps+. We had 1440p and that has been conquered with single cards for a while. Unless this can do 4k what is the point...

I bet it offers the same performance as the 980ti but uses a bit less electric to keep the hippie earh lovers happy....
 
I really don't care about power efficiency as an end-goal. However, there is a limit to how much heat a GPU can dissipate, so I do care in that power efficiency is likely to be relevant for where the performance limit for an air-cooled card is (Not interested in going water personally)

If power efficiency doesn't then get recycled back into a bigger, faster part then I couldn't give a hoot.

I agree, enthusiasts only care about efficiency as a means to an end i.e. they want much more performance at the same wattage. I mean that's why 1000+W PSUs are popular after all. I don't care about power efficiency for its own sake, however, power efficiency is a very important part of the quest for getting more performance.

I would even be willing to bet that if going significantly above the 300W TDP for a GPU enabled 60+fps at 4K or 90+fps minimum for VR in demanding current and future games with all the settings at maximum then enthusiasts would think that is a great thing as long as the card was cooled really well and had a brilliant noise profile. This is certainly how I as an enthusiast think.
 
No. It's definitely on Samsung 14nm FF LPP ... definitively not TSMC 16nm FF+.

According to the AT article it's on both TSMC and GF.

As for RTG’s FinFET manufacturing plans, the fact that RTG only mentions “FinFET” and not a specific FinFET process (e.g. TSMC 16nm) is intentional. The group has confirmed that they will be utilizing both traditional partner TSMC’s 16nm process and AMD fab spin-off (and Samsung licensee) GlobalFoundries’ 14nm process, making this the first time that AMD’s graphics group has used more than a single fab.
 
Last edited:
Pc gaming people are not bothered about how much power its not going to use we just want a single card that can game at 4k @ 60fps+.

Unless people are asking which GPU or PSU to buy. Numerous thread where people will ask for a 390 or 970 and the power argument is always mentioned, just like when I was thinking of buying a EVGA850 B2 bronze, people telling me get a gold rated one its more efficient. As it happens the difference between the 390 and the 970 is probably more than the B2 bronze and an EVGA gold rated one.
But of course who cares when you can use it to get your point across on the internet right. :D
 
Pc gaming people are not bothered about how much power its not going to use we just want a single card that can game at 4k @ 60fps+. We had 1440p and that has been conquered with single cards for a while. Unless this can do 4k what is the point...

I bet it offers the same performance as the 980ti but uses a bit less electric to keep the hippie earh lovers happy....
I dont think we'll see a card that can do 4k/60fps at High/Ultra settings in all the latest, most demanding games til 2017/2018. 980Ti is decent for this at 1440p, but 4k is a much bigger jump than 1080p->1440p.
 
Why would they choose medium settings in that case?

It doesn't add up. My guess is that medium settings are the highest they can use without dropping below 60 FPS.

SO, that doesn't mean they won't go above 60fps either. It's just saying hey, we can offer a box that does X fps at Y power.

When on earth will people learn. These slides are NOT SELLING to gamers, but selling to Dell who want to sell a box they can advertise as providing, lets say 60fps in Battlefront. What does that mean for them, if they can do it at 86W it means this card can go in a cheaper box, a smaller box, one with a smaller cheaper, quieter PSU maybe. IE it can fit in a different price point.

Most people on forums like these struggle to understand that every single piece of information isn't aimed at them, then they all act upset by the info not being what they want.

It's like dev days where AMD, intel, Apple and anyone else having similar days have to talk about future hardware, it's how you get industry support by letting them know what is coming. But here someone leaks slides intended for devs and talking about coding features or something non gaming and ultra high end and they accuse AMD of sucking at marketing, because slides not meant for them didn't tell them what they wanted.

99% of 'marketing' materials for events like CES aren't meant for the end user to decide anything. Being able to put in a cheaper lower power card into a Dell box that can achieve a certain performance target is a very powerful message to send to a Dell buyer/system designer.
 
I have a bad feeling about this.

"optimized" finfet design
touting unimpressive bench figures and focusing on ppw

Of course I predicted all this. Pascal will be the same way. 980 all over again. But with no Titan this time.

Not unimpressive, and every generation for 7-8 years has focused on performance per watt. When you're limited to a certain amount of power out of a card, performance per watt is literally the limiting factor for performance. Predicted what, that this architecture would follow the same pattern as all before it, well done. Guess what, the last GPUs were made on optimised process designs, if you don't optimise your chip for a process you'd be wasting power for no reason at all.

980 all over again? Outside of Nvidia's pricing, bringing more performance in a smaller area is a good thing. Again like power, if you're limited by area the more you can fit in that area is good. You make a new architecture, it means your 550mm^2 card gets faster, and your 300mm^2 card gets faster. Your midrange card won't beat your high end card, it's not physically possible. The 980 was what Nvidia determined to be the best size/yield/performance ratio, midrange card will never focus on the most power possible, you target the best range of things to get a better card. 10% larger might let you get 15% more performance but due to lower yields increase costs by 30%, making it not worthwhile. Or dropping size by 15% might only drop performance by 7% (say going with same rops/tmus, less shaders so limited but not in all situations) but brings cost down 20% making it worthwhile.

There is nothing fundamentally wrong with the 980... except for Nvidia selling it as high end and daft consumers not telling Nvidia where to shove the pricing.
 
Who cares how much power they use!? We wanna see how fast and powerful they are

Power used = heat output = fan noise/pump noise.

I for one don't enjoy sitting next to a loud system, though I may just be more sensitive to background noise than others.

Having FuryX performance (or higher) with a much quieter cooling solution will be amazing :)
 
Power used = heat output = fan noise/pump noise.

I for one don't enjoy sitting next to a loud system, though I may just be more sensitive to background noise than others.

Having FuryX performance (or higher) with a much quieter cooling solution will be amazing :)

'only' FX / 980Ti on the new node is likely to be an upper mid range card imo, 380X replacement. The only issue with the smaller node is that the die size is going to be so much smaller and denser that operating temps of the GPU might become an issue ... but Samsung's LPE has shown very, very low leakage at least on ARM chips. Hopefully the LPP is the same on big GPUs. The .85mV (Fury / Fury X are about 1.3V IIRC) core voltage for that demo is certainly promising. Also, one of the guys in the video said that they were increasing frequencies with Polaris, which I wasn't expecting ... be interesting to see how much by. I thought frequencies would likely be similar / slightly lower.
 
Power used = heat output = fan noise/pump noise.

I for one don't enjoy sitting next to a loud system, though I may just be more sensitive to background noise than others.

Having FuryX performance (or higher) with a much quieter cooling solution will be amazing :)

Exactly this, plus you can save a bit on the PSU

Its only the first week of Jan and there is news already, look forward to seeing what unfolds and time to get saving
 
Back
Top Bottom