• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

The few? How do you know this?

I definitely agree for us regulars, (well I am mainly a regular lurker, but only recently went crazy posting) we will likely upgrade every 12-24 months.

However many people come here lurking or ask for advice on what gpu is best for them and you can see a lot of them don't know what's what and they don't keep up with things like we do or game as much. Chances are these people do not upgrade every year.

Not many give genuine advice that is best for the user. They don't stop and ask questions to understand what the users needs are to best tailor the advice.

Yes and recently on here the 390 has been recommended far more frequently than the 970 and rightly so. However it is only until very recently that the 390 has been consistently beating the 970. Even then there have been some popular recent titles where the 970 is just as fast if not faster. The 300 series still has its caveats though, like nearly twice the tdp.
 
Yes and recently on here the 390 has been recommended far more frequently than the 970 and rightly so. However it is only until very recently that the 390 has been consistently beating the 970. Even then there have been some popular recent titles where the 970 is just as fast if not faster. The 300 series still has its caveats though, like nearly twice the tdp.

Yeah, anyway let's move on to Polaris news ;)

Next... :p
 
So if I am not likely to get a significant boost this summer with rx 480 what's the deal with crossfire?

Is the future of dx12 an anti dual card situation for AMD?

I don't want to go without Freesync. Variable sync is a must for me.
 
So if I am not likely to get a significant boost this summer with rx 480 what's the deal with crossfire?

Is the future of dx12 an anti dual card situation for AMD?

I don't want to go without Freesync. Variable sync is a must for me.

AMD have been making more noise about crossfire/dual GPUs, speculation about future consoles having multi-GPU as well from an engineering perspective there is certainly something to be said about it. With CPUs they moved to multicore simply because it gave the most significant performance boost within finite transisor budget. Graphics is inherently parallel , and so at a high level multi-GPU should be a walk in the park but due to a lot of other constraints it ends up incredibly complex.

However, although DX12 does facilitates better multi-GPU capabilities this requires significant resources from developers who are increasingly constrained by budgets and timelines. Multi-GPU users are few and far between so this is rarely a high priority.

I can see AMD trying to promote Polaris 10 crossfire if it is priced right. For 300w you could have a setup equal to 2x390x which draws 600w.
 
That 480 leak states the price will be $199 - $250 USD, so dirt cheap really

With VAT that is £165 to £210. So if it is around R9 390X for that price it is a good deal IMHO. That is nearly twice the speed of the GTX960 and R9 380.

The GTX1070 is meant to be Titan X level and will probably start around £320 to £330,and that means you are paying £100+ for only a 25% performance increase:

https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080/images/perfrel_2560_1440.png

perfrel_2560_1440.png
 
Last edited:
£100 for an extra 25% performance is a bargain though. Look how much more money people pay for a 1080 over a 1070, or for one of the premium cards over a budget brand.

But that is all irrelevant. The real question is what is the price and performance of the incoming Pascal 1060.
 
AMD have been making more noise about crossfire/dual GPUs, speculation about future consoles having multi-GPU as well from an engineering perspective there is certainly something to be said about it. With CPUs they moved to multicore simply because it gave the most significant performance boost within finite transisor budget. Graphics is inherently parallel , and so at a high level multi-GPU should be a walk in the park but due to a lot of other constraints it ends up incredibly complex.

However, although DX12 does facilitates better multi-GPU capabilities this requires significant resources from developers who are increasingly constrained by budgets and timelines. Multi-GPU users are few and far between so this is rarely a high priority.

I can see AMD trying to promote Polaris 10 crossfire if it is priced right. For 300w you could have a setup equal to 2x390x which draws 600w.
The point about power draw is what made me ask. Worst case scenario considered, I looked at getting another 390 but it is a 2.5 slot card and the power draw would be excessive.
 
£100 for an extra 25% performance is a bargain though. Look how much more money people pay for a 1080 over a 1070, or for one of the premium cards over a budget brand.

But that is all irrelevant. The real question is what is the price and performance of the incoming Pascal 1060.

I know no gamer who would spend £100,ie, 50% to 60% more for a 25% increase which is meaningless. Only hardware enthusiasts or people with 4K type displays.It is not a bargain. The market over £300 is far smaller than the £200 to £300 market,let alone the sub £200 one.

Even JPR figures confirm that.

People buying £150 to £200 cards don't buy £320 to £350 GPUs all of a sudden. They might stretch to a £250 one.

The GTX1070 isn't a bargain when the GTX670 came out at the same dollar price 4 years ago and was 20% faster than the GTX580. That was far closer to a bargain.

That £100 would be better spent on someone getting a better CPU,ie, a Core i5 instead of a Core i3,and one of the adaptive sync monitors.

That is what £100 buys you.

I really,really,really hope for Nvidia's sake the R9 480 is not £165 for R9 390X level performance,because that would mean the GTX1070 is double the price for 25% extra performance - that would be an HD4870 level upset.

I somehow doubt it TBH and think the pricing seems a tad low.
 
Last edited:
I know no gamer who would spend £100,ie, 50% to 60% more for a 25% increase which is meaningless. Only hardware enthusiasts or people with 4K type displays.It is not a bargain.

People buying £150 to £200 cards don't buy £320 to £350 GPUs all of a sudden. They might stretch to a £250 one.

The GTX1070 isn't a bargain when the GTX670 came out at the same dollar price 4 years ago and was 20% faster than the GTX580. That was far closer to a bargain.

That £100 would be better spent on someone getting a better CPU,ie, a Core i5 instead of a Core i3,and one of the adaptive sync monitors.

That is what £100 buys you.

True. Plus not sure where the £100 came from, there will be close to double that difference between 1070 and 1080 when all custom cards are out.

Overall I am a little disappointed with these first gen 16nm cards, it was not the leap in performance I was expecting from either camp, unless by miracle amd surprise us tonight. But it is what it is. At the very least 980Ti performance can now be had for much cheaper.

This is probably me being optimistic, but I am expecting Vega with HBM 2 will be at least as fast as 980Ti in SLI.
 
£100 for an extra 25% performance is a bargain though. Look how much more money people pay for a 1080 over a 1070, or for one of the premium cards over a budget brand.

But that is all irrelevant. The real question is what is the price and performance of the incoming Pascal 1060.

not everyone like to throw money away, some ppl try to see what value those 25% perf would bring them, wont make 4k more playable, nor will 100fps feel that different from 80fps.
personally i game at 1080p and i am fine with my usual 70-90fps, when it's the upper end i VSR 1440p for a nice 60fps, so sure as hell im not the kind to drop extra 300€ to get something i will hardly notice.
 
Nice, if it really is £165, that kinda leaves Nvidia in an awkward spot. Price drops hopefully, but I ain't counting my chickens.
 
not everyone like to throw money away, some ppl try to see what value those 25% perf would bring them, wont make 4k more playable, nor will 100fps feel that different from 80fps.
personally i game at 1080p and i am fine with my usual 70-90fps, when it's the upper end i VSR 1440p for a nice 60fps, so sure as hell im not the kind to drop extra 300€ to get something i will hardly notice.

Does VSR drop frames to 60 fps on a higher refresh monitor? I recall seeing a Polaris thing running VSR at 60 fps but the monitor was capable of 120Hz iirc.
 
True. Plus not sure where the £100 came from, there will be close to double that difference between 1070 and 1080 when all custom cards are out.

Overall I am a little disappointed with these first gen 16nm cards, it was not the leap in performance I was expecting from either camp, unless by miracle amd surprise us tonight. But it is what it is. At the very least 980Ti performance can now be had for much cheaper.

This is probably me being optimistic, but I am expecting Vega with HBM 2 will be at least as fast as 980Ti in SLI.

I was talking about the R9 480 at £165 to £210 to a GTX1070 at £320 to £330. £100 to £120 for 25% extra performance would be spending at least 50% more if the R9 480 is £210 and the GTX1070 is £320.

Personally I think the £165 to £200 price for R9 390X level performance probably looks too good to be true,but if by some chance it is that would be a big improvement.

It would nearly double what a GTX960 or R9 380 would do at that price-point,and be 50% faster than an R9 380X.

Even the GTX660 and HD7870 at the last node shrink,at £200ish,were only 30% to 45% faster than the HD6870 and GTX560TI:

https://tpucdn.com/reviews/MSI/GTX_660_Gaming/images/perfrel_1920.gif

Plus if AMD has improved DX11 performance and made improvements to DX12 performance too,even if the Fury line is a bit quicker,it pretty much means they are pointless too.

OFC,we might just get a R9 390 at £260 though,and we go WTF AMD!!

:p
 
Last edited:
Does VSR drop frames to 60 fps on a higher refresh monitor? I recall seeing a Polaris thing running VSR at 60 fps but the monitor was capable of 120Hz iirc.

The latest I read was was the monitor could run 144hz 1440p and AMD did their Hitamn 60FPS cap to mask performance, which makes sense because demo staring 1440p performance on a 1080p monitor through VSR is just plain retard.
 
Back
Top Bottom