• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Associate
Joined
26 Aug 2010
Posts
554
None of them will. Vega is the one you want if you are coming from Fiji. Polaris is more 290/290x/390/390x, while Vega is going to be the high-performance part.

I don't think that will be the case. If it is, that is a huge fail by AMD.

New architecture and a doubling of available transistors even on a much smaller die and still can't beat fury x? It surely will by 20-30% and that will be their 490x. The 490 will likely be 10-20% faster. This should be worst case scenario.

Vega should be close to double the speed of Fiji. Surely 70-80% better at the least. Again, newer architecture, bigger die size to polaris and hbm 2 on top of that...
 
Associate
Joined
4 Nov 2013
Posts
1,437
Location
Oxfordshire
I don't think that will be the case. If it is, that is a huge fail by AMD.

New architecture and a doubling of available transistors even on a much smaller die and still can't beat fury x? It surely will by 20-30% and that will be their 490x. The 490 will likely be 10-20% faster. This should be worst case scenario.

Vega should be close to double the speed of Fiji. Surely 70-80% better at the least. Again, newer architecture, bigger die size to polaris and hbm 2 on top of that...

Agreed. The high-mid range 390x replacement should be a bit faster than Fury, and the Vegacards should be a lot faster.
 
Associate
Joined
30 Nov 2015
Posts
166
I don't think that will be the case. If it is, that is a huge fail by AMD.

New architecture and a doubling of available transistors even on a much smaller die and still can't beat fury x? It surely will by 20-30% and that will be their 490x. The 490 will likely be 10-20% faster. This should be worst case scenario.

Vega should be close to double the speed of Fiji. Surely 70-80% better at the least. Again, newer architecture, bigger die size to polaris and hbm 2 on top of that...

I'm left slightly confused by the AMD roadman as it would seem that Vega should have noticeable, even if exaggerated, performance per watt increases on Polaris. I'm left wondering why AMD don't release a big Polaris or even just a Hawaii sized Polaris, before a Vega as this would make a big Vega very powerful.
 
Soldato
Joined
1 Jun 2013
Posts
9,315
I'm left slightly confused by the AMD roadman as it would seem that Vega should have noticeable, even if exaggerated, performance per watt increases on Polaris. I'm left wondering why AMD don't release a big Polaris or even just a Hawaii sized Polaris, before a Vega as this would make a big Vega very powerful.

Risk mitigation. They want to get the smaller chips working on a new process before they try it with bigger chips. It's better for yield and thus profitability. Starting first with a big chip has bitten both AMD and Nvidia before, because it's much harder to do.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Not always reachable through game settings only. In many games you can go beyond "max" settings simply by editing game config manually.

He's saying just turning every setting on or to it's highest setting doesn't mean max IQ. As in, DoF, blur, and a half dozen other frequently used settings reduce IQ, cost performance and can be disabled with in game options. You can randomly turn on these features, reduce your performance and have worse IQ or be sensible and turn off these features, gain performance and have higher IQ.

In The Division I've disabled every stupid POS bluring nonsense option that is available. Also if there was a higher texture setting that didn't increase IQ but only reduced performance I wouldn't enable that either.
 
Caporegime
Joined
18 Oct 2002
Posts
32,623
I may have missed it but I don't think anyone is arguing that max isn't max. It's that turning on an option that degrades IQ yet impacts performance is a pointless setting.

Chromatic aberration (the clue is in the name)
Depth of Field (deliberately rendering out of focus)
Lens Flare
Vignetting (tunnel vision)

None of these settings reflect how the human eye works in real life and are image degradation effects that are only seen with photo/camera lenses. You are in effect deliberately making your game look worse and in all cases at a massive performance hit. IMHO of course



Totally agree with this.

Human eyes do experience depth of focus, all optical system do. They also suffer from lens flare and chromatic aberration, again almost all optical system so. The thing is you get used to it, e.g. much of your vision is out of focus if You are reading a book/on an iPad, but you simply don't care about anything you aren't focusing on.
 
Associate
Joined
26 Aug 2010
Posts
554
Human eyes do experience depth of focus, all optical system do. They also suffer from lens flare and chromatic aberration, again almost all optical system so. The thing is you get used to it, e.g. much of your vision is out of focus if You are reading a book/on an iPad, but you simply don't care about anything you aren't focusing on.

I don't particularly care if the eye experiences it. Why would one want to waste performance on a feature that does not improve IQ, but arguably does the opposite?

Say you could have your eye focus on more and your brain being able to handle it, would you class that as a upgrade or a downgrade?

Each to their own, but DOF is a waste of performance for me and I consider having it off as maximum settings for a game, not on.
 
Soldato
Joined
30 Nov 2011
Posts
11,356
It depends what the game is trying to do with it, if it's a single player game, telling a story, then I think the game maker should have a say in how they want their story to be portrayed. In that respect I'd want GPU comparisons to be like for like and not someone else's arbitrary "turn that off" set of settings
 
Associate
Joined
27 Aug 2008
Posts
1,877
Location
London
Human eyes do experience depth of focus, all optical system do. They also suffer from lens flare and chromatic aberration, again almost all optical system so. The thing is you get used to it, e.g. much of your vision is out of focus if You are reading a book/on an iPad, but you simply don't care about anything you aren't focusing on.

It's just unfortunate that game dev's seem to treat the human eye like a camera lens.
Amusingly, as you say, loss of detail in peripheral vision is a free effect of the eye. The only reason to add dof would be to reproduce a cinematic aesthetic or to focus the players attention (could be legitimately useful at times in the narrative I suppose, but a blanket setting makes no sense for that) but otherwise its restrictive and I find it prevents enjoyment of the game environment. I want to see some detail in the distant mountain when I decide to look at it and it is what my eye expects.
 
Soldato
Joined
30 Dec 2011
Posts
5,553
Location
Belfast
Human eyes do experience depth of focus, all optical system do.

They also suffer from lens flare and chromatic aberration, again almost all optical system so. The thing is you get used to it, e.g. much of your vision is out of focus if You are reading a book/on an iPad, but you simply don't care about anything you aren't focusing on.

As you state yourself it's not artificially forced though and is compensated for by your brain. DoF, chromatic aberration, lens flare in games seriously diminishes image quality that your brain cannot compensate for because it is forced and as such unrealistic. Adding these effects in games degrades IQ in an unrealistic way and seriously impact performance.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
The issue is that DoF chooses which part of the screen you are focusing on for you, in real life you do that. Look out of the window, focus anywhere and that part will be in focus, the outside will be out of focus, but you chose to look there and everywhere in your peripheral vision is likely varying levels of out of focus and slightly distorted. The issue is you can choose to look anywhere you want, by moving your eyes focus and NOT your head or you can move your head and look straight forward.

In game DoF generally brings the middle and only the direct focus of the crosshair into focus and attempts to make everything else out of focus, it prevents you looking forwards but moving your eyes to the top left of the screen as it doesn't know you're doing this and it's instead out of focus which is unnatural and stupid.

The daft thing being, if you are looking dead centre the outside edge of the screen isn't actually in your own eyes best focus.. so blurring it in any way is a complete waste. If you are looking at the corner of the screen then it's out of focus when it shouldn't be. It's either a waste or shouldn't be there at all, there is no time it's actually good, it only removes the natural ability to move your eye's focus without moving your head.

The only time it should be at all acceptable is during in game video. In live gameplay the player chooses where they look even if the dev can prompt you where they think you should be looking but. But in a video the dev can say hey, this part is important to the story and they are effectively able to control where the characters eye and head is looking. but when the game is live they can control neither and should not attempt to simulate doing so because it's a joke.
 
Soldato
Joined
18 Oct 2002
Posts
11,038
Location
Romford/Hornchurch, Essex
As you state yourself it's not artificially forced though and is compensated for by your brain. DoF, chromatic aberration, lens flare in games seriously diminishes image quality that your brain cannot compensate for because it is forced and as such unrealistic. Adding these effects in games degrades IQ in an unrealistic way and seriously impact performance.

All things i turn off, not because of performance issues, but because they look horrible and are so fake.
 
Caporegime
Joined
18 Oct 2002
Posts
32,623
As you state yourself it's not artificially forced though and is compensated for by your brain. DoF, chromatic aberration, lens flare in games seriously diminishes image quality that your brain cannot compensate for because it is forced and as such unrealistic. Adding these effects in games degrades IQ in an unrealistic way and seriously impact performance.

In you opinion, other actually like those effects when done properly.
Depth of focus for example our brain doesn't correct for, you you tend not to care too much.

And then there is cases where these effects are used in cut-sequences to replicate effects of a video camera. Shallow DoF during a cut-scene has a large benefit in the ability to purvey the story.


I'm lookign forward to see if Nvidia' light field VR technology becomes mainstream. A problem with current VR headsets is you feel sick since although you have a stereoscopic disparity the actual photos hitting your eyes don't have a any depth information due the flat lightfield. nvidia's headset replicates the actual observed lightfield so the depth cues of the light rays are appropriately modified and thus when you look at a close up or distance object your visual focus will adjust and you will experience a natural depth of focus effect and will feel a lot better.
http://www.wareable.com/vr/nvidia-stanford-university-vr-light-field-2016
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
OK the two sides of this argument are never going to agree.

So can we just agree that different people can see/prefer different settings and that not all settings improve image quality/gameplay experience for all people.


Back on topic.

So 4th generation GCN (GCN 4.0) or whatever AMD finally end up calling it


4th Generation GCN
HDMI 2.0a
DP 1.3
265 main10 decode up to 4K
4K h.265 at 60FPS encode/decode
Largest performance per watt jump in the history of AMD GPUs

The ports are good, the decoding stuff, I'm not sure how different that is to what the previous 3rd gen GCN could do and the largest performance per watt jump in history.

Well it sounds good, as we have seen from the earlier leaked shots there seems to be a DVI connection as well, at least on that one particular card, which surely cannot be a bad thing.
Anyone care to comment on the 265 decode side of things ?

Performance per watt, well AMD have been a little bit behind NVidia in this aspect of late. This new architecture certainly looks to be continuing the good work that the Tonga architecture started and it looks like it will easily surpass the efficiency of Maxwell. Now of course NVidia are also going to increase their energy efficiency this time round, will it be enough after the good saving they have already made with Maxwell who knows. But it will be very interesting to see who comes out on top in that area with the new chips.
 
Associate
Joined
4 Nov 2013
Posts
1,437
Location
Oxfordshire
OK the two sides of this argument are never going to agree.

So can we just agree that different people can see/prefer different settings and that not all settings improve image quality/gameplay experience for all people.


Back on topic.

So 4th generation GCN (GCN 4.0) or whatever AMD finally end up calling it




The ports are good, the decoding stuff, I'm not sure how different that is to what the previous 3rd gen GCN could do and the largest performance per watt jump in history.

Well it sounds good, as we have seen from the earlier leaked shots there seems to be a DVI connection as well, at least on that one particular card, which surely cannot be a bad thing.
Anyone care to comment on the 265 decode side of things ?

Performance per watt, well AMD have been a little bit behind NVidia in this aspect of late. This new architecture certainly looks to be continuing the good work that the Tonga architecture started and it looks like it will easily surpass the efficiency of Maxwell. Now of course NVidia are also going to increase their energy efficiency this time round, will it be enough after the good saving they have already made with Maxwell who knows. But it will be very interesting to see who comes out on top in that area with the new chips.

Don't forget that Maxwells power efficiency seemed so good because it was compared with 3 year old AMD tech. They got kicked in the balls by the cancellation of 20nm, so they have to use what they had.
Other than maxwell the two usually were close of PE, so i don't expect much difference between polaris and pascal either.
 
Back
Top Bottom