• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Soldato
Joined
25 Jun 2011
Posts
5,468
Location
Yorkshire and proud of it!
In a years time would you like to buy a 4gb card, I may even give you a special discount off one of mine but I suspect in a years time you will have an 8gb card lol.

That's a substantial shift away from arguing that it matters today, however.

As to games that use more than 4gb I have posted the GPUZ screenshots on a number of occasions for various games, perhaps if you called your Watch Dogs off you may spot them. People could be committing Grand Theft Auto right in front of you and you would not know what is going on.:D

I've seen this from you on previous occasions and the response is the same. Just because memory is filled, doesn't mean it is needed. Caches (whether disk, GPU memory, browser) aren't normally cleared proactively all the time. Doing so would harm performance. Parallel example: there are things in my browser cache from websites I haven't visited in months, yet they still count towards the size of it.

Only measurable differences in output based on memory (and beyond margin of error) show 8 or 12GB provide benefits. A screenshot showing a filled RAM does not.
 
Soldato
Joined
30 Dec 2011
Posts
5,553
Location
Belfast
In a years time would you like to buy a 4gb card, I may even give you a special discount off one of mine but I suspect in a years time you will have an 8gb card lol.

Important news flash shocker, in future more VRAM will be required. You keep moving the goalposts here as nobody is arguing that in future more VRAM will be required. The fact that quad SLI or Xfire has the grunt to push modern games beyond 4GB with unrealistic settings at 4K does not prove your point. There has always been VRAM problems as you push graphical settings to extremes.[/QUOTE]

Here is a quote from you showing the exact opposite stance on this argument when another poster argues 2GB is not enough for 1440p a few years back.

https://forums.overclockers.co.uk/showpost.php?p=24554987&postcount=21

Here is another where you claim the developers of Watch Dogs are having a laugh because they state 3GB VRAM is not enough.

https://forums.overclockers.co.uk/showpost.php?p=26358738&postcount=48

Yet you now claim that some devs are bad so we need as much VRAM as possible to counter this and that more VRAM is better.

My stance has always been the more VRAM the better especially if you plan multiGPU but the fact remains "currently" 4GB at 4K is fine and for 1080p it is plenty with single GPU. If you plan multi GPU then the more VRAM the better.

My own stance on 2GB VRAM limits. Ironically I have also contradicted myself somewhat here and it seems we have swapped roles slightly since 2013. :D
https://forums.overclockers.co.uk/showpost.php?p=26364789&postcount=78
 
Last edited:
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
That's a substantial shift away from arguing that it matters today, however.



I've seen this from you on previous occasions and the response is the same. Just because memory is filled, doesn't mean it is needed. Caches (whether disk, GPU memory, browser) aren't normally cleared proactively all the time. Doing so would harm performance. Parallel example: there are things in my browser cache from websites I haven't visited in months, yet they still count towards the size of it.

Only measurable differences in output based on memory (and beyond margin of error) show 8 or 12GB provide benefits. A screenshot showing a filled RAM does not.

Try running XCOM 2 on a single GTX 980 Ti or Fury X maxed @2160p, it is not a nice experience as you run out of memory. Using a TX on the other hand is quite playable.
 
Last edited:
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
Important news flash shocker, in future more VRAM will be required. You keep moving the goalposts here as nobody is arguing that in future more VRAM will be required. The fact that quad SLI or Xfire has the grunt to push modern games beyond 4GB with unrealistic settings at 4K does not prove your point. There has always been VRAM problems as you push graphical settings to extremes.

Here is a quote from you showing the exact opposite stance on this argument when another poster argues 2GB is not enough for 1440p a few years back.

https://forums.overclockers.co.uk/showpost.php?p=24554987&postcount=21

Here is another where you claim the developers of Watch Dogs are having a laugh because they state 3GB VRAM is not enough.

https://forums.overclockers.co.uk/showpost.php?p=26358738&postcount=48

Yet you now claim that some devs are bad so we need as much VRAM as possible to counter this and that more VRAM is better.

My stance has always been the more VRAM the better especially if you plan multiGPU but the fact remains "currently" 4GB at 4K is fine and for 1080p it is plenty with single GPU. If you plan multi GPU then the more VRAM the better.

My own stance on 2GB VRAM limits. Ironically I have also contradicted myself somewhat here and it seems we have swapped roles slightly since 2013. :D
https://forums.overclockers.co.uk/showpost.php?p=26364789&postcount=78

Those are very old posts when no one was using 2160p since then things have changed a lot in the 2 or 3 years.

It is good that you have quoted my old posts because it highlights the fact that we have gone from being ok with 2gb on cards to needing more than 8gb on cards in the space of 2 or 3 years. This only reinforces my point that someone buying a card with 8gb this year could find that it may be inadequate next year.
 
Caporegime
Joined
17 Mar 2012
Posts
48,334
Location
ARC-L1, Stanton System
These days rendering static and ridged geometry detail to the point where its form is indistinguishable from real objects is becoming fairly standard.

Lighting and Shading is also getting close to photo real

The next step is getting the look and texture of surfaces to look real, aside from Lighting and Shading there is also the use of multiple texture layers, for bump mapping, specular light reflection, occlusion, dirt mapping ecte.... and more importantly for GPU Buffering the resolution of those textures, over the years we have gone from 0.5K to 1K, 2K, 4K and 8K textures that get ever sharper and more detailed and bigger.

They all need to be stored in memory, the bigger they are the more memory you need.

Game devs are pushing 4K textures as a matter of course these days. so not only are textures demanding more buffer they are also in multiples for layering on the same surface.

4GB Buffer is pretty minimal for 1440P or even 1080P in the very near future. never mind 4K

 
Last edited:
Soldato
Joined
25 Jun 2011
Posts
5,468
Location
Yorkshire and proud of it!
Try running XCOM 2 on a single GTX 980 Ti or Fury X maxed @2160p, it is not a nice experience as you run out of memory. Using a TX on the other hand is quite playable.

I don't have either of those cards or that game, so can't. How about you link to some benchmarks comparing XCom2 performance on different cards that support your point?

EDIT: Also, what the Hell kind of settings are you using that make this game unpleasant to play on a 980Ti or FuryX at 2160p?
 
Last edited:
Soldato
Joined
30 Dec 2011
Posts
5,553
Location
Belfast
Doesn't the fact that 4gb's is not enough to run ROTTR with very high textures at 1080p let alone 4k highlight the trend?

No, because ROTTR uses 4K uncompressed textures that even a single 4GB 970 can run at very high settings at 1080p. 980Ti or Titan X can't run the game in 4K at acceptable FPS with very high settings. So it's not a VRAM issue it's a GPU grunt issue at 4K.

A single GTX 970 does not run in to VRAM spikes at very high even at 1440p.
http://www.pcper.com/reviews/Graphi...Performance-Results/Adding-GTX-970-and-R9-390

This is of course for single GPU use.

Edit: I should make it clear I agree 4GB will be problematic at 4K before the end of this year but not for 1080p or even 1440p IMHO.
 
Last edited:

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
I myself am so looking forward to some people's opinions and how they might change depending on the amount of memory on the new AMD cards.

Personally I don't understand how people can argue about max settings. If a dial goes up to 11, then 10 isn't the max. If a car will do 200 mph max then 195 isnt flatout.
The point being if a game has a setting that is above the previous setting then the previous one isn't maxed. It doesn't matter if the difference is visible to one person or even 100 people if a single individual can tell the difference then there is a difference simple as that.

As for 4GB at 4k. Just about enough for 'most' things but won't be going forward.
 
Soldato
Joined
30 Dec 2011
Posts
5,553
Location
Belfast
I myself am so looking forward to some people's opinions and how they might change depending on the amount of memory on the new AMD cards.

Personally I don't understand how people can argue about max settings. If a dial goes up to 11, then 10 isn't the max. If a car will do 200 mph max then 195 isnt flatout.
The point being if a game has a setting that is above the previous setting then the previous one isn't maxed. It doesn't matter if the difference is visible to one person or even 100 people if a single individual can tell the difference then there is a difference simple as that.

I may have missed it but I don't think anyone is arguing that max isn't max. It's that turning on an option that degrades IQ yet impacts performance is a pointless setting.

Chromatic aberration (the clue is in the name)
Depth of Field (deliberately rendering out of focus)
Lens Flare
Vignetting (tunnel vision)

None of these settings reflect how the human eye works in real life and are image degradation effects that are only seen with photo/camera lenses. You are in effect deliberately making your game look worse and in all cases at a massive performance hit. IMHO of course

As for 4GB at 4k. Just about enough for 'most' things but won't be going forward.

Totally agree with this.
 
Last edited:

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
I may have missed it but I don't think anyone is arguing that max isn't max. It's that turning on an option that degrades IQ yet impacts performance is a pointless setting.

Chromatic aberration (the clue is in the name)
Depth of Field (deliberately rendering out of focus)
Lens Flare
Vignetting (tunnel vision)

None of these settings reflect how the human eye works in real life and are image degradation effects that are only seen with photo/camera lenses. You are in effect deliberately making your game look worse and in all cases at a massive performance hit. IMHO of course.


I suppose that is the whole crux of the matter. people easily think that max settings mean the best image quality, where as of course it doesn't. As you have pointed out there are many effects that might make a game experience feel more realistic, or even not more realistic but to have a desired effect that the developer wants, that greatly decrease the image quality.

Bottom line, it is down to a developer to decide what the max settings for their game is not us the consumer. I suppose even that isn't strictly true, as it is sometimes possible to force extra settings in the GPU driver.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
I myself am so looking forward to some people's opinions and how they might change depending on the amount of memory on the new AMD cards.

Personally I don't understand how people can argue about max settings. If a dial goes up to 11, then 10 isn't the max. If a car will do 200 mph max then 195 isnt flatout.
The point being if a game has a setting that is above the previous setting then the previous one isn't maxed. It doesn't matter if the difference is visible to one person or even 100 people if a single individual can tell the difference then there is a difference simple as that.

As for 4GB at 4k. Just about enough for 'most' things but won't be going forward.

If 11 looks identical to 10, literally identical and is purposefully using a less efficient method then no, 11 isn't max, 10 is max, 11 is just a for some reason the developer trying to convince you that you need more memory or performance.... and there are three games that do this, all Nvidia gameworks games.

I don't care about max settings, I care about best IQ for a given performance level. For max IQ you don't need more than 4GB of memory in any of the games Kaap lists, for max settings, ie a completely worthless alternative very high texture option, you do, but you gain nothing by doing it. You reduce performance for no IQ increase.

IF Shadows of Mordor renamed very high and ultra texture settings to, 'fast very high' and 'inefficient yet identical quality to Very high'... who would chose the latter? Because those are more accurate descriptions and insisting in every thread as Kaap does that you NEED more than 4GB for this one ridiculous option added in a few Nvidia games is incredibly misleading.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
33,188
These days rendering static and ridged geometry detail to the point where its form is indistinguishable from real objects is becoming fairly standard.

Lighting and Shading is also getting close to photo real

The next step is getting the look and texture of surfaces to look real, aside from Lighting and Shading there is also the use of multiple texture layers, for bump mapping, specular light reflection, occlusion, dirt mapping ecte.... and more importantly for GPU Buffering the resolution of those textures, over the years we have gone from 0.5K to 1K, 2K, 4K and 8K textures that get ever sharper and more detailed and bigger.

They all need to be stored in memory, the bigger they are the more memory you need.

Game devs are pushing 4K textures as a matter of course these days. so not only are textures demanding more buffer they are also in multiples for layering on the same surface.

4GB Buffer is pretty minimal for 1440P or even 1080P in the very near future. never mind 4K


The amount of memory listed in use includes all memory left in the memory buffer simply because it doesn't need that memory. There is no point removing memory that might only need to be reloaded. In those scenes maybe only 1.5GB is 'needed' and the rest is just effectively cached, maybe it needs 2.5GB, maybe it needs 3.8GB. It isn't a valid way to measure memory requirements.

The simple fact is that almost all games work within a 4GB memory buffer at 4k currently, the only ones seen that don't use uncompressed textures to achieve higher memory usage.
 
Associate
Joined
13 Oct 2009
Posts
778
I'm confused, does this mean neither polaris 10 or 11 will replace fiji in terms of performance?

None of them will. Vega is the one you want if you are coming from Fiji. Polaris is more 290/290x/390/390x, while Vega is going to be the high-performance part.
 
Back
Top Bottom