• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

Your statement was directed to now, rather than in the future.

In the future of course we'll all be running 4K, 8K etc.

I was simply pointing out that your "games that need the best part of 12GB exist" statement is not currently true.

Of course as the years go on, developers will add in higher resolution textures and more advanced AA/filtering options, which will increase the amount of VRAM games use. Though in the future GPU's will have more VRAM, so kinda obvious ramification.

Of course it is true as these games exist, whether you want to use the settings to use that much memory is up to the user.
 
Of course it is true as these games exist, whether you want to use the settings to use that much memory is up to the user.

I have to disagree with this statement because it simply a rehash of the same xxGB is not enough we see every year. "See I can force this game to hit the VRAM limit with unplayable and unrequired MSAA/SSAA settings". There has always been someone running games at unplayable settings to prove that XX MB/GB was not enough.

Using 8X MSSAA at 4K is utterly pointless and I would defy anyone to see the benefit over 4X MSAA (or even 2X MSAA) during actual gameplay without stopping to stare at every little polygon edge just to find a jaggy.

Don't take a limited scenario where 4X Titan X GPUs can be forced to hit a VRAM limit at 4K with 8X MSAA on a few games and arbitrarily declare it a fact that "we need 12GB". Those very same games are running fine with 4GB VRAM for plenty of users, even at 4K because the vast majority do not push unrealistic graphical settings.
 
Last edited:
I have to disagree with this statement because it simply a rehash of the same xxGB is not enough we see every year. "See I can force this game to hit the VRAM limit with unplayable and unrequired MSAA/SSAA settings". There has always been someone running games at unplayable settings to prove that XX MB/GB was not enough.

Using 8X MSSAA at 4K is utterly pointless and I would defy anyone to see the benefit over 4X MSAA (or even 2X MSAA) during actual gameplay without stopping to stare at every little polygon edge just to find a jaggy.

Don't take a limited scenario where 4X Titan X GPUs can be forced to hit a VRAM limit at 4K with 8X MSAA on a few games and arbitrarily declare it a fact that "we need 12GB". Those very same games are running fine with 4GB VRAM for plenty of users, even at 4K because the vast majority do not push unrealistic graphical settings.

The point is the games exist and if you want to max them you need 12gb.

The rest is up to the user to decide what settings they want to use.
 
The point is the games exist and if you want to max them you need 12gb.

The rest is up to the user to decide what settings they want to use.

Still a matter of Pointless IQ settings with no noticeable benefits just to try and prove a point.

But GPU's in general do need to start increasing their memory sizes as standard. Even in the most modern of games the textures look like Arse up close. Which of course takes more time to make higher quality textures but it is also limited by frame buffer size.

Also when HDR enabled games drop, the textures will increase in size by a noticeable amount due to 10bit etc.
 
Didn't we have this discussion before in GTA V thread? The outcome of that discussion was: Kaap loves max AA on 4k, the rest of the world realise that any AA is completely pointless at 4k.
This is just Kaap's preference for some reason (he argues he sees the difference) ;)
 
Didn't we have this discussion before in GTA V thread? The outcome of that discussion was: Kaap loves max AA on 4k, the rest of the world realise that any AA is completely pointless at 4k.
This is just Kaap's preference for some reason (he argues he sees the difference) ;)

I am not telling anyone what settings they should use, I am pointing out that there are games that can use 10gb or more of memory.

Everyone has to make their own mind up as to what settings to use.
 
I am not telling anyone what settings they should use, I am pointing out that there are games that can use 10gb or more of memory.

Everyone has to make their own mind up as to what settings to use.

Kaap, if you see VRAM usage in certain tools, it does not mean the game is using all of that VRAM at that given point. IT has been proven time and time and time again that AMD VRAM usage is smaller than nvidia cards with more vram. Yet AMD card keep performing without any issues. Why make this argument like that, when we all know that displayed VRAM usage does not tell you the whole story? Game engine sees 12Gb of RAM and caches textures in it for less info swap. Then game engine sees certain amount of VRAM on AMD cards, listens to AMD drivers and caches certain amount of textures. Unless the performance is influence by such usage, we can forget about all this.
Obviously when certain people start making their argument from the fact that they deliberately push unnecessary settings to max, then we have a problem.
I only know of one game where 4Gb of Fury X had issues performance wise. It was that Mordor game at 4K. I have no clue if AMD addressed that or not. Other games have no issues if the card has 4Gb or 12Gb, even though supposedly those games use more than 4Gb of VRAM.
 
Kaap, if you see VRAM usage in certain tools, it does not mean the game is using all of that VRAM at that given point. IT has been proven time and time and time again that AMD VRAM usage is smaller than nvidia cards with more vram. Yet AMD card keep performing without any issues. Why make this argument like that, when we all know that displayed VRAM usage does not tell you the whole story? Game engine sees 12Gb of RAM and caches textures in it for less info swap. Then game engine sees certain amount of VRAM on AMD cards, listens to AMD drivers and caches certain amount of textures. Unless the performance is influence by such usage, we can forget about all this.
Obviously when certain people start making their argument from the fact that they deliberately push unnecessary settings to max, then we have a problem.
I only know of one game where 4Gb of Fury X had issues performance wise. It was that Mordor game at 4K. I have no clue if AMD addressed that or not. Other games have no issues if the card has 4Gb or 12Gb, even though supposedly those games use more than 4Gb of VRAM.

XCOM 2 on max settings has no problem running on TXs, on Fury Xs it is a 2fps slide show.

The games exist and there are more coming.
 
XCOM 2 on max settings has no problem running on TXs, on Fury Xs it is a 2fps slide show.

The games exist and there are more coming.

We have always been able to bump settings up to levels that cause performance issues, it does not mean the IQ is significantly better or that it was sensible to do so.

More Vram will be needed only once games start getting far better textures and HDR.
 
We have always been able to bump settings up to levels that cause performance issues, it does not mean the IQ is significantly better or that it was sensible to do so.

More Vram will be needed only once games start getting far better textures and HDR.

i see this said a lot on here about HDR and im not sure why
maybe just needs explaining to me why it means bigger textures

HDR in tv's i believe is mostly a set of instructions to tell it where to use the backlighting? and a better use of colors!?

...also this game is out next week huh! be interesting to see if its popular, looks a little slow paced for me but i might give it a go
 
I am not telling anyone what settings they should use, I am pointing out that there are games that can use 10gb or more of memory.

Everyone has to make their own mind up as to what settings to use.

I always believe more is better when it comes to VRAM and that 4GB Fiji is cutting it fine with VRAM for 4K. Having said that you are not pointing out that some games use (require) 10GB of VRAM, you are showing that you can force some games to use 10GB of VRAM in extremely limited scenarios that are not in any way indicative of the vast, vast majority of cases.
 
Last edited:
I always believe more is better when it comes to VRAM and that 4GB Fiji is cutting it fine with VRAM for 4K. Having said that you are not pointing out that some games use 10GB of VRAM, you are showing that you can force some games to use 10GB of VRAM in extremely limited scenarios that are not in any way indicative of the vast, vast majority of cases.

One game that sits in my mind was COD: AW and 4GB wasn't enough to run full settings but frames were good enough to cope on a single GPU and this was at 1440P. There is games out there that require a decent amount of VRAM and that is that, regardless of how buggy it is.
 
Fair enough Kaap there is games that can use more than 6Gb and go anywhere upto 12GB but how many users are going to experience this? How many people are running 4k on pc gaming? Majority are still 1080p and even less on 1440p never mind 4k. So why would we need 12GB cards at present? In the future when next gen cards come out and have the grunt to run 4k with two cards it may be plausible.
 
Yes, today if you want playable 4k experience you need two GPUs. 99% of todays games can be managed with 4Gb VRAM. Tomorrows games mostly should be DX12, which if wanted to be played in 4k will still need probably 2 GPUs for max settings.
So we take 2 fury xs for tomorrows dx12 game at 4K, and use this neat dx12 feature where games sees two VRAM pools as one. I am sure dx12 supports this, I remember seeing it among the features of API. So now I have 8Gb of HBM memory. How well that would work is another question, but seeing how AMD prepared their hardware for DX12 I have quite a lot of confidence that 4GB fury x will be fine.

In case tomorrows games are still dx11, AMD showed that they can optimise memory usage, so again we would see just one or two which are problematic enough to have this discussion.
 
One game that sits in my mind was COD: AW and 4GB wasn't enough to run full settings but frames were good enough to cope on a single GPU and this was at 1440P. There is games out there that require a decent amount of VRAM and that is that, regardless of how buggy it is.

I am not arguing that 4GB is perfect for 4K, 4GB is cutting it close at 4K in my experience. While perfectly fine in the majority of games games you will have some games that push that 4GB to its limits and will require dropping graphical settings but with single GPU you will be dropping those settings even on a Titan X due to lack of GPU grunt. This was the same when people argued GTX680 2GB was not enough at 1440p because it never had the grunt to max that res anyway. I always believed that VRAM limits only come in to play in multi GPU scenarios when you have GPU grunt to push higher settings at higher resolutions. While Kaap is seeing this with his 4X TX cards and 4K this is a limited scenario that is in no way indicative of the vast majority.

I am disputing the fact that games require 12GB VRAM at 4K. Clicking max settings adding 8X MSAA and claiming games use 10GB ergo more VRAM is a must is inaccurate.
 
Last edited:
i see this said a lot on here about HDR and im not sure why
maybe just needs explaining to me why it means bigger textures

HDR in tv's i believe is mostly a set of instructions to tell it where to use the backlighting? and a better use of colors!?

...also this game is out next week huh! be interesting to see if its popular, looks a little slow paced for me but i might give it a go

HDR in this case means using 10bits per colour channel or more. The rec.2020 standard being used means this I believe although you can have up to 12-16 bits per colour for finer granualoty. But 10bit can be considered a good balance as it is.

In terms of textures this means you have 10x10x10 bits of colour information per pixel so 1000bits of data just to get the colour information in an uncompressed format. Instead of 8x8x8 of 8bit colour data equalling 512 bits per channel so nearly double.

You need a screen and software that will then display the 10bits per channel. And that's not including any alpha channels which multiply an extra 8 or 10bits per pixel worth of data.
 
Last edited:
I am disputing the fact that games require 12GB VRAM at 4K. Clicking max settings adding 8X MSAA and claiming games use 10GB ergo more VRAM is a must is inaccurate.

There is nothing inaccurate about it.

XCOM 2 can use over 10.5gb of memory.

It is entirely up to the user if they wish to do this.

I don't presume to tell people they should do this and others should not presume to tell everyone they should not.

I actually play XCOM 2 on a single TitanX as it is a turn based game so does not need high fps.

For the next gen of cards I will not be buying any that have less than 12gb of VRAM, this is common sense as in 3 years time 12gb could be the minimum spec required by games.
 
Back
Top Bottom