• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
18 Feb 2015
Posts
6,480
It’s probably been debated a million times already, but running things at 4K native I don’t think I’ve ever seen a single game come close to using the 11Gb on my 2080Ti.

Why the disappointment at only 12Gb?

I think mine uses around 9 or 9.5 in AC Odyssey at 4k ultra.

12gb on the new one does seem like a potential limiting factor in the nearish future at 4K with consoles about to start using much higher quality assets.

Although I think I read something about 3000 series maybe using less VRAM fortthe same scenario due to a better compression tech or something? I could also be totally making that up. If untrue, then 12gb is very disappointing in my mind.

Also Shadow of War as I recently found out. One of the few times it pays to have HBCC, works out to about 11 GB RAM & 11 GB VRAM. And boy is the next one going to consume even more as these assets aren't even that high-detail.

2a6454e2_middleearth2.jpeg
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Also Shadow of War as I recently found out. One of the few times it pays to have HBCC, works out to about 11 GB RAM & 11 GB VRAM. And boy is the next one going to consume even more as these assets aren't even that high-detail.
As you say Mordor is nothing special in the slightest. Which surely tells us that VRAM usage is not strictly tied to ultra-detailed graphical fidelity and that with good development and reduction of asset sizes via compression etc, VRAM usage can be lessened.
 
Associate
Joined
14 Oct 2004
Posts
979
As you say Mordor is nothing special in the slightest. Which surely tells us that VRAM usage is not strictly tied to ultra-detailed graphical fidelity and that with good development and reduction of asset sizes via compression etc, VRAM usage can be lessened.

It's a double-edged sword. As technology gets better, developers get lazier.
 
Soldato
Joined
18 Feb 2015
Posts
6,480
As you say Mordor is nothing special in the slightest. Which surely tells us that VRAM usage is not strictly tied to ultra-detailed graphical fidelity and that with good development and reduction of asset sizes via compression etc, VRAM usage can be lessened.

That's not what I'm saying at all. The SoW assets aren't the highest detail but that doesn't mean there isn't a lot to the game going on (remember, it's not just the assets that make the game!). What I'm saying is, if they were to keep the complexity of it and then further increase the detail of said assets, it would be even more demanding past current tech. Or perhaps they'll find new ways of doing things thanks to new tech features (mesh shaders, SFS etc) that aren't available for current gen. See video above.

@sadbuttrue To say devs got lazier would be a misreading of the situation. There's context in the video above.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
That's not what I'm saying at all. The SoW assets aren't the highest detail but that doesn't mean there isn't a lot to the game going on (remember, it's not just the assets that make the game!). What I'm saying is, if they were to keep the complexity of it and then further increase the detail of said assets, it would be even more demanding past current tech. Or perhaps they'll find new ways of doing things thanks to new tech features (mesh shaders, SFS etc) that aren't available for current gen. See video above.

@sadbuttrue To say devs got lazier would be a misreading of the situation. There's context in the video above.
So why would mordor use a ton more VRAM than other games that are bigger and better looking, unless it was a developer-specific problem? I can't think of any logical alternative.

Nvidia shouldn't have to add more VRAM to compensate for the occasional developer than can't design and code efficiently.
 
Soldato
Joined
12 May 2014
Posts
5,225
So why would mordor use a ton more VRAM than other games that are bigger and better looking, unless it was a developer-specific problem? I can't think of any logical alternative.

Nvidia shouldn't have to add more VRAM to compensate for the occasional developer than can't design and code efficiently.

1. You don't know how VRAM is utilisation changes between the different setting in shadow or Morder (e.g. what is actually being loaded in and why)
2. You don't know how VRAM is utilisation changes between shadow of Mordor and the other mystery game you are referring to. (e.g. what is actually being loaded in and why)
3. Different game engines work in different ways, therefore VRAM and RAM usage may not be comparable without a deep dive.
4. Let us assume you are right; you seem to be implying we don't need more VRAM yet are ignoring the fact that "better" developers can do more with more VRAM.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
1. You don't know how VRAM is utilisation changes between the different setting in shadow or Morder (e.g. what is actually being loaded in and why)
2. You don't know how VRAM is utilisation changes between shadow of Mordor and the other mystery game you are referring to. (e.g. what is actually being loaded in and why)
3. Different game engines work in different ways, therefore VRAM and RAM usage may not be comparable without a deep dive.
4. Let us assume you are right; you seem to be implying we don't need more VRAM yet are ignoring the fact that "better" developers can do more with more VRAM.
Regarding points 1 to 3, you are 100% correct in that I do not know the specific way in which the Mordor engine works, but that does not answer my question. What I am asking is why does a developer design an engine that needs a crap ton of VRAM, when other developers that design bigger, better and prettier games do not? Answer: they are not creating efficient and optimised game engines. That's not a hardware specification fault, that is the developers fault.

Point 4 makes no logical sense. I am talking about developers not using the VRAM they have available for them (eg: 12GB), and you are somewhat oddly asking: "But should we not add more VRAM for these efficient developers to make even better use of"? If they already don't use the maximum VRAM with creating the best games they are able to that suit a myriad of systems, then why would we need to add another 30%?

Do you really think Nvidia don't take this into consideration with all of the developers they speak to and engage with when designing their cards?
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,225
Regarding points 1 to3 you are 100% correct in that I do not know the specific way in which the Mordor engine works, but that does not answer my question. What I am asking is why does a developer design an engine that needs a crap ton of VRAM, when other developers that design bigger, better and prettier games do not? Answer: they are not creating efficient and optimised game engines.
?

If you do not know the specifics how can you come to that conclusion? . I am saying that you cannot answer your question without addressing point 1 to 3.

Point 4 makes no logical sense. I am talking about developers not using the VRAM they have available for them (eg: 12GB), and you are somewhat oddly asking: "But should we not add more VRAM for these efficient developers to make even better use of"? If they already don't use the maximum creating the best games they are able to that suit a myriad of systems, then why would we need to add another 30%?

One answer to why they don't use the maximum is because they build games to consoles which only had 8GB.
Another answer is why build a game or put time into something that only the top 1% of gamers could ever experience?

With the consoles moving to 16GB do you honestly think that developers will just use 8GB and leave the rest?
 
Soldato
Joined
18 Feb 2015
Posts
6,480
So why would mordor use a ton more VRAM than other games that are bigger and better looking, unless it was a developer-specific problem? I can't think of any logical alternative.

Because it does more than other games in various aspects. Better looking, also, is 1) subjective; 2) dependent more on processing power than just vram. Again, watch the video if you really want to understand. Really can't compare apple & oranges. There's a million things going on under the hood, of all games, that you're not accounting for as opposed to just determining the game's technical chops from a mere screenshot/clip.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
If you do not know the specifics how can you come to that conclusion? . I am saying that you cannot answer your question without addressing point 1 to 3.

With the consoles moving to 16GB do you honestly think that developers will just use 8GB and leave the rest?

Because it does more than other games in various aspects. Better looking, also, is 1) subjective; 2) dependent more on processing power than just vram. Again, watch the video if you really want to understand. Really can't compare apple & oranges. There's a million things going on under the hood, of all games, that you're not accounting for as opposed to just determining the game's technical chops from a mere screenshot/clip.

Ok guys, I just watched the video... and from your replies I don't think you guys did. The talk he is giving is all about the console development, which was different to the PC version in how it was coded and implemented. The PC version was vastly more inefficient as you will see if you watch from 1:02:00 where a guy asks why certain techniques were not used and the dev explains why no effort was put into optimising it. His explanation is vague and he look uncomfortable and says:

"I have a bunch of slides talking about the PC version but we have no time to cover it for an extra 30 mins".

"In the end due to technical limitations on PC we ended up not sharing any of that memory and just duplicating it all and just (raises hands and shrugs) loading high MIPS the traditional way and not trying to like, exchange memory back and forth".


So, as I strongly suspected, both of you are talking absolute guff... the PC version was an unoptimised console to PC port with excessive VRAM usage. :D

Also, better looking is also not 'subjective', it is measurable through quality of assets and graphical fidelity used and on display. ;)
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,225
@Richdog
You may have watched the video but you still have not addressed or found answer to points 1 to 3. Watching a 1 hour video isn't enough to get the answer to those questions. You need to take a deep dive and analysis what exactly the game engine is doing and understanding what is being loaded on an instance to instance basis to understand what is using the memory and why. You would also need to do this for the other game which you have stated that is running on a superior engine.

In doing that you can then compare scenes of the two games and understand what is actually using up the VRAM.

The guy said he had 30 minutes worth of slides talking about what they did for PC and you think that a highly specific 2 minute answer to a very specific question regarding using specific feature of DX12 proves your point that they have an inefficient, unoptimised game engine?

Also at 54:31 he states that the Mip systems saves 1GB of memory. so that clearly doesn't account for the memory difference seen above.

It also doesn't answer the other question. With the consoles moving to 16GB do you honestly think that developers will just use 8GB and leave the rest?
 
Back
Top Bottom