• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Known/suspected games to eat more than 1GB video memory at 1920x1200

Permabanned
Joined
15 Feb 2011
Posts
250
List of known/suspected games to eat more than 1GB video memory at 1920x1200

This is a list of games I have confirmed or suspect to eat more than 1GB video memory at 1920x1200 resolution in worst case (not average), while the graphics settings are set reasonably high, e.g. 4AA. Please read the notes below the list before you proceed with doubt. For users who only care about case (a) please ignore this article.

I will be able to capture any vram usage less than 3GB at 1920x1200, if I have the specific game installed on my computer. (For a fair comparison between 5870 1GB and 5870 2GB click here.)

Method to pick a suspicious game: if I feel obvious lag spikes on a 1GB card, I add it to the list. It works the same way as system memory - simulated video memory is a lot slower than dedicated video memory, just the same way as virtual memory is a lot slower than physical memory.

Method to confirm a game: If I witness a screenshot proving that a game running at a resolution of 1920x1200 or lower can consume 1GB video memory or more, I confirm the game. All confirmed games would be marked in orange. I do not guarantee always being able to post a link to the screenshot of proof if you have doubt, because image sharing fails over time. You'll have to choose whether to trust me based on your own justification.

2007-11: Crysis / Warhead: No more lag while quickly rotating my camera. Typical vram usage: around 1300MB.
2008-12: GTA IV: Famous for requirement of over 1GB video memory to max out graphics settings. Confirmed to exceed 1GB video memory easily.
2009-06: ARMA II: Reported to hit 1.5GB at 1080p. Waiting for confirmation from nVidia users.
2009-09: Unigine Heaven Benchmark: Terrible lag spikes on 1GB cards. Typical vram usage: around 1100MB.
2009-11: Call of Duty 6: Modern Warfare 2: Did not notice lag with 5870 1GB, however confirmed to exceed 1GB video memory by screenshots from nVidia users.
2009-12: Colin McRae DiRT 2: Confirmed to exceed 1GB video memory by screenshots from nVidia users.
2010-02: Battlefield: Bad Company 2: Did not notice lag with 5870 1GB, however confirmed to exceed 1GB video memory by screenshots from nVidia users.
2010-02: Napoleon: Total War: 5870 1GB would lag as hell during Picture-in-picture scenes, such like troops taking over buldings. Typical vram usage: 1250MB.
2010-02: STALKER Call of Pripyat with complete mod: Confirmed to hit 1.2GB according to screenshots from nVidia users.
2010-02: Aliens vs. Porangeator: Reported to hit 1GB on a GTX 460 at 1080p, however no lag spike noticed. Waiting for confirmation from nVidia users.
2010-03: Metro 2033: MSAA 4X (not AAA): Confirmed to exceed 1GB video memory easily. Even 470/570 gets killed easily. Typical vram usage: 1500MB.
2010-07: Starcraft 2:Mothership cloaking *MANY* Carriers leads to 20 fps on 5870 1GB but over 30fps on 6950 2GB. Confirmed to exceed 1GB video memory by screenshots from nVidia users.
2010-09: Civilization V: Confirmed to use up all 1.5GB of GTX 480 by screenshots from nVidia users.
2010-10: Lost Planet 2: Confirmed to hit 1.2GB by screenshots from nVidia users.
2010-12: World of Warcraft: Cataclysm: Running two instances (logging in two characters) concurrently in DX11 mode would definitely eat more than 1GB video memory, and 5870 1GB struggles at 2-3 fps, while 6950 2GB has no problem at 30 fps. Each DX11 instance is confirmed to approach 1GB video memory usage by screenshots from nVidia users. Typical vram usage of dual-instance: 1600MB.
2011-02: Bulletstorm: Confirmed to hit 1GB at 1080p according to screenshots from nVidia users.
2011-03: Dragon Age 2: No more lag/unbearable min fps with 6950 2GB. Confirmed to exceed 1GB video memory by screenshots from nVidia users. Typical vram usage: 1200MB.
2011-03: Total War: Shogun 2: DX11 patch released. Typical vram usage: 1600MB.
2011-03: Homefront: Reported to exceed 1GB video memory. Waiting for confirmation from nVidia users.
2011-03: Assassin’s Creed Brotherhood: Confirmed to use 975MB at 2450x1440 4xAA. Waiting for confirmation from nVidia users for worst case of 1200p.
2011-03: Crysis 2: DX9 Confirmed to use up all 1.5GB of GTX480 by screenshots from nVidia users. DX11 confirmed to use up all 1.5GB of GTX580 by screenshots from nVidia users.
2011-04: Shift 2 Unleashed: Confirmed to hit 1.3GB by screenshots from nVidia users.
TBA: Digital Combat Simulator: A-10C: Reported to use all 1.5GB of GTX 480. Waiting for confirmation from nVidia users.
2011-05: The Witcher 2: Confirmed to hit 1.2GB by screenshots from nVidia users.


I will keep this list updated. However if inappropriate please remove it.

Notes: It is recommended for less experienced users to read about Virtual Memory and have a brief understanding about paging/swapping. Basically for graphics cards it works in a similar way - simulated video memory is a lot slower than dedicated video memory, just the same way as virtual memory is a lot slower than physical memory. nVidia's simulated video memory is called TurboCache, while ATI/AMD's simulated video memory is called HyperMemory. There are two categories of "out of video memory":

Case (a): Actual video memory usage is greatly above the capacity of dedicated video memory on the graphics card. In such case, all the contents within the sight of the camera (in from of the player) "fight" for a place in the dedicated video memory but unfortunately swapping happens all the time, causing unplayable average fps obviously below 30.

Case (b): Actual video memory usage is slightly above the capacity of dedicated video memory on the graphics card. In such case, only the contents within the sight of the camera (in front of the player) would be loaded into the dedicated video memory, while the contents outside the sight of the camera (behind the player) would be pushed into the simulated video memory. High fps is still achieved if the player doesn't rotate the camera; however when the player quickly rotates the camera, swapping happens between the dedicated video memory and the simulated video memory, for which the latency and bandwidth of the PCI-E communication is bottlenecking, causing temporarily low fps (aka lag, choppy lag, lag spike), which often make up the "min fps" of a benchmark session.

For users who only care about case (a) please ignore this article. For enthusiast users who do care about both case (a) and case (b) then this is the list for you.
 
Last edited:
Permabanned
OP
Joined
15 Feb 2011
Posts
250
Good findings there drifting, I have been in many debates regarding the vram issue.
Although you are no doubt going to get hit with the 'where's your proof?' line.

I might think of adding the links to the screenshots from various places. However the links fail over time. For now you'll have to choose to trust me or not to trust me :D

Edit: examples of the screenshots I used as evidence to support my list:

Screenshots for Metro 2033, 1920x1080 MSAA 4X. A GTX 580 1.5GB is going to get horrible min fps at 1920x1200.

1.jpg


2.jpg


3.jpg


4.jpg


5.jpg


6.jpg
 
Last edited:
Soldato
Joined
26 Apr 2004
Posts
9,356
Location
Milton Keynes
Stalker with the complete mod might hit that list too.

Also I dont have any direct proof but Id heard previously that ATI memory compression is better, so it could be a case games with no noticeable lag but slightly over 1GB mem usage with Nvidia cards are being compressed to remain within the 1GB buffer on ATI cards.
 
Soldato
Joined
16 Jul 2007
Posts
7,691
Location
Stoke on Trent
Odd as it might seem, Fallout 3 for me regulary goes to 940-960mb and I rarely check the VRAM usage so it's very likely that it maxes out my 1gb card at times. GPU usage however, is relatively low
 
Associate
Joined
4 Jul 2009
Posts
1,004
I have definitely seen Arma 2 consume my entire 1.5gb running at 1080p.
Also Stalker Call of Pripyat with complete mod on highest settings 4xAA, I just loaded up the start and it's hitting 1.3gb.

How do you check vram usage then? Be interesting to see how much my games use.

Use Afterburner or EVGA Precision. I think it's on by default but if it's not go to settings > monitoring, then tick memory usage and it will then appear on the graph.
Also if you want it turned on on the OSD you just highlight it and tick the appropriate box below.
 
Last edited:
Associate
Joined
11 Jul 2007
Posts
691
Location
Southampton
interesting.....

i can provide screenshot of AC Brotherhood later, which location eats most VRAM do you reckon?


also got the 4 classic TW games including Napoleon haven't played yet though.....
 
Last edited:
Associate
Joined
27 Nov 2008
Posts
2,268
Location
Cambridge
I can't confirm over 1gb but AvP (the not so good newish one) used to consistently cripple the 896mb on my 295 in dx10 mode at 1680x1050. I'm sure someone could fire it up in dx11 and get over a gb usage.

Does anyone know how to monitor RAM usage on ATI cards or is an nvidia only thing?
 
Soldato
Joined
30 Mar 2010
Posts
13,053
Location
Under The Stairs!
^ i think you are getting mixed up with tummelv's summary.

When using a dual card setup/xfire/sli, you don't add each cards memory together to get a 'grand total'.

The only way I can explain, is it's used in a 1 to 1 scenario, not 1+1.

For example using 2 5870 1gb cards in xfire, each gpu can only utilise 1gb each of memory, none of the memory is 'unused'(unless the game/app doesn't use it all).
 
Back
Top Bottom