• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Each game checks for VRAM and lives within its limits. If you have too little vRAM the game wont start. Most games use at maximum 3-6GB even at 4k. So really 8GB is enough.





GPU can compress data https://developer.nvidia.com/gpu-accelerated-texture-compression and read it straight from the SSD reducing the vRAM buffer size. DLSS 2.1 reduces the amount of vRAM used to render. Developers can prune unneeded textures from the vRAM instead of leaving them in memory. Most of the time you can get away with 6GB vRAM and DLSS ultra performance mode. Even at 4k in games like Control with all the RT settings on. All you have to do is turn the effect quality down to medium for some settings. Stay away from really expensive AA modes which eat vRAM.

Most software reads vRAM allocated but not vRAM used. Only the most demanding games get near 10GB. Many games need extreme settings and AA modes. https://www.resetera.com/threads/ms...isplay-per-process-vram.291986/#post-46310192

For example watch dogs at 4k with the HD texture pack which states 11GB. https://www.game-debate.com/news/29...-1080p-1440p-4k-and-raytracing-specs-revealed

Games do have modes that wont run on anything but the highest end GPU. This is were the 3090 with its 24GB of vRAM comes in. I bet you that the +20GB of HD textures still work on the 3080. It does https://youtu.be/bAiY3gaILrw, note the vRAM stated is allocated and not usage. +20GB of HD texture is very extreme for any game. Also the VRAM usage could be higher than normal as this game is being recorded.

Watch Dogs Legion system requirements 4K Ultra Settings
  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-9700K or AMD Ryzen 7 3700X
  • RAM: 16GB (Dual-channel setup)
  • GPU RAM: 11GB
  • GPU: Nvidia GeForce RTX 2080 Ti
  • HDD: 45GB (+20GB HD Textures Pack)
Example of software reads vRAM allocated but not vRAM used. Some games can be extreme but most use a lot less vRAM than you think.
I never asked or mentioned what the software reads. Also RTX IO doesn't work yet and DLSS doesn't work in all games.

I asked you how you arrived the the 10gb limit for developers and it's clear from what you posted above it was because Nvidia said so.

It's also convenient that you mention things like "most" games while the outliers for the moment appear to be the latest open world games.

I'm also well aware that you can turn down settings to reduce vram usage within the limits of the gpu you have now but that misses the point of is 10gb enough for now?
If you are reducing a setting specifically because of vram and not direct fps then it is not enough.

And for anyone else that that's reads my post I don't know if 10gb is or isn't enough because no next gen games have been released.

As a hypothetical since ubisoft like to do "hd" texture packs would 10gb be enough if far cry 6 ran fine at the same settings on a 6800 with it enabled but the 3080 with 10gb suffered with hitching?
 
Sure but with the new Consoles developers have 16GB, in this case they will have to reign back on the visuals/textures to 10GB, first time we might see worse visuals on a PC with a 3080 due to lack of GFX memory ;)

No they don’t. They have 16GB of shared memory, which means they also have RAM to come out of that. On the X at least only 10GB is fast enough to be assigned as VRAM.
 
I never asked or mentioned what the software reads. Also RTX IO doesn't work yet and DLSS doesn't work in all games.

I asked you how you arrived the the 10gb limit for developers and it's clear from what you posted above it was because Nvidia said so.

It's also convenient that you mention things like "most" games while the outliers for the moment appear to be the latest open world games.

I'm also well aware that you can turn down settings to reduce vram usage within the limits of the gpu you have now but that misses the point of is 10gb enough for now?
If you are reducing a setting specifically because of vram and not direct fps then it is not enough.

And for anyone else that that's reads my post I don't know if 10gb is or isn't enough because no next gen games have been released.

As a hypothetical since ubisoft like to do "hd" texture packs would 10gb be enough if far cry 6 ran fine at the same settings on a 6800 with it enabled but the 3080 with 10gb suffered with hitching?

Outliners are games with +20GB HD texture packs. Even then 10GB appears to be enough. Game developers have to stay within hardware limits. So the software is designed to detect hardware limits.

When we measure vRAM, some 3rd party software does not detect the correct amount of vRAM used. This inflates the amount of vRAM because other programs use vram and not all allocated vRAM is really used by the program.

Also there is texture compression. Colour compression. Compute data compression which for ampere is ment to increase the amount of data in RAM by 1.3x. If that was true overall then 10GB would be 13GB with DLSS 2.1 that 16GB. Its complicated to work out what the maximum data would fit into the 10GB of vRAM.

If Watch Dogs Legion is to go by then there is no issue with a HD texture pack of +20GB. This is what NVidia stated, there would be enough RAM for the game and any texture packet we wanted.

Q: Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

We’re constantly analysing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

https://www.nvidia.com/en-gb/geforce/news/rtx-30-series-community-qa/

If there are issues, you can always reduce the texture quality. Even my 2060 6GB can run with texture quatily at ultra and hit 4k DLSS 720p (all RT features on).
 
That's an impossibility due to the memory bus. 12GB or 24GB.
You can do it but you will reduce the width of the memory bus to 256 bits. The rtx 3080 has 10GB for 320bit, 12GB would be 384bit. Each RAM chip is 32bits, so 8x32= 256bit. Next year 2GB chips will become available, so you can do it with 8 chips. Even with 16x1GB chips you would be 256Bits. This with the RTX 3080 would change the bandwidth quite a bit.

3080, 9500 effective speed (half the 19.5Gb/s IO data rate). 256 bit bus. 8chips x 32bit each chip.

9500*256/8 * 2 = 608,000MB/s or 608GB/s which reduces the bandwidth the 3080 would have by quite a bit. Normal bandwidth, 760GB/s.
9500*320/8 * 2 = 760,000MB/s 760GB/s

This is the best I can do to work it out.
 
You can do it but you will reduce the width of the memory bus to 256 bits. The rtx 3080 has 10GB for 320bit, 12GB would be 384bit. Each RAM chip is 32bits, so 8x32= 256bit. Next year 2GB chips will become available, so you can do it with 8 chips. Even with 16x1GB chips you would be 256Bits. This with the RTX 3080 would change the bandwidth quite a bit.

3080, 9500 effective speed (half the 19.5Gb/s IO data rate). 256 bit bus. 8chips x 32bit each chip.

9500*256/8 * 2 = 608,000MB/s or 608GB/s which reduces the bandwidth the 3080 would have by quite a bit. Normal bandwidth, 760GB/s.
9500*320/8 * 2 = 760,000MB/s 760GB/s

This is the best I can do to work it out.

They're not going to do that. It's 12GB or 24GB, or the performance will be terrible.

There are massive gains from increasing the memory bus width. If Nvidia reduced it to 256bit, they'd probably lose 20-30% performance
 
I never asked or mentioned what the software reads. Also RTX IO doesn't work yet and DLSS doesn't work in all games.

I asked you how you arrived the the 10gb limit for developers and it's clear from what you posted above it was because Nvidia said so.

It's also convenient that you mention things like "most" games while the outliers for the moment appear to be the latest open world games.

I'm also well aware that you can turn down settings to reduce vram usage within the limits of the gpu you have now but that misses the point of is 10gb enough for now?
If you are reducing a setting specifically because of vram and not direct fps then it is not enough.

And for anyone else that that's reads my post I don't know if 10gb is or isn't enough because no next gen games have been released.

The simplest reason why developers would target below 10GB is that the consoles only have 16GB TOTAL RAM.

It would also be financial suicide not to keep the game running well on 8GB cards given that so many people have 8GB cards and the cheapest cards from both Nvidia and AMD have 8GB.

As a hypothetical since ubisoft like to do "hd" texture packs would 10gb be enough if far cry 6 ran fine at the same settings on a 6800 with it enabled but the 3080 with 10gb suffered with hitching?

I doubt you will be able to run 8K texture packs with 10GB.

Another hypothetical would be if Ubisoft released an ultra raytracing mode for said game. Would a 3080 with 10GB be playable, while a card such as the 16GB 6800x would be unplayable? That then raises the question, what will run out first, GPU power or VRAM? I doubt any of us buying a 3080,3080Ti or 3090 today will want to keep it after Hopper/RDNA3 launches as these will have considerable GPU performance gains both due to both raytracing and competition.
 
No they don’t. They have 16GB of shared memory, which means they also have RAM to come out of that. On the X at least only 10GB is fast enough to be assigned as VRAM.
Yeah exactly. It's a uniform memory architecture so the CPU and GPU both share the same memory capacity and bandwidth which could be a disadvantage. An advantage though is that their IO path might be even more efficient than RTX IO type stuff in some cases since whether the memory is loaded in my GPU or CPU it can be addressed by either if needed - no need to copy between the two at all. RTX IO will have to have the GPU basically read in the memory without involving the CPU and I'm not sure whether this will be limited to PCI attached SSDs because of that.
 
The simplest reason why developers would target below 10GB is that the consoles only have 16GB TOTAL RAM.

It would also be financial suicide not to keep the game running well on 8GB cards given that so many people have 8GB cards and the cheapest cards from both Nvidia and AMD have 8GB.



I doubt you will be able to run 8K texture packs with 10GB.

Another hypothetical would be if Ubisoft released an ultra raytracing mode for said game. Would a 3080 with 10GB be playable, while a card such as the 16GB 6800x would be unplayable? That then raises the question, what will run out first, GPU power or VRAM? I doubt any of us buying a 3080,3080Ti or 3090 today will want to keep it after Hopper/RDNA3 launches as these will have considerable GPU performance gains both due to both raytracing and competition.

your hypothetical doesnt work becuase I'm not telling developers not to implement ray tracing because my GPU cant run it.
 
They're not going to do that. It's 12GB or 24GB, or the performance will be terrible.

There are massive gains from increasing the memory bus width. If Nvidia reduced it to 256bit, they'd probably lose 20-30% performance


Maybe they will add their own totally orginal er nvinity cache
 
it wasn't and even if it was? who decided 10GB was the MAXIMUM limit for max quality settings for game developer?

Didn't you read?

The simplest reason why developers would target below 10GB is that the consoles only have 16GB TOTAL RAM.

It would also be financial suicide not to keep the game running well on 8GB cards given that so many people have 8GB cards and the cheapest cards from both Nvidia and AMD have 8GB.



I doubt you will be able to run 8K texture packs with 10GB.

Another hypothetical would be if Ubisoft released an ultra raytracing mode for said game. Would a 3080 with 10GB be playable, while a card such as the 16GB 6800x would be unplayable? That then raises the question, what will run out first, GPU power or VRAM? I doubt any of us buying a 3080,3080Ti or 3090 today will want to keep it after Hopper/RDNA3 launches as these will have considerable GPU performance gains both due to both raytracing and competition.

I posted
In reply to
your hypothetical doesnt work becuase I'm not telling developers not to implement ray tracing because my GPU cant run it.
As it made no sense.
 
Status
Not open for further replies.
Back
Top Bottom