• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Yeah, those who keep it for longer than one gen may run into a problem. I will be building a family member a PC and that includes monitor and for that build I will be going with 6800XT as I know he will likely be keeping that card for a long time. Even if he was not I would still go with that as it will go nicely with 5900X.

I paid £650 for my 3080 and for my needs I know 10gb will not be an issue until next gen. I am happy to turn down texture setting one notch if needs be in a handful of future titles. No big deal for me. The main games I am getting this card for will all likely not need more than 10gb, those are Cyberpunk 2077, Dying Light 2 and Bloodlines 2.

I try my best to say it how I see it without letting bias get in the way. That is why even though I am getting a 3080 I am still very excited about AMD’s offerings and think they have done very well. That said having a OLED that is G-sync and a monitor that is g-sync and the top 3 games I am looking forward to having RT/DLSS means I choose the 3080 for myself. I will not cut of my nose to spite my face. Even though I am no fan of nvidia or the greedy leather jacket man.



That guy lost all credibility. I have put him in his place more than once so he stuck me on ignore. Now he takes every opportunity to take a dig at me he can. Anyone who does not agree with him is apparently a mug.

He makes stuff up to to. Like where have I said anyone is stupid for daring to think 10gb is not enough? Sad person, makes me laugh :D

Yea fully understand your reasoning for going with the 3080. I was always going AMD due to my 4k Freesync screen as Nvidia dont support it. The 6800xt 16gb should see me good for a few years.
 
They did 8GB on the RTX 3070 did they not? They did 10GB on the 3080... Both are faster transfer wise than AMD.

AMD will have the infinity cache and SAM, which will outstrip GDDR6X. Obviously the numbers are theoretical until benchmarks come out but it'll definitely be better than plain GDDR6.

Also, the 3070 is not using GDDR6X, it's using GDDR6.
 
I was just think that a 3080 won't be able to run my favorite sim at max settings and it has nothing to do with vram. The game is just too demanding. The 3090 won't be able to do the job either.

If "flagship" means the ability to run every title in 4k at max settings or no buy, then they can't buy *any* graphics card on the market.

-At all.
Yep, absolutely agree. For some reason however, it's more of an issue when we're talking about VRAM. Most strange ...
 
Yea fully understand your reasoning for going with the 3080. I was always going AMD due to my 4k Freesync screen as Nvidia dont support it. The 6800xt 16gb should see me good for a few years.
Yep. Looks like a lovely card from what we have seen so far and I hope it does really well in reviews. Would be cool if they can hit £599.99 in the UK for it and actually have stock.

Whatever way you look at it, AMD is back. They have even done better than I expected from what I have seen so far and I was one of the guys who was laughing at those who were saying AMD will not be able beat a 2080Ti (which I found ridiculous). But for them to be up there with a 3090? Great to see that.
 
AMD will have the infinity cache and SAM, which will outstrip GDDR6X. Obviously the numbers are theoretical until benchmarks come out but it'll definitely be better than plain GDDR6.

Also, the 3070 is not using GDDR6X, it's using GDDR6.

Not really. GDDR6x is not the only thing Nvidia are using. The data from the memory and cache is compressed to increase bandwidth. Called compute data compression. Good for DLSS etc. Tensor Memory Compression



Also you can compress textures, colour data etc. The compresion is less about space and more about bandwidth. The GPU itself can compress and decopress data as it enters and leaves the gpu.

There's also Tensor Memory Compression that on Ampere, and will reportedly use Tensor Cores to both compress and decompress items that are stored in VRAM. This could see a 20-40% reduction in VRAM usage, or more VRAM usage with higher textures in next-gen games and Tensor Memory Compression decreasing that VRAM footprint by 20-40%.

Read more: https://www.tweaktown.com/news/7388...es-your-pc-more-like-playstation-5/index.html

The RTX 3080 has a 320-bit memory controller with 19 gigabit per second per pin memory chips, giving it a bandwidth of 760 gigabytes per second. This speed matters because things often don’t need to be kept in VRAM forever, and a fast memory system allows a GPU to cycle more data in and out rapidly. AMD’s new Radeon lineup is rumored to be compromised here, using a maximum 256-bit memory bus with 16 Gbps GDDR6 memory chips, which means the maximum bandwidth is 512 GB/s. This is slower, but not awfully so, and if another rumor/patent find comes true, AMD may be using an on-GPU cache to facilitate other memory speed improvements by storing small bits of data in the cache instead of having to push to the main VRAM.

Because of this, larger memory pools aren’t inherently better, because if a memory system is fast enough, it can overcome a smaller allocation of memory by being able to cycle data more effectively. Read this here https://kaylriene.wordpress.com/2020/10/22/sidenote-new-graphics-cards-and-vram-how-much-is-enough/

If you have a faster memory subsystem you don't need to allocate so much vRAM. This is were NVidia RTX IO comes in. You can Read the compress data straight from the NVMe SSD. This data is decompressed at the GPU as needed. It can sit in the vRAM compressed.

At the moment 8GB is enough vRAM at 4k.


You have to note that there are settings in many games that can't be run at the time and still most likely cant be run. For example in Witcher 2 a RTX 2080 ti can't run this game at max @ 4k and this is a DX9 game. Most likely a 3090 will have a chance. The settings are cinematic depth of field (bandwidth virus) and ubersampling (basically looks 8k). This melts any GPU and this game is nearly 10 years old (2011?). Run the game yourself on a 3080 or 3090 with cinematic depth of field (insane) and ubersampling. 4k with all other graphics options maximum and watch your system melt.


The lesson here is some games have settings ment to test future hardware. Just because the setting exists does not mean the latest hardware was ment to run it.
 
Last edited:
Lol if godfall works nicely on ultra 4k on a 3080 i will happily just buy it without any second thoughts since im going 1440p144hz.

The devs said 12gb would be needed for 4k ultra, many will probably measure the actual usage here and we will know... It is a true next gen game so it should tell us what to expect for the next 3-4 years :P

I think It will use maximum of 8-10 gigs...
 
Lol if godfall works nicely on ultra 4k on a 3080 i will happily just buy it without any second thoughts since im going 1440p144hz.

The devs said 12gb would be needed for 4k ultra, many will probably measure the actual usage here and we will know... It is a true next gen game so it should tell us what to expect for the next 3-4 years :p

I think It will use maximum of 8-10 gigs...
Well, it will only use whatever video memory your GPU has (actual usage) it cannot use more video memory than what is present. After that it will start to use system memory instead of video memory.
 
Well, it will only use whatever video memory your GPU has (actual usage) it cannot use more video memory than what is present. After that it will start to use system memory instead of video memory.
Yea i know that, i will wait for people to post benchmarks with msi new beta that shows actual vram usage, if its lower than 10gb, then that means i am good with the 3080 for 3+ years at ultra at 1440p, if it is using system ram and using the full 10gb then i might wait for 3080ti or get 6800xt
 
The Godfall developer said the game requires 12Gb VRAM for the 4K textures. You could have everything else maxed out but not the textures with <12Gb VRAM.
 
Not really. GDDR6x is not the only thing Nvidia are using. The data from the memory and cache is compressed to increase bandwidth. Called compute data compression.


I can understand why they would do this for non real time applications like scientific simulations that branch exponentially.. but would it be optimal for games were every clock counts?
 
Compression is pretty common across all platforms.

Tensor data compression exists only on ampere. Its decompressed in the core before going to be processed. So this increases the size of the L2 cache by upto 14 times and increases the size of the vRAM.
 
I can understand why they would do this for non real time applications like scientific simulations that branch exponentially.. but would it be optimal for games were every clock counts?

Its compressed/decompresed realtime inside the GPU. The data remains compressed in memory and in the L2 cache. It runs on the tensor cores. GPU's can compress textures already (lossless compression https://www.gamasutra.com/blogs/RobertBasler/20180202/313739/DXT_Texture_Compression_in_2018.php and even with lossy ASTC https://developer.nvidia.com/astc-texture-compression-for-game-assets). Delta color compression too (Pascal https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/8). Its all about reducing bandwidth.

If you can reduce 252MB of tectures to 37MB when you have to transfer them it uses less bandwidth. https://www.gamasutra.com/blogs/RobertBasler/20180202/313739/DXT_Texture_Compression_in_2018.php Compression like S3TC is heavily tuned to be extremely hardware efficient. S3TC is a lossy compression.

Tensor data compression is lossless and real time. So is Delta Color Compression at 8:1 (Pascal is 4:1 and 8:1 compression modes https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/8).
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom