• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Seems so for the moment anyway.

If you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.

It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!

The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap :)
 
1783 post thread TLDR:

Is 10GB GDDR6X enough for 4K ultra settings now? Yes.

How long until 10GB GDDR6X is not enough for 4K Ultra settings? Nobody knows. Some people say soon, others say it’s a while away.

What happens when 10GB GDDR6X isn’t enough for 4K Ultra settings? You will need to turn some settings down in that game.
 
1783 post thread TLDR:

Is 10GB GDDR6X enough for 4K ultra settings now? Yes.

How long until 10GB GDDR6X is not enough for 4K Ultra settings? Nobody knows. Some people say soon, others say it’s a while away.

What happens when 10GB GDDR6X isn’t enough for 4K Ultra settings? You will need to turn some settings down in that game.

TLDR of #1784 - Baaah. People on youtube tell me 3080 is top dog for many years. I believe them, and I'm prepared to turn down settings on next gen games so my brand new £650-870 flagship GPU can cope. Baaah
 
If you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.

It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!

The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap :)
More rubbish from you as usual :)

Nvidia use industry power to rush? Lol. Have you seen the stock shortages? Going to take them over 2 month, maybe even 3 before they deliver the card i purchased within 90 minutes of it being launched.

They won't give you the extra 10gb for free. You will be made to pay for that. I am guessing a 3080Ti will be around £999 on average.

I paid £649.99 for my 3080, so then what would roughly one third extra money buy me? Around 10% extra performance and 10gb extra vram? Lol. By the time I would need those I will be on Hopper. And you call me sheep? You just cannot see that everyone has different needs. Hence you attack anyone who does not agree with your closed minded thoughts ;):D

And all this coming from a guy who got a Radeon VII for the 16gb vram that never came in handy once and now gets crushed by 8gb 3070 in ever game! Haha.
 
The answer is we don't know.

Future games designed for 4K base will either require larger amounts of Vram or be heavily dependent on the ability to stream in very fast.
Time will tell which solution will be better in the short/long term but do not expect game developers to be "efficient" with their use of vram as for the most part when it comes to the PC version everything just gets dialled up with no optimisation.
 
The answer is we don't know.

Future games designed for 4K base will either require larger amounts of Vram or be heavily dependent on the ability to stream in very fast.
Time will tell which solution will be better in the short/long term but do not expect game developers to be "efficient" with their use of vram as for the most part when it comes to the PC version everything just gets dialled up with no optimisation.

I haven't seen any evidence that suddenly VRAM amount is longer important. Sure, the 3080 has higher memory bandwidth compared to the 2080 (3080 has 760.3 GB/s) though there have been cards with more bandwidth in the past, such as the Radeon VII with 1,024 GB/s. You still need the physical capacity to fit the required amount of data into the VRAM.

The NVME SSD streaming thing is more about game load times, not reducing VRAM requirements. We've seen how vitally important the BOM (bill of materials) is for consoles, do you think they'd have put 16GB on them if the NVME SSD caching feature reduced physical memory requirements? Absolutely not, it's got 16GB total shared memory is it's needed for the games they want to design.

Game developers are lazy. They'll tune games for the PS5/XboxSX, and have plenty of VRAM to play with, considering their 16GB of TOTAL shared memory (double that of the previous gen).
 
I view VRAM as a kind of "insurance". More VRAM is worth more money to me, but probably not worth as much as Nvidia want to charge.

If they want $1k for 20gb of VRAM, I'll take my chances with 10 or just go AMD and get 16.

I realize that I am risking utter catastrophe. The day may come when one day, on some random new game, that I must turn down a setting that just ruins my entire experience. -And at that moment I will wish that I had spent an extra $300 so that I could use that setting on that game.

I'm scared just thinking about it. Someone hold me.
 
I view VRAM as a kind of "insurance". More VRAM is worth more money to me, but probably not worth as much as Nvidia want to charge.

If they want $1k for 20gb of VRAM, I'll take my chances with 10 or just go AMD and get 16.

I realize that I am risking utter catastrophe. The day may come when one day, on some random new game, that I must turn down a setting that just ruins my entire experience. -And at that moment I will wish that I had spent an extra $300 so that I could use that setting on that game.

I'm scared just thinking about it. Someone hold me.

Sarcasm? :p
 
3080TI 20GB pretty much confirmed at this point! There's was a tweet linked to at Reddit, that's since removed. Hopefully on TSMC 7nm.

Going to be fun quoting a few select people here who were adamant that 10GB 3080 is enough for 4k for years, and who claimed Nvidia wouldn't release a 20GB 3080/3080ti variant.... Wake up, sheep!

What no? 3080 20gb and 3070 16gb were cancelled a long time ago.

3080ti 12gb g6x and 3070ti 10gb g6x are the newly leaked refreshes.

3060ti and 3070ti will likely be $100 more than their base model.

Now im hoping that 3080ti releases this year (idk if it will) and if its under $849 I'll buy that otherwise Imma stick with the vanilla 3080 10gb for 1440p
 
Last edited:
If you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.

It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!

The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap :)

You have to take into account that the more memory chips you have on a GPU the more bandwidth.

Example GeForce 20 series. GDDR6
2060 bus width 192 bits 6GB vRAM Bandwidth 336GB/s
2070 bus width 256 bits 8GB vRAM Bandwidth 448GB/s
2080 bus width 256 bits 8GB vRAM Bandwidth 448GB/s
2080ti bus width 352 bits 11GB vRAM Bandwidth 616GB/s
Titan RTX bus width 388 bits 24GB vRAM Bandwidth 672GB/s

GeForce 30 series. GDDR6X
3070 bus width 256 bits 8GB vRAM Bandwidth 448GB/s
3080 bus width 320 bits 10GB vRAM Bandwidth 760GB/s
3090 bus width 384 bits 24GB vRAM Bandwidth 936GB/s

The RDNA2 series GDDR6
6900xt bus width 256 bit 16GB vRAM Bandwidth 512GB/s
6800xt bus width 256 bit 16GB vRAM Bandwidth 512GB/s
6800 bus width 256 bit 16GB vRAM Bandwidth 512GB/s
6700xt bus width 192 bit 12GB vRAM Bandwidth 384GB/s
6700 bus width 192 bit 12GB vRAM Bandwidth 384GB/s
6500xt bus width 128 bit 8GB vRAM Bandwidth 256GB/s

The more expensive 3080's can almost reach the 3090 performance. [1] The 6900xt has to have its stock power limit unlocked by Rage mode to reach the RTX 3090 in AMD's slides. There is a rumor of much worse performance with RT. All of the AMD cards could be slower in RT. [2] If true the 6800xt cannot match the 3080 for performance once a game uses RT. More likely a stock 6900xt is more the match in RT games.

Take Time Spy Extreme 4k and the rumored estimated score for the 6800xt. RX 6800XT (estimated) 8232. Stock 3080 FE 8581. NVidia RTX 3080 Asus strix OC (oc mode 1935MHz) 9325. MSI RTX 3090 Gaming X Trio 9819. AMD Ryzen 9 3950X CPU.

Rage mode increases the power to the card which increases the performance 1-2% [4]. So you would need to set the power slider to maximum on NVidia cards to get the same. AMD SAM also increases performance by approx. 11%. [11] If you use an Intel system or upgraded an older system you could lose 11% performance. [11]
SAM basically gives the card direct access to main system memory, but only when configured in a system with one of the new Ryzen 5000 series CPUs. [11]

DLSS performance is much better and all systems can use it. [12]

6900xt vs 6800xt 1440p [6] & 4k [10] vs 3080 FE Intel 10900k[7&8] vs 3080 + Intel 9900k[9]. GeForce RTX 3080 Founders Edition.
Borderlands 3 Badass Settings
6900xt 73fps
6800 xt 1440p - 111fps
6800xt 4k - 63fps
3080 4k - 77.6 fps <--- faster than 6900xt @ 4K with a 10900k [7]
3080 1440p - 102.2fps [7]
Shadow of the Tomb Raider
6900xt 96fps
6800xt 1440p - 155fps
6800xt 4k - 88fps
3080 4k - 92fps with RT +DLSS disabled<---- close to the 6900xt [7]
3080 1440p - 170.6fps with RT +DLSS disabled<----- faster than the 6800xt [7]
3080 1440p - 150fps with RT +DLSS disabled [8]
3080 4k - 85fps with RT +DLSS disabled [8]
3080 1440p - 150fps with RT + DLSS pure hair on HBAO+ [9] 9900k stock
3080 4k - 103fps with RT + DLSS pure hair on HBAO+ [9] 9900k stock
Gears of war 5
6900xt 92fps
6800 xt 1440p - 132 fps
6800xt 4k - 78fps
3080 4k - 84.5fps <---- slower than 6900xt [7]
3080 1440p - 142.4fps <---- faster than 6800xt [7]
3080 4k - 75fps [8]
3080 1440p - 129fps [8]
3080 4k - 76fps [9] 9900k stock
3080 1440p - 111fps [9] 9900k stock
Doom Eternal
6900xt - 150 fps
6800 xt 1440p - 216fps
6800xt 4k - 138fps
3080 4k -152fps <--- faster than 6900xt [7]
3080 1440p - 252 fps <---- faster than 6800xt [7]
3080 4k -189 fps <--- faster than 6900xt [8] RS off
3080 1440p - 314 fps <---- faster than 6800xt [8] RS off
Battlefield V
6900xt - 122fps
6800 xt 1440p - 198fps
6800xt 4k - 113fps
3080 4k -115.3fps DX11 [7]
3080 1440p - 174.6fps DX11 [7]
3080 4k -110fps DX12 [9]
3080 1440p - 171fps DX12 [9] 9900k stock
3080 4k - 55fps DX12 RT on [9] 9900k stock
3080 1440p - 92fps DX12 RT on [9] 9900k stock
3080 4k - 73fps DX12 RT on + DLSS 1.0 [9] 9900k stock
Resident Evil 3
6900xt 129 fps
6800xt 1440p - 225 fps
6800xt 4k - 117fps
3080 4k - 105fps - AMD slides put nvidia 3080 ahead. [10]
3080 1440p - 199fps - AMD slides put nvidia 3080 ahead. [8]
3080 4k - 104fps - [9] 9900k stock
3080 1440p - 194fps - [9] 9900k stock

Looks like the RTX 3080 could be faster than the 6800xt overall. I don't believe any of the benchmarks by AMD use RT in any way. If you don't have an AMD 5000 series cpu you could lose up to 11% of the performance of a 6800xt or 6900xt. [11] This is why the AMD slides for the 6900xt use the SAM feature. The 6800xt could also be using SAM, we dont know. [13] AMD claims that the 6800xt is faster than the 3080 with SAM and rage mode enabled. Enjoy buying an 5000 series cpu to be able to find out. Once a game supports DLSS 2.1 then the 3080 should be far faster. Note AMD's FidelityFX Contrast Aware Sharpening (CAS) is fast but can't hide the lower resolution like DLSS. Its just an upscaler with sharping.

CAS is worthwhile, though it comes at the cost of increased aliasing and a drop in graphical clarity in many scenes.

DLSS 2.0 quality is more like the native image, DLSS 2.1 uses a 16k ground truth image for training.
Imagine you have a license plate of a car which is far away, the numbers and letters being barely readable. Now if you upscale it it will blur and still be unreadable and worse, its blurred. DLSS has seen thousands of license plates in the high res footage that it ingested during the training phase and knows what the license plate should look like. It then adds in the details that it learned. So if the digit 5 is indistinct, DLSS has seen that 5 before and adds in missing pixels to make the 5 sharp and full. So now we have a full res image and more detail than the rendered image. [14]

DLSS 2.0 still delivers the best picture quality - in most cases with even more detail than native resolution. And the combination with a temporary filter eliminates flicker. FidelityFX CAS remains an alternative for those users who cannot activate DLSS due to the lack of NVIDIA graphics cards, but still want to play in high definition. [15]

I don't think NVidia need to do anything to complete with AMD atm. They could still be well ahead. With DLSS 2.1 you don't need that much vRAM for 4k. Render at 1440p and then just let DLSS 2.1 work its magic to create the 4k image.

[1] https://www.overclock3d.net/reviews/gpu_displays/asus_rtx_3080_oc_strix_review/20
[2] https://www.thefpsreview.com/2020/1...short-of-nvidia-rtx-according-to-early-tests/
The Radeon RX 6800 XT is also 29 percent slower than the GeForce RTX 3080 in that regard. “Measured by AMD engineering labs 8/17/2020 on an AMD RDNA 2 based graphics card, using the Procedural Geometry sample application from Microsoft’s DXR SDK, the AMD RDNA 2 based graphics card gets up to 13.8x speedup (471 FPS) using HW based raytracing vs using the Software DXR fallback layer (34 FPS) at the same clocks. Performance may vary.” NVIDIA GeForce RTX 3080 630 FPS
[3] https://www.overclock3d.net/reviews/gpu_displays/asus_rtx_3080_oc_strix_review/20
[4] https://youtu.be/haAPtu06eYI?t=745
[5] https://hexus.net/media/uploaded/2020/10/333f1646-ab1e-4458-8663-5e5829cb07c1.PNG
[6] https://images.idgesg.net/images/article/2020/10/radeon-rx-6800-xt-1440-100863938-orig.jpg
[7] https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-amd-3900-xt-vs-intel-10900k/13.html
[8] https://www.techspot.com/article/2103-amd-vs-intel-rtx-3080/
[9] https://www.guru3d.com/articles_pages/geforce_rtx_3080_founder_review,11.html 9900k stock NVIDIA GeForce RTX 3080 (founder)
[10] https://images.idgesg.net/images/article/2020/10/radeon-rx-6800-xt-4k-100863937-orig.jpg
[11]
AMD claims that SAM combined with Rage Mode boosts frame rates by up to 13%

AMD claims that SAM combined with Rage Mode boosts frame rates by up to 13%, which means performance on Intel systems or upgraded systems won't be as good.

SAM basically gives the card direct access to main system memory, but only when configured in a system with one of the new Ryzen 5000 series CPUs.
https://www.cnet.com/news/amd-radeon-rx-6800-xt-and-6900-gpus-target-4k-gaming-start-at-579/ https://www.amd.com/en/technologies/smart-access-memory
Up too 11% extra performance.
[12] https://www.techspot.com/article/1992-nvidia-dlss-2020/ https://www.game-debate.com/news/29159/death-stranding-pc-full-dlss-2-0-performance-analysis-and-graphics-comparison https://www.youtube.com/watch?v=uMbIov1XQa8 https://www.nme.com/features/nvidias-dlss-2-0-tech-could-make-your-300-graphics-card-perform-like-a-700-one-2708925
Quality mode of DLSS 2.1 render 1440p is so close to the native 4k you cant see the difference. Up scaling cant match that.
[13]
In an apples-to-oranges comparison we don't like, AMD further says Radeon RX 6800 XT is faster than GeForce RTX 3080 with Rage Mode and SAM turned on. Well, Nvidia cards can be overclocked too, so this comparison is not entirely fair.
https://hexus.net/tech/news/graphic...deon-rx-6000-series-promises-rtx-performance/ https://hexus.net/media/uploaded/2020/10/6c66ed53-e318-4ec0-b077-fbcc78f14cee.PNG
If you happen to have a Ryzen 5000 Series CPU paired alongside any Radeon RX 6000 Series GPU, a feature known as Smart Access Memory (SAM) enables the CPU to access the full complement of GPU memory, instead of just 256MB. It's said to offer an extra fps or two in complex scenes where the CPU historically bogs the GPU down.
[14] https://www.roadtovr.com/nvidias-latest-dlss-2-1-supersampling-now-supports-vr/
[15] https://translate.google.co.uk/translate?hl=en-GB&sl=auto&tl=en&u=https://www.hardwareluxx.ru/index.php/artikel/software/spiele/49974-test-death-stranding-na-raznykh-videokartakh-s-dlss-2-0-i-fidelityfx-cas.html?start=2

https://youtu.be/oWV5tECfTBQ
 
Last edited:
If you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.

It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!

The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap :)

so 2 extra gb = fantastic card....ok
 
This thread has really gone round in circles, end of the day it's a judgement call that you have to make based on your own set of circumstances and future upgrade plans, some people will say it'll be fine for a couple of years and by that time you'll of upgraded to the next GPU release. Maybe less ram is a valid trade off for the features you want from the green team, many cite DLSS as a strong reason, that may in the interim give you more performance/frames in the games your interested in over having more ram that isn't being utilised, maybe some people are taking resale value into consideration, particularly if AMD GPUs have 16gb, it's inevitable that Nvidia will now release Ti/Super versions of their cards with more ram, will people want to buy 10gb GPU in a few years time?, maybe at the right price.

I always think there's an art to buying a PC component, there isn't one answer for everyone.
 
This thread has really gone round in circles, end of the day it's a judgement call

I always think there's an art to buying a PC component, there isn't one answer for everyone.


Quoted for the truth, the facts keep getting ignored and the same arguments from the first page are repeated on page 90, I'm ready to start talking competitor in this thread so it goes where it belongs, locked or deleted, hell it would be worth the forum ban to see the conversation end.
 
1783 post thread TLDR:

Is 10GB GDDR6X enough for 4K ultra settings now? Yes.

How long until 10GB GDDR6X is not enough for 4K Ultra settings? Nobody knows. Some people say soon, others say it’s a while away.

What happens when 10GB GDDR6X isn’t enough for 4K Ultra settings? You will need to turn some settings down in that game.

It depends how ultra is defined. When ultra becomes 8k textures, it won't be. But there won't be any point in using 8k textures if you're playing on a 4k display...
 
TLDR of #1784 - Baaah. People on youtube tell me 3080 is top dog for many years. I believe them, and I'm prepared to turn down settings on next gen games so my brand new £650-870 flagship GPU can cope. Baaah

Really mature post, nice. Nobody thinks 3080 is ‘top dog for many years’. At some point in the future 10GB vram will be insufficient. Nobody knows when that will be.
 
They will upgrade
3080TI 20GB pretty much confirmed at this point! There's was a tweet linked to at Reddit, that's since removed. Hopefully on TSMC 7nm.

Going to be fun quoting a few select people here who were adamant that 10GB 3080 is enough for 4k for years, and who claimed Nvidia wouldn't release a 20GB 3080/3080ti variant.... Wake up, sheep!


Nvidia will releas 3080ti when all the 3080 pre orders are done and the return period is dead
 
The 3080TI 20GB is not pretty much confirmed. It's currently the level of a rumor. 10GB of vRAM is enough. The 6800xt has some evidence of being 29% slower than the 3080 in RT, if the source is accurate. This is a problem that will affect 6900xt as well. So the Nvidia cards are faster in ALL RT games. There is no version of DLSS on AMD cards. DLSS 2.1 is better than AMD's FidelityFX Contrast Aware Sharpening by a big margin. When you upscale RT games, you get the problems DLSS was created to fix.
 
Status
Not open for further replies.
Back
Top Bottom