• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why are people holding out for a RTX 3080 with 20GB VRAM?

Caporegime
Joined
18 Oct 2002
Posts
39,312
Location
Ireland
That just shows so many people are worried that 10GB will not be enough

For the most part I imagine it will be enough, bound to be some outliers that edge it. The main problem with games is they cache a lot of vram but might not be using it, I don't think there's currently any real way to monitor via third party app just how much vram is being used for textures etc vs cache. Some games just suck up all the vram they can because its available.
 
Soldato
Joined
21 Jan 2016
Posts
2,915
8k gaming is totally pointless unless you happen to have one of those £100k+ 100" 8k screens, or you sit within about 2 feet of your TV! I don't think anyone needs to worry about 8k gaming, short of just ticking an epeen box!

Not currently and arguably not in the future either until you look at VR... Abrash was predicting a need of 4k x 4k per eye to enable a really good fidelity experience with a iirc 130+ degree FoV.

My memory is a bit hazy but I think he was predicting such headsets for around 2023. I’m ignoring the Pimax 8k here perhaps unfairly, as it already exists.

So yeah, agree not needed now and arguably may never be needed depending on how eye tracked foveated rendering and machine learning based supersampling end up working out over the next few years, but theoretically at least there is a true use case in the relatively near future for 8k resolutions, just not on your telly.
 
Associate
Joined
17 Sep 2020
Posts
171
The only thing that interests me about the 20gb model is the fact that its rumoured that they will be better clocks than the current 3080's which I assume is a guess because nvidia 8nm manufacturing would have matured.

I got no chance of smashing the 10gb vram on the card, only time I could possible fill it is if I ran some AI Training on it and like some others said the direct IO should massively help for swapping assets in and out though curious how it would work with a ram cache in play.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Why wouldn't you hold out? People are holding out anyway. Getting a 3080 is luck at the moment at least for a couple of months. Then perseverance to get one by December.

Most people don't give a **** and will wait out until stock is readily in supply, card quality is known and other cards are released.

No rush is there? Not like this is at all important in the grand scheme of life.

Yeah, it's crazy, anyone might think we are on a enthusiast site :p
 

J.D

J.D

Soldato
Joined
26 Jul 2006
Posts
5,223
Location
Edinburgh
There might be people who ordered the 3080 10gb end up holding out for the 20gb card but not through choice. :D

Imagine, all these people waiting (including myself) on the 3080 to be shipped and it's mid October, RDNA2 hype ramps up, Nvidia starts leaking rumours about the 20gb card and then the 10gb people still waiting have choices to make. Do we take a gamble, wait for the 20gb card launch to fail terribly and get the card in (2027....) possibly 2021 or go with AMD if it's an actual competitor to the 3080?

Strange year indeed.
 
Associate
Joined
11 Jun 2013
Posts
1,087
Location
Nottingham
For the most part I imagine it will be enough, bound to be some outliers that edge it. The main problem with games is they cache a lot of vram but might not be using it, I don't think there's currently any real way to monitor via third party app just how much vram is being used for textures etc vs cache. Some games just suck up all the vram they can because its available.

Exactly ... VRAM allocated (what our monitoring utilities can tell us) is not the same as VRAM required (which is something only the game can tell, and few do).
 
Associate
Joined
3 Jan 2010
Posts
1,379
If anything I'd be waiting out on the 4080 in the hopes it has more vram, it's not worth spending more for the sake of it so I agree with you on that much. The 3080 seems like a bit of a misstep in some areas (big power draw, not great overclocking to my knowledge, vram concern etc.). Overall I'd rather wait a year or so to get a card that is closer to running the majority of games in 4k at 120fps or more. For the 3080 it seems with modern 4k games it can handle them but not excel. I'd like to see 120fps even on somewhat demanding games (maxed out of course) at 4k as well as greater vram for future proofing.

For the asking price that's not unreasonable but if the consoles can deliver 4K then the PC with a graphics card more expensive than the entire console system you would expect to at least be pushing high FPS even with the minor tweaks. I do believe once the PS5 and new Xbox hit that the realm of diminishing returns will be very real and the only thing that would get me to invest in the PC to the level of those graphics card prices is if they can smash 4k 120fps reliably. At the moment that dream still seems a year or two away and if the bar gets raised with development due to the new consoles then I can only imagine how many years it will take before that 4k 120fps dream will become reality. Overall I think 12gb vram would have been just enough to ensure it's at least future proof for the performance it has though, as you're only looking to have enough Vram to prevent any performance issue and I don't think that happens too often over 10gb anyway but the little extra would put you in a comfort spot then, rather than on the knife's edge and chancing it down the line.
 
Last edited:
Soldato
Joined
21 Jan 2016
Posts
2,915
While I’d never be upset with more vram, given Direct storage/velocity architecture (sorry Nvidia I mean RTX IO of course) will be in place by then and well established in both consoles as a general technique, I’m not sure the argument of future games struggling with the 10gb is likely to age well. Said future games will also likely be designed to be able to take advantage of the rapid streaming of compressed data from high speed nvme SSDs directly into VRAM.

The rules of the game are ultimately changing.
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,589
People keep going on about vram, isn't that what the new i/o feature is meant to solve? Seems similar to amd's hbcc so might not be an issue.

yes but no games support it

however if in a year most new games support RTX I/O, then sure 10gb vram will no longer be a problem

And I think that's what Nvidias thinking was too - that games that use lots of vram will get RTX I/O support so they no longer require lots of vram
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
If anything I'd be waiting out on the 4080 in the hopes it has more vram, it's not worth spending more for the sake of it so I agree with you on that much. The 3080 seems like a bit of a misstep in some areas (big power draw, not great overclocking to my knowledge, vram concern etc.). Overall I'd rather wait a year or so to get a card that is closer to running the majority of games in 4k at 120fps or more. For the 3080 it seems with modern 4k games it can handle them but not excel. I'd like to see 120fps even on somewhat demanding games (maxed out of course) at 4k as well as greater vram for future proofing.

A year? You are going to be waiting at least 2 years.

yes but no games support it

however if in a year most new games support RTX I/O, then sure 10gb vram will no longer be a problem

And I think that's what Nvidias thinking was too - that games that use lots of vram will get RTX I/O support so they no longer require lots of vram

That makes it sound more of a niche product. It's DX IO first which RTX IO will support, I think anyway.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
There might be people who ordered the 3080 10gb end up holding out for the 20gb card but not through choice. :D

That's true, but they could be waiting a while longer. Nvidia have already announced the GPUs being launched this year. They will probably have another seperate launch event / window for additional GPUs next year. One exception could be an RTX 3060, don't know why this wasn't announced (unless it's also planned for next year).
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
If it were me and I was prepared to spend £650, I'd go for a RTX 3080 FE from the Nvidia website, as soon as they have sufficient stock. Maybe get an overlocked model if the performance is 10% or higher vs the RTX 3080 FE, and it only costs a little more.

I worked out a while ago that the RTX 3080 @ 2050 mhz has about the same theoretical performance of a stock RTX 3090, that claim seems more likely to be true now :D
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,589
If it were me and I was prepared to spend £650, I'd go for a RTX 3080 FE from the Nvidia website, as soon as they have sufficient stock. Maybe get an overlocked model if the performance is 10% or higher vs the RTX 3080 FE, and it only costs a little more.

I worked out a while ago that the RTX 3080 @ 2050 mhz has about the same theoretical performance of a stock RTX 3090, that claim seems more likely to be true now :D

Some 3080 cards will be hitting that or higher out of the box like the Asus Strix OC
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
Yeah, its gonna be awesome for anyone who can afford one.

EDIT - You mean with some manual overclocking, right?

It will be interesting to see reviews comparing it to the RTX 3090 :D
 
Last edited:
Associate
Joined
22 Jul 2012
Posts
29
Not sure there is scope in the die to get a ti version out of it. 3080 to 3090 is anywhere between 10-20% difference performance wise. Going with they will release a Super variant at 20gb in Q1/Q2 2021 when the 2gb modules are available, with a little performance uplift due to AIBs being able to refine the PCB and bios somewhat as they will by then know how the die works better and the Samsung yield will likely be better on the chips and more refined binning.
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,589
Not sure there is scope in the die to get a ti version out of it. 3080 to 3090 is anywhere between 10-20% difference performance wise. Going with they will release a Super variant at 20gb in Q1/Q2 2021 when the 2gb modules are available, with a little performance uplift due to AIBs being able to refine the PCB and bios somewhat as they will by then know how the die works better and the Samsung yield will likely be better on the chips and more refined binning.


What if they: Take the 3080 FE, change it to 20GB VRAM, stick the 3090's cooler on it, overclock it and call it a 3080ti
 
Back
Top Bottom