• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ASUS confirms ROG STRIX GeForce RTX 3080 Ti graphics card with 20GB memory

Even Steve from GN says the 3080 VRAM argument is a total non issue. 10GB will be plenty enough until the GPU needs replacing anyway.
His opinion (expert as it is, based on his widespread experience) back in September 2020 was to 'stop worrying about it'. I don't believe he is correct, as objectively speaking, some current games are already using 8+GB VRAM at 4k. It is not and can never be a 'total non-issue' to say that 10GB VRAM may be breached within the next 2 years. We can do our best to make an educated guess to predict VRAM usage over the next 2 years, but again there is, objectively speaking, a realistic possibility that some games will come along that will push this boundary at Ultra/Extreme settings, especially with mods like 4k textures (I also take modding potential into account as I love modding games so it always factors into my forecasting). Can you always scale those settings down to reduce VRAM usage? Yes, of course, but it doesn't change the fact that you would otherwise have hit a VRAM limitation by simply doing what you want to do.

In the end, no-one can say for sure, definitively, that 10GB VRAM will not be a limitation within the next 2 years. At this point, it's a known potential risk. AMD have removed this concern from the equation for this generation by giving 16GB VRAM and they are to be applauded for that. Anyway, I will leave it there as the discussion has been done to absolute death by this point in numerous threads, so all that remains is to wait and see and gather more concrete information as 2021/2022 progresses.
 
In the end, no-one can say for sure, definitively, that 10GB VRAM will not be a limitation within the next 2 years

When 10GB Vram is a limitation it'll be a moot point for the 3080 because it'll have run out of grunt to operate at the ultra settings that require those large amounts of Vram, so the end user will be turning down the settings anyway... Or, upgrading to the 4000 series.
 
When 10GB Vram is a limitation it'll be a moot point for the 3080 because it'll have run out of grunt to operate at the ultra settings that require those large amounts of Vram, so the end user will be turning down the settings anyway... Or, upgrading to the 4000 series.
Lets agree to disagree with the opinon you are trying to present as fact and just wait and see wat happens. There's nothing more for me to discuss here and I will revisit the topic in the coming 12-24 months when we have actual evidence to support it either way. :)
 
There is a huge vram thread which I don’t wish to replicate but we never hear it the other way around as when will the 3080 run out of gpu horsepower? Surely that’s far more logical and likely to happen than vram.

Perhaps it already has happened as a 3080 can’t hit 4K/60 in many games. Vram limits haven’t been hit though...
 
Well as serious as one should be worried about vram running out!
I meant are you serious with what you posted about: "never seeing arguments about horsepower running out before VRAM". That is exactly what Dirk posted and exactly what has been posted for months in the VRAM discussion threads as one of the core argument of people who believe the cards do have enough VRAM. So for you to say you haven't seen it posted is just really bizzarre.
 
Last edited:
His opinion (expert as it is, based on his widespread experience) back in September 2020 was to 'stop worrying about it'. I don't believe he is correct, as objectively speaking, some current games are already using 8+GB VRAM at 4k. It is not and can never be a 'total non-issue' to say that 10GB VRAM may be breached within the next 2 years. We can do our best to make an educated guess to predict VRAM usage over the next 2 years, but again there is, objectively speaking, a realistic possibility that some games will come along that will push this boundary at Ultra/Extreme settings, especially with mods like 4k textures (I also take modding potential into account as I love modding games so it always factors into my forecasting). Can you always scale those settings down to reduce VRAM usage? Yes, of course, but it doesn't change the fact that you would otherwise have hit a VRAM limitation by simply doing what you want to do.

In the end, no-one can say for sure, definitively, that 10GB VRAM will not be a limitation within the next 2 years. At this point, it's a known potential risk. AMD have removed this concern from the equation for this generation by giving 16GB VRAM and they are to be applauded for that. Anyway, I will leave it there as the discussion has been done to absolute death by this point in numerous threads, so all that remains is to wait and see and gather more concrete information as 2021/2022 progresses.

Throwing this one in there too, when people start using resizeable bar, this will only make it a bigger headache no? The whole point is to work with the system by utilising spare VRAM but the 3080 has such a lack of it I can see a lot of shuffling or better still, dont bother enabling it for some games.
 
Throwing this one in there too, when people start using resizeable bar, this will only make it a bigger headache no? The whole point is to work with the system by utilising spare VRAM but the 3080 has such a lack of it I can see a lot of shuffling or better still, dont bother enabling it for some games.
To be honest I know very little about the inner workings of BAR as I haven't got around to doing any reading up on it yet. However if it does use VRAM in the way that you say then it could indeed have a further impact if it is simultaneously using VRAM that a game would otherwise want to use.
 
To be honest I know very little about the inner workings of BAR as I haven't got around to doing any reading up on it yet. However if it does use VRAM in the way that you say then it could indeed have a further impact if it is simultaneously using VRAM that a game would otherwise want to use.

Just enhances your earlier point that if any card that is the flagship or best value buy can suddenly be exposed as a weak link due to poor engineering selection.

Remember when we talked about the lack of VRAM being the topic, there was no SAM/resizeable bar... now we have progressed and nvidia are releasing this finally, you have to wonder even moreso now that is this going to be the Achilles heel to expose the lack of VRAM?

So far not many games to flaunt its boost, but up to 10% lets go with for now. It is safe to say at 4k your not going to get any extra performance if you have a 3080.

For those slating the 3090 users though..
 
VR is another VRAM heavy application, as resolutions are increasing, and even on lower spec headsets super sampling is used to increase the visual quality.

Playing modded Skyrim VR, without super sampling I'm approaching 10gb VRAM usage on my 11gb 1080ti FTW3.

If it wasn't for VR I would be sticking with my 1080ti as it's perfectly fine for gaming at 1440p, and I don't really have an interest in RTX ray tracing until it becomes more widely supported.
 
Just enhances your earlier point that if any card that is the flagship or best value buy can suddenly be exposed as a weak link due to poor engineering selection.

Remember when we talked about the lack of VRAM being the topic, there was no SAM/resizeable bar... now we have progressed and nvidia are releasing this finally, you have to wonder even moreso now that is this going to be the Achilles heel to expose the lack of VRAM?

I think 10GB was already borderline without SAM considering it was a regression vs the 1080Ti and 2080Ti. People I have seen in threads seem to think that the 3080 having 10GB was a really well thought out decision because Nvidia actually thought it was all gamers would need. It wasn't, it was a cost-related business decision and they said as much in interviews. It came down to a compromise based on the architecture they chose vs the associated costs and availability of the more expensive GDDR6x. More specifically, it was due to them going with a 320-bit bus combined with there only being 1GB or 2GB GDDR6x module sizes available; so it was either stick with 10GB or be forced to double the VRAM to 20GB on every 3080 which would have demolished their supply. They played it safe and chose 10GB to keep costs down and availability higher.

I would love to known for he sake of my own curiousity what performance would have been like with 20GB of regular GDDR6 but that's just wishful thinking.

I still think that the 3080 10GB is a great performaning card, my only grips is the risk of VRAM limitations during this generation. Otherwise, I think it's awesome especially at the MSRP.

For those slating the 3090 users though..
Hah. I think it's certainly safe to say that at the very least, 3090 users are never going to run out of VRAM. :D
 
Last edited:
Is there anywhere that actually states enabling BAR/SAM will worsen performance with evidence to back it up?

As given how the tech. works, in theory it should actually benefit ALL gpus. From my understanding, it simply is eliminating a bottleneck where we had to rely on like a temporary swap system, where only up 256MB chunks could be swapped out each time, where as now, it is direct access to VRAM so almost like removing the middle man and allowing a bi-directional direct transfer rather than a limited uni-directional transfer system.

Plus, don't forget consoles have 16GB VRAM/RAM overall, nothing else and suspect this is where BAR/SAM is helping them so it should help PCs game ports too.

EDIT:

Having a quick read on reddit, so if CPU can only access the VRAM with this bar/sam feature enabled i.e. is limited to purely whatever VRAM said card has, then it could potentially have an impact.

Hard to say though.....

EDIT:

https://hardwaresfera.com/en/articulos/resizable-bar/

Yup I don't see how having a low amount of VRAM is going to be a disadvantage, maybe with 8gb and less at likes of 4k/4k texture packs etc.???

EDIT:

someone's take from reddit:

The GPU will reserve what it needs for the game and will not allow CPU to get more than what it allows. This will preserve the game performance. This functionality in current state only allows 256MB to CPU where instruction handshake can happen. rest of the data that needs to handshake is stored in main RAM modules and CPU will fetch that and provide to GPU on demand.

With resize bar, even though CPU technically has full access to GPU VRAM, say 10gb, GPU will still reserve what it needs as a priority and give CPU remaining VRAM for storing instructions and data. This means when GPU needs additional information, it does not have to ask CPU and can itself fetch it from VRAM, making performance better.

However, lesser the total VRAM, lesser the "extra" benefit, still more than what we see today (never less). It will also depend on the game or the software and amount of data to reap the benefits. A very demanding game with lots of data, should reap higher benefits on a 3090 with 24gb vram than on a 3080 with 10gb VRAM, but still it will be better than what it is today on a 3080.

So all in all, those cards with more vram might see more benefit but ultimately, other cards with less vram will also see benefit, just not as much.
 
Last edited:
When 10GB Vram is a limitation it'll be a moot point for the 3080 because it'll have run out of grunt to operate at the ultra settings that require those large amounts of Vram, so the end user will be turning down the settings anyway... Or, upgrading to the 4000 series.
That argument has never made sense, because most games are very tweakable and you don't have to just pick a preset and go. If you're running a card that's starting to struggle in modern titles and it doesn't have much VRAM either, it's down to low/medium settings across the board for you. If you have a card that's feeling the heat, but has a ton of VRAM, you can turn all the core-heavy settings down to low/medium but keep textures on Ultra Mega Nightmare Xtreme, which will result in a better-looking game. Red Dead Redemption 2 is a good example of a game that benefits greatly in such a scenario, where turning the texture setting down has a bigger impact on visual quality than all the other settings combined. It looks like a PS2 game on the lowest texture setting.

Of course, there are other settings that increase VRAM usage too, but none nearly as much as simply whacking the textures up to max in most games.
 
More rumoured info on the 3080Ti that supports that it is simply being delayed for a couple of or a few months... https://wccftech.com/nvidia-geforce...060-rtx-3050-graphics-cards-submitted-to-eec/

Also, unless I am misreading things (please correct me if I am) it seems to more or less be a 3090 on a 3080 PCB...

NVIDIA GeForce RTX 3080 Ti 'Rumored' Graphics Card Specifications

NVIDIA's GeForce RTX 3080 Ti FE (Founders Edition) graphics card is expected to feature the PG133-SKU15 PCB design and the GA102-250-KD-A1 graphics core. The GA102-250 GPU has also changed since the last time we saw them and is now exactly the same as the GeForce RTX 3090 at 10496 FP32 CUDA cores.
 
Back
Top Bottom