• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Its not the vRAM causing the low fps.


The HD texture pack textures +20GB and needs 6GB of vRAM. 8GB cards should be okay. The 3070 is not ment to be a 4k card. The game will alocate as much vRAM as possible, you need used vRAM to see what is happening. There is a mod for Watch dogs legion with 8k textures.

Does this mean the 2080Ti that it's "faster than" wasn't meant to be a 4k card either?
 
WDL is actually almost hitting a wall with RTX 3080, not sure versus allocated/used, there are both claims on the net regarding built in bench, but in native 4K (fully maxed, RTX on, DLSS off) it does seems to go all the way to 10GB.

YSMiRG0.png
 
lol i just dont get it, some you know there will be issues down the line with Vram allocation, but you just buy anyway, lockdown doing strange things to some.:)
 
WDL is actually almost hitting a wall with RTX 3080, not sure versus allocated/used, there are both claims on the net regarding built in bench, but in native 4K (fully maxed, RTX on, DLSS off) it does seems to go all the way to 10GB.

If its ingame its most likely used. As far as I know games will use extra vRAM to cache textures so there is no need to load them. This is why more vRAM can be better up to a point. Also the bigger the vRAM size the lower level of compression that may be required. Up to a point were you dont need compression at all. Waiting for features like Direct ML to upscaling textures, wont help at the moment. Also no games are likely to use it for some time and the game has to support the feature. The worst outcome is the GPUs will run a hybrid mode where the drivers/GPU start streaming texture data from system RAM over the PCIe bus to make up for the "missing" RAM. This is a fps killer. 99% of the time this is not an issue because you're using more memory than necessary. As the 3070 with 8GB has no fps issues ingame with th HD texture pack. I would guess more streaming of textures is taking place and the gpu has enough RAM. This would imply that the 3080 has enough RAM but can load more textures.

You can always create a game that uses more memory than current GPU's can handle. Imagine if all games required 24GB of vRAM for the best textures. That would be a lot of effort for a small number of customers. It would also leave every other gpu AMD or nvidia unable to run the game as intented. At least until gpu's with more vRAM become common.

Does this mean the 2080Ti that it's "faster than" wasn't meant to be a 4k card either?

Nvidia marketed the RTX 3070 as a 1440p card, if memory serves.

lol i just dont get it, some you know there will be issues down the line with Vram allocation, but you just buy anyway, lockdown doing strange things to some.:)

Games are not designed to cause vRAM issues. Every game is designed to live within the most common vRAM limits. Considering consoles are designed to have upto 10GB of vRAM. Issues wont exist with most games. At most you wont be able to run the higher texture settings.
 
Last edited:
If its ingame its most likely used. As far as I know games will use extra vRAM to cache textures so there is no need to load them. This is why more vRAM can be better up to a point. Also the bigger the vRAM size the lower level of compression that may be required. Up to a point were you dont need compression at all. Waiting for features like Direct ML to upscaling textures, wont help at the moment. Also no games are likely to use it for some time and the game has to support the feature. The worst outcome is the GPUs will run a hybrid mode where the drivers/GPU start streaming texture data from system RAM over the PCIe bus to make up for the "missing" RAM. This is a fps killer. 99% of the time this is not an issue because you're using more memory than necessary. As the 3070 with 8GB has no fps issues ingame with th HD texture pack. I would guess more streaming of textures is taking place and the gpu has enough RAM. This would imply that the 3080 has enough RAM but can load more textures.

You can always create a game that uses more memory than current GPU's can handle. Imagine if all games required 24GB of vRAM for the best textures. That would be a lot of effort for a small number of customers. It would also leave every other gpu AMD or nvidia unable to run the game as intented. At least until gpu's with more vRAM become common.



Nvidia marketed the RTX 3070 as a 1440p card, if memory serves.



Games are not designed to cause vRAM issues. Every game is designed to live within the most common vRAM limits. Considering consoles are designed to have upto 10GB of vRAM. Issues wont exist with most games. At most you wont be able to run the higher texture settings.

Jesus. I've got to say, you're incredibly devoted to defending the 3080's 10GB VRAM issue. So many posts from you in this thread, fearlessly defending your 10GB honour.

I wonder how long you'll keep it up, weeks, months, years? :eek:
 
Jesus. I've got to say, you're incredibly devoted to defending the 3080's 10GB VRAM issue. So many posts from you in this thread, fearlessly defending your 10GB honour.

I wonder how long you'll keep it up, weeks, months, years? :eek:
I do lfind it interesting how the narrative is subtly and slowly changing from where it started at "10GB VRAM is enough for this generation, period" to statements like: " Issues wont exist with most games. At most you wont be able to run the higher texture settings."

So now we are fine with people not being able to run higher texture settings within 12 months of buying an £800 GPU? When an AMD 6800XT with 16GB VRAM will do so? The level of self-rationalization and confirmation bias in this thread is strong.

I have not bought any card this generation and have had Nvidia for the last 5 years so have zero reason to want to criticize the 3080... but everyone in the community who is honest with themselves knows 10GB is skimping it. If I didn't genuinely believe that then I would easily have bought one by now.

It's going to be a bit painful to watch 3080 10GB owners slowly come around to this realization next year as increasingly visually complex games are announced and Nvidia rush in the higher VRAM card refreshes in Q1 to cope with Big Navi.
 
Last edited:
Haha, I remember that :)

The good ol' days when GPU stock was better and a handshake was welcomed :)

Lol, JD back from the dead to send me to hell. :D

I see this as me honouring the great LtMatt. To some people in this world, you're a legend. (least you had the balls to have a laugh, Tommy did too until the Scottish winds hit him and I think they returned inside)

:D :D
 
The good ol' days when GPU stock was better and a handshake was welcomed :)
Yeah :)


I do lfind it interesting how the narrative is subtly and slowly changing from where it started at "10GB VRAM is enough for this generation, period" to statements like: " Issues wont exist with most games. At most you wont be able to run the higher texture settings."
Lol at this guy. Making his own narrative as he goes along. A few cards short of a full deck it seems :p
 
I do lfind it interesting how the narrative is subtly and slowly changing from where it started at "10GB VRAM is enough for this generation, period" to statements like: " Issues wont exist with most games. At most you wont be able to run the higher texture settings."

So now we are fine with people not being able to run higher texture settings within 12 months of buying an £800 GPU? When an AMD 6800XT with 16GB VRAM will do so? The level of self-rationalization and confirmation bias in this thread is strong.

I have not bought any card this generation and have had Nvidia for the last 5 years so have zero reason to want to criticize the 3080... but everyone in the community who is honest with themselves knows 10GB is skimping it. If I didn't genuinely believe that then I would easily have bought one by now.

It's going to be a bit painful to watch 3080 10GB owners slowly come around to this realization next year as increasingly visually complex games are announced and Nvidia rush in the higher VRAM card refreshes in Q1 to cope with Big Navi.
I think people will be more likely having to turn down settings at 4k to hit 60FPS rather than having to turn down because the Vram is running out.

Take watchdogs legion and Assassin's creed valhalla which both struggle to hit 60FPS even with a 3090 at 4k.
 
Do you think the 3080 has enough GPU horsepower to max all games now if not for the vram buffer?
Do I think that a flagship card of a new generation is going to have enough power to max a game of this or the new generation? If not... don't you think it would be a bit... weird? At 1440p at least this should be achievable to fully max any game at Ultra. 4k60fps should be achievable in the majority of well developed and optimised games.

RT is the only thing that I have my doubts about any card from Nvidia or AMD fully maxing depending how its implemented.

I think people will be more likely having to turn down settings at 4k to hit 60FPS rather than having to turn down because the Vram is running out.

Take watchdogs legion and Assassin's creed valhalla which both struggle to hit 60FPS even with a 3090 at 4k.
Both are Ubisoft unoptimised pieces of crap. They do it every single generation.
 
Last edited:
Do I think that a flagship card of a new generation is going to have enough power to max a game of this or the new generation? If not... don't you think it would be a bit... weird? At 1440p at least this should be achievable to fully max any game at Ultra. 4k60fps should be achievable in the majority of well developed and optimised games.

RT is the only thing that I have my doubts about any card from Nvidia or AMD fully maxing depending how its implemented.


Both are Ubisoft unoptimised pieces of crap. They do it every single generation.
Horizon Zero Dawn, Crysis Remastered. I fully expect Cyberpunk to be the same. All PC games are unoptimised. That should be held as the baseline.
 
What if you are happy to play at 40-60 at 4K maxed on games like watchdogs or gta etc but now you have to reduce texture quality purely because of a lack of vram.

Anyway the 3080ti is warming up so the 3080 is getting closer and closer to midrange :p
 
Status
Not open for further replies.
Back
Top Bottom