• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
My Skyrim SE modded, my most played game of all time, uses 9GB VRAM!

And I don’t have that many mods installed compared to many people. This is at 3440x1440 by the way, not even 4K.

People can play mental gymnastics and new toy self delusion all they want, but 10Gb for a new flagship card just smells ‘off’

So what happens when you try to play your modded game with an 8GB card? It's also worth noting that Ampere has tensor compression running between 20-40%.
 
Well stop slacking and install some more mods!

I'd love to try a heavily modded Skyrim, but the amount of work to get it running puts me off.

Look into Wabbajack, it can automatically install mod lists, the largest of which currently for skyrim se clocks in at 800+ mods. Also has some for Fallout 4 and older bethesda games.
 
Discussed this loads in the Ampere 8nm thread, basically 10gb is fine.

You have to understand the notion of bottlenecks with PCs and gaming, the fact that making one component bigger or better in isolation is a waste. The question really is, if we give a 3080 10Gb worth of assets to do number crunching on, can it do that fast enough for a playable frame rate and all the evidence right now says no, not even close.

We can get FS2020 up to 9.5Gb of usage but it is completely and utterly unplayable. Same for avengers.

We also have to consider the fact that we've been measuring vRAM badly for years, measuring only what the engine requests to be allocated for the game, not internally what that game usefully assigned into that memory and it tends to be a good 20% over estimate with FS2020 9.5Gb actual presents as something like 12.5Gb.

All these other arguments from emotion are tiresome, no one cares what you believe you "deserve" for a £xxx card, or what was on last gen. Or being upset about taking a "step backwards" because you went from an 11gb card (which never made use of it) back to a 10.

These are all ememotional arguments, what you should care about is "does the GPU have enough vRAM to service it at playable frame rates?"

10Gb looks fine, yes future games will demand upwards of 10gb when maxed, but they'll also be so demanding on the GPU you wont be able to run them at those settings, and as you dial back the settings you dial back the space needed in vRAM.
 
Discussed this loads in the Ampere 8nm thread, basically 10gb is fine.

You have to understand the notion of bottlenecks with PCs and gaming, the fact that making one component bigger or better in isolation is a waste. The question really is, if we give a 3080 10Gb worth of assets to do number crunching on, can it do that fast enough for a playable frame rate and all the evidence right now says no, not even close.

We can get FS2020 up to 9.5Gb of usage but it is completely and utterly unplayable. Same for avengers.

We also have to consider the fact that we've been measuring vRAM badly for years, measuring only what the engine requests to be allocated for the game, not internally what that game usefully assigned into that memory and it tends to be a good 20% over estimate with FS2020 9.5Gb actual presents as something like 12.5Gb.

All these other arguments from emotion are tiresome, no one cares what you believe you "deserve" for a £xxx card, or what was on last gen. Or being upset about taking a "step backwards" because you went from an 11gb card (which never made use of it) back to a 10.

These are all ememotional arguments, what you should care about is "does the GPU have enough vRAM to service it at playable frame rates?"

10Gb looks fine, yes future games will demand upwards of 10gb when maxed, but they'll also be so demanding on the GPU you wont be able to run them at those settings, and as you dial back the settings you dial back the space needed in vRAM.

Whilst I do pretty much agree with everything you've said there, I strongly predict there will be plenty of 3080 buyer's remorse going around in a few months when the bigger Vram cards drop. You watch those 3070Ti's with 16GB Vram fly out of stock quicker than the 3080!

If Nvida had made the 3080 have just a bump to 12GB of Vram, none of this discussion would be happening. It's a 'psychological fail' a little bit in my book to have gone with 10.
 
Whilst I do pretty much agree with everything you've said there, I strongly predict there will be plenty of 3080 buyer's remorse going around in a few months when the bigger Vram cards drop. You watch those 3070Ti's with 16GB Vram fly out of stock quicker than the 3080!

If Nvida had made the 3080 have just a bump to 12GB of Vram, none of this discussion would be happening. It's a 'psychological fail' a little bit in my book to have gone with 10.

I actually agree, much of it is psychological, but Nvidia have to balance hype and marketing, as well as costs, more memory on the board = more cost. Which makes them less attractive vs say AMD.

There will no doubt be buyers remorse, I mean, in general in the tech sector epeen only lasts 6 months tops anyway. And maybe on top of that more general remorse there will be people salty about the new cards plausibly having more memory. But the real question is, is that remorse just emotional, or is it rational, is it because theres some new fangled game out that the 3080 could make look more pretty but cant due to vRAM limitations. I doubt it very much, what little evidence we have now shows by the time youre close to 10gb youre deep into unplayable territory.
 
Last edited:
Whilst I do pretty much agree with everything you've said there, I strongly predict there will be plenty of 3080 buyer's remorse going around in a few months when the bigger Vram cards drop. You watch those 3070Ti's with 16GB Vram fly out of stock quicker than the 3080!

If Nvida had made the 3080 have just a bump to 12GB of Vram, none of this discussion would be happening. It's a 'psychological fail' a little bit in my book to have gone with 10.
I got my 3080 on Friday and I don’t have any remorse now and I won’t in the future, 10GB of vRam is fine now and for as long as I will keep my 3080 it will be fine. Because that is a fact a 16gb or 20GB card with similar performance won’t make me covet it as I know it won’t be useful for what I use it for I.e. gaming at 4K. Btw it’s an amazing card so hope you all can enjoy one sooner than later.
 
I actually agree, much of it is psychological, but Nvidia have to balance hype and marketing, as well as costs, more memory on the board = more cost. Which makes them less attractive vs say AMD.

There will no doubt be buyers remorse, I mean, in general in the tech sector epeen only lasts 6 months tops anyway. And maybe on top of that more general remorse there will be people salty about the new cards plausibly having more memory. But the real question is, is that remorse just emotional, or is it rational, is it because theres some new fangled game out that the 3080 could make look more pretty but cant due to vRAM limitations. I doubt it very much, what little evidence we have now shows by the time youre close to 10gb youre deep into unplayable territory.

I agree with you completely- thanks for voicing my opinion so well. It’s nice to know others think the same way lol
 
Only way I would feel some buyer's remorse is if Nvidia release a 3080 20gb for £649 this year. As that is highly unlikely to happen, I am unlikely to feel remorse :D
 
Granted you won't have buyer's remorse yet. You just wait :p

Thing is with actual Vram usage vs allocation - no one seems to really know do they.

I use HWinfo64 in widgets on a separate mini screen to monitor all my system stats when gaming etc (sad I know but I'm in good company on here :D)

For the GPU it shows 'amount' of GPU VRam allocated in GB's, but also memory 'usage' in a separate % figure. The thing is, doing the basic maths the % it says in actual use pretty much matches the amount it says is being allocated. So this means that either:

1. it tallies and games really are using the amount allocated, or
2. the % usage and amount allocated figures are basically the same thing, being read off the same point of data???
 
Granted you won't have buyer's remorse yet. You just wait :p

Thing is with actual Vram usage vs allocation - no one seems to really know do they.

I use HWinfo64 in widgets on a separate mini screen to monitor all my system stats when gaming etc (sad I know but I'm in good company on here :D)

For the GPU it shows 'amount' of GPU VRam allocated in GB's, but also memory 'usage' in a separate % figure. The thing is, doing the basic maths the % it says in actual use pretty much matches the amount it says is being allocated. So this means that either:

1. it tallies and games really are using the amount allocated, or
2. the % usage and amount allocated figures are basically the same thing, being read off the same point of data???
Look at a 2080Ti vs 3080 vid on YT. The 2080Ti is “using” more vram then the 3080 but it’s FPS is much lower than the 3080.
 
Granted you won't have buyer's remorse yet. You just wait :p

Thing is with actual Vram usage vs allocation - no one seems to really know do they.

I use HWinfo64 in widgets on a separate mini screen to monitor all my system stats when gaming etc (sad I know but I'm in good company on here :D)

For the GPU it shows 'amount' of GPU VRam allocated in GB's, but also memory 'usage' in a separate % figure. The thing is, doing the basic maths the % it says in actual use pretty much matches the amount it says is being allocated. So this means that either:

1. it tallies and games really are using the amount allocated, or
2. the % usage and amount allocated figures are basically the same thing, being read off the same point of data???

A game modding tool called Special K has proven to be able to make this distinction accurately in many games, often to know more details you need lower level access to engine tools. FS2020 has dev tools that display actual used, both that and Special K show FS2020 at 9.5Gb usage in 4k ultra, compared to all the regular GPU measurement tools which all repeort about 12.5

You simply cannot play these games in those settings on a 3080, the GPU blows out way way before the vRAM does.
 
I would have no remorse saving at least £150 and experiencing no drawback from 10GB Vram. I would have gloat you wasted all that money for no point.

Exactly, the 1080ti is a good example of this. The owners had that extra 3gb of vram for years and many looking to upgrade now will have never benefited from it, yet they paid a premium for those memory chips on the card.

That said, i dont think anyone was really interested in that card for the vram, it was the GPU horsepower people were after would be my guess. So im not taking a dig at those people, because of that odd number of 11gb my suspiscion is the a4chitecture didnt allow for many good vRAM configs on the board and they chose to overshoot rather than undershoot. But the principle of having hardware you never used, but paid for, is itself a bitter pill or should be.
 
Exactly, the 1080ti is a good example of this. The owners had that extra 3gb of vram for years and many looking to upgrade now will have never benefited from it, yet they paid a premium for those memory chips on the card.

That said, i dont think anyone was really interested in that card for the vram, it was the GPU horsepower people were after would be my guess. So im not taking a dig at those people, because of that odd number of 11gb my suspiscion is the a4chitecture didnt allow for many good vRAM configs on the board and they chose to overshoot rather than undershoot. But the principle of having hardware you never used, but paid for, is itself a bitter pill or should be.

You would be able to use all of it if you didn't throw things away every time something new comes out.

As for is 10gb enough?

Ask yourself this. If it's enough then why were cards with double the VRAM being rumoured before the card even launched?
 
Status
Not open for further replies.
Back
Top Bottom