• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
You would be able to use all of it if you didn't throw things away every time something new comes out.

As for is 10gb enough?

Ask yourself this. If it's enough then why were cards with double the VRAM being rumoured before the card even launched?

10GB is plenty for 1080P and 1440P. These are the most popular resolutions used by the 3080 buyers from what I've seen over the last few days.

It's just enough for 4K now - though unknown how long it will continue to be enough for. I'm personally not risking it, it will be a 3090 for me, as long as it's 15% faster than a 3080.
 
You would be able to use all of it if you didn't throw things away every time something new comes out.

As for is 10gb enough?

Ask yourself this. If it's enough then why were cards with double the VRAM being rumoured before the card even launched?
Lol. Silly question.
 
A game modding tool called Special K has proven to be able to make this distinction accurately in many games, often to know more details you need lower level access to engine tools. FS2020 has dev tools that display actual used, both that and Special K show FS2020 at 9.5Gb usage in 4k ultra, compared to all the regular GPU measurement tools which all repeort about 12.5

You simply cannot play these games in those settings on a 3080, the GPU blows out way way before the vRAM does.

Where does the extra 3GB come from? Is this the game engine requesting it as reserve or is it a bug in windows that assigns more than is needed?
Lets assume that the game engine is requesting 12.5GB as reserve. Are you saying that game developers don't know how to do their job and are requesting an excessive amount VRAM that they do not need?

Has someone played through the whole of FS2020 to confirm that 9.5GB is the maximum it will ever need? Or did they benchmark a section and think that they can extrapolate it to apply to the rest of the game?
What about other games, have people benchmarked the entire game to check how VRAM requirements fluctuate?

Edit: As a sidenote, it is kind of funny to watch some people go from saying how we should listen to the Nvidia engineer because they know more than us. To they themselves knowing the requirements for a game engine better than the developer of said engine.
 
I can't wait to see all you lot saying 10GB is fine double dipping and selling the 10GB 3080's for the larger Vram versions in a few months. A master stroke by Nvidia....;)
 
Last edited:
Exactly, the 1080ti is a good example of this. The owners had that extra 3gb of vram for years and many looking to upgrade now will have never benefited from it, yet they paid a premium for those memory chips on the card.

That said, i dont think anyone was really interested in that card for the vram, it was the GPU horsepower people were after would be my guess. So im not taking a dig at those people, because of that odd number of 11gb my suspiscion is the a4chitecture didnt allow for many good vRAM configs on the board and they chose to overshoot rather than undershoot. But the principle of having hardware you never used, but paid for, is itself a bitter pill or should be.

It's not just about capacity. The memory bus used has a direct correlation to performance. Higher memory bus sizes need more physical memory chips on the PCB. Depending on what size modules are available, this may simply require them to have more memory chips (and thus capacity) on the card.
 
Where does the extra 3GB come from? Is this the game engine requesting it as reserve or is it a bug in windows that assigns more than is needed?
Lets assume that the game engine is requesting 12.5GB as reserve. Are you saying that game developers don't know how to do their job and are requesting an excessive amount VRAM that they do not need?

Has someone played through the whole of FS2020 to confirm that 9.5GB is the maximum it will ever need? Or did they benchmark a section and think that they can extrapolate it to apply to the rest of the game?
What about other games, have people benchmarked the entire game to check how VRAM requirements fluctuate?

Edit: As a sidenote, it is kind of funny to watch some people go from saying how we should listen to the Nvidia engineer because they know more than us. To they themselves knowing the requirements for a game engine better than the developer of said engine.

I do agree. Unfortunately Nvidia have a long term strategy which will only play it's self out over time.

Like a chess player that knows they will play a certain opening move but chooses to play it out of order.

We can only speculate and make educated guesses.
 
Has someone played through the whole of FS2020 to confirm that 9.5GB is the maximum it will ever need?

I havn't yet played all the maps on FS2020 but i did have an extremely interesting Map sent to me during testing. Up until the point i was sent this Map, 9.5GB was roughly around the most Vram used on my 1080ti. I was sent a Map that was DX12 enabled, i know it was because all 32 threads of my 3950x were used as was just over 26GB of system ram as well. The interesting thing though was the amount of Vram used...........all 11GB. As yet FS2020 isn't DX12 enabled, but it will be at some point that's for sure. I'm speculating that a DX12 enabled FS2020 will use as much Vram as a card has. If my 1080ti's Vram was maxed out, i'm guessing that a card with 20GB of Vram could certainly be using most of it.
 
The subject that just won't die!

FS2020 it should be said again, is bit of a freak when it comes to games. I'll say it again, Horizon Zero Dawn on my 1080ti @ 3440 x 1440p allocates 10.5GB of my 11GB vram, but runs like a dog compared to the 10GB 3080. It's not even close in performance.
 
Latest version of MSI Afterburner can also show dedicated memory per process instead of allocated memory.

Good news everyone, MSI Afterburner developer Unwinder has finally added a way to see per process VRAM in the current beta!

  1. Install MSI Afterburner 4.6.3 Beta 2 Build 15840 from https://www.guru3d.com/files-details/msi-afterburner-beta-download.html
  2. Enter the MSI Afterburner settings/properties menu
  3. Click the monitoring tab (should be 3rd from the left)
  4. Near the top and next to "Active Hardware Monitoring Graphs" click the "..."
  5. Click the Checkmark next to "GPU.dll", and hit OK
  6. Scroll down the list until you see "GPU Dedicated Memory Usage", "GPU Shared Memory Usage", "GPU Dedicated Memory Usage \ Process", "GPU Shared Memory Usage \ Process"
  7. Pick and choose what you want to be tracked using the checkmarks next to them. "GPU Dedicated Memory Usage \ Process" is the # that most closely reflects the # we find in FS2020 Developer Overlay and Special K (DXGI_Budget, except Unwinder uses D3DKMT api)
  8. Click show in On-Screen Display, and customize as desired.
  9. ???
  10. Profit
A7m4BY6.png

Source: Darktalon @ ResetEra
 
for 4 k it might be ok for 12-18months if that but for 1080/1440p yeah fine christ I play games on my 1080 with 8gb and only red dead comes close on 1440p with everything on.
 
Whilst I do pretty much agree with everything you've said there, I strongly predict there will be plenty of 3080 buyer's remorse going around in a few months when the bigger Vram cards drop. You watch those 3070Ti's with 16GB Vram fly out of stock quicker than the 3080!

If Nvida had made the 3080 have just a bump to 12GB of Vram, none of this discussion would be happening. It's a 'psychological fail' a little bit in my book to have gone with 10.
The 3070ti with 16gb of memory needs more because it's slower memory. It's like comparing a bog standard old 1.6 petrol with a brand new 1 litre turbo. Apples and oranges.
 
Where does the extra 3GB come from? Is this the game engine requesting it as reserve or is it a bug in windows that assigns more than is needed?
Lets assume that the game engine is requesting 12.5GB as reserve. Are you saying that game developers don't know how to do their job and are requesting an excessive amount VRAM that they do not need?

Has someone played through the whole of FS2020 to confirm that 9.5GB is the maximum it will ever need? Or did they benchmark a section and think that they can extrapolate it to apply to the rest of the game?
What about other games, have people benchmarked the entire game to check how VRAM requirements fluctuate?

Edit: As a sidenote, it is kind of funny to watch some people go from saying how we should listen to the Nvidia engineer because they know more than us. To they themselves knowing the requirements for a game engine better than the developer of said engine.

The extra memory allocation comes from developers writing the engine so it reserves an estimated block of memory in vRAM which is larger that what it knows it needs, and then the game engine itself interneally managed what is put into that vRAM so it's abstracted away from the hardware. The engine just see's a big list of memory addresses it can use that don't conflict with other processes using the GPU, and from the GPUs standpoint once it's assigned to an app it's "in use" and unavailable to any other process unless later released.

How each developer uses that vRAM in their engine is going to be unique to them, in the old days you'd just throw the "level" assets in there required for everything in the game space you're in, and game spaces were separated by levels, between which the "loading" process flushed vRAM and then loaded the next lot. As engines became more sophisticated and game installs (the assets) grew far larger than vRAM can cope, it became more of a buffer into which you do predictive streaming of assets as players pass between zones. Once that concept was mastered you could in theory have infinite content and just stream what you need which is why games, especially open world ones went from 4-5Gb to like 100+gb in a short few years.

But the fact is no one other than the engine engineers really know how this works deep down, lots of trade secrets I'm sure, even the developers don't really know, the game devs have abstracted tools to allow them to zone and put in loading/streaming areas, but almost certainly have no idea what the engine is actually doing in vRAM. Point is that vRAM usage and game assets kinda just became decoupled, and more and more of that vRAM is now being dedicated to purely what the GPU needs to render the next frame, the better that prediction gets the more vRAM is spent on that, rather than a large dumb cache. iDTech5+ made use of this last gen and the next gen console (and microsoft DirectStorage) will continue to capitalize on this into the next generation (in the case of Nvidia they've integrated this as RTX IO)

On the FS2020, the benchmarks were pulled from generic list of benchmarks so you'd assume they're representative but I don't know that for sure. But what I do know is that if it's not, you fly over some area that hypothetically needs 14Gb of vRAM then the vRAM wont crap out first, the GPU will, it'll choke trying to provide you with a fast enough playable frame rate.

It's not just about capacity. The memory bus used has a direct correlation to performance. Higher memory bus sizes need more physical memory chips on the PCB. Depending on what size modules are available, this may simply require them to have more memory chips (and thus capacity) on the card.

Yeah this is what I alluded to earlier, when I said architecture. Fundamentally if you pick some bus width for your GPU/Memory and chips are available in only certain sizes then you end up with a fixed list of candidate memory configs for the card, and my bet is that whatever the next config below 11Gb for the 1080Ti would have been too small, and it's better to over provision the memory than under provision it. The result is a more expensive card because at the end of the day the extra memory costs money so it adds to the already heavy premium of the card, but what are you gonna do? It's an architecture limitation that's different for each card depending on what RAM/BUS width you pick and what chips are available at the time.
 
As I understand it ...

With a high level API (OpenGL & DirectX 11) the developer has limited control over how much GPU memory will actually be used. You might ask for the memory for a texture of a particular size, the drivers might just allocate extra for their behind the scenes work and that amount could change with the users control panel settings too. Overall, more VRAM would be allocated than actually used anyway. The developer might still be able to tally the memory they explicitly allocated, but as users we can basically only see the total allocated by application + OS + drivers.

There might also be caching involved at the driver level, as it very much seems like a GPU with more memory will use more memory even at the same settings; it still doesn't mean that VRAM is needed.

With a low level API the developer has to be explicit about everything as you do your own memory management. You wouldn't just know how much memory your textures need, but all your buffers etc, and you can present these figures more directly to the user.
 
Honestly unless your card is **** I don't know why anyone would buy a 3080, your just going get mugged off when the 16gb or 20gb 3080s comes out

Nah, honestly, that 10GB is plenty. You'll run out of grunt before you run out of VRAM.

Particularly getting a 3090 is like inviting the burglars into your house and then waving them off and wishing them well as they take off with your precious money.
 
As I understand it ...

With a high level API (OpenGL & DirectX 11) the developer has limited control over how much GPU memory will actually be used. You might ask for the memory for a texture of a particular size, the drivers might just allocate extra for their behind the scenes work and that amount could change with the users control panel settings too. Overall, more VRAM would be allocated than actually used anyway. The developer might still be able to tally the memory they explicitly allocated, but as users we can basically only see the total allocated by application + OS + drivers.

There might also be caching involved at the driver level, as it very much seems like a GPU with more memory will use more memory even at the same settings; it still doesn't mean that VRAM is needed.

With a low level API the developer has to be explicit about everything as you do your own memory management. You wouldn't just know how much memory your textures need, but all your buffers etc, and you can present these figures more directly to the user.

It's sort of limit and sort of not, I mean the engine can request low level things to be done by the GPU which may result in the GPU just managing the memory for it, but in terms of something like the game caching in game assets such as textures and models, it can just tell the GPU what it needs and then internally manage that memory. Low level access to the GPUs while more efficent can't be done with DirectX as an API and then lots of hardware on top of that, it's the downside of PCs, and one benefit of consoles, that it's fixed hardware and devs can tinker directly and get more performance.

But the OS and drivers and whatnot abstract the engines use away from the rest of the system, but modding tools that hook the game can inspect memory usage and that's how we can use tools like Special K to see what they're actually doing inside their allocated vRAM, often lots of it a left simply unused, or used by assets which are pointless, that aren't being used in active rendering.
 
Nah, honestly, that 10GB is plenty. You'll run out of grunt before you run out of VRAM.

I'm surprised more people haven't mentioned this. Vram isn't the only way a new game can stop you from reaching xx settings and xx frame rates. MSFS 2020 is here, now, and it can't run at 60fps on anything, with any amount of vram.

Do people really think the 3080 would be able to max every new game over the next two years...if it only had more vram?
 
Do people really think the 3080 would be able to max every new game over the next two years...if it only had more vram?

Very much seems like some people think that, yes. I was initially worried about it too (it sounded like it wouldn't be enough) ... but have come to my senses since.
 
You would be able to use all of it if you didn't throw things away every time something new comes out.

As for is 10gb enough?

Ask yourself this. If it's enough then why were cards with double the VRAM being rumoured before the card even launched?

Because people simply don't understand RAM allocation and follow more is better, while at the same time AMD look as though they are releasing 12 and 16GB cards.
 
Status
Not open for further replies.
Back
Top Bottom