• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
No issue with Cold War @ 4k with HQ pack Ultra RT and Quality DLSS, no stuttering or hitching and using just over 8gb of Vram.
Thanks.

Is that running single player or Multi player?

Curious what happens if you disable DLSS and run native with regards to usage.

Feel free to disable RT if you need to get playable FPS with DLSS off, just trying to mimic the settings i used in single player.

I see lower usage in MP for some reason.
 
The biggest "threat" to vram buffers appears to be "texture packs".

The problem with trying to buy enough vram for texture packs is that it sounds like they eat up as much ram as they are coded for.

Pick a vram buffer size....any size....and a texture pack can be made that saturates it.

In the mean time, the 3080 still lacks the GPU horspower needed to max-out my favorite sim from a few years ago. The maximum amount of vram I can even allocate is under 8gb and that's with the 3080 limping along with a stuttery, unplayable puke-fest in VR at max settings.

Smooth, playable settings use between 5 and 6gb of vram with the GPU running in the high 90's load most 9f the time.

GPU horsepower is the bottleneck in my use case...not vram.
 
@TNA Sweet, I have just managed to buy a 3080 FE!

Can you tell me if the card comes with the right power connector dongle or do I need to order one?

---

Well it's not confirmed yet, but I ordered it and it went through, but it's not a home run just yet.

Well done! You were obviously quicker than me!
 
The biggest "threat" to vram buffers appears to be "texture packs".

The problem with trying to buy enough vram for texture packs is that it sounds like they eat up as much ram as they are coded for.

Pick a vram buffer size....any size....and a texture pack can be made that saturates it.

This should improve this year with MS DirectStorage and ResizableBAR support implementations. We'll end up with multiple smaller texture files and with better compression.
 
The graphics in single player are much much better than multi player, still no vram issues here though.
Fair enough, could you share the settings used.
I assume you see a green tick in game by the HQ Texture pack?

I mention as i see videos on YT of people claiming to use the HQ Texture pack but don't actually have it installed despite being able to select Ultra Textures form the menu. :p
The biggest "threat" to vram buffers appears to be "texture packs".

The problem with trying to buy enough vram for texture packs is that it sounds like they eat up as much ram as they are coded for.

Pick a vram buffer size....any size....and a texture pack can be made that saturates it.
The tricky part is being able to discover at what point utilisation degrades the experience. And that's the difficult part to test and monitor.

Some people can't detect minor hitches, stuttering, some people won't notice if the game is lowering texture quality so as to not degrade the performance.

It's pretty difficult to test and monitor this it seems. I guess we could monitor 0.1% lows, but monitoring the game changing the Texture quality on the fly is probably hard to detect unless its blatantly obvious.

I never realised until recently that games have started doing this. As if anyone thought the waters could get even muddier, but here we are. :D
 
This should improve this year with MS DirectStorage and ResizableBAR support implementations. We'll end up with multiple smaller texture files and with better compression.

It will get worse over time, not better. As new games are released that are developed for the new consoles, VRAM required will inevitably increase. 3080 is a ticking time bomb @ 4k.
 
Fair enough, could you share the settings used.
I assume you see a green tick in game by the HQ Texture pack?

I mention as i see videos on YT of people claiming to use the HQ Texture pack but don't actually have it installed despite being able to select Ultra Textures form the menu. :p

The tricky part is being able to discover at what point utilisation degrades the experience. And that's the difficult part to test and monitor.

Some people can't detect minor hitches, stuttering, some people won't notice if the game is lowering texture quality so as to not degrade the performance.

It's pretty difficult to test and monitor this it seems. I guess we could monitor 0.1% lows, but monitoring the game changing the Texture quality on the fly is probably hard to detect unless its blatantly obvious.

I never realised until recently that games have started doing this. As if the anyone though the waters could get even muddier, but here we are.


I'm not talking about allocation. I mean that a programmer could actually overload any vram buffer size they want...they could create a texture pack that is unplayable with any hardware on the market today if they chose to.
 
@Dave2150 - Perhaps you can clarify if you saw any issues and how you measured usage?

Btw you don't need per process monitoring, you can just note how much video memory is in use prior to launching the game, then subtract that amount from what the allocation amount gets up to in game. Doing this provides the exact same value as MSI Afterburners Per Process monitoring - I've tested it.

The menu option says that if you notice any stuttering or hitching to reduce from Ultra to High, or to reduce the allocated VRAM from 90% to 80% - But doing that latter can result in downgraded textures or pop-in.

Well so far it seems people are ok. Let’s see what imginy reports back, but I would be surprised if he finds anything.

As for the 3080 being a ticking time bomb comment made by Dave, well I agree. But that is more to do with a lack of rasta and rt grunt more than anything. The power required to run 4K changes all the time as new games come out. The 3080 can’t bloody even run all games 4K60ps in games today. Neither can a 3090. Hence people who will want 4K60fps will no doubt be upgrading to next gen cards that will have more vram by default anyway.


@TNA Sweet, I have just managed to buy a 3080 FE!

Can you tell me if the card comes with the right power connector dongle or do I need to order one?

---

Well it's not confirmed yet, but I ordered it and it went through, but it's not a home run just yet.
Congratz man! You can finally join the club :D

But are you sure 10gb is enough though? :p

Yeah it comes with the cable you need. You only need to buy another cable purely for aesthetics. I did not bother as I likely won’t even have this card in 12 months anyway.
 
It will get worse over time, not better. As new games are released that are developed for the new consoles, VRAM required will inevitably increase. 3080 is a ticking time bomb @ 4k.

Then you don't understand the technology, which is technology that both consoles are adopting to improve performance and longevity. Assets are swapped out more frequently and therefore don't need to be kept in the VRAM for later operations.

I'm not saying that the 3080 is the right choice for high-end 4K gaming, but it's likely to improve as the technology becomes mainstream.
 
It will get worse over time, not better. As new games are released that are developed for the new consoles, VRAM required will inevitably increase. 3080 is a ticking time bomb @ 4k.

I agree. I have already had warnings on my 2070S, so I don't suppose it will be very long before folks will start getting warnings on the 3080.
 
The settings I use for call of duty is everything maxed 4k, texture pack installed, 3 ray tracing settings is max,max,medium and dls on balanced.

The campaign is stunning and locked to 60fps 4k.


As TNA said the performance of this card is what limits it not the VRAM.

Obviously my next screen will be 4k 120fps, this card will run out of grunt long before VRAM.

I will be swapping over to 4080 or whatever next gen card is good value.
 
It will get worse over time, not better. As new games are released that are developed for the new consoles, VRAM required will inevitably increase. 3080 is a ticking time bomb @ 4k.

By new consoles do you mean the recently released ones? They don't offer better hardware than the 3080, so I don't see how games developed for XBX and PS5 are going to be too much for a 3080 at any stage.
 
3080 is a ticking time bomb @ 4k.

This seems rather dramatic. If the bomb exploding means turning down a few settings, well, we are already there...NOW.

The 3080 can't max out Project Cars 2 On the HP Reverb now. I have to turn down settings now. The GPU itself is the bottleneck...right freaking now.

AMS2? same.

ACC? same. (a lot worse actually)

Iracing....the title that's easier to run than most other sims...still bottlenecked by the GPU itself rather than the Vram buffer.

So while people are focusing on isolated instances taxing the vram buffer on the 3080, I have already ran out of GPU horsepower in my use case.

Maybe this is why I don't have this fear of turning down settings in the future. I have already lived through the lowered-settings apocalypse with a 3080, and I don't feel particularly traumatized by the experience.

The 3080 I installed in the PC I just built (for someone else) is still a solid upgrade over the 1080Ti I currently run in my own rig.
 
I just downloaded Watchdogs Legion for a very quick spin waked it on ultra no RT at 1440p and saw it was over the 8GB VRAM budget on my 1080.

So for sure the 3070 will suffer with vram.
 
Last edited:
I just downloaded Watchdogs Legion for a very quick spin waked it on ultra no RT at 1440p and saw it was over the 8GB VRAM budget on my 1080.

So for sure the 3070 will suffer with vram.

The 3070 can't run watch dogs with RT on, it runs out of VRAM which causes the game to run at 10fps
 
Status
Not open for further replies.
Back
Top Bottom