• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

You didn't play Far Cry 6 then on your 3080? I can tell you it was a stuttering mess on my old 3080 at 4k when it bounced around 10gb Vram usage and was unplayable and that was 2021 title, It's only going to get worse as GPU performance increases and higher quality textures are used.

Played it for 15 minutes or so on £1 trail ubipass. Uninstalled and never touched it again. Lol.

That was the only game as I recall that had vram issues back then if you installed the optional textures.

In my experience, by the time a card is out of vram, it is way out of performance. Case in point 3090 users who are selling and upgrading to 16GB cards.

All those games I listed above. Absolutely zero vram issues.
 
Yup VRAM ‘needs’ beyond 16gb are totally overhyped / overplayed. Yet to hear of anyone who actually struggles with this issue in a way that’s ‘game breaking’.

I guess it's more about longevity

Because of high prices, people don't want to buy a new GPU every generation anymore, they want to hold on for 2-3 generations and not have their GPU run out of memory in the meantime
 
Last edited:
I guess it's more about longevity

Because of high prices, people don't want to buy a new GPU every generation anymore, they want to hold on for 2-3 generations and not have their GPU run out of memory in the meantime

They will need to drop settings down due to performance, so I guess it would hurt even more going down a setting on texture too. Lol.

Still though, 3090 users upgrading to 16GB cards. Lol.
 
Played it for 15 minutes or so on £1 trail ubipass. Uninstalled and never touched it again. Lol.

That was the only game as I recall that had vram issues back then if you installed the optional textures.

In my experience, by the time a card is out of vram, it is way out of performance. Case in point 3090 users who are selling and upgrading to 16GB cards.

All those games I listed above. Absolutely zero vram issues.
There are a few for eg as mentioned above, COD I believe does, Star Citizen, Indiana Jones, Not sure about Alan Wake heard it does but I haven't played it so I don't know but yes I do remember regretting the 3080 at the time because of Nvidia going backwards from the 1080ti, While it was only 1 title at the time it did annoy me that Nvidia cheaper out on Vram as they always tend to do lately on lower tier GPUs.

There is also a case for Nvidia holding back development of high quality looking ganes because they keep skimping out on Vram so not as many titles require lots of it.

But I didn't want to make that mistake again hence why I went for the 4090 last gen as I kind of thought I wouldn't be upgrading again this gen so glad I did now as looks like I'll be keeping it for a couple of more years at least.
 
There are a few for eg as mentioned above, COD I believe does, Star Citizen, Indiana Jones, Not sure about Alan Wake heard it does but I haven't played it so I don't know but yes I do remember regretting the 3080 at the time because of Nvidia going backwards from the 1080ti, While it was only 1 title at the time it did annoy me that Nvidia cheaper out on Vram as they always tend to do lately on lower tier GPUs.

There is also a case for Nvidia holding back development of high quality looking ganes because they keep skimping out on Vram so not as many titles require lots of it.

But I didn't want to make that mistake again hence why I went for the 4090 last gen as I kind of thought I wouldn't be upgrading again this gen so glad I did now as looks like I'll be keeping it for a couple of more years at least.

COD does not. It has always allocated more vram than it needed. It goes back all the way to Titan days as i recall. Kaapstad always showed it used up all his vram. Yet it worked perfectly fine on half the vram most others had.

Many games are like that. There are even games that not just allocated but actually use more vram if available, yet manage to work in on less.

In my experience when you truly run out of vram it becomes super obvious.
 
Yes for a while I couldn't get video output. but I have a spare graphics cards so I slotted that in. Completly nuked the nvidia install completely. Disconnected the PC from the internet so windows couldn't;t redownload a driver as you need 572.16 to recognise and fire up 5xxx series cards. Used the Command line interface to reset all the clocks as my 5080 got stuck at 2287 mhz, DDU everything installed my 5080 back in and installed 572.16 fresh.

Why would you need to get into bios? Pci-5 bug? A few peeps have had this.
If I had an issue once the water cooled card was installed then I imagine I would need to get into the bios and be unable to easily swap the GPU out. Makes me worried to update the driver ever again! Hopefully Nvidia put out some sort of statement, as I imagine what you did shouldn’t have necessarily caused such major issues.
 
Still though, 3090 users upgrading to 16GB cards. Lol

The only game that I own that has a setting requiring more than 16gb is Indiana Jones. With a 5080, you need to reduce the texture pool setting to ultra (reduce to ultra... read that again lol) which just means the cache size is smaller... but it's still a massive cache size. In reality this just means higher res textures are drawn in closer to the camera but this has been tested by reviewers and they all say that you can't see the difference. The textures still look identical, it doesn't change the texture quality or anything like that. It's just a cache thing. You could argue that the supreme cache setting is actually completely pointless.

A 3090 can't run Indiana Jones at an acceptable frame rate at the highest settings, even with Lossless Scaling, especially in the Thailand level where it's just unplayable. A 5080 can.

Another use case might be a heavily modded Skyrim. Pretty niche I would say.

The whole 16 vram thing is massively overblown especially when you look at the reality of it in games.
 
Last edited:
There are a few for eg as mentioned above, COD I believe does, Star Citizen, Indiana Jones, Not sure about Alan Wake heard it does but I haven't played it so I don't know but yes I do remember regretting the 3080 at the time

I've played star citizen, cod and Alan wake on my 3080 10gb with zero vram issues. They will load up if you have more vram available but there's no performance or texture pop in issues if you have less vram. Actual vram issues are usually super obvious and bring a card to it's knees, like single digits FPS.

A little counter in the corner saying it's used up all the vram just means the game isn't clearing out as much as it can, it doesn't actually mean you need that much.

Your performance will drop off due to GPU not being enough way before vram actually becomes the bottleneck, outside of a few real niche cases of heavily modded games or MSFS.
 
Last edited:
The only game that I own that has a setting requiring more than 16gb is Indiana Jones. With a 5080, you need to reduce the texture pool setting to ultra (reduce to ultra... read that again lol) which just means the cache size is smaller... but it's still a massive cache size. In reality this just means higher res textures are drawn in closer to the camera but this has been tested by reviewers and they all say that you can't see the difference. The textures still look identical, it doesn't change the texture quality or anything like that. It's just a cache thing. You could argue that the supreme cache setting is actually completely pointless in reality.

A 3090 can't run Indiana Jones at an acceptable frame rate at the highest settings, even with Lossless Scaling, especially in the Thailand level where it's just unplayable. A 5080 can.

Another use case might be a heavily modded Skyrim. Pretty niche I would say.

The whole 16 vram thing is massively overblown especially when you look at the reality of it in games.

Nope. 16GB is not enough. COD and Indiana Jones mate. Oh and maybe Far Cry 6 with its optional texture pack...

:p
 
Don't forget to mention it causes a noticeable reduction in image quality and a significant loss of performance
It depends which mode and what resolution;

"NTC transcoded to BCn" mode showed a negligible reduction in average FPS compared to NTC off, though 1% FPS lows were noticeably better than regular texture compression with NTC disabled.

This mode still reduced vram usage by 64%. And looking at the images in the article I can't see any difference, or any comment that it reduces image quality significantly?

Also bear in mind that actual vram issues usually exhibit as drops to single digits FPS and massive texture pop in issues, so even if it does reduce performance (as the article says from 1800 to 1500 FPS) it may still be preferable to single digits FPS with massive corruption on screen.
 
Last edited:
Tbh if you don’t have a card with 32gb vram, can you even call yourself a ‘gamer’?

If you’re not deliberately cripping your card for something you need to record, zoom in 4x and view in slow motion to see, it’s not really ‘true gaming’.

I take the same approach with magic tricks and guitar solos. I like watching them slowed down to really savour how they are achieved. A true fan, I guess.
 
COD does not. It has always allocated more vram than it needed. It goes back all the way to Titan days as i recall. Kaapstad always showed it used up all his vram. Yet it worked perfectly fine on half the vram most others had.

Many games are like that. There are even games that not just allocated but actually use more vram if available, yet manage to work in on less.

In my experience when you truly run out of vram it becomes super obvious.
Can't remember about COD just what I read somewhere, Yeah I am aware some games just allocate but not needed for the immediate forefront visuals.

But yes it is obvious as the FPS just tanks and stutters, I mean whether it's needed or not for the immediate future remains to be seen but obviously it is holding back devs using high quality textures at 4k, GPUs don't even need to have the performance for high quality textures it's Vram what is needed it's nothing to do with performance if the Vram capacity isn't there they can't implement high quality texture packs because of the mainstream tiers, But I think personally any GPU in 2025 at least even an 80 class should be being released with 16gb it's a bit of a joke tbh on Nvidias part.
 
It depends which mode and what resolution;

"NTC transcoded to BCn" mode showed a negligible reduction in average FPS compared to NTC off, though 1% FPS lows were noticeably better than regular texture compression with NTC disabled.

This mode still reduced vram usage by 64%. And looking at the images in the article I can't see any difference, or any comment that it reduces image quality significantly?

If memory compression is possible with zero overhead, and saves a lot of ram it could interesting. A bit like compressing a excel spreadsheet, they compress to an insane level from say 10MB to 500k or something like that.
 
Back
Top Bottom