• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia recognises 10gb isn't enough - RTX3080 gets a VRAM upgrade

Soldato
Joined
16 Jul 2010
Posts
5,897
Glad I got a 3090FE now, despite the ludicrous price. It's almost worth it just not to have to worry about VRAM issues, and the problems of getting hold of a decent card at all. I'm feel sorry for those who can't get a decent GPU at an affordable price.

Plus I'm playing high-end VR which requires a lot of VRAM and GPU grunt.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
GTA6 and ES6 might come out in 2023-2024 :D, so by that time, I presume they will need 10Gb+ texture on 4K resolution.
HZeroDawn already consumes for me around 10.7Gb of memory at Ultimate Settings :D
Yeah and by then I might be on a RTX 5070 which will no doubt have 16gb minimum anyways :p
 
Associate
Joined
20 Aug 2020
Posts
2,038
Location
South Wales
Been playing the Crysis 2 remastered version and even that ancient game managed to somehow hit between 10.0GB/12GB of dedicated VRAM used, while I had 14GB total allocated. This was at 4K fully maxed with ray tracing. Even if this game has a bug with VRAM usage, I would not be comfortable with only 10GB of VRAM for 4K gaming with high res textures for AAA games going forward, the GPU may be more than powerful enough but it's useless if the VRAM can't keep up.

gN3DPBv.jpg
 
Soldato
Joined
21 Jul 2005
Posts
20,039
Location
Officially least sunny location -Ronskistats
Been playing the Crysis 2 remastered version and even that ancient game managed to somehow hit between 10.0GB/12GB of dedicated VRAM used, while I had 14GB total allocated. This was at 4K fully maxed with ray tracing. Even if this game has a bug with VRAM usage, I would not be comfortable with only 10GB of VRAM for 4K gaming with high res textures for AAA games going forward, the GPU may be more than powerful enough but it's useless if the VRAM can't keep up.

I think when nvidia are in control (sponsor game) or the game is not demanding you will be fine on a card like the 3080. What we have is enthusiasts who are at the cusp of that safe zone (discounting the custom mods people want to add to their games). We are now one year on from the 3080 release and are finally seeing what VRAM amount games are likely needing.

A small minority of games will require or be smoother playing when you have a large amount. According to some, not many people play games at 4k. While this may be somewhat true today, I can see more gamers buying TV's or panels that can play 4k. Don't forget Jensen and the boys said they are 4k gaming cards! I don't expect 3070's and lower to be considered in this bracket as they were always touted to be 1440p and only the reviewers/tech sites throw these resolutions in there 'cos they do.
 
Caporegime
Joined
4 Jun 2009
Posts
31,044
Been playing the Crysis 2 remastered version and even that ancient game managed to somehow hit between 10.0GB/12GB of dedicated VRAM used, while I had 14GB total allocated. This was at 4K fully maxed with ray tracing. Even if this game has a bug with VRAM usage, I would not be comfortable with only 10GB of VRAM for 4K gaming with high res textures for AAA games going forward, the GPU may be more than powerful enough but it's useless if the VRAM can't keep up.

gN3DPBv.jpg

Is there any footage showing a 3080 having issues at 4k because of the lack of vram? Or footage showing equivalent GPUs for raw perf. but with different amounts of vram performing considerably better because of vram?

We have seen plenty of games where more vram will be used if the card has it but reality is it made little to no difference such as resident evil village, hzd etc.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
I'm sure I played both Crysis and FarCry 2 at launch using a 128MB 9800 PRO using a 19" DELL 1600x1200 CRT.
Code:
128 Megabytes =    134,217,728 Bytes
  1 Gigabyte  =  1,073,741,824 Bytes
  2 Gigabytes =  2,147,483,648 Bytes
  4 Gigabytes =  4,294,967,296 Bytes
  8 Gigabytes =  8,589,934,592 Bytes
 10 Gigabytes = 10,737,418,240 Bytes
 16 Gigabytes = 17,179,869,184 Bytes


I feel I should point out that 8GB cards were happily running 4k last year and that 10GB is a 25% increase over 8GB. Which leads me to my ask: what the hell happened to make 10GB not enough other than a console GPU appearing on a smaller node with clocks up to 50% higher trying to cling to a diminishing market share by leveraging it's only plus, 16GB of VRAM?

I've also read comments that you need more VRAM for modern complex systems, yet FC2 proves otherwise...


I'd also suggest that an excess of VRAM doesn't stop LOD issues, which can be easily seen within FC6 videos by watching the trees.
 
Caporegime
Joined
4 Jun 2009
Posts
31,044
Just had a quick google for footage, 3080 10GB vram doesn't appear to be causing any issues that I can see of in crysis 2 at 4k compared to 3080ti, sadly different CPUs though:



Given the price difference, really is mind boggling how people can justify double the cost if just gaming :p

Unless there is something showing different?
 
Soldato
Joined
30 Jun 2019
Posts
7,875
I suspect most would want 16GB on the rtx 3070, given the over the top requirements of Farcry 6, but maybe the game is just poorly optimised with the texture pack.

With cheaper GDDR6, I can't see why not... 8GB vram was the standard for a long time, so I'd like to see an improvement here.
 
Caporegime
Joined
4 Jun 2009
Posts
31,044
I think when nvidia are in control (sponsor game) or the game is not demanding you will be fine on a card like the 3080. What we have is enthusiasts who are at the cusp of that safe zone (discounting the custom mods people want to add to their games). We are now one year on from the 3080 release and are finally seeing what VRAM amount games are likely needing.

A small minority of games will require or be smoother playing when you have a large amount. According to some, not many people play games at 4k. While this may be somewhat true today, I can see more gamers buying TV's or panels that can play 4k. Don't forget Jensen and the boys said they are 4k gaming cards! I don't expect 3070's and lower to be considered in this bracket as they were always touted to be 1440p and only the reviewers/tech sites throw these resolutions in there 'cos they do.

You can use custom mods without issues too. I have been playing skyrim and fallout 4 with a good 30 mods or so (including texture packs) and no issues at 4k/3440x1440. There was only one person who mentioned he had issues with some indie game and was using like 100 mods iirc, not exactly the norm even for pc gamers....

It could be argued that no gpu is really a "4k" gaming card since people are having to make sacrifices for either ray tracing, texture packs (just the one game so far, which if it gets fixed/improved may not even be an issue...) other than maybe a 3090 and then the question comes, was it really worth paying the premium given that the next lot of gpus are around the corner and their mid range versions will probably match/beat it for half the price? Then of course, if you really are a pc enthusiast/gamers, chances are you'll be switching said current gpus out anyway to the point where the lack of ray tracing or/and rasterization grunt and vram won't even be an issue then....
 
Soldato
Joined
21 Jul 2005
Posts
20,039
Location
Officially least sunny location -Ronskistats
You can use custom mods without issues too. I have been playing skyrim and fallout 4 with a good 30 mods or so (including texture packs) and no issues at 4k/3440x1440. There was only one person who mentioned he had issues with some indie game and was using like 100 mods iirc, not exactly the norm even for pc gamers....

It could be argued that no gpu is really a "4k" gaming card since people are having to make sacrifices for either ray tracing, texture packs (just the one game so far, which if it gets fixed/improved may not even be an issue...) other than maybe a 3090 and then the question comes, was it really worth paying the premium given that the next lot of gpus are around the corner and their mid range versions will probably match/beat it for half the price? Then of course, if you really are a pc enthusiast/gamers, chances are you'll be switching said current gpus out anyway to the point where the lack of ray tracing or/and rasterization grunt and vram won't even be an issue then....

Next gen are only round the corner - yet some have had them a year.. don't be at it. 2022 looks strongly like the same short supply and slow supply lines. Really strange observation other than having a dig at those that bought a 3090 I feel.

Games that are like skyrim have been out since 2011 should quite happily play on most GPUs on the market, not really sure the point on that statement. It was Jensen himself who said the flagship (3080) was a 4k card. Excuses all you like on that one, it is targeted to play that - they even showcased gamers in that reveal playing at 8k!

Guys like @TNA played the game well. He will upgrade im sure of it to a decent 40 series card and made excellent use of the codes etc to make the 3080 as cheap as possible. Even bought his 3070 with money in the bank - we get this *applauds*.

Im not really out of pocket myself mind you. My card has cost me under £400 thanks to mining when not using it. I also trade the scraps I make so its more than just mining and selling each week. If you had said to me last November would you take a 3090FE for £350 I wouldn't have believed it.

Dont forget every day that passes is another £5 off my 3090.. so soon it will have cost me nothing!

@Nexus18

Feels worth it to me as a 3090FE owner. After all, I’ve had it for almost a year already and the new cards are probably still a year away.

Somehow I dont think its sinking in...
 
Caporegime
Joined
4 Jun 2009
Posts
31,044
Next gen are only round the corner - yet some have had them a year.. don't be at it. 2022 looks strongly like the same short supply and slow supply lines. Really strange observation other than having a dig at those that bought a 3090 I feel.

Games that are like skyrim have been out since 2011 should quite happily play on most GPUs on the market, not really sure the point on that statement. It was Jensen himself who said the flagship (3080) was a 4k card. Excuses all you like on that one, it is targeted to play that - they even showcased gamers in that reveal playing at 8k!

Guys like @TNA played the game well. He will upgrade im sure of it to a decent 40 series card and made excellent use of the codes etc to make the 3080 as cheap as possible. Even bought his 3070 with money in the bank - we get this *applauds*.

Im not really out of pocket myself mind you. My card has cost me under £400 thanks to mining when not using it. I also trade the scraps I make so its more than just mining and selling each week. If you had said to me last November would you take a 3090FE for £350 I wouldn't have believed it.

Dont forget every day that passes is another £5 off my 3090.. so soon it will have cost me nothing!



Somehow I dont think its sinking in...

You mentioned gpus being on the cusp of vram limits as of right now and said "discounting the custom mods people want to add to their games" implying that you can't use mods on cards with low vram gpus. I'm quite into modding games on the pc and other than bethesada open world games, I'm not sure what other games you can mod to the same extent? GTA 5 redux etc. (which I done and no issues here), not tried rdr 2, anything else?

I would still classify a 3080 as a 4k card still, same way I would classify a 6800xt as a 4k card, even though for both of them you have to reduce some settings (and use dlss/fsr where possible) if you want a good 4k experience, of course a 3090 and 6900xt are even better for 4k, but again, chances are if you are reducing settings on a 3080/6800xt (and enabling dlss/fsr) to achieve high fps on a high refresh rate 4k display, you will also be doing exactly the same on a 3090/6900xt too, it's not really rocket science....

If the likes of you and hrl don't intend on buying another gpu for a good 2/3+ years then no doubt, 3090 is probably the best choice out of all gpus to go for especially when you have mined to recoup some of the cost back but if you only game and intend on getting the next one when it's out, then it is a bit daft imo, of course stock and prices may be an issue again but given most peoples experience, it's really not that hard to get a FE, likes of myself and TNA got one within a couple of weeks after signing up to DC alerts, heck look at the owners threads on here to see how many people are rocking FE cards....
 
Soldato
Joined
21 Jul 2005
Posts
20,039
Location
Officially least sunny location -Ronskistats
There's plenty of users who still post they cant get one, and plenty have been shown the alerts. I would say most of that is true except the numbers of 3080's seem to be hardest to attain as they either dont ship enough or too many want them so they last seconds.

Regarding the 4k stuff, I dont see why people must turn on RT. Yes its a new feature but the 6800XT and 3080 will play awesome without RT... its only those who insist it must be on that gimp their own fps (unless you can run DLSS).
 
Caporegime
Joined
4 Jun 2009
Posts
31,044
There's plenty of users who still post they cant get one, and plenty have been shown the alerts. I would say most of that is true except the numbers of 3080's seem to be hardest to attain as they either dont ship enough or too many want them so they last seconds.

Regarding the 4k stuff, I dont see why people must turn on RT. Yes its a new feature but the 6800XT and 3080 will play awesome without RT... its only those who insist it must be on that gimp their own fps (unless you can run DLSS).

Not doubting it is still hard to get one but it's not exactly that hard if you have have prepped and got everything ready.


Maybe because people who like better/next gen visuals see the benefit of turning ray tracing on?

https://www.youtube.com/c/RayTracingRevolution

If using that kind of argument, we could also say the same for "ultra" settings preset, why use this if we can all just set "high" and have little to no difference in visuals but also often get double the perf. back?
 
Back
Top Bottom