• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
8 Jun 2005
Posts
5,193
I’m only running at 3440x1440 and I’m using more than 10GB of VRAM as I write this so I definitely won’t be buying a card with less VRAM than I have.
 
Associate
Joined
29 Sep 2011
Posts
43
Location
UK, Kent
From recent Benchmarks shown on Videocardz website the 3080 can sustain 60fp+ @ 4k. Would be nice to see what the VRAM usage is during the reviews process. Before the announcements I saw on WCCftech and on Moores Law is Dead there is a rumoured 3080 sku with 20GB. I suspect that when AMD announces big Navi with bigger vram than the 3080's, Nvidia will launch this card. Hence why I am waiting till the release of AMD's cards to launch before I pull the trigger
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London
From recent Benchmarks shown on Videocardz website the 3080 can sustain 60fp+ @ 4k. Would be nice to see what the VRAM usage is during the reviews process. Before the announcements I saw on WCCftech and on Moores Law is Dead there is a rumoured 3080 sku with 20GB. I suspect that when AMD announces big Navi with bigger vram than the 3080's, Nvidia will launch this card. Hence why I am waiting till the release of AMD's cards to launch before I pull the trigger
Thing is they will, but will also want more money for it. If I knew waiting meant I would get the 20gb version just before cyberpunk 2077 is out for £649, I would wait. But this is pretty much certainly won’t happen.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London
Yea a 20GB version would be around £800 I reckon.
That is just too much. And all for what? The possibility that I may need to use nightmare instead ultra nightmare texture setting? Yea, I think not, especially when I will only upgrade again when the next gen is out so won’t be a vram issue by then. Oh and the fact that in majority of cases one needs a magnifying glass on a still shot to see the difference between the two textures.

Each to their own though, for some people maximum no matter if it makes any difference in image quality or not is a must. I don’t suffer from that problem. I enjoy to tinker and am happy to use one setting lower if when in game I can’t see a difference between the two.
 
Associate
Joined
1 Jun 2015
Posts
74
That is just too much. And all for what? The possibility that I may need to use nightmare instead ultra nightmare texture setting? Yea, I think not, especially when I will only upgrade again when the next gen is out so won’t be a vram issue by then. Oh and the fact that in majority of cases one needs a magnifying glass on a still shot to see the difference between the two textures.

Each to their own though, for some people maximum no matter if it makes any difference in image quality or not is a must. I don’t suffer from that problem. I enjoy to tinker and am happy to use one setting lower if when in game I can’t see a difference between the two.

Isn’t that why we’re all here though?
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
That is just too much. And all for what? The possibility that I may need to use nightmare instead ultra nightmare texture setting? Yea, I think not, especially when I will only upgrade again when the next gen is out so won’t be a vram issue by then. Oh and the fact that in majority of cases one needs a magnifying glass on a still shot to see the difference between the two textures.

Each to their own though, for some people maximum no matter if it makes any difference in image quality or not is a must. I don’t suffer from that problem. I enjoy to tinker and am happy to use one setting lower if when in game I can’t see a difference between the two.

The only thing to consider is the rumours are that the new AMD cards are indeed 16GB cards.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,559
Location
Greater London
Isn’t that why we’re all here though?
Not sure what you mean, elaborate please.

I game at 4K, that already looks much better that 90% of gamers which are on lower resolutions. That makes a MUCH bigger difference to IQ in my opinion than choosing nightmare instead of ultra nightmare textures. If you don’t believe me, if you have doom go try it and come back and show us the difference between the two, you will struggle I recon.


The only thing to consider is the rumours are that the new AMD cards are indeed 16GB cards.
Not my concern, I have 2. Gsync panels here and want to go nvidia on this occasion. I doubt nvidia will lower prices much below £649 unless AMD really try and go competitive and offer 3080 performance with 16gb for say £499 or something which I can’t see them doing. It is possible, but AMD these days don’t go down that route with their GPU’s.

I will order the FE on release and take my chances.


You can ust whatever justification you like but I'm not moving down to a 10gb card after having 11gbs for over 3 years.
That‘s your call, no one is trying to convince you to do otherwise. I recommend buying a 3090 24GB’s! ;)

Just ask Kaapstad who has had 24GB TitanRTX’s for around 2 years how useful that has been :p
 

Stu

Stu

Soldato
Joined
19 Oct 2002
Posts
2,739
Location
Wirral
20GB of GDDR6x will really push the price up! Surely that will be approaching 3090? I'm not convinced the performance increase will be justified for the cost.
 
Associate
Joined
1 Jun 2015
Posts
74
Not sure what you mean, elaborate please.

I game at 4K, that already looks much better that 90% of gamers which are on lower resolutions. That makes a MUCH bigger difference to IQ in my opinion than choosing nightmare instead of ultra nightmare textures. If you don’t believe me, if you have doom go try it and come back and show us the difference between the two, you will struggle I recon.



Not my concern, I have 2. Gsync panels here and want to go nvidia on this occasion. I doubt nvidia will lower prices much below £649 unless AMD really try and go competitive and offer 3080 performance with 16gb for say £499 or something which I can’t see them doing. It is possible, but AMD these days don’t go down that route with their GPU’s.

I will order the FE on release and take my chances.



That‘s your call, no one is trying to convince you to do otherwise. I recommend buying a 3090 24GB’s! ;)

Just ask Kaapstad who has had 24GB TitanRTX’s for around 2 years how useful that has been :p

May have misunderstood, I meant that the whole purpose of this forum is too get the best graphical quality we can. Whether that be fps or resolution.
 
Associate
Joined
12 Sep 2020
Posts
2
With Ampere Tensor compression, the 3080 will use 20-40% less VRAM. So it has effectively anywhere between 12-14GB of actual VRAM of Turing. Then we have the RTX I/O system with directstorage api from Microsoft, which will allow the GPU to request data directly from the storage device at several times order of speed than is currently possible, so this means less VRAM caching is needed since it can just forget what it doesn't need and pull what it does into it's buffer near instantly. All of these systems working together mean that 10GB of VRAM is more than enough for 4K gaming.

I really don't think NVIDIA engineers would purposely gimp their flagship GPU with only 10GB if they didn't know what they were doing. People worry too much. Numbers sell because too many people are ignorant of how technology works.
 
Status
Not open for further replies.
Back
Top Bottom