• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
With Ampere Tensor compression, the 3080 will use 20-40% less VRAM. So it has effectively anywhere between 12-14GB of actual VRAM of Turing. Then we have the RTX I/O system with directstorage api from Microsoft, which will allow the GPU to request data directly from the storage device at several times order of speed than is currently possible, so this means less VRAM caching is needed since it can just forget what it doesn't need and pull what it does into it's buffer near instantly. All of these systems working together mean that 10GB of VRAM is more than enough for 4K gaming.

I really don't think NVIDIA engineers would purposely gimp their flagship GPU with only 10GB if they didn't know what they were doing. People worry too much. Numbers sell because too many people are ignorant of how technology works.


... Only if developers use this tech in their games. They games have to support this tech I believe.
 
Are they actual numbers? I would guess 99% of gamers have 8GB or less. Look at the Steam hardware survey... The vast majority of gamers have budget cards.

Would be nice to be able to filter by country and disregard laptops/systems older than a decade old. Its somewhat skewed.

I have steam on 3 PCs, one with Vega 56, one with GTX 980 and one with integrated laptop gpu for example.. its not an accurate assessment at all.
 
Are they actual numbers? I would guess 99% of gamers have 8GB or less. Look at the Steam hardware survey... The vast majority of gamers have budget cards.
Exactly. What will happen though is some developers will use a slightly less compressed version of the highest texture that look 1% better which need more than 10gb to work properly and people will start saying “see, told you 10gb is not enough”. Lol. Then I will say go lower textures one notch and show me the difference ;)

I am not saying the above will always be the case, but it will be like that for the most part imo.
 
If vram isnt an issue why did AMD sell say the 580 with 4gb and 8GB. Are they providing a useless extra 4GB of unneed vram, or are they seriously gimping the 4GB variant?

The 8GB cards are faster then 4GB cards and the gap does grow.
320 / 10(gb) = 32
320 / 20(gb) = 16

320 / 16(gb) = 20 << not a multiple of 16

Micron are only doing 1GB chips at present, for something close to 16GB on a 3080 you would need 768mb chips and then you would end up with 15.3GB

Maybe they can do a 970 and add some slower vram.
10GB is fine for 1080P and 1440P. It's absolutely not fine for 4K. Even today, several titles need more than 10GB for consistently smooth gameplay. The consoles are getting an upgrade from 8GB to 16GB RAM/VRAM this year, so it's common sense, even to small children, that VRAM requirements will increase going forward.

This is why games will need more vrams in future, if you going to upgrade next year than no problem but for people who keeps their cards a few years it could be a problem.

EDIT:- Jesus I didn't mean to quote all of that.:rolleyes::p:rolleyes:
 
I don't see a problem for people that keep their cards for a few years if they also keep their monitors for a few years, unless possibly the minority that already have a top of the line 4k monitor, but those people tend to upgrade gfx card more often.

I've got a 1440p UW with Gsync that I'm not upgrading for a LONG time, and I think a 3080 will easily keep up for 3-4 years.
 
I don't see a problem for people that keep their cards for a few years if they also keep their monitors for a few years, unless possibly the minority that already have a top of the line 4k monitor, but those people tend to upgrade gfx card more often.

I've got a 1440p UW with Gsync that I'm not upgrading for a LONG time, and I think a 3080 will easily keep up for 3-4 years.

That is precisely what I'm hoping for. I knew I was sort sealing my fate 3 1/2 years ago when I spent £900 on an X34A, but I'm an open nVidia fanboi :D the 3080 on paper is perfect to keep me going a while longer. Even a 3070 would be a huge upgrade over my 980Ti.
 
Yes but a lot of people will be or are at 4K.
Sorry, but I don't agree. High Hz 4k monitors are still super expensive and niche. There might be a higher number using 4k TVs, but they will most likely be sat on a sofa 6 feet away and not able to tell the difference between ultra and ultra-1 settings.
 
Sorry, but I don't agree. High Hz 4k monitors are still super expensive and niche. There might be a higher number using 4k TVs, but they will most likely be sat on a sofa 6 feet away and not able to tell the difference between ultra and ultra-1 settings.

I should have said 4k tv's.
 
What people don't understand about Steam survey is that they're not a representative market for the new GPU tech (nor the display market). People who are gaming on laptops' integrated graphics, using old polaris/pascal cards, etc. both don't care about new GPUs because they are out of their budget AND irrelevant to their games (dota 2, path of exile, cs:go, etc.) since they have more than enough, and likewise for vram.

So it's erroneous thinking to look at Steam's total market and produce conclusions with respect to advanced GPU tech based on that.
 
What people don't understand about Steam survey is that they're not a representative market for the new GPU tech (nor the display market). People who are gaming on laptops' integrated graphics, using old polaris/pascal cards, etc. both don't care about new GPUs because they are out of their budget AND irrelevant to their games (dota 2, path of exile, cs:go, etc.) since they have more than enough, and likewise for vram.

So it's erroneous thinking to look at Steam's total market and produce conclusions with respect to advanced GPU tech based on that.

Not really because the mass market drives implementation. Very few developers prioritise those at the high end as the are of marginal value in overall sales.
 
Status
Not open for further replies.
Back
Top Bottom