• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Th0nt, think logically and rationally. Are you actually saying that you think having 10GB GDDR6x VRAM can completely compensate for a theoretical situation where you need 12GB of VRAM (whether GDDR6 or GDDR6x) to run some super duper fancy Ultra mode? Your reasoning reason being... because Nvidia have engineers that 'know their stuff' which somehow means that all of their decisions, in conjunction with the bean counters who have the final decisions on costing, must be great decisions? Eek.

Nvidia already said, in a public blog, that while more VRAM is always nice that they knowingly compromised on the VRAM amount of the 3080 due to cost factors. Then they tried to justify it by saying they had spoken to devs and come up with a good average number of 10GB that they felt was fine for this generation. https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/



Did they speak to all devs? Can they predict all the visual settings that some devs may want to implement next year and beyond? No, of course they can't. They also used current and even previous generation games as examples of VRAM usage, they mentioned nothing about what they thought would happen in 2021/2022. It's nice that you have blind trust in corporations, but maybe now and then it's nice to just engage a little of your independence and think critically.



"The only reason AMD can use GDDR6 is because they use less memory bandwidth because they have infinity cache" is a rather dismissive and blasé way of saying that AMD implemented some very smart engineering in their cards this generation. That is, objectively speaking, a real accomplishment that they could engineer a card with GDDR6 and have it perform as good or better as Nvidias cards with GDDR6X. It makes Nvidias decision to go with newer and more expensive GDDR6X ultimately look like a bad call.
Good find Rich with that Nvidia quote about keeping costs down.

I remember when it was said that the 4GB Fury X could overcome VRAM situation due to high bandwidth, and we all know how that worked out. :p
 
Good find Rich with that Nvidia quote about keeping costs down.

I remember when it was said that the 4GB Fury X could overcome VRAM situation due to high bandwidth, and we all know how that worked out. :p
That's the thing Matt, that quote and blog has been there the whole time, come directly from Nvidia, and has been referenced multiple times in this very thread... but it never ceases to amaze me how people will still go to such lengths to defend and rationalize something, even when it often doesn't make logical sense. For example PrincessFrosty has written literally thousands of words in at least a hundred(s) of posts in this thread trying to convince people, with a intensity that borders on the manic (and by that I mean that someone is so seemingly obsessed about this topic that they would put such energy and effort into it), that they are crazy for thinking 10GB VRAM may not be a smart decision for Nvidias newest flagship. That's the kind of tunnel-visioned, relentless and borderline autistic mentality we have today among some individuals in the PC hardware enthusiast community who leave no room in their minds for error or compromise and will literally just engage in incessant and overly lengthy circular arguments until other people just give up trying to have a sensible two-way discussion.
 
Last edited:
Watch Dogs legion HD texture pack requires 11GB of memory but runs fine on a 3080. The texture pack itself is over 20GB. Which would only fit the RAM on the 3090, yet work fine on the 3080. Physical memory size is not the only factor.

4K / Ultra Settings
  • CPU: Intel Core i7-9700K / AMD Ryzen 7 3700K
  • GPU: NVIDIA GeForce RTX 2080 Ti or AMD Radeon VII
  • VRAM: 11 GB
  • RAM: 16 GB (Dual-channel setup)
  • Storage Space: 45 GB (+ 20 GB HD Textures Pack)
  • Operating System: Windows 10 (x64)

All the information that most games released have enough vRAM has already been posted. That they really use between 3-8GB. Yet people argue an issue with vRAM size. The only evidence an unreleased AMD sponsored game that claims 12GB for an extreme texture pack. Thats cherry picking to take the **** level. You call that clutching at straws in a debate.

This is important because a card with turning level RT performance (see link already posted and why there are no RT benchmarks from AMD) that wont be able to run 4k in a RT heavy game like Control without upscaling (huge quality loss). Has more vRAM for more textures who's quality can only been seen at 4k or above.

This is based on the information we have at the moment. Information could change later but the 10GB not enough possition left evidence behind right from the start. 8GB is enough at the moment and 10GB is fine even in a game with +20GB of HD textures.

Yet quatily at 4k is important to people with a 1660 super with 6GB of RAM.

https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,9.html



As you can see, without the HQ texture pack in ultra HD, we're now at 7GB, closing in at 8 GB graphics memory usage. Ergo this is why I call 8GB the minimum these days.



https://www.overclock3d.net/reviews/software/watch_dogs_2_pc_performance_review/11

When using the game's high-resolution texture pack players will be able to use the game's ultra texture setting, allowing the game to use to 6GB of VRAM at 4K.

Ray Tracing On - 4K / Ultra Settings
  • CPU: Intel Core i7-9900K / AMD Ryzen 7 3700X
  • GPU: Nvidia Geforce RTX 3080
  • VRAM: 10GB
  • RAM: 16GB
  • Storage: 45GB (+ 20GB HD texture pack)
  • OS: Windows 10

https://www.gfinityesports.com/arti...ystem-requirements-minimum-max-settings-uplay

Literally maxed @4K w/ HD texture pack (motion blur off, as always):


Literally maxed @4K with DLSS set to Quality w/ HD texture pack (I can't tell the difference, other than FPS):


DLSS reduces memory usage.
 
Last edited:
Th0nt are you saying that you think having 10GB of faster GDDR6x VRAM can completely compensate for a theoretical situation (lets for a moment assume that the Godfall devs did actually implement that) where 12GB or more of VRAM (whether GDDR6 or GDDR6x) is required to run some super duper fancy Ultra mode? Your reasoning reason being... because Nvidia have engineers that 'know their stuff' which somehow means that all of their decisions, despite that it's the management bean counters who actually have the final say on how generous specs can be, must be making great decisions? That's a lot of blind faith to have.

Be mindful of the cherry picking of points dog. I have said many times in multiple threads it makes no sense for any GPU company to be scrimping on VRAM when 4/8Gb has been the norm for AMD and 3/6Gb was that for the mainstream stacks since around 2015. We are now at the end of 2020 and games seem to bloat filesizes year on year. I own both a Vega56 (8GB VRAM) and 1060 (6Gb), and as I also keep informative on the mining scene, the more VRAM the better when it comes to longevity of algorithms and tweaking (so not just games).

That being said, my point was the 10Gb of the 3080 was the edge they can get away with, and you have skimmed over the bandwidth and focused on the VRAM size. Whilst I agree that others have posted walls of texts on the subject (which instantly makes me leave the pages) some of them are more knowledgeable than me on the topic. So my point? Just because they have less VRAM doesn't mean its instant fail, you need to check how fast the memory can move the data before jumping to conclusions - architecture 101.

The same will apply to the 3070 when it comes to the top end 1440p games and at 4k. Most will work, but there will have to be some cheats, hacks or tricks to get nvidia cards to avoid the pitfalls in some of these demanding titles - or they will be put to the sword; something AMD do not have to worry about as they go the generous route.
 
So guys if the developer claims godfall will use 12gb at 4k with 4k textures, is it fine to think that if I'm planning on getting the 3080 10gb for 1440p, and i guess I won't install the 4k texture pack that my usage will be below 10gb?

I think if it is 8gb-10gb that means 10gb at 1440p is good enough for cross gen and next gen(3+ years), I wish amd had a dlss alternative, I would go with the 6800xt if they did, but they have no answer as of now, I am waiting until january 2021 though let's see what they come up with..
 
Be mindful of the cherry picking of points dog. I have said many times in multiple threads it makes no sense for any GPU company to be scrimping on VRAM when 4/8Gb has been the norm for AMD and 3/6Gb was that for the mainstream stacks since around 2015. We are now at the end of 2020 and games seem to bloat filesizes year on year. I own both a Vega56 (8GB VRAM) and 1060 (6Gb), and as I also keep informative on the mining scene, the more VRAM the better when it comes to longevity of algorithms and tweaking (so not just games).

That being said, my point was the 10Gb of the 3080 was the edge they can get away with, and you have skimmed over the bandwidth and focused on the VRAM size. Whilst I agree that others have posted walls of texts on the subject (which instantly makes me leave the pages) some of them are more knowledgeable than me on the topic. So my point? Just because they have less VRAM doesn't mean its instant fail, you need to check how fast the memory can move the data before jumping to conclusions - architecture 101.

The same will apply to the 3070 when it comes to the top end 1440p games and at 4k. Most will work, but there will have to be some cheats, hacks or tricks to get nvidia cards to avoid the pitfalls in some of these demanding titles - or they will be put to the sword; something AMD do not have to worry about as they go the generous route.
No-one has said having 10GB VRAM is an "instant fail", this doesn't even make any logical sense as it is a powerful GPU that handles all current games. The overwhelming message from the community is that there are clear concerns as to whether the 10GB VRAM that Nvidia assigned to their new flagship card will be sufficient for this entire generation of high-end gaming. This isn't just a few people on this forum, this is a concern strongly represented from across the entire tech community, whether consumer or reviewer, and those concerns were voiced from the moment the card specs were known. It's a valid potential limitation and it is a risk with a plausible outcome that we will see the truth of whether or not it materialises in any impactful way over the next 12-24 months.
 

Yeah I mean those reviewers are doing the same old wrong thing which is telling you how much memory is requested to be allocated, not how much is in use. And now people have worked out how to get around the DRM by disabling BattleEyeLauncher they've run tests for real vRAM usage. Just like every other game tested, it's lower than the requested amount, by about 1-2Gb. So 10Gb is absolutely fine for the 3080, that's why when you play and/or benchmark the cards the apparent lack of vRAM doesn't manifest in frame rate drops. The fact is that the game at Ultra 4k is hammering the GPU into unplayable frame rates, it looks pretty as hell, it sits inside the 3080s vRAM budget, but the GPU gives out first.

All those reviewers and the people writing the min game spec, none of them are accounting for this fact that modern tools to measure this are simply measuring an irrelevant number. I know these people are supposed to be industry experts but the point is they're slow on the uptake on this, these are recent changes, the tools for measuring this stuff are relatively new, not even out of beta yet and the topic of measuring memory usage is a subtle and complex one.

WDL as close to as a next gen titles tells us 3 things:
1) Games in future require more vRAM.
2) Games in future require more GPU horsepower.
3) The 3080's GPU gives out before its vRAM budget is exceeded.

FS2020, Avengers and Crysis Remastered all tell the same story.
 
I wouldnt bet on bandwidth being a replacement for storage at all. There's a finite amount of data you can fit in vram regardless of how fast it is. More bandwidth is always nice, but you still end up having to swap out to ram and that'll always be the bottleneck.

That being said, my point was the 10Gb of the 3080 was the edge they can get away with, and you have skimmed over the bandwidth and focused on the VRAM size. Whilst I agree that others have posted walls of texts on the subject (which instantly makes me leave the pages) some of them are more knowledgeable than me on the topic. So my point? Just because they have less VRAM doesn't mean its instant fail, you need to check how fast the memory can move the data before jumping to conclusions - architecture 101.

It seems like the vague definition of the phrase 'wall of text' is forever shrinking, and I'm sure mobile devices are to blame for it. There's some really good stuff in this thread, you shouldn't be put off because it's more than a couple of lines of text.
 
So guys if the developer claims godfall will use 12gb at 4k with 4k textures, is it fine to think that if I'm planning on getting the 3080 10gb for 1440p, and i guess I won't install the 4k texture pack that my usage will be below 10gb?

I think if it is 8gb-10gb that means 10gb at 1440p is good enough for cross gen and next gen(3+ years), I wish amd had a dlss alternative, I would go with the 6800xt if they did, but they have no answer as of now, I am waiting until january 2021 though let's see what they come up with..

3080 10GB will be fine for 1440P for years IMO. There's a huge jump from 1440P to 4k in terms of VRAM.
 
Watch Dogs legion HD texture pack requires 11GB of memory but runs fine on a 3080. The texture pack itself is over 20GB. Which would only fit the RAM on the 3090, yet work fine on the 3080. Physical memory size is not the only factor.

4K / Ultra Settings
  • CPU: Intel Core i7-9700K / AMD Ryzen 7 3700K
  • GPU: NVIDIA GeForce RTX 2080 Ti or AMD Radeon VII
  • VRAM: 11 GB
  • RAM: 16 GB (Dual-channel setup)
  • Storage Space: 45 GB (+ 20 GB HD Textures Pack)
  • Operating System: Windows 10 (x64)

All the information that most games released have enough vRAM has already been posted. That they really use between 3-8GB. Yet people argue an issue with vRAM size. The only evidence an unreleased AMD sponsored game that claims 12GB for an extreme texture pack. Thats cherry picking to take the **** level. You call that clutching at straws in a debate.

This is important because a card with turning level RT performance (see link already posted and why there are no RT benchmarks from AMD) that wont be able to run 4k in a RT heavy game like Control without upscaling (huge quality loss). Has more vRAM for more textures who's quality can only been seen at 4k or above.

This is based on the information we have at the moment. Information could change later but the 10GB not enough possition left evidence behind right from the start. 8GB is enough at the moment and 10GB is fine even in a game with +20GB of HD textures.

Yet quatily at 4k is important to people with a 1660 super with 6GB of RAM.

https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,9.html







https://www.overclock3d.net/reviews/software/watch_dogs_2_pc_performance_review/11



Ray Tracing On - 4K / Ultra Settings
  • CPU: Intel Core i7-9900K / AMD Ryzen 7 3700X
  • GPU: Nvidia Geforce RTX 3080
  • VRAM: 10GB
  • RAM: 16GB
  • Storage: 45GB (+ 20GB HD texture pack)
  • OS: Windows 10

https://www.gfinityesports.com/arti...ystem-requirements-minimum-max-settings-uplay

Literally maxed @4K w/ HD texture pack (motion blur off, as always):


Literally maxed @4K with DLSS set to Quality w/ HD texture pack (I can't tell the difference, other than FPS):


DLSS reduces memory usage.

You sound extremely desperate to try and convince yourself 10GB is ample on a brand new flagship 3080 for 4K. It's not, there's no excuse.

The flagship shouldn't have any questions around if it's VRAM is enough for now, or for one year's time. The very fact is has is simply not good enough. Don't worry though, Nvidia will quickly rectify with a 3080ti with 20GB and all will be quickly forgotten and forgiven, leaving the 4 people who managed to get a 3080 shafted.

Then again, I highly doubt resale value of the 3080 10gb cripple edition will be impacted, as Nvidia fans will buy a polished turd if it has Nvidia's logo on it.
 
I wouldnt bet on bandwidth being a replacement for storage at all. There's a finite amount of data you can fit in vram regardless of how fast it is. More bandwidth is always nice, but you still end up having to swap out to ram and that'll always be the bottleneck.

Yes, memory bandwidth is objectively not the same as VRAM capacity and I neglected to reply to that part of Th0nts post. The specific topic here is VRAM capacity and unless someone can show me that GDDR6x bandwidth can actually compensate for not having enough VRAM to run a specific graphical profile or setting, then as nice it is to have that extra bandwidth, for me it is not so relevant to the discussion.

It seems like the vague definition of the phrase 'wall of text' is forever shrinking, and I'm sure mobile devices are to blame for it. There's some really good stuff in this thread, you shouldn't be put off because it's more than a couple of lines of text.
By "walls of text", I mean some people are writing way more than they actually need to, using excessive amounts of blah and filler, in order to get their points across. Some people have been doing it habitually (and maybe unconsciously) within this thread and it a great way of making people zone out of whatever message they are trying to convey and killing discussion in the process.
 
Last edited:
You sound extremely desperate to try and convince yourself 10GB is ample on a brand new flagship 3080 for 4K. It's not, there's no excuse.

The flagship shouldn't have any questions around if it's VRAM is enough for now, or for one year's time. The very fact is has is simply not good enough. Don't worry though, Nvidia will quickly rectify with a 3080ti with 20GB and all will be quickly forgotten and forgiven, leaving the 4 people who managed to get a 3080 shafted.

Then again, I highly doubt resale value of the 3080 10gb cripple edition will be impacted, as Nvidia fans will buy a polished turd if it has Nvidia's logo on it.

No evidence, so just a personal attack. Also no evidence of a 3080ti 20GB, just the rumor mill. With standards this low when considering evidence anything can be true.
 
Yes, memory bandwidth is objectively not the same as VRAM capacity and I neglected to reply to that part of Th0nts post. The specific topic here is VRAM capacity and unless someone can show me that GDDR6x bandwidth can actually compensate for not having enough VRAM to run a specific graphical profile or setting, then as nice it is to have that extra bandwidth, for me it is not so relevant to the discussion.

By being able to load textures and data in vRAM faster you don't need to keep everything in vRAM. You can stream data into vRAM as you get close to new areas. Then delete the data you dont need. The whole point on the RTX IO feature. It saves vRAM space.

Object pop-in and stutter can be reduced, and high-quality textures can be streamed at incredible rates, so even if you’re speeding through a world, everything runs and looks great. In addition, with lossless compression, game download and install sizes can be reduced, allowing gamers to store more games on their SSD while also improving their performance.

This begs the question, do you really know anything about what you are talking about? https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/
 
By "walls of text" mean some people writing way more than they actually need to, using excessive amounts of blah and filler, in order to get their points across. Some people have been doing it habitually (and maybe unconsciously) within this thread and it a great way of making people zone out of whatever message they are trying to convey and killing discussion in the process.
It really does have that effect doesn't it?

Makes me want to take a long dirt nap. :p
 
No evidence, so just a personal attack. Also no evidence of a 3080ti 20GB, just the rumor mill. With standards this low when considering evidence anything can be true.

You're aware that up until Nvidia's 3000 series announcement, all news of the 3070, 3080, 3090 were also 'rumours'? We already knew most of the specs and details before they launched, it's the same now with the 3080ti news. It's obvious, even to a small child, that a 3080ti is coming, with much more than 10GB VRAM that the 3080 has.

Also if you consider my post a 'personal attack', you have issues.

~90% of your posts from the last few weeks appear to have been in this thread, discussing the paltry 10GB on the 3080. I didn't count the number, though it appears to be 50+ posts in this thread. You have, or had, a 3080 on pre-order, so it's logical that you're now concerned about this not being enough. Else, why would you have so many posts in this thread, over so many weeks?
 
Last edited:
By being able to load textures and data in vRAM faster you don't need to keep everything in vRAM. You can stream data into vRAM as you get close to new areas. Then delete the data you dont need. The whole point on the RTX IO feature. It saves vRAM space.

It can reduce the vram required, yes. It's no replacement for a lack of VRAM though. Remember, the fastest consumer SSDs are what, approaching 8gb/sec red/write? Now look at the bandwidth of a 3080. Almost 100 times more. When the 3080 runs out of ram, and that's when not if as we've been saying from the start, RTX IO isnt going to save it.

.

By "walls of text", I mean some people are writing way more than they actually need to, using excessive amounts of blah and filler, in order to get their points across. Some people have been doing it habitually (and maybe unconsciously) within this thread and it a great way of making people zone out of whatever message they are trying to convey and killing discussion in the process.

Well, all it can say is that doesn't affect me. Creating an argument then dismissing counters because they're too long - that's not cricket in my book.
 
Last edited:
It can reduce the vram required, yes. It's no replacement for a lack of VRAM though. Remember, the fastest consumer SSDs are what, approaching 8gb/sec red/write? Now look at the bandwidth of a 3080. Almost 100 times more. When the 3080 runs out of ram, and that's when not if as we've been saying from the start, RTX IO isnt going to save it.

Not to mention, that this tech is also used in the new consoles. If it were so great, why would the new consoles have doubled their memory from 8GB to 16GB total memory? It's obvious that the VRAM is still required.
 
It seems like the vague definition of the phrase 'wall of text' is forever shrinking, and I'm sure mobile devices are to blame for it. There's some really good stuff in this thread, you shouldn't be put off because it's more than a couple of lines of text.

I have to do a lot of reading in my day job. Sysadmin and helpdesks etc. I have to absorb key details and weed out the chaff. A lot of posts on here could be more to the point, even link to resources for deeper reading, dont need reems of paragraphs to get your point across. :)

I acknowledge his statement as having a degree in Computer Science (or similar) will give you enough fundamentals, not everyone will have this, nor need it, but its just my black and white response for when dog or anyone is asking me a question or paraphrasing or strawmanning - it works both ways. Happy to discuss.

Yes, memory bandwidth is objectively not the same as VRAM capacity and I neglected to reply to that part of Th0nts post.

I just pointed it out as its valid, regardless of your sidestep there, Im sure @PrincessFrosty would meet me half way on this one!
 
Last edited:
3080ti coming = pretty much garanteed. Much more ram? That's speculation.

I guess it's conceivable that the 3080ti will use 12X1GB modules on a 384 bit bus. Though this would involve expensive re-engineering on the PCB. I think it's more likely we see a 20GB version using 2GB modules (10x2GB), or 24GB (24x1GB).
 
Status
Not open for further replies.
Back
Top Bottom