• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
For AI and ML workloads. They require huge amounts of memory. Like the Titan card before, the 3090 is not just a gaming card.

The Death Stranding metrics Poneros posted are really interesting. At 8K native a Titan RTX uses 11.2 GB. At 8K DLSS (with better image quality) it uses 8.7 GB (so 15.3 GB of VRAM sat doing nothing).
Cool a converted PS4 game.
We won't really know until new games get released that are designed around better hardware.
 
How does DLSS affect things? If people are rendering at lower resolutions and upscaling does that require less VRAM?
That is what I am thinking. Games that likely will be more vram hungry will be triple a games which then likely will have DLSS 2.0. So in the potentially handful of games that get released between now and the 4000 series which is when I will upgrade I have the option of either using DLSS 2.0 or lowering textures by 1 notch. No biggie.


Surely game developers will just use lower qualiy textures etc to ensure they stay below 10Gb.

Just like when CPUs were stagnated, game developers held back until core counts increased.

Game developers have been adjusting to the changing specs of home computers for the last 30 years. Dont worry.
Exactly, some make it sound like the game won’t work, it will, just go down 1 setting on textures if ever needed.


I find it weird that a new Flagship card in 2020 has less VRAM than a flagship card from 2017. For me it comes across as a cost cutting move and I can't help but feel like in 1-2 years time it won't be enough.
That’s just marketing mate. To me the 3090 is flagship and even if not the case 3080Ti won’t be long away.

It is simple, they wanted to keep the cost of the 3080 down and offer that level of performance cheaper.

End of the day if you intend to keep the card for 4-5 years then just wait a bit and go 16 or 20gb variants.
 
Why do we use caches? Because loading information from disk is typically very slow and we need the information readily available on a high speed interface, so we put stuff in cache. Now the faster you're able to retrieve that information from cache the less important your total cache size is, as long as you have strategies for feeding that cache in advance of needing it.

The thinking from NVIDIA is that the RTX IO (and Direct Storage) is that they can better optimise the loading of information from disk to the GPU. Their claims are they can do something like 24GB/second compressed on disk from an NVME SSD over the PCI-E bus into the VRAM. If that's true, we're talking about being able to fill a 10GB cache in under half a second with minimal CPU overhead.

So the question is, how much memory does a game actually need at a given moment? From my understanding, even in very large game worlds, the actual drawable assets on screen represent a very tiny percentage of the total cache. Game developers have gotten used to just loading everything they need into VRAM because why not? When in reality, they may only need to hold the assets needed for the next 5-10 seconds of game-play because by the time the gamer starts to move to another area, the data can be loaded into ram at high speed with minimal impact to performance.

This is a much more efficient use of VRAM and there's no reason why you wouldn't do this. The faster your persistent storage gets and the better strategies you have for loading your cache, the smaller your cache needs to be.

Nvidia know what they are doing so I'm willing to bet they have the data which says 10GB is more than capable for modern games at 4K and won't be an issue for the life cycle of the card, assuming games are taking advantage of the new technology. (which they will be forced to do...)
 
Lol "midrange" here we go... It's not midrange. It's the second top card (thinking 3080Ti being top). 3090 is a titan and therefore is not a "consumer" card. an 80 class card is and always has been a high end card. not top. but high end.

Did/does the 10GB annoy me? Yup, do I wish it was more? yup. Would I pay more for more? yup.
However, I am gaming at 1440P. and I keep telling myself... it's a £650 card which really is not bad. Im happy enough to buy.

The 3090 most definitely won't be the Titan, it does not even use the full Die, the Titan will surface in a few months time when 3090 sales start to dip.

As to the 3080 it is mid range as above it you will have eventually 3080 Ti, 3090 (3090 Ti possibly using full die) and the Titan in some form. And that does not include any Super cards later on.

NVidia will not talk about a possible Titan at the moment because they know people who buy them are also likely to buy the 3090 in the short term.

As usual NVidia will milk their loyal followers for every cent they can get.

I built a Ryzen 3950X setup the other week, it makes all my intel stuff look very poor for various reasons, NVidia should pay more attention to the CPU market. Intel could never imagine Robert Noyce buying a Ryzen setup but I did lol.
 
Eh? The speed of a GPU core and the amount of memory are independent of each other.

As for any talk about consoles and comparisons to this gen of video cards - again it's irrelevant. Consoles have a unified memory architecture - they're trading one off against the other. PCs with these cards have separate system memory and VRAM.
Consoles dictate AAA game development. They are relevant to the conversation. Consoles have just doubled the amount of RAM they have. As the new generation progresses developers will build more and more complex games to push the consoles to the limit. This will involve using and filling up the RAM available to them. They will allocate the Shared pool as they wish, but they will most likely assign most of it as VRAM.
 
Surely game developers will just use lower qualiy textures etc to ensure they stay below 10Gb.

Just like when CPUs were stagnated, game developers held back until core counts increased.

Game developers have been adjusting to the changing specs of home computers for the last 30 years. Dont worry.
Who wants to buy a 3080 to use textures that are lower quality than a consoles?:confused:

The assumption i am making here is that the new generation of consoles will be pushing higher quality textures due to the increase in VRAM they have over the previous generation.
 
The 3090 most definitely won't be the Titan, it does not even use the full Die, the Titan will surface in a few months time when 3090 sales start to dip.

As to the 3080 it is mid range as above it you will have eventually 3080 Ti, 3090 (3090 Ti possibly using full die) and the Titan in some form. And that does not include any Super cards later on.

NVidia will not talk about a possible Titan at the moment because they know people who buy them are also likely to buy the 3090 in the short term.

As usual NVidia will milk their loyal followers for every cent they can get.

I built a Ryzen 3950X setup the other week, it makes all my intel stuff look very poor for various reasons, NVidia should pay more attention to the CPU market. Intel could never imagine Robert Noyce buying a Ryzen setup but I did lol.
So based on that, is the 3070 low end Kaaps? :p

I mean 3070 is a GA104 and 3080 is GA102 right?
 
Both mid range.:)

In the same way as the 3090 and Titan will be high end.

This message was brought to you on my mid range Turing Titan.:D
Lol :D

Yea I am not bothered what it is classified as in the end. As always, I am concerned with price for performance. If the 3090 “high end” is only around 20% faster for more than double the money, then I am happy even if the 3080 is considered “low end” :p:D
 
Fine in the short term but as games being developed solely for next-gen systems are released it could reach it's capacity alarmingly fast. I'm not buying until they release a 20GB version, either the 3080 or 3080 Ti.
 
A full 10GB devoted to the game as opposed to having to share it with windows. ;)
Also people keep trotting out that number as if developers can't choose to assign more to VRAM if they feel the need to.

Given the OS and game will use some of that RAM then there may be 10-12gb left usable as VRAM.

Will have to wait and see if the VRAM whinging actually turns into fact or not when some more detailed analysis hits. :)
 
Anyone remember the heated arguments about 4GB being enough? No one is arguing for less than 8GB now. Given the power of these cards is say 12-16GB would be appropriate if you intend to run them for a few years.
 
Anyone remember the heated arguments about 4GB being enough? No one is arguing for less than 8GB now. Given the power of these cards is say 12-16GB would be appropriate if you intend to run them for a few years.

Can remember when the argument was if 2gb was enough.

I can also remember at the time that 2gb even at high resolution like 1440p or 1600p was enough and the graphics actually looked quite good. 8 years later 8gb really is not enough but the graphics have improved very little, what is going on?
 
Status
Not open for further replies.
Back
Top Bottom