• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
RTX 3080 10gb vs RTX 3080 10GB (at half the ram speed).

Would the halfed ram speed cause the game to suffer from running out of VRAM quicker than the full speed ram?

RAM speed contributes to the memory bandwidth between the GPU and the vRAM, if that is too low then the GPU becomes memory starved and won't be able to operate at full speed, it will have moments where it's idle and waiting for data and this cause performance to drop.
 
RTX 3080 10gb vs RTX 3080 10GB (at half the ram speed).

Would the halfed ram speed cause the game to suffer from running out of VRAM quicker than the full speed ram?

Think about it like copying a file to a 1GB SSD and copying the same file to a 1GB HD. The file will take the same space on both, but the slower HD will take longer to store the information.
 
https://www.youtube.com/watch?v=_L_dAynbKbo&t=240s

Turns out tensor memory compression really works...

6800xt vs 3090, 3090 uses 10%-20% less vram(they promised 20%-40% but eh its never what they promise)

Soo, a nvidia card will work a lot better when hitting against a vram limit.

Does this mean 3080's 10gb can be said to be the same as a hypothetical 6700xt's 12gb?

The 6800xt at one point was nearly 12GB but the 3090 had not gone over 10GB.
 
https://www.youtube.com/watch?v=_L_dAynbKbo&t=240s

Turns out tensor memory compression really works...

6800xt vs 3090, 3090 uses 10%-20% less vram(they promised 20%-40% but eh its never what they promise)

Soo, a nvidia card will work a lot better when hitting against a vram limit.

Does this mean 3080's 10gb can be said to be the same as a hypothetical 6700xt's 12gb?

I assume DLSS would have less of an impact if the tensors are doing memory compression too, unless they're just doing that all the time anyway?
 
The architecture is normally designed to cope with multiple cards at multiple performance levels with multiple variants of the GPU and different memory capacities. It's all a big trade off, you can see that AMD targeted 16GB for example which is greater than 10GB but they've also probably over provisioned their cards, certainly the lower even models but also probably the 6900XT as well, that thing will never use 16GB and have playable frame rates, they could have got away with 12GB or maybe even 10GB, but odds are their architecture would only allow 8GB as the next lowest vRAM config which would not be enough for the 6800XT or 6900XT for sure. So you're paying more for memory you can't use, does that mean it's an idiotic design decisions? No It's a trade off of the architecture, these kind of tradeoffs are made with every architecture on both camps.

Games like CoD Cold War have high resolution texture packs which when installed push the install to 130GB on disk and it runs in 10GB of vRAM just fine even with Ray tracing and all the effects on Ultra, in fact it's a good example of a modern game which is GPU/Compute bottlenecked, you can't run that game maxed out in 4k on a 6800XT for example you get about 20fps, and even on the 3080 with better RT performance you still only get about 40FPS. Yet we're not exceeding 10GB of vRAM. This follows a trend established by most of the very newest AAA titles, that GPU bottlenecks before vRAM does.

Whats interesting is we now have a 16GB VRAM GPU in the 6800XT which has comparable perfomance to the 3080 at 4K.
We can now easily test the 4K hypothesis with VRAM limitations. Has anyone yet provided an example or benchmark where the 6800XT vastly outperforms the 3080 at 4K due to a VRAM bottleneck?


If not, then it seems as if the two cards are going to become out-dated at around about the same time for 4K (which I personally think is the resolution where VRAM becomes an issue) and if this is the case, then... well I just don't see it as much of a problem.

I can 24 months from nowsee for the 2-5 titles, the 3080 struggling due to a VRAM limitation which requires NVIDIA users to use high rather than ultra texture back.
In the same breath, I can see AMD users having to not use RT in (potentially a lot of) titles due to the inferior RT AMD architecture which results in turning RT from high to low.

So both cards are going to potentially (as you'd expect) have some form of visual compromise as they age but I think both will eventually be outdated at the same time.

I wasn't totally sold on RT adoption but seeing the PS5 games including first party titles from Sony including it, its pretty clear it will be a feature that from this year onwards is widely adopted.
 
Whats interesting is we now have a 16GB VRAM GPU in the 6800XT which has comparable perfomance to the 3080 at 4K.
We can now easily test the 4K hypothesis with VRAM limitations. Has anyone yet provided an example or benchmark where the 6800XT vastly outperforms the 3080 at 4K due to a VRAM bottleneck?


If not, then it seems as if the two cards are going to become out-dated at around about the same time for 4K (which I personally think is the resolution where VRAM becomes an issue) and if this is the case, then... well I just don't see it as much of a problem.

I can 24 months from nowsee for the 2-5 titles, the 3080 struggling due to a VRAM limitation which requires NVIDIA users to use high rather than ultra texture back.
In the same breath, I can see AMD users having to not use RT in (potentially a lot of) titles due to the inferior RT AMD architecture which results in turning RT from high to low.

So both cards are going to potentially (as you'd expect) have some form of visual compromise as they age but I think both will eventually be outdated at the same time.

I wasn't totally sold on RT adoption but seeing the PS5 games including first party titles from Sony including it, its pretty clear it will be a feature that from this year onwards is widely adopted.

As much as I agree with what you have written, the consoles running RT is on AMD hardware.
 
Whats interesting is we now have a 16GB VRAM GPU in the 6800XT which has comparable perfomance to the 3080 at 4K.
We can now easily test the 4K hypothesis with VRAM limitations. Has anyone yet provided an example or benchmark where the 6800XT vastly outperforms the 3080 at 4K due to a VRAM bottleneck?

If not, then it seems as if the two cards are going to become out-dated at around about the same time for 4K (which I personally think is the resolution where VRAM becomes an issue) and if this is the case, then... well I just don't see it as much of a problem.

There's nothing so far which uses over 10GB of vRAM that I've seen/tested. And I've tested all claims in this thread which state games needing more than 10GB and when you measure vRAM actually in use.

We don't even have to wait for games that will be GPU bottlenecked at 4k we already have a load. FS2020, Avengers, Crysis Remastered, Watch Dog Legion and even CoD Cold War are all unplayable at 4k Ultra right now (assuming no DLSS usage) yet none of them exceed vRAM budgets of 10GB. As always I remain open minded to testing new things.

Tangentially, the new MSI Afterburner Beta has been updated and now enables per process memory usage in the performance list by default, so no more messing about to get it working if you just get the latest beta you can enable it with 1 click. https://download-eu2.guru3d.com/afterburner/[Guru3D.com]-MSIAfterburnerSetup463Beta4Build15910.rar
 
There's nothing so far which uses over 10GB of vRAM that I've seen/tested. And I've tested all claims in this thread which state games needing more than 10GB and when you measure vRAM actually in use.

We don't even have to wait for games that will be GPU bottlenecked at 4k we already have a load. FS2020, Avengers, Crysis Remastered, Watch Dog Legion and even CoD Cold War are all unplayable at 4k Ultra right now (assuming no DLSS usage) yet none of them exceed vRAM budgets of 10GB. As always I remain open minded to testing new things.

Tangentially, the new MSI Afterburner Beta has been updated and now enables per process memory usage in the performance list by default, so no more messing about to get it working if you just get the latest beta you can enable it with 1 click. https://download-eu2.guru3d.com/afterburner/[Guru3D.com]-MSIAfterburnerSetup463Beta4Build15910.rar

None of those games are next gen though. Next gen means PS5 games made and designed and released after 2020 so Q1 2021 we will start to see new games designed for PC and next gen console. What are the actual numbers for these games you mention? Is under 10gb 9gb?


And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps. Some might want 4k 60fps on low and others will want max textures with RT off in next gen. 10gb does not inspire me with confidence to last 24months. And saying COD at max only runs at 20fps and never uses over 10gb totally skews the arguement. Who runs max 20fps? The settings are there so a user can pick and choose.


Next gen someone will decide what they want, Effects, Textures, Raytracing in order for 10gb to be workable we need to see proof of a very good looking game with the highest textures and see how much ram is being used. Hardly any of these games exist yet nearly all the games have been in development before the new consoles arrived. Cyberpunk will be the first probable next gen game and i have yet to see any vram numbers.
 
None of those games are next gen though. Next gen means PS5 games made and designed and released after 2020 so Q1 2021 we will start to see new games designed for PC and next gen console. What are the actual numbers for these games you mention? Is under 10gb 9gb?


And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps. Some might want 4k 60fps on low and others will want max textures with RT off in next gen. 10gb does not inspire me with confidence to last 24months. And saying COD at max only runs at 20fps and never uses over 10gb totally skews the arguement. Who runs max 20fps? The settings are there so a user can pick and choose.


Next gen someone will decide what they want, Effects, Textures, Raytracing in order for 10gb to be workable we need to see proof of a very good looking game with the highest textures and see how much ram is being used. Hardly any of these games exist yet nearly all the games have been in development before the new consoles arrived. Cyberpunk will be the first probable next gen game and i have yet to see any vram numbers.

I agree with this also, but think Cyberpunk is a bad game to use as an example as it is an Nvidia sponsored title and they would have made sure it worked well on the 3080 'flagship'.

It would be a PR disaster if it was breaching 10GB vram.
 
Sony were shipping out devkits last year, why are we wait for some arbitrary length of time before we call games 'next gen'? Are we expecting some random jump in game engine tech down the line or are we just hopeful?

And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps.

8k30 = 4x the fillrate vs 4k30. It's not happening without copious amounts (unachievable amounts?) of DLSS. People who want 8k30 will be left waiting for an long time.
 
Last edited:
I agree with this also, but think Cyberpunk is a bad game to use as an example as it is an Nvidia sponsored title and they would have made sure it worked well on the 3080 'flagship'.

It would be a PR disaster if it was breaching 10GB vram.

Which is why i am willing to stake a claim on the following theory. Cyberpunk 2077 will at release not have the highest 4k/8k textures because i feel they will take a hammering over the performance like Crysis did. I think after they sort the obvious DLC then they will release the texture pack which will exceed 10gb. Nvidia will be putting pressure on them not to exceed 10gb likely.



@james.miller
It takes more than a year to make a game, Even Cyberpunk was not developed for new consoles. We are waiting for next gen that is made for PC in 2021 and ported to PS5 or vice versa. The games will come these GPUs need and are expected to last a minimum of 24months. They are the highest end GPU apart from the 3090 which costs £1700. Let that sink in.
 
The 3090's already cant do 8k30 without DLSS. let that sink in.

Yea because the 3090 is not even much faster yet costs double the price because Nvidia know even without trying fools will buy one. Also for 8k no one plays at ultra. The last time i checked 8k i ran For Honor on a 1080 which were both on the same generation and i got 20fps.

So i do not know how you are claiming this unless you expect 8k users want shadows, RT and effects at max rofl they would simply want the 8k res and tweak whats needed for 30fps. And it will still look great. You can see details at 8k that are simply a blur at 1080p and 4k. Chainmail being an example you can see every round chainmail link at 8k on a Vanguards armor.
 
None of those games are next gen though. Next gen means PS5 games made and designed and released after 2020 so Q1 2021 we will start to see new games designed for PC and next gen console. What are the actual numbers for these games you mention? Is under 10gb 9gb?


And what if someone wants a locked 8k 30fps experience? If the console can do 4k 30fps modes a few niche users will want 8k 30fps. Some might want 4k 60fps on low and others will want max textures with RT off in next gen. 10gb does not inspire me with confidence to last 24months. And saying COD at max only runs at 20fps and never uses over 10gb totally skews the arguement. Who runs max 20fps? The settings are there so a user can pick and choose.


Next gen someone will decide what they want, Effects, Textures, Raytracing in order for 10gb to be workable we need to see proof of a very good looking game with the highest textures and see how much ram is being used. Hardly any of these games exist yet nearly all the games have been in development before the new consoles arrived. Cyberpunk will be the first probable next gen game and i have yet to see any vram numbers.


I think at 4K, the point is that the 6800XT is already bottlenecked re: rasterisation in the current generation of titles, rarely, if ever beating the 3080 at 4K.

Therefore, I'm not entirely sure how useful the extra VRAM will be for these titles when the card is ALREADY slower than the 3080.
 
Is that with or without Tensor compression?


I'm going to assume even with. I'm sure videogames will progress to the extent that a 3080 can play it at ultra on everything including textures. This is to a degree the natural evolution of GPUs, generation to generation and definitely console generation to console generation. If anyone is expecting their 3080 to 4k/60 ultra every single title from now until the end of the PS5's life span, they're going to be dissapointed.
 
I'm going to assume even with. I'm sure videogames will progress to the extent that a 3080 can play it at ultra on everything including textures. This is to a degree the natural evolution of GPUs, generation to generation and definitely console generation to console generation. If anyone is expecting their 3080 to 4k/60 ultra every single title from now until the end of the PS5's life span, they're going to be dissapointed.

I predict the most common new card in 18months will be the 3060Ti which has 8GB of VRAM. I also don't plan on keeping my 3080 (when it arrives) past Hopper's launch. RDNA3 will be considered if AMD do some proper work on their RT performance.
 
Status
Not open for further replies.
Back
Top Bottom