• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Fingers crossed Microsoft includes DLSS with the ray tracing update.

Apparently Microsoft has just released DirectML as a standalone API, which is great news for both Nvidia and AMD users when developers fully use it in Windows games as well. It will be interesting to see how the 3000 and 6000 series compare in reviews of games that use it to upscale resolution

https://youtu.be/W-c0warlQto
 
Last edited:
Perhaps not, but VRAM limitations are certainly at the forefront of the discussions when it comes to limiting factors on these GPUs. We've had people questioning the logic of dropping 700 quid on a GPU when the RAM might limit your ability to run everything maxed out, when there are games out there now that the 3080 already cant maintain 4k60. The later seems to be continually brushed aside like it's fine to be slow.... just not slow because of the VRAM buffer :o
The reason it is at the forefront is because that is what the thread is about :|

What else would be at the forefront of discussion on a thread about VRAM.

Secondly it wasn't brushed aside it has been discussed. Something along the lines of resolution is not the biggest consumer of VRAM. And that items that consume a lot of VRAM (textures) do not have a huge affect on framerates. Also people do nt want the the gpu die to be bottlenecked by VRAM. I
Feel free to grab some popcorn and reread the thread to find were this stuff was mentioned. I may have also Rewrote this in the CP2077 thread.
 
The reason it is at the forefront is because that is what the thread is about :|

What else would be at the forefront of discussion on a thread about VRAM..
I think that's far too logical for threads like this. In the end what we have is a serious case of people wilfully engaging in cognitive dissonance by shoehorning GPU horsepower limitations into a thread specifically about VRAM liitations, despite no-one at any point having said that GPU horsepower was not also not an important factor. The fact that the thread is obviously working on a potential scenario that GPU horsepower is not a imitation, but VRAM is, seems to be too much of a stretch for some members intellects.
 
Last edited:
In an ideal world you want GPU horsepower to be closely matched to video memory capacity.

No point having 16GB of video memory on a GPU with the performance of a 5600X. Likewise, no point having 8GB on a 3070 when you have the performance of a 2080 TI.

I think the flagship 3080 is also lacking video memory, given the fact a 2080 TI had more and that's a flagship from two years ago.

Tell me I'm wrong and why. :)
 
In an ideal world you want GPU horsepower to be closely matched to video memory capacity. I have not seen

No point having 16GB of video memory on a GPU with the performance of a 5600X. Likewise, no point having 8GB on a 3070 when you have the performance of a 2080 TI.

I think the flagship 3080 is also lacking video memory, given the fact a 2080 TI had more and that's a flagship from two years ago.

Tell me I'm wrong and why. :)

I agree that the 3070 is lacking with 8GB when the consoles will be targetting 10GB.

The 3080 I think is a sensible card for the moment, based on price and performance, with leading AI based upscaling and ray tracing. More VRAM isn't going to help with either. Then of course there is the upcomming Direct IO / RTX IO, again reducing the need for VRAM.

I had no issue retiring my 1080Ti With 11GB of VRAM. People need to stop thinking about legacy games and move forward.
 
I agree that the 3070 is lacking with 8GB when the consoles will be targetting 10GB.

The 3080 I think is a sensible card for the moment, based on price and performance, with leading AI based upscaling and ray tracing. More VRAM isn't going to help with either. Then of course there is the upcomming Direct IO / RTX IO, again reducing the need for VRAM.

I don't understand why you are saying VRAM it won't "help with upscaling or ray tracing". I mean seriously, what are you talking about.

Also RTX IO (Directstorage), which as a minimum requires a PCIe4 SSD, will not magically fully compensate for not having enough VRAM. That is when games start to actually use it in 2022/2023, by which time we will of course be on the next generation of GPU's that will have more minimum VRAM. https://wccftech.com/rtx-io-and-directstorage-are-coming-but-itll-be-a-while-yet/
 
We won't see any games using Direct Storage this year, i am not sure Microsoft will fully release it this year.
But with upscaling it is true that you won't need as much Vram it is like rendering the game at 1080p or max 1440p instead of 4k. The downside is that you need to wait for the upscaling to be implemented.
But then if Direct ML and super resolution becomes a thing then we can even say the 1080 or the 5600XT have enough VRAM since they can run games at "4k" . And they also have enough horse power. :D
 
The reason it is at the forefront is because that is what the thread is about
Nope, you've got that backwards.

Feel free to grab some popcorn and reread the thread to find were this stuff was mentioned

Oh so I didn't directly quote and respond to Richdog making that exact statement about dropping 700 quid in this exact thread then? It isn't me who needs to reread it.
 
I don't understand why you are saying VRAM it won't "help with upscaling or ray tracing". I mean seriously, what are you talking about.

Also RTX IO (Directstorage), which as a minimum requires a PCIe4 SSD, will not magically fully compensate for not having enough VRAM. That is when games start to actually use it in 2022/2023, by which time we will of course be on the next generation of GPU's that will have more minimum VRAM. https://wccftech.com/rtx-io-and-directstorage-are-coming-but-itll-be-a-while-yet/

More VRAM won't improve the 3080's ray tracing performance. More VRAM won't improve DLSS overhead.

In 2022/2023 we will be using 4080's and RDNA3.
 
More VRAM won't improve the 3080's ray tracing performance. More VRAM won't improve DLSS overhead.
Your post makes no logical sense because your redundant statements about VRAM helping DLSS or RT performance have nothing to do with a card running out of VRAM.

In 2022/2023 we will be using 4080's and RDNA3.

Which makes your comment about RTX IO helping with VRAM in current generation games even more silly than it initially was.
 
I still find it quite funny

At launch: Most were saying the 3090 is a waste of time, get the 3080 is nearly the same, 10gb VRAM is more than enough

Now the 3090 is like a ferrari starting to pull away in 5th, they're saying 10gb VRAM is still fine LMAO

So what happened guys, why is the 3080 getting slower relative to the 3090? Maybe we should try blaming Nvidia drivers, Nvidia must be nerfing the 3080 right
 
I still find it quite funny

At launch: Most were saying the 3090 is a waste of time, get the 3080 is nearly the same, 10gb VRAM is more than enough

Now the 3090 is like a ferrari starting to pull away in 5th, they're saying 10gb VRAM is still fine LMAO

So what happened guys, why is the 3080 getting slower relative to the 3090? Maybe we should try blaming Nvidia drivers, Nvidia must be nerfing the 3080 right

I haven’t noticed a drop in performance for my 3080, I must be really lucky :D
 
The later seems to be continually brushed aside like it's fine to be slow.... just not slow because of the VRAM buffer :o

This is what I don't understand about the vram prepper position.

The fact that the thread is obviously working on a potential scenario that GPU horsepower is not a imitation, but VRAM is, seems to be too much of a stretch for some members intellects.

Limitation is limitation.

The need to lower settings is inevitable. (In fact it's here now) I have yet to see a compelling argument for why one specific scenario requires its own thread.
 
I haven’t noticed a drop in performance for my 3080, I must be really lucky :D

Mine seems to be doing fine too. A 3090 *might* allow me to turn up one single setting...one notch...not even to ultra.

I'm not interested in spending more than twice as much money so I can take "detailed shadows" from low to medium.
 
I don't understand why you are saying VRAM it won't "help with upscaling or ray tracing". I mean seriously, what are you talking about.

Also RTX IO (Directstorage), which as a minimum requires a PCIe4 SSD, will not magically fully compensate for not having enough VRAM. That is when games start to actually use it in 2022/2023, by which time we will of course be on the next generation of GPU's that will have more minimum VRAM. https://wccftech.com/rtx-io-and-directstorage-are-coming-but-itll-be-a-while-yet/

It’ll provide additional longevity to the current generation as not everyone upgrades every 2 years. I’d also imagine that once it became more mainstream, and if it had a meaningful impact on VRAM usage, that we won’t see cards with oversized VRAM capacity, which is a plus for lower GPU pricing (memory is expensive).
 
Mine seems to be doing fine too. A 3090 *might* allow me to turn up one single setting...one notch...not even to ultra.

I'm not interested in spending more than twice as much money so I can take "detailed shadows" from low to medium.

You're not interested in spending twice the money for 37% better minimums in Godfall ? Madness :)
 
Nope, you've got that backwards.
I look forward to your explanation on why in a thread about VRAM, VRAM shouldn't be the forefront topic. The stage is all yours.

Oh so I didn't directly quote and respond to Richdog making that exact statement about dropping 700 quid in this exact thread then? It isn't me who needs to reread it.
So you snipped out the rest of the post that the quote was a part of therefore removing context. I'll play along.

What statement did Richdog make that you are referring to and what did you say?

Most importantly, out of all the posters on these forums why would I remember exactly what you have posted?
 
Same here... seems to deal with everything i throw at it in 4K, no wonder everyone in the world wants one ;)

Mine seems to be doing fine too. A 3090 *might* allow me to turn up one single setting...one notch...not even to ultra.

I'm not interested in spending more than twice as much money so I can take "detailed shadows" from low to medium.

The GPU shortage made the 3090 artificially popular. Had there been sufficient 3080 stock, people would have been all over those.

The 3080 will last a good few years.
 
Status
Not open for further replies.
Back
Top Bottom