• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Here's something no one has mentioned yet.

Games need RAM also. Not just vram.

Games now days on the PC use how much ram? 8, 9 - 10GB?

So the consoles have to share the 16GB of RAM they have between system ram and VRAM.

A gaming PC with a 3080 would have 10GB of DEDICATED super fast VRAM and usually 16GB of system ram.

So your PC suddenly looks like it has a lot more ram than a next gen console.

That's true but it doesn't so much matter because consoles only do one thing: game. There's a lot of efficiency to the process that gets added by virtue of them being closed systems & limited in the number of specs available (let's say 3 this gen - PS5, XSX & XSS). There's also different builds & APIs for each, and then for PC as well, so I wouldn't count 10 GB in a console as 10 GB in a PC - it's going to depend a lot on the devs in order to reach that parity.
 
Here's something no one has mentioned yet.

Games need RAM also. Not just vram.

Games now days on the PC use how much ram? 8, 9 - 10GB?

So the consoles have to share the 16GB of RAM they have between system ram and VRAM.

A gaming PC with a 3080 would have 10GB of DEDICATED super fast VRAM and usually 16GB of system ram.

So your PC suddenly looks like it has a lot more ram than a next gen console.

No i definitely mentioned that lol.
 
Speaking of 4K, we did notice some VRAM limitations when using the game’s HD Texture Pack. On Ultra Settings/Ultra Textures, our performance went downhill in the following scene. As you can see, our RTX2080Ti was used to its fullest and pushed 17fps.
However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/
 
The 3gb 780 Ti was enough in 2013 when the consoles launched with 8gb. That's 3Gb Vram Vs 8Gb Vram. It wasn't a problem.

Now in 2020, 10gb Vram is suddenly a problem Vs the consoles 16Gb? It's a smaller gap than in 2013.

Doesn't anybody remember this??? It wasn't a problem. Stop panicking.
 
The 3gb 780 Ti was enough in 2013 when the consoles launched with 8gb. That's 3Gb Vram Vs 8Gb Vram. It wasn't a problem.

Now in 2020, 10gb Vram is suddenly a problem Vs the consoles 16Gb? It's a smaller gap than in 2013.

Doesn't anybody remember this??? It wasn't a problem. Stop panicking.

Things change. Raytracing is vram hungry, so if you don't want to game at the console minimum you risk having vram issues.
 
The 3gb 780 Ti was enough in 2013 when the consoles launched with 8gb. That's 3Gb Vram Vs 8Gb Vram. It wasn't a problem.

Now in 2020, 10gb Vram is suddenly a problem Vs the consoles 16Gb? It's a smaller gap than in 2013.

Doesn't anybody remember this??? It wasn't a problem. Stop panicking.
That's why we are still using graphics cards with 3GB of VRAM in 2020. oh wait.

If we were talking about the 3060 or maybe even the 3070 with 10GB of VRAM. I would agree that it is enough (for now). But we are talking about the flagship card for PC gaming (assuming AMD drops the ball).
You're paying a premium so that you can have the same graphical fidelity as a console, admittedly with higher frame rates. (I think we all agree that an increase graphical fidelity, compared to the console, will require more VRAM right? Does anyone disagree?). Like i said on the first page for those that buy a new graphics card every generation this isn't going to be a problem. For anyone planning on skipping a generation, good luck with that.

To those who are saying that 10GB is enough, and RTX IO will make up the difference, etc...

You will all be singing a different tune when Nvidia launches their next generation of cards with ~16GB VRAM on the top end:p.
And none of you will be complaining about having to pay more for an extra 6GB of VRAM you "don't" need;).
 
You will all be singing a different tune when Nvidia launches their next generation of cards with ~16GB VRAM on the top end:p.
And none of you will be complaining about having to pay more for an extra 6GB of VRAM you "don't" need;).

Don't even need to wait that long, just a few months and a variant like that will be out. :D
 
I never really rated my 780Ti, was not much a jump over the 680 4GB I had previously but the jump from the 780Ti to the Titan X was (the first black coloured full fat Titan X so 2015) was decent.
 
Have we got any evidence of an 8GB 2080 or 5700XT being hindered by VRAM?
I don't mean "ThIs GaMe UsED aLL 8GigS" I mean actually hitting the limit and performance suffering as a result (mass stuttering and/or massive drop in performance etc).
 
Have we got any evidence of an 8GB 2080 or 5700XT being hindered by VRAM?
I don't mean "ThIs GaMe UsED aLL 8GigS" I mean actually hitting the limit and performance suffering as a result (mass stuttering and/or massive drop in performance etc).
This is worth a watch. Skip to 9:55 for the biggest difference.
 
8gb is still enough, I've still not seen it filled even when using VR. 6gb on the 1060 was a bit borderline on some things.

If you were running 4k in VR then you might fill it. But that's going to run like crap no matter what you have.
 
Well I was set on this 3080 coming from a 1080ti, but all this VRAM talk has pushed me over the edge. I'll sit it out for a little while longer and see how things fall.....until tomorrow when I change my mind again.
 
Status
Not open for further replies.
Back
Top Bottom