• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I think that if WDL requires DLSS at 1080p on a card with 8GB VRAM then it is an absolutely unoptimised POS. I say that not to defend any lack of VRAM in the 3070, but considering that WDL hardly looks like the most amazing of next-gen games vs something like CP2077, then this suggests me that it probably uses more resources than it needs. I would also be interested to see how it runs without RT at the same settings.

The sad fact is that it's not going to be the only unoptimised POS coming out in 2020/2021 and this will have an effect on cards with lower amounts of VRAM that may not otherwise deserve to run in to performance issues. I mean, I am all for accepting more VRAM usage if a developer actually justifies needing it and you see the evidence with your own eyes, but I resent it when dirt companies like Ubisoft make it necessary though bad development and you end up with a mediocre looking game with bad textured and tons of RT reflections that cause performance to drop off a cliff.
Yea probably optimization is a part of it. 8gb is the bare minimum you would want in a card and im sure for 90% of the games coming out in the next 2 years it shouldn't pose huge problems at 1080p or maybe even 1440p.


https://www.youtube.com/watch?v=NEoggn32eag&list=LL&index=1&t=132s

It uses around 7gb-7.5gb without ray tracing and without dlss

and runs without any stuttering, but its at the absolute edge. I think the main culprit here is the 20gb hd texture pack(ultra settings) that was recommended to be used with 4k by the devs but every benchmarker and reviewer uses it at all settings.

Without that texture pack the game uses barely 6gb at 1440p with dlss+ray tracing
 
Seems we have two different narratives between the 3070 8GB thread and this one.

In the 3070 8GB thread they are all saying 8GB is not enough.

Step up a class to the 3080 and we are all trying to justify that 10GB is fine.

:D

I'm really in two minds what to do myself.

I only game at 1440p, but also by the time I'm spending £700 on a GPU I don't want to hit any vram limitations.... I shouldn't have to at this price point.
 
Seems we have two different narratives between the 3070 8GB thread and this one.

In the 3070 8GB thread they are all saying 8GB is not enough.

Step up a class to the 3080 and we are all trying to justify that 10GB is fine.

:D

I'm really in two minds what to do myself.

I only game at 1440p, but also by the time I'm spending £700 on a GPU I don't want to hit any vram limitations.... I shouldn't have to at this price point.
Normally I'd say the truth is somewhere in the middle, but this time i can't. :D
 
Seems we have two different narratives between the 3070 8GB thread and this one.

In the 3070 8GB thread they are all saying 8GB is not enough.

Step up a class to the 3080 and we are all trying to justify that 10GB is fine.

:D

I'm really in two minds what to do myself.

I only game at 1440p, but also by the time I'm spending £700 on a GPU I don't want to hit any vram limitations.... I shouldn't have to at this price point.

Do what I'm doing and boycott the VRAM gimped 3070 and 3080.

I'd be going AMD if not for my Gsync panel.

I'll just wait either for the absolutely inevitable upgraded 3070/80 or a bit longer until Hoppit comes out.

I will not and shall not buy gimped graphics cards.

I still feel bad for recommending the 970 to my brother before the 3.5GB shenanigans.
 
Do what I'm doing and boycott the VRAM gimped 3070 and 3080.

I'd be going AMD if not for my Gsync panel.

I'll just wait either for the absolutely inevitable upgraded 3070/80 or a bit longer until Hoppit comes out.

I will not and shall not buy gimped graphics cards.

I still feel bad for recommending the 970 to my brother before the 3.5GB shenanigans.

Is turning down settings the issue, or is turning down settings acceptable....as long as it's not because of vram?

I could by a 3090 and still have to turn down settings in my favorite sim. (And the sim uses less than 8gb of vram)
 
Do what I'm doing and boycott the VRAM gimped 3070 and 3080.

I'd be going AMD if not for my Gsync panel.

I'll just wait either for the absolutely inevitable upgraded 3070/80 or a bit longer until Hoppit comes out.

I will not and shall not buy gimped graphics cards.

I still feel bad for recommending the 970 to my brother before the 3.5GB shenanigans.

The irony is the 970 still turned out to be an absolutely fantastic card at it's price point. It had unrivalled performance per watt and soundly beat the AMD competition at the time. It also lasted a very long time for many gamers.
 
I'd be going AMD if not for my Gsync panel.
Same, and my current screen is good, and little/no backlight bleed so is also good for dark games. I really would rather keep my current monitor, so it's probably a matter of either skipping these cards or waiting for the Ti model. Or buying one anyway and alter settings to suit until something better comes along.
I only game at 1440p, but also by the time I'm spending £700 on a GPU I don't want to hit any vram limitations.... I shouldn't have to at this price point.
It seems like it will in the odd title or so, vram usage only seems to be a bit lower at 1440p than at 4k anyway
 
Do what I'm doing and boycott the VRAM gimped 3070 and 3080.

I'd be going AMD if not for my Gsync panel.

I'll just wait either for the absolutely inevitable upgraded 3070/80 or a bit longer until Hoppit comes out.

I will not and shall not buy gimped graphics cards.

I still feel bad for recommending the 970 to my brother before the 3.5GB shenanigans.

What card are you running? Because I set my self a target to upgrade to a 3000 series card for CyberPunk.
 
What card are you running? Because I set my self a target to upgrade to a 3000 series card for CyberPunk.

I'm running a 1070 at 3440x1440 which still handles it surprisingly well.

I limit my FPS to 60Hz, mainly run FXAA which is fine to my old eyes, and turn down the odd setting to maintain near 60 in most games.

The 1070 runs at 2000+Mhz and having a Gsync panel really helps with smoothness.
 
Is turning down settings the issue, or is turning down settings acceptable....as long as it's not because of vram?

I could by a 3090 and still have to turn down settings in my favorite sim. (And the sim uses less than 8gb of vram)

I don't mind turning down the odd setting here and there, but if I'm paying six to seven hundred pounds for a new card I'd sooner have more than eight or ten GB, especially when I've been running 8GB for three years plus.

AMD are showing the way forward with their new range of cards.

Nvidia are doing their usual Vram skimping shenanigans.
 
I don't mind turning down the odd setting here and there, but if I'm paying six to seven hundred pounds for a new card I'd sooner have more than eight or ten GB, especially when I've been running 8GB for three years plus.

AMD are showing the way forward with their new range of cards.

Nvidia are doing their usual Vram skimping shenanigans.
Your response? Buy another Nvidia card, that will teach them!

Just kidding :p;)
 
Thats what I do, an I'm not kidding:p

Got my 3070 and waiting for the vram to run oot!:p
Lol :D

As long as you are happy turning texture setting down one notch in future titles you will not be fine for the most part. The problem about running out is only for those who are not flexible and must run on maximum no matter what, even if it makes no difference to image quality. Bit like in Doom Eternal, the actual texture quality is the exact same between the top top two settings. Lol :D
 
I don't mind turning down the odd setting here and there, but if I'm paying six to seven hundred pounds for a new card I'd sooner have more than eight or ten GB, especially when I've been running 8GB for three years plus.

AMD are showing the way forward with their new range of cards.

Nvidia are doing their usual Vram skimping shenanigans.

Would a $700 20gb appeal to you? I would rather take my chances with 10gb of vram and more GPU horsepower.

I would pay a *little* more money for more vram, but not much. If more vram costs too much money, I'll take my chances. -like if the $500 3070 would cost $700 with 16gb of gddr6x, or $900 20gb 3080...no thanks.
 
Its not the vRAM causing the low fps.
Yea probably optimization is a part of it. 8gb is the bare minimum you would want in a card and im sure for 90% of the games coming out in the next 2 years it shouldn't pose huge problems at 1080p or maybe even 1440p.


https://www.youtube.com/watch?v=NEoggn32eag&list=LL&index=1&t=132s

It uses around 7gb-7.5gb without ray tracing and without dlss

and runs without any stuttering, but its at the absolute edge. I think the main culprit here is the 20gb hd texture pack(ultra settings) that was recommended to be used with 4k by the devs but every benchmarker and reviewer uses it at all settings.

Without that texture pack the game uses barely 6gb at 1440p with dlss+ray tracing

The HD texture pack textures +20GB and needs 6GB of vRAM. 8GB cards should be okay. The 3070 is not ment to be a 4k card. The game will alocate as much vRAM as possible, you need used vRAM to see what is happening. There is a mod for Watch dogs legion with 8k textures.


With a 2080 ti the game uses https://youtu.be/U_mO8ImU5sU?t=352 5:52 8228MB with RT off and DLSS off @4k. With RT on and DLSS off (TAA) at 4k the 2080 ti uses 8944MB 2:51 https://youtu.be/U_mO8ImU5sU?t=171

By this thread the 3070 should be out of vRAM. Yet it runs the game fine.
 
Its not the vRAM causing the low fps.


The HD texture pack textures +20GB and needs 6GB of vRAM. 8GB cards should be okay. The 3070 is not ment to be a 4k card. The game will alocate as much vRAM as possible, you need used vRAM to see what is happening. There is a mod for Watch dogs legion with 8k textures.


With a 2080 ti the game uses https://youtu.be/U_mO8ImU5sU?t=352 5:52 8228MB with RT off and DLSS off @4k. With RT on and DLSS off (TAA) at 4k the 2080 ti uses 8944MB 2:51 https://youtu.be/U_mO8ImU5sU?t=171

By this thread the 3070 should be out of vRAM. Yet it runs the game fine.
https://www.youtube.com/watch?v=0LkqqgvMyFI

1080p ultra(hd texture pack) without dlss and ray tracing on, the 3070 struggles, with dlss though it runs great.
 
I dont get why people are so eager to reduce texture quality in favour of fancy lighting especially when that fancy lighting now highlights your now reduced texture quality.
using higher res textures was always a gimme given that the performance hit was minimal provided you had the vram for it. Now we are at a point were we dont need the vram because nvidia decided you didnt need it.

I'm interested to see how this plays out over the next few months as new games release,still on the fence until we see some next gen games.
 
I dont get why people are so eager to reduce texture quality in favour of fancy lighting especially when that fancy lighting now highlights your now reduced texture quality.
using higher res textures was always a gimme given that the performance hit was minimal provided you had the vram for it. Now we are at a point were we dont need the vram because nvidia decided you didnt need it.

I'm interested to see how this plays out over the next few months as new games release,still on the fence until we see some next gen games.
No one is eager. However there is plenty of evidence out there that dropping one notch down does not make a huge difference to image quality when playing. Not saying this is the case in every game. Also there is the price for performance perspective. If you know you will be only keeping the card you just got for one gen, then is it so mad to make a big saving and going for the lower amount of vram in Nvidia's selection and then using the money saved on a next gen card that will no doubt have 16gb as standard? Also, how many games right now can one not run at maximum textures with a 3080 10gb without running out of vram? Zero.

Good thing is there are options, each to their own. Even if you do not want an AMD card, soon there will be a 3080 with more VRAM no doubt. Pay the extra for the vram if it makes you feel better, but not everyone feels this way.
 
Status
Not open for further replies.
Back
Top Bottom