• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
The whole point was to run maximum settings at 4K (as the card had the grunt to do it) to see the frame rate collapse to single digits, as pointed out via PcGamesHardware and Computerbase. If you ran FSR you may not see the FPS collapse as the game is running at 1440P and not 4K resolution. I would have thought this would be obvious, but apparently it's not. These tech results were deemed to be false, so we asked actual owners to use the same settings (4K etc) and test. We then found out the tech results were true, and the people claiming otherwise since the launch of the game were wrong. This is the part that stings I guess. Also, no one has ever said you cannot enable FSR, unless you are trying to test the exact scenario reported by tech outlets and now owners, which this thread has largely cantered around.

Remind me again, who said they were false/lying? The only thing I recall of is that someone insinuated their 6800xt performance results weren't correct (yet the 3080 results are gospel :cry:) where they show a 6800xt lacking grunt for 4k i.e. dips into 40s with averages in 50s, which falls in line with other users footage from yt and other reviewers but somehow.... your 6800xt is the de facto standard for how all 6800xt will perform :cry: Can't blame not having SAM on for those ones ;)

The only thing I and others pointed out was that why aren't none of the other sites showing the same issues as they have? I brought up how they forced rebar to be on too, which from my testing also caused issues but as mentioned, HUB also had rebar on yet didn't have the issues.

Crazy how tribal VRAM capacity is.

This thread (and same on TPU).

I would summarise as.

3080 grunt to VRAM balance not balanced if you want fidelity, for some games under spec'd depending on settings used.
3080 FE good value for money in current market. So still a good buy regardless of the VRAM issue.
People going backwards and forwards showing examples of arguments for both sides.
If you a high FPS low fidelity gamer, e.g. competitive shooter, VRAM likely to be fine.

Personally I disagree with that bit in bold and underlined. A 3080 achieves both fantastically, aside from the tiers above the 3080 10gb (which we have all agreed on as not providing worthwhile value) and maybe a 3070/ti, no other gpu can provide the same visual fidelity when it comes to RT. Also, a 3080 performs a good bit better than it's competitor the 6800xt in 4k as well, ampere scales better with higher res. On average a 3080 is anywhere between 2-10% faster and that's not factoring in many RT titles as per HUBs and TPU testing nor how many more titles have dlss over FSR 1/2 (which dlss generally improves visual quality on the "whole") but I digress on that last bit....

Of course if you play games like your FF7, icdps DCS Normandy map in VR then having more vram will provide more visual fidelity there.
 
Remind me again, who said they were false/lying? The only thing I recall of is that someone insinuated their 6800xt performance results weren't correct (yet the 3080 results are gospel :cry:) where they show a 6800xt lacking grunt for 4k i.e. dips into 40s with averages in 50s, which falls in line with other users footage from yt and other reviewers but somehow.... your 6800xt is the de facto standard for how all 6800xt will perform :cry: Can't blame not having SAM on for those ones ;)

The only thing I and others pointed out was that why aren't none of the other sites showing the same issues as they have? I brought up how they forced rebar to be on too, which from my testing also caused issues but as mentioned, HUB also had rebar on yet didn't have the issues.



Personally I disagree with that bit in bold and underlined. A 3080 achieves both fantastically, aside from the tiers above the 3080 10gb (which we have all agreed on as not providing worthwhile value) and maybe a 3070/ti, no other gpu can provide the same visual fidelity when it comes to RT. Also, a 3080 performs a good bit better than it's competitor the 6800xt in 4k as well, ampere scales better with higher res. On average a 3080 is anywhere between 2-10% faster and that's not factoring in many RT titles as per HUBs and TPU testing nor how many more titles have dlss over FSR 1/2 (which dlss generally improves visual quality on the "whole") but I digress on that last bit....

Of course if you play games like your FF7, icdps DCS Normandy map in VR then having more vram will provide more visual fidelity there.
Make up your mind as you seemed to have no issue with how I did my diagnosis before, :)

Bear in mind when VRAM is saturated the symptoms can be really nasty, when rasterization is maxed out its usually just a few less frames. I also play at 60 or 30fps capped which the 3080 is easily capable off.

We havent all agreed on anything in this thread hence my last comment and why its over 300 pages, earlier in the thread someone posted some texture quality shots and said he couldnt tell the difference, but I someone who is very sensitive to texture quality noticed immediately. I tend to max out texture quality in any game I play (including using 4k textures if available). Was unhappy having to drop shadows to low in ff7 remake to mitigate VRAM capacity issues.

For me all of the tech reviewers reviews are not credible regarding VRAM except maybe digital foundry, DF actually play the games for several hours they review instead of doing short bench runs like GN, HUB, TPU etc. They also detect things like texture streaming quality issues, pop ins etc again unlike GN, HUB, TPU etc. They also discovered that traditional FPS monitoring is now unreliable and were the only reviewer that I can recall that detected the stutter fest in FF7 remake with other reviewers just relying on their monitoring logs. The review industry as I said in earlier posts is below par when it comes to analysing VRAM and they need to get their act together. It isnt good enough to plug in a card, and just fire up a bench run. Remember UE4 is dynamic texture quality not fixed, if its low on vram it will downgrade the textures to try and keep the game running.

None of them picked up on far cry 6 vram issues, none of them picked up on the now infamous gtx 970 Mordor VRAM issues, same with FF15. Was amusing reading all their excuses when they all failed on Mordor.
 
Last edited:
Having toyed around with something as silly expensive and overkill as a 3090 Ti, Not mine I have a 2080 Super, I have seen many games easily use/allocate upwards of 16GB, Namely The Division 2, I saw that using/allocating just over 16GB at 1440P max settings and even at a frame cap of 60FPS it felt a hell of a lot smoother than on my 2080S at the same exact frame rate cap of 60.

I know VRAM wise my 2080S, A 3080 and 3090 Ti aren't comparable at all, Different memory capacities and speeds, But seeing high VRAM usage way beyond what I've seen before mixed with a very smooth game experience was interesting.

Allocated is not the same as what is actually needed from my experience.

Like I remember playing final fantasy 15 on my Titan XP some years ago and it would allocate almost the whole 12gb. But then people on 11gb 1080 Ti were able to play the game just fine also and did not get any performance issues due to a lack of the extra vram. Not so long after I started finding more out about vram and how allocated and what is actually used before you have an actual problem where fps tanks are two seperate things.

As another example you can get old Call of Duty games use up loads of vram. I remember Kaapstad demonstrating this ages ago. Yet they worked fine on 8gb cards.

Don't get me wrong. I am a firm believer that more is better and now that by the time new cards are out we will be in 2023 I feel 4080 would need no less than 16gb. But the whole debate here is about the 3080.

I don't think anyone here defending less vram is better. What people are saying is let's look at the price and lifespan of the card. Most enthusiasts that visit and debate in this thread are the types that upgrade every gen, hell some many times per gen. What I said back in 2020 was 10gb will be fine in all but a handful of games by the time next gen cards are out and that is exactly what has happened. Now we will be upgrading to next gen cards which will have the more vram anyway.

The question I ask people is what game have you missed out that you really wanted to play due to only having 10gb? :p

Anyway, it is just a fun debate that keeps going round and round. At the end of the day if having a 3090 or 3090Ti makes someone happy, more power to them. As the saying goes, it's their money and they can do what they want with it :D
 
Make up your mind as you seemed to have no issue with how I did my diagnosis before, :)

Bear in mind when VRAM is saturated the symtpms can be really nasty, when rasterization is maxed out its usually just a few less frames. I also play at 60 or 30fps capped which the 3080 is easily capable off.

We havent all agreed on anything in this thread hence my last comment and why its over 300 pages, earlier in the thread someone posted some texture quality shots and said he couldnt tell the difference, but I someone who is very sensitive to texture quality noticed immediately. I tend to max out texture quality in any game I play (including using 4k textures if available). Was unhappy having to drop shadows to low in ff7 remake to mitigate VRAM capacity issues.

For me all of the tech reviewers reviews are not credible regarding VRAM except maybe digital foundry, DF actually play the games they review instead of doing short bench runs like GN, HUB, TPU etc. They also detect things like texture streaming quality issues, pop ins etc again unlike GN, HUB, TPU etc. They also discovered that traditional FPS monitoring is now unreliable and were the only reviewer that I can recall that detected the stutter fest in FF7 remake with other reviewers just relying on their monitoring logs. The review industry as I said in earlier posts is below par when it comes to analysing VRAM and they need to get their act together. It isnt good enough to plug in a card, and just fire up a bench run.

I agreed with your diagnosis hence why I listed your example FF 7 :p My point is that those kind of scenarios are far and few between, say like 1%, however, for the majority of other games especially RT, it's the 99% where a 3080 despite having lesser vram does better for both visual and performance. It's why you never see DF mentioning about vram i.e. it's not been a problem for them (I haven't watched the FF 7 video as not really interested in the game so perhaps they mentioned it there?), the only example I recall of was deathloop where some textures will pop in with lesser vram card (Bethesda titles are notorious for this too). Sadly as we both pointed out last time though, DF aren't credible because "nvidia shills" but glad you concur that they are the best in the business for this kind of thing :)

PS.

You should have a watch of their spiderman video, it's great at looking into stutter/performance and even factors in the hard drive speed/utilisation :cool:

Allocated is not the same as what is actually needed from my experience.

Like I remember playing final fantasy 15 on my Titan XP some years ago and it would allocate almost the whole 12gb. But then people on 11gb 1080 Ti were able to play the game just fine also and did not get any performance issues due to a lack of the extra vram. Not so long after I started finding more out about vram and how allocated and what is actually used before you have an actual problem where fps tanks are two seperate things.

As another example you can get old Call of Duty games use up loads of vram. I remember Kaapstad demonstrating this ages ago. Yet they worked fine on 8gb cards.

Don't get me wrong. I am a firm believer that more is better and now that by the time new cards are out we will be in 2023 I feel 4080 would need no less than 16gb. But the whole debate here is about the 3080.

I don't think anyone here defending less vram is better. What people are saying is let's look at the price and lifespan of the card. Most enthusiasts that visit and debate in this thread are the types that upgrade every gen, hell some many times per gen. What I said back in 2020 was 10gb will be fine in all but a handful of games by the time next gen cards are out and that is exactly what has happened. Now we will be upgrading to next gen cards which will have the more vram anyway.

The question I ask people is what game have you missed out that you really wanted to play due to only having 10gb? :p

Anyway, it is just a fun debate that keeps going round and round. At the end of the day if having a 3090 or 3090Ti makes someone happy, more power to them. As the saying goes, it's their money and they can do what they want with it :D

If the fury x was ok with 4gb according to some.... ;) :cry: :D I think the 3080 will be ok with 10gb :p
 
  • Haha
Reactions: TNA
Make up your mind as you seemed to have no issue with how I did my diagnosis before, :)

Bear in mind when VRAM is saturated the symptoms can be really nasty, when rasterization is maxed out its usually just a few less frames. I also play at 60 or 30fps capped which the 3080 is easily capable off.

We havent all agreed on anything in this thread hence my last comment and why its over 300 pages, earlier in the thread someone posted some texture quality shots and said he couldnt tell the difference, but I someone who is very sensitive to texture quality noticed immediately. I tend to max out texture quality in any game I play (including using 4k textures if available). Was unhappy having to drop shadows to low in ff7 remake to mitigate VRAM capacity issues.

For me all of the tech reviewers reviews are not credible regarding VRAM except maybe digital foundry, DF actually play the games for several hours they review instead of doing short bench runs like GN, HUB, TPU etc. They also detect things like texture streaming quality issues, pop ins etc again unlike GN, HUB, TPU etc. They also discovered that traditional FPS monitoring is now unreliable and were the only reviewer that I can recall that detected the stutter fest in FF7 remake with other reviewers just relying on their monitoring logs. The review industry as I said in earlier posts is below par when it comes to analysing VRAM and they need to get their act together. It isnt good enough to plug in a card, and just fire up a bench run. Remember UE4 is dynamic texture quality not fixed, if its low on vram it will downgrade the textures to try and keep the game running.

None of them picked up on far cry 6 vram issues, none of them picked up on the now infamous gtx 970 Mordor VRAM issues, same with FF15. Was amusing reading all their excuses when they all failed on Mordor.
fF36z5h.gif
I've been saying the part in bold for a while now, and as you pointed out saturation can cause various different symptoms depending on how the game engine handles the situation.
 
Last edited:
Is it make up your mind or move the goal posts though? Don't think we will ever see the anti-knitting crowd accept a grown up debate.
 
earlier in the thread someone posted some texture quality shots and said he couldnt tell the difference

Was is LtMatt by any chance? As I recall he could also not see the difference RT made in Dying Light 2 when the difference was day and night. Makes one think that being the case why he goes on so much about textures :p :cry: :p
 
They won't, they'll just keep the fingers in the ears as per usual :cry: :D

Yeah, it’s what the ”grown ups” do in debates apparently :p

“Grown up debates” also means bumping a thread that has not been active for over a month with a post to stir **** up :cry:
 
I spent £1300 on 2 3080's this gen so could have easily afforded a 3090 and probably would have bought one had it offered a noticeable bump in performance of like 40% but 15% isn't even noticeable outside of having an fps counter running and it's not even the "best card" just an overpriced runner up.

So lets drill into this Joxeon. Your happy to buy two of the same card. £1300+ yet you cannot sli them, if they are in the same rig it will be a hotbox with one card being rather wasted unless you were into mining which again is a bad idea with horrible airflow. Bad idea mining again in separate rigs as each rig uses power for components therefore inefficient - are you just not letting on how many cards you bought during a gamer shortage? You come across as exceptionally bitter talking about the 10GB model specifically, yet in random threads acknowledge the talking point.

Jensen is loving guys like you and nexus, after all you will also buy an Ada card in a few months. We also see in the other thread you are aware of the vram hence and I quote:

the flagship will need more VRAM this time around.

I've been saying the part in bold for a while now, and as you pointed out saturation can cause various different symptoms depending on how the game engine handles the situation.

Yes you have, and it was debunkers swiss army knife of avoidance tactics to "use reputable sources" so we tried that, and then he throws in a shotgun of garbage referencing game forums of all sorts where 95% is about people complaining who can barely configure a PC setting. ;)
 
Last edited:
Was is LtMatt by any chance? As I recall he could also not see the difference RT made in Dying Light 2 when the difference was day and night. Makes one think that being the case why he goes on so much about textures :p :cry: :p
Once I saw the performance cost for the improvement in image quality by enabling RT, I was hooked. :cry:


Talk about a must have feature! :D
 
Once I saw the performance cost for the improvement in image quality by enabling RT, I was hooked. :cry:


Talk about a must have feature! :D

Kind of like the price difference we saw between the 3080 and 3090? :cry:

Also, seems you have gone and done it again. That is not a Dying Light 2 comparison ;)
 
Also, seems you have gone and done it again. That is not a Dying Light 2 comparison ;)
I know, but I also never said I couldn't tell the difference between normal textures and HQ ones, and you know that too. ;)

In Dying Light 2 I also said that the reason RT looks good is because no effort was put into the game without it, the game looks totally different with it off. Most other games are not that pronounced.

And also, parts of the game even with RT look terrible IMO. Low quality textures, dark and dingy. It might be 'realistic' but it looks awful in parts. And I've put quite a bit of time into the game now with AMD and Nvidia cards.
npwZcDw.jpg
And yeah, the first Dying Light is much better. Dying Light 2 is a big let down in all aspects bar some nice RT effects. Though you could say the same about Far Cry 5 vs 6 though tbf, as I preferred 5 despite the graphics (and textures) being far superior in 6. The story and gameplay were worse IMO. YMMV.
 
Last edited:
Once I saw the performance cost for the improvement in image quality by enabling RT, I was hooked. :cry:


Talk about a must have feature! :D

Given you don't know when RT is used or what RT is used and where it is used as based on that image and also DL 2 video (which I see is now removed :cry:), you are the last person to listen to when it comes to RT :p :D :cry:

@TNA

iw8VKP4.png

Wise move in all seriousness though as it was misleading ;)

You know what else “grown ups” have? Cool sigs!! Just check out his one out. It has our usernames in it and has been his sig for some months now :cry:

An unhealthy obsession with us it seems ;)




I see we are now back to the "RT is crap" arguments, once again, pretty much sums up this thread well and answers the question of if 10gb is enough..... ;) :D
 
  • Haha
Reactions: TNA
Status
Not open for further replies.
Back
Top Bottom