• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I just changed from a 3080 10GB to a 4080 16GB. The Witcher 3 on Ultra with RTing maxxed out (DLSS3 on) is using just over 10GB at 3440x1440. I guess you could say that isn't really relevant to the 3080 because as I found out myself, RTing was nigh unplayable on a 3080 with TW3, unless you enjoyed a FPS below 60 even on performance settings.

So without RTing at 3440x1440 I found most games worked fine within the 10GB budget (albeit it under 4K res). So the arguing can continue lol.
 
Last edited:
It will no doubt but i think the 12gb thread will get busier as the 4070ti cost more than the 3080 10gb.
12gb thread got locked. This will too soon at this rate :cry:

Not a problem for me as my prediction was right in the end, plus I am on 12GB for the moment (probably until next gen cards) :D
 
It's ok chaps, hogwarts magic is able to magically fix your pc hardware issues too:

LMAO!

This is actually true:

Sometimes doing ‘revelio’ will reset fps back to normal (if going in/out of menus doesn’t help).

glr5Sdm.jpg

ncASGX5.jpg

:cry:


Washed out because of HDR

"repairo FPS!"
 
It will no doubt but i think the 12gb thread will get busier as the 4070ti cost more than the 3080 10gb.

Guess folks gonna have to be happy accepting it's both not enough and possibly enough depending on your uses.

I do generally agree the 3080 should never have launched with 10GB after the 2080Ti settled on 11GB, hence NVIDIA quite quickly relaunching the 3080 with 12GB. But for anyone using a 3080 10GB, just carry on enjoying using it if it meets your needs.

Especially now that NVIDIA have gone off the deep end with value proposition below the 4090. Don't get me wrong DLSS3 is a gamechanger for me for the single player games I've used it in so far, but the cost to entry right now for a 4080 let alone a 4090 is steep.
 
Last edited:
Haha, yup, magic :D

So given this game’s performance is buggered I wonder if channels like HUB will go back and re-do their testing (and change their clickbait titles) once the game is patched?

The 10GB being too little - yeah, you can argue that one way or the other - but I’m certain this game needs some work before we start using it to write off the 3080 10GB as obsolete. I do like HUB, but Steve comes across as so arrogant sometimes I bet he doesn’t do a follow up and doubles down on the 3080 bashing.
 
Haha, yup, magic :D

So given this game’s performance is buggered I wonder if channels like HUB will go back and re-do their testing (and change their clickbait titles) once the game is patched?

The 10GB being too little - yeah, you can argue that one way or the other - but I’m certain this game needs some work before we start using it to write off the 3080 10GB as obsolete. I do like HUB, but Steve comes across as so arrogant sometimes I bet he doesn’t do a follow up and doubles down on the 3080 bashing.

X8Thzm2.gif

:cry: :D

Guaranteed they won't do a follow up, I do like HUB and their content but they are very clickbaity and often feels like they just say/do things as they know it will get clicks and people commenting/linking to their stuff because of the fanboys (goes for both amd and nvidia) e.g. all of a sudden, they are now saying that the 3080 10gb is causing issues in loads of games yet they haven't shown any evidence of these issues in their videos until hogwarts :confused:

Wish Df/Alex would do a video on the game though as they are far more technical/thorough when it comes to looking into why performance is the way it is and what might be causing issues, also gives feedback to the devs to try and resolve the issues rather than just simply saying "you need more vram, better cpu, more ram" etc.



I have got the game running very well now for hogsmead and hogwarts castle (most demanding areas), have simply dropped settings to high and RT off (except shadows) with dlss balanced and if fps drops, I simply cast the "repairo fps" spell :cry: It looks great and hdr is spot on as well :)
 
Your fight is over :cry:

I've turned reflections and ao off more because it's "broken" i.e. doesn't look good/work properly (in most scenarios) first game with RT (of which there has been many) where I've "chosen" to do this so as per usual, let's not blame the tech, but blame the developers for a half assed job of implementing it.
 
Last edited:
12gb thread got locked. This will too soon at this rate :cry:

Not a problem for me as my prediction was right in the end, plus I am on 12GB for the moment (probably until next gen cards) :D
Mine was correct as well with it not being enough for those into it as a long term card. The 10gb like you said was enough until new gen came along. It does look like developers are starting to push Vram up now though.
 
  • Like
Reactions: TNA
Mine was correct as well with it not being enough for those into it as a long term card. The 10gb like you said was enough until new gen came along. It does look like developers are starting to push Vram up now though.

The problem with this argument is this:

This game seems to be all over the place. Some sites saying it runs great and others the opposite. Could be down to hardware configerations at a guess.

All of a sudden hogwarts, a game that is very clearly broken is now going to be the de-facto for new games going forward now with regards to vram? Even though:

- gpus with plenty of vram are also ******** the bed at 1080P e.g. 7900xtx
- in cut scenes, fps plummet but in gameplay, goes back up
- casting a spell causes the fps to shoot straight back up even though nothing else has changed/happened in the frame
- vram usage isn't any different between 1080p and 4k (also, just to add even when I have dlss performance mode enabled, vram usage also didn't drop, another thing which goes against the standard of what you would expect to happen)
- people even at 720P getting drops to 10-20 fps with everything on low

It reminds me a bit of dead space where Alex/DF mentioned an issue with fps plummeting and at first thinking it might be down to vram but then the same issue is also there on a 409, which rules out the "not enough vram" argument.

Maybe in another year or 2, 10gb will be a real killer but as we are already seeing, gpus released at the same time with more vram are also ******** the bed because of lack of grunt (especially rdna 2 where RT is concerned) and I wouldn't expect any gpu which then hits 3/4/5 years old being able to keep settings maxed especially at high res. or/and high fps, especially when the current new £1300+ are having to already sacrifice settings because of grunt or/and **** optimised games, as said, I have a feeling current top end flagships are going to age far worse and quicker now because of devs getting even more lazy and using dlss and now FG to "optimise" their games for them.

It's a sad state pc gaming scene is in when a £1300/1600+ new gpu is no longer enough.
 
Last edited:
  • Like
Reactions: TNA
I guess we will find out when the game recieves patches. The difference between a 7900xtx and 3080 is that its clearly not a Vram issue on the 7900xtx case but clearly is with the 3080. Watched a few videos and at 1440p the game was using 11500 but needing 10500 so still over 10gb on a 4090. This was using msi afterburner for both vram readings.

It does appear that there is some kind of weird cpu bottleneck going on in most videos. Cpu load is low but Gpu not maxed.
 
Last edited:
I guess we will find out when the game recieves patches. The difference between a 7900xtx and 3080 is that its clearly not a Vram issue on the 7900xtx case but clearly is with the 3080. Watched a few videos and at 1440p the game was using 11500 but needing 10500 so still over 10gb on a 4090. This was using msi afterburner for both vram readings.

It does appear that there is some kind of weird cpu bottleneck going on in most videos. Cpu load is low but Gpu not maxed.

That is true but a 7900xtx should be getting far better FPS than what it is at 1080P, arguably so should a 4090 too (although that's probably more down to nvidias dx 12 driver overhead)

Thing I would love to know is why does fps jump back up with casting a spell in game, literally did nothing other than cast a spell, no opening/exiting menu:


And it isn't "clearing" vram out as you can see they are both the same before and after, although look at the gpu usage, higher fps one has lower gpu usage and also look at the power consumption.....

A few others have also said the same, something is not right.
 
That is true but a 7900xtx should be getting far better FPS than what it is at 1080P, arguably so should a 4090 too (although that's probably more down to nvidias dx 12 driver overhead)

Thing I would love to know is why does fps jump back up with casting a spell in game, literally did nothing other than cast a spell, no opening/exiting menu:


And it isn't "clearing" vram out as you can see they are both the same before and after, although look at the gpu usage, higher fps one has lower gpu usage and also look at the power consumption.....

A few others have also said the same, something is not right.
Yes i think we agree this game needs patching as something ain't right. It is funny that magic casting does magical things to the frame rate.
 
A good video proving even further how broken the game is, this is 3070 8gb footage:


Got to say, can see a difference between low and ultra texture setting but damn... is it small the difference.
 
I see the new narrative is already being pushed that the game is "broken" and it's nothing to do with VRAM.

So you also think this is all perfectly normal and nothing wrong at all with the game?

- gpus with plenty of vram are also ******** the bed at 1080P e.g. 7900xtx
- in cut scenes, fps plummet but in gameplay, goes back up
- casting a spell causes the fps to shoot straight back up even though nothing else has changed/happened in the frame
- vram usage isn't any different between 1080p and 4k (also, just to add even when I have dlss performance mode enabled, vram usage also didn't drop, another thing which goes against the standard of what you would expect to happen)
- people even at 720P getting drops to 10-20 fps with everything on low
- setting textures to "low" does not fix the fps drops either

:cry:
 
Last edited:
  • Like
Reactions: TNA
I did some testing with my 3080 back in Sept/Oct 2020 when I was silly enough to listen to “da internet“. At the time Afterburner beta had been released- this shows actual vRam usage compared to allocated. As I remember actual was more than 20% less than allocated and using any Upscaling dropped that by more than 20% too.This was at 4K.

The conclusion I came to in 2020 is the same as now- the 3080 will run out of grunt before it runs out of vRam I.e. you will be looking at upgrading for better performance before you start worrying about vRam in the vast amount of user experiences.I know that is certainly where I am at- at RRP in 2020 the 3080 was amazing, but can only hope the 4080ti is released at the current 4080 price or I’ll sit and wait it out. I was spoilt by what the 3080 gave me cost/performance wise so not interested in getting any post-buyers remorse.
 
Status
Not open for further replies.
Back
Top Bottom