• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I did some testing with my 3080 back in Sept/Oct 2020 when I was silly enough to listen to “da internet“. At the time Afterburner beta had been released- this shows actual vRam usage compared to allocated. As I remember actual was more than 20% less than allocated and using any Upscaling dropped that by more than 20% too.This was at 4K.

The conclusion I came to in 2020 is the same as now- the 3080 will run out of grunt before it runs out of vRam I.e. you will be looking at upgrading for better performance before you start worrying about vRam in the vast amount of user experiences.I know that is certainly where I am at- at RRP in 2020 the 3080 was amazing, but can only hope the 4080ti is released at the current 4080 price or I’ll sit and wait it out. I was spoilt by what the 3080 gave me cost/performance wise so not interested in getting any post-buyers remorse.

Few understand.
 
  • Haha
Reactions: TNA
Posting this here too, not tried myself as already got the game running extremely well and won't be using RT reflections anyway due to how awful they look in this.

well, this 'fix' was a game changer for me, locked 60fps even with RT on in Hogsmeade (3080 10GB). Prior to this fps would drop to 30-40 big time.


Navigate to "AppData\Local\Hogwarts Legacy\Saved\Config\WindowsNoEditor" and backup "Engine.ini". Add the following to the bottom of the file and save it:

[SystemSettings]
r.bForceCPUAccessToGPUSkinVerts=True
r.GTSyncType=1
r.OneFrameThreadLag=1
r.FinishCurrentFrame=0
r.TextureStreaming=1
r.Streaming.PoolSize=3072
r.Streaming.LimitPoolSizeToVRAM=1

[ConsoleVariables]
AllowAsyncRenderThreadUpdates=1
AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1
AllowAsyncRenderThreadUpdatesEditor=1


8dywaXP.png

But nah, not the game at fault folks....
 
Last edited:
  • Haha
Reactions: TNA
Yet there are many stating that "fix" did nothing for them.

It may indeed be the game and it gets a patch. It just seems pretty routine for you to jump straight into "defend Nvidia mode" with no actual evidence other than "feels".
 
Last edited:
Personally I think FreeSync/VESA Adaptive Sync coming to consumer monitors is one of the worst things to happen - though I do like having G-Sync compatible on my 436M6 - we seem to have settled for an inferior standard because it is "good enough" which is largely based on hacking things like panel self-refresh features which were never designed for the task into doing a form of VRR rather than a ground up best effort.
The thing is it is good enough for most people as the market has shown us. It is not just about cost either, but also consumer choice, the GSYNC monitors were gimped in their input's.

GSYNC still exists if you want it.
 
The thing is it is good enough for most people as the market has shown us. It is not just about cost either, but also consumer choice, the GSYNC monitors were gimped in their input's.

GSYNC still exists if you want it.

Maybe now but not when g and free sync displays first launched. Took a good 1-2 years for freesync or rather "adaptive sync" to get on par with gsyncs quality. Even now adaptive sync/freesync displays still seem to have some issues e.g. AW QD-OLED freesync vs gsync ultimate version with regards to users posting about flickering, black screens, hdr not working as well.
 
Yet there are many stating that "fix" did nothing for them.

It may indeed be the game and it gets a patch. It just seems pretty routine for you to jump straight into "defend Nvidia mode" with no actual evidence other than "feels".

No different than people jumping defending AMD with the vram is not enough stuff ;)


Mine was correct as well with it not being enough for those into it as a long term card. The 10gb like you said was enough until new gen came along. It does look like developers are starting to push Vram up now though.

Not enough grunt either though in long term. Just look at Harry Potter, seems like you need a 4090 to max it out and even then still need to turn down settings. Lol
 
Last edited:
I did some testing with my 3080 back in Sept/Oct 2020 when I was silly enough to listen to “da internet“. At the time Afterburner beta had been released- this shows actual vRam usage compared to allocated. As I remember actual was more than 20% less than allocated and using any Upscaling dropped that by more than 20% too.This was at 4K.

The conclusion I came to in 2020 is the same as now- the 3080 will run out of grunt before it runs out of vRam I.e. you will be looking at upgrading for better performance before you start worrying about vRam in the vast amount of user experiences.I know that is certainly where I am at- at RRP in 2020 the 3080 was amazing, but can only hope the 4080ti is released at the current 4080 price or I’ll sit and wait it out. I was spoilt by what the 3080 gave me cost/performance wise so not interested in getting any post-buyers remorse.
It cannot be over committed, meaning allocated is effectively the same as used.

You also failed to describe the nature of your testing. Its a very vague post.
 
I see the new narrative is already being pushed that the game is "broken" and it's nothing to do with VRAM.
The whole VRAM issue is completely overblown. Can an almost 3 year old card like the 3080 max out new games in RT? NO. Could it do that if it had 50 terrabyes of vram? Still no. So does it matter? No again. Not even a 4090 can (and let's not talk about amds newest cards, lol).

You know games don't have to be played MAXED out native 4k + RT. Right now im playing on a 3060ti at 3440x1440p monitor, DLSS Q, everything ultra with shadows low and RT reflections on. It's a locked 60 fps with gpu usage between 75 and 85% (cause of vsync). That's on a 16gb PC btw. Vram nonsense...
 
  • Haha
Reactions: TNA
The whole VRAM issue is completely overblown. Can an almost 3 year old card like the 3080 max out new games in RT? NO. Could it do that if it had 50 terrabyes of vram? Still no. So does it matter? No again. Not even a 4090 can (and let's not talk about amds newest cards, lol).

You know games don't have to be played MAXED out native 4k + RT. Right now im playing on a 3060ti at 3440x1440p monitor, DLSS Q, everything ultra with shadows low and RT reflections on. It's a locked 60 fps with gpu usage between 75 and 85% (cause of vsync). That's on a 16gb PC btw. Vram nonsense...
The way you have described the situation means you dont understand the problem.

When you run out of grunt, you get a steady drop in performance, it typically doesnt cause stuttering or a sudden large drop in performance.

When you run out of VRAM its entirely different, typically there is a reasonable ratio to be kept between teraflops and VRAM capacity, comparing Nvidia to AMD and Intel it is lopsided on Nvidia's part.

Thinking about it rationally, its quite obvious, the only question is why we still have people defending it and trying to pretend its something else.

Its a bit like buying a 13900K or 7950X and putting 8 gig of RAM in the thing.

Its ok saying "it doesnt affect me on the games I play and my preferences" but saying "its overblown" or "its a narrative" is been disrespectful and shows a lack of understanding.
 
Last edited:
Yup, made a significant improvement for me, especially in hogsmwade. Well worth a try.

Just tried it out and huge boost/improvement! Still get fps drops (to 40s and no longer 5-20fps range) and general stutters but not anywhere nearly as bad nor as often, this even applies in hogsmeade and hogwarts grounds too but nah, remember it's not the game......

Personally I'm going to be keeping to "high" preset with RT AO and shadows turned on with RT reflections off (RT preset set to high), dlss quality @ 3440x1440 as this nets me 80-100+ fps.

I have seen a post about improving the RT IQ especially around the resolution of the RT reflections but obviously this hammers performance, that and it still won't look quite right, dry brick like material should not have a reflective surface. It just doesn't look "right" or feel natural in this imo:

sAaVGWZ.jpg

RqRNQVF.jpg

R95uzv8.jpg

ExtNu7A.jpg

RhAgefV.jpg

Un3OUXx.jpg
 
The way you have described the situation means you dont understand the problem.

When you run out of grunt, you get a steady drop in performance, it typically doesnt cause stuttering or a sudden large drop in performance.

When you run out of VRAM its entirely different, typically there is a reasonable ratio to be kept between teraflops and VRAM capacity, comparing Nvidia to AMD and Intel it is lopsided on Nvidia's part.

Thinking about it rationally, its quite obvious, the only question is why we still have people defending it and trying to pretend its something else.

Its a bit like buying a 13900K or 7950X and putting 8 gig of RAM in the thing.

Its ok saying "it doesnt affect me on the games I play and my preferences" but saying "its overblown" or "its a narrative" is been disrespectful and shows a lack of understanding.
There is no difference between VRAM not being enough and RT performance not being enough. In both cases you have to turn down the equivalent settings. Textures when the VRAM isn't enough, RT quality when RT performance isn't enough.
 
There is no difference between VRAM not being enough and RT performance not being enough. In both cases you have to turn down the equivalent settings. Textures when the VRAM isn't enough, RT quality when RT performance isn't enough.
There is a big difference. If you run out of VRAM typically a game becomes unplayable even if its a very slight deficit.

If you have a slight deficit on grunt, you get a proportional drop in framerate.

Have you ever ran out of VRAM?
 
And computerbase.de have more gpus tested now:

Ee0VPhk.png

Mcmt804.png

IwEIoKu.png

5qJQt4t.png

There are bigger problems with streaming​

The frame output in Hogwarts Legacy is actually quite decent. Without ray tracing, the Radeon RX 7900 XTX and GeForce RTX 4080 perform well and there are only minor outliers in the frame times. The AMD GPU is slightly ahead in this discipline, but that is hardly worth mentioning.


With ray tracing, the image output becomes a bit more restless, a little more on the Radeon than on the GeForce. Then there are a few medium-sized outliers on the AMD graphics card, nothing wild, but not optimal either. If the frame rate is high enough, there are no problems in the game, but at low FPS you can already tell that something is not running optimally. On the Nvidia counterpart, the outliers also get bigger, but are a bit more contained. With one exception, which is also the biggest spike that occurs again and again in the benchmark runs.
However, despite the decent frame times, Hogwarts Legacy has a pretty big problem with frame pacing. Regardless of the hardware, i.e. also on a Core i9-12900K and a GeForce RTX 4090, the game always gets stuck. This is particularly extreme in Hogwarts when ray tracing is activated, where there can be severe stuttering. This is probably not due to shader compile stutters, rather the game seems to be reloading data at these points. The phenomenon occurs reproducibly, for example, when the part of the lock or the floor is changed. the slower the hardware, the longer the hackers become and the more frequently they occur.

I still haven't really noticed any texture problems either but I have only played at 3440x1440 and whilst using dlss so probably be more a problem for 4k
 
  • Like
Reactions: TNA
Status
Not open for further replies.
Back
Top Bottom