• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
If only it had 24gb vram, you could be flying about on your broomstick with a chubby at a whopping 36 fps with lows of 24 fps too :cry:

Didn't we all say, grunt would be the primary issue before vram..... ;) :p


It wasn't though as at 1080p/1440p in Hogsmead the grunt was there but 10gb vram was the limiter when RT was on. This could be a game issue but for now 10gb is not enough in this game.
 
Last edited:
Generally, a 3080/6800XT comparison is fairly similar, but in this example at 4k the 3080 is a good chunk down - It's average is only just in excess of the 6800XT fps lows. The 3060 is right up there with a 3080 too, fps lows are double that of a 3080! RAM level definitely seems to be impacting here at 10GB.

However, full bells and whistles is unplayable on all of these examples.

Really interesting to see the A770 in there mixing it up though.
 
I noticed some VRAM issues at 2160p DLSS performance in Dead Space however - fps dropped to the 40s but GPU power was about 240W instead of expected 300W+
Could be room for improvement with patches there as seems inconsistent.

In Hogwarts with RT you could argue perf is already really low so who really cares about more VRAM. With RT off performance looks pretty good across the board to be fair
:)

Haven't tried on my 4k oled, playing on my aw 3440x1440 qd-oled instead so maybe there are issues at higher res., be surprised if you're having issues 100% down to vram if you're using dlss performance but as per Alex video and what you said there, does seem like there are some optimisation issues regardless of vram amount and looking at TPU results, not exactly a great performer for any gpu without upscaling again:


All in all, 2023 looking to be a horrible year for pc releases so far no matter what gpu you have :o

I'm definitely getting hogwarts but waiting for a patch or 2 first, want to play this with RT goodness :D

lol at 1080p maxed with RT the 3080 is below the 6650xt 8gb if that doesnt tell you the games broken I dont know what will

Don't go against the narrative :p

It wasn't though as at 1080p/1440p in Hogsmead the grunt was there but 10gb vram was the limiter when RT was on. This could be a game issue but for now 10gb is not enough in this game.

Given 1080p is using more or less the same vram as 4k, there is clearly some improvements to be made

yB83Rq1.png

Just for comparison, here is a few other titles where vram usage is properly scaling with the res.:

oGoFgPV.png

Znb6Y4b.png

FcwPLMJ.png

Even forspoken looks to get it somewhat better:

njmZsO8.png

Maybe they won't fix it, in which case 10gb won't be enough for this game but alas, I think it's safe to say this is more down to poor optimisation in this game than anything else.
 
This could be a game issue but for now 10gb is not enough in this game.

^^ There 'tis :eek:


24Gb isnt enough let alone 10. Dont wait for a fix folks - save yourself, pony up and get a 4090 or start saving for a 5090 with 48GB on it so you can play Harry Potter.

Does look nice though - R2D2 levels of gorgeousness.

Can spend pages 2nd guessing and taking tech sites as gospel for posting up anything for clicks - if they don't they wont look current.

Guaranteed in a few months the performance for this game will be much better after a patch or 3 as I see nothing in the graphics that warrants such performance over games that look as good, or better. If you think the game has something that we cant see as the hardware isnt available yet, then fair enough - but a very odd way to make a game.

Buy now, PLAY later.......... Higher Purchase games.....
 
Last edited:
It wasn't just hub that was suspect of 10Gb, plenty 3080 owners been getting stick in here for saying it can struggle/run out.

I felt that 10GB was going to be an issue for my 3080 at 4K, if I had planned to hold on another year. There was a few games that it caused issues with, FPS was 50 - 60+ then tanked due to hitting VRAM limits (DCS and FC6) for me. Yet it was still argued that the 3080 would run out of rendering power, before the smallish VRAM became a problem at 4K.

So despite these issues many of us 3080 owners running 4K were told our GPU was fine, that no matter what game we had VRAM issues with, it was the game that was the problem. Or even more laughably we were told “that game is crap anyway”, or “nobody plays that so it doesn’t count”. Yet we were trolled over and over with shill posts about Cyberpunk with RT and DLSS being all that mattered.
 
as HU suspected previously, the 3080 didnt age well

Quite a few suspected. Gave evidence, but nah was only one game!

It wasn't just hub that was suspect of 10Gb, plenty 3080 owners been getting stick in here for saying it can struggle/run out.

Not as much stick as the people that didn't own them got. If you were not part of select club you had no opinion.

I felt that 10GB was going to be an issue for my 3080 at 4K, if I had planned to hold on another year. There was a few games that it caused issues with, FPS was 50 - 60+ then tanked due to hitting VRAM limits (DCS and FC6) for me. Yet it was still argued that the 3080 would run out of rendering power, before the smallish VRAM became a problem at 4K.

So despite these issues many of us 3080 owners running 4K were told our GPU was fine, that no matter what game we had VRAM issues with, it was the game that was the problem. Or even more laughably we were told “that game is crap anyway”, or “nobody plays that so it doesn’t count”. Yet we were trolled over and over with shill posts about Cyberpunk with RT and DLSS being all that mattered.

Yep. I think it was yourself, Bill and a couple of others that didn't borg like get the pitchforks and torches out because they took offense over owning a piece of hardware being analysed on a hardware enthusiast forum that's been doing this for decades..
 
Last edited:
Generally, a 3080/6800XT comparison is fairly similar, but in this example at 4k the 3080 is a good chunk down - It's average is only just in excess of the 6800XT fps lows. The 3060 is right up there with a 3080 too, fps lows are double that of a 3080! RAM level definitely seems to be impacting here at 10GB.

However, full bells and whistles is unplayable on all of these examples.

Really interesting to see the A770 in there mixing it up though.

It uses the same amount of vram it does at 1080p surely means the game is broken why would 1080p use as much ?

eagy59y.png
 
I felt that 10GB was going to be an issue for my 3080 at 4K, if I had planned to hold on another year. There was a few games that it caused issues with, FPS was 50 - 60+ then tanked due to hitting VRAM limits (DCS and FC6) for me. Yet it was still argued that the 3080 would run out of rendering power, before the smallish VRAM became a problem at 4K.

So despite these issues many of us 3080 owners running 4K were told our GPU was fine, that no matter what game we had VRAM issues with, it was the game that was the problem. Or even more laughably we were told “that game is crap anyway”, or “nobody plays that so it doesn’t count”. Yet we were trolled over and over with shill posts about Cyberpunk with RT and DLSS being all that mattered.

Which is all valid enough if you are running out of vram before grunt but with regards to DCS, when you go onto say that even a 3090 will struggle and had to have settings reduced, it comes down to another case of grunt not being there in the first place thus settings having to be reduced anyway.

Time will tell but my own experience in limited scenarios is the GPU grunt is there but the VRAM isn't. And for reference my Pimax 8KX is being run in normal or small mode and nowhere near the 4K x2 resolution. I also have to undersample to 60% SS to keep performance respectable in some games. So even a 3090 will need to run at lower settings on a Pimax 8KX.

So when I run freeflight or a small DCS Normandy mission then performance is perfectly fine at 60 FPS+ with 9.8GB VRAM used. If it is a larger mission then VRAM usage goes over 10GB and performance tanks.

if we have a simple yes or no answer then the answer is NO because you have absolutely no idea what games/sims or uses other people have. So if someone were to ask me is a 3080 good enough for hi res VR in DCS, my answer would be NO.

As LTMatt has said, is 10GB enough? For most scenarios yes, for all things no.

As it is, if we ignore the fact that hogwarts is a mess in terms of optimisation on "all gpus", we could say that gpus with less than 16gb vram do indeed not have enough vram but here's the problem, even gpus with 24gb vram don't have the grunt for a good pc gaming experience thus are also having to reduce settings or/and use dlss or/and FG to overcome the lack of hardware power.

Again, this is all a case of people pointing fingers at amd and nvidia when the ones to blame are game developers, they need to start optimising their games and not relying on dlss/fsr, fg as their "optimisation".

The fact that a pretty brand new £1300 and £1600+ top dog gpu is having to reduce settings and is now only good for 1080p and 1440p gaming says it all.

Also, so far HUB are the outlier in terms of their performance results:








9Sl6bqm.jpg

4fRifI7.gif


Seems that ultra performance mode somewhat makes it more playable, thank god nvidia have drastically improved IQ of DLSS performance/UP recently!

gZAEh7r.png


3QkieNi.png

6GS2AJz.png

326lIeh.png

f4BdEXi.png

GLKyd9z.gif
 
  • Haha
Reactions: TNA
Quite a few suspected. Gave evidence, but nah was only one game!



Not as much stick as the people that didn't own them got. If you were not part of select club you had no opinion.



Yep. I think it was yourself, Bill and a couple of others that didn't borg like get the pitchforks and torches out because they took offense over owning a piece of hardware being analysed on a hardware enthusiast forum that's been doing this for decades..

resident evil village - debunked
mfsf - debunked (unless you add mods)
halo infinite - debunked
godfall - debunked
re 2 remake - debunked
doom eternal - debunked
forza - debunked

:cry:

Only one which had some weight to it was fc 6 but in my case, I could only replicate by intentionally sabotaging myself :D Only 100% genuine evidence showing a vram issue was by myself with several 4-8k texture packs in cp 2077 :cry:
 
  • Haha
Reactions: TNA
Status
Not open for further replies.
Back
Top Bottom