• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I felt that 10GB was going to be an issue for my 3080 at 4K, if I had planned to hold on another year. There was a few games that it caused issues with, FPS was 50 - 60+ then tanked due to hitting VRAM limits (DCS and FC6) for me. Yet it was still argued that the 3080 would run out of rendering power, before the smallish VRAM became a problem at 4K.

So despite these issues many of us 3080 owners running 4K were told our GPU was fine, that no matter what game we had VRAM issues with, it was the game that was the problem. Or even more laughably we were told “that game is crap anyway”, or “nobody plays that so it doesn’t count”. Yet we were trolled over and over with shill posts about Cyberpunk with RT and DLSS being all that mattered.
It's as if you've never used a PC before in your life and you don't have a clue.:o


In no particular order these ones spring to mind for the top 10

nobody plays that so it doesn't count
that's not how running out of vram works
system error/user error
you need a better cpu/you need more system ram
you'd run DLSS anyway
game is crap anyway
HD pack makes no difference
But you use addons/mods
you are intentionally running Rebar
but AMD 's rubbish at RT'ing


Well lazy dev's is definitely an issue, but of course we all know we not going to get the industry to change, for a long time now, the fact is optimising a game comes at the end of game development as its considered low priority, and by the time they get to that point they seem to just make sure it at least runs ok on the top end hardware available at the time.

As I often get quoted from developer friends "upgrading the hardware is the fix".
For those of us who play RPG's, who like textures that dont look like they from the PS2 era, we knew the VRAM was an issue. I remember thinking it over when I got my 3080, but I went ahead, as the pricepoint at the time for something better was ridiculous, and I would have only had 1 extra gig keeping my 1080ti. I dont regret it, but I have always felt 10 gigs of VRAM with the power of a 3080 was a massive mismatch.

Even on the 4000 series, the 4070ti really should be a 16 gigs card. Nvidia are still been tight, just not as extreme.

Meanwhile Intel managed to supply 8 gigs on a £250 card. So VRAM is defenitly not that expensive as Nvidia appear to give the impression from. I think whats hindering Nvidia is dedicated hardware for RT, thats clearly addiing to the cost of the cards, and they are mitigating that by slicing away VRAM. Thats why I want them to release non RT variants of their cards with higher VRAM at same price point. But they are all in now on their current idea.

Crap launch day performance isn't new, we've been beta testing titles since they stopped demo's way back then, oth Batman AK was pulled and refunded it was that bad at launch, Ark, NMS, AC, anything Bethesda, BF, CODs, CS, HL2.

Considering the record profits as gpus were a license to print money for the past 3 years, the vram cost argument is hilarious, watch how Nv/AMD keep that profit going despite selling less gpus due to this gens massive price hike.
 
Last edited:
It's as if you've never used a PC before in your life and you don't have a clue.:o


In no particular order these ones spring to mind for the top 10

nobody plays that so it doesn't count
that's not how running out of vram works
system error/user error
you need a better cpu/you need more system ram
you'd run DLSS anyway
game is crap anyway
HD pack makes no difference
But you use addons/mods
you are intentionally running Rebar
but AMD 's rubbish at RT'ing




Crap launch day performance isn't new, we've been beta testing titles since they stopped demo's way back then, oth Batman AK was pulled and refunded it was that bad at launch, Ark, NMS, AC, anything Bethesda, BF, CODs, CS, HL2.

Considering the record profits as gpus were a license to print money for the past 3 years, the vram cost argument is hilarious, watch how Nv/AMD keep that profit going despite selling less gpus due to this gens massive price hike.

you-spin-me-round-dead-or-alive.gif
 
Nexus do you think Nvidia can do no wrong?
:)


I didnt say GSYNC had no benefit, I said AMD proved it wasnt required, because before AMD released VRR, Nvidia only allowed VRR with a GSYNC module. They only widened their support when forced to by their main competitor.

Will we see Nvidia release a 16 gig card for under £800?

Personally I think FreeSync/VESA Adaptive Sync coming to consumer monitors is one of the worst things to happen - though I do like having G-Sync compatible on my 436M6 - we seem to have settled for an inferior standard because it is "good enough" which is largely based on hacking things like panel self-refresh features which were never designed for the task into doing a form of VRR rather than a ground up best effort.
 
People still holding on to the "fact" 10GB VRAM is enough...

xjTrVjV.gif


12GB VRAM is next in line for this treatment and already showing signs it may even come before the next generation cards are out.. Seriously anyone that thought 10GB VRAM was enough clearly didn't realise 1080ti's had 11GB and the scam card after that with no memory increase was the 2080ti again with 11GB (for £1000-£1200, really sold for £1.2k+) where it should have had 16GB, then they gave 10GB on a 3080 a VRAM reduction from an over 4 year old 1080ti and people thought that was right ? Come on really even Nvidia had to bring out a 12GB 3080 later that again should have been 16GB and then rubbed more salt into peoples wounds with a 3080ti with 12GB again..


Real tech enthusiasts saw the scam from day one and called Nvidia out on it but the problem was the market was a mess and mining/lockdowns made them sell out and made many regret buying them later at the extortionate prices many paid for a 3080 10GB and no most people could not buy the FE, many paid over double for a 3080 10GB and what the real street price was. This is why Nvidia and AMD are taking advantage now with the new gen pricing because they saw that behaviour from buyers, this is why I keep saying this behaviour needs to stop if you want real good hardware at sensible pricing again or we are stuck in this mess forever..
 
Last edited:
3080 gamer's cannot wait to upgrade, although the members market is not being flooded, perhaps delusion has not worn off yet.

There is a lack of upgrade options without having to spend more, the 4070ti has the same problems as the 3080 and the 4080 is overpriced so 3080 owners are between a rock and a hard place. The best upgrade in terms of value for a 3080 owners would be the Radeon 7900xt or 7900xtx
 
Last edited:
People still holding on to the "fact" 10GB VRAM is enough...

3080 gamer's cannot wait to upgrade, although the members market is not being flooded, perhaps delusion has not worn off yet.

There is a lack of upgrade options without having to spend more, the 4070ti has the same problems as the 3080 and the 4080 is overpriced so 3080 owners are between a rock and a hard place.

Only thing to add would be not so bad if you bought the 12GB version which was released just over a year ago. Since this release it can be had for the same price as the 10GB if you shop around so there is no reason to deprive oneself if you were in that boat.
 
Last edited:
The only game I’ve had issues with when using 3080 was FFVII as the game had hardly any settings to tweak.
Everything else I was dropping settings anyway do get higher framerate so never reached vram limit.
For £650 it was by far the best option at the time not to mention the card actually earned me money to fund 4090 by mining.
As said before most of the people whining about it are the ones who couldn’t get one at msrp so either bought one and overpaid by a lot, bought AMD card or a 3090… :P
 
Last edited:
Plays fine on my end.......... :cry: Get the popcorn @TNA


4k still processing, probably be done later

But in all seriousness when outside and about performance as shown does go to **** using max settings with native/dlss quality, however, performance mode and even ultra performance mode with dlss is seriously good in this (using dlss 2.5.1) so essentially the game is certainly very playable/performing well in this case and still looks great (haven't really noticed any serious texture loading/rendering issues, some pop in but nothing like the bug FC 6 had at the start which then got fixed in a game patch), happy to have paid the premium for a quality feature that just works :p ;) :D :cool: Did try FSR and well it's a mess as per usual with anything except the top quality preset.

My observations so far:

- RT reflections is overall very poor, it reminds me of hitman, there is like a mix mash of both SSR and RT reflections happening at the same time (could explain many of the issues) as you can see SSR artefacts happening even though there are RT reflections, not to mention RT reflections have quite a few issues with artefacts, can also see in my video, there is an artefact in some of the reflections, haven't seen this in other 3080 footage though so likely something wrong on my end. Problem is whilst some RT reflections look bad, only SSR looks considerably worse so you can't win here. Also, it's almost like they haven't activated de noising for the RT reflec. as they are very shimmery/ant crawling effect

That and sadly SSR awfulness for bodies of water :( Look at all the artifacting on the pillar edges too, talk about immersion breaking :o


- RT shadows are actually great in this, not perfect.... but non RT shadows looks awful, everything is hard and sharp when shadows should be soft especially with dim weak light sources e.g.

OB3fZD5.png

- RT AO seems good too but not a huge difference with it on/off
- fps in certain cutscenes is ****, sometimes it can be 60+ and then even though nothing changed, it can drop to 5 fps

So in 2 minds what to do, at the minute, I'm thinking of just leaving RT off or at the very least only using RT shadows since they have the biggest impact.

Overall though, game play is very good and a stunning game world :)

In no particular order these ones spring to mind for the top 10

nobody plays that so it doesn't count
that's not how running out of vram works
system error/user error
you need a better cpu/you need more system ram
you'd run DLSS anyway
game is crap anyway
HD pack makes no difference
But you use addons/mods
you are intentionally running Rebar
but AMD 's rubbish at RT'ing

nobody plays that so it doesn't count

No one has really said that, more a case of, just because it is happening in a very niche scenario/game doesn't mean this will apply to every other game like some made out.....

that's not how running out of vram works

When there is a 100% genuine vram shortage, usually this doesn't mean having fps drop to 5 fps and STAY at that forever as I demonstrated in the examples provided many times including my very own scenario. If the fps drops to single digits and stays there until a game restart, this indicates a bigger problem with poor optimisation or/and intentional sabotage just like how portal rtx is a mess on amd gpus

system error/user error

Apparently that only works for one side? e.g. when I highlighted jansn rdna 2 footage stuttering in doom eternal "local system error" from matt and likewise when I highlighted how bang4bucks 3090 had stutters in resident evil village yet on my 3080, it was a flat frame latency the whole way through, in fact, no one came back to that post :D

you need a better cpu/you need more system ram

As we have seen, if there is not enough vram then the system ram will be used (hence why hogwarts is even worse with 16gb ram). FC 6 patch which fixed the rendering/texture loading issues saw system ram increase by 4gb

you'd run DLSS anyway

Well yeah, most people are using this so again, it's a bit like with fc 6, by not using it, you are just sabotaging your experience. If people with the new current £1300+ are having to use DLSS/FSR, of course someone with a 2 year old £650 gpu is going to have to use it which will likely resolve any possible issues

game is crap anyway

Yup! :cry:

HD pack makes no difference

Arguable, I noticed it made a nice difference tbf but nothing groundbreaking

But you use addons/mods

This has always been valid, I even showed the issue myself

you are intentionally running Rebar

Very valid, given this was the only way I could sabotage my performance in fc 6, you could even see how fps dropped by 20 fps for no reason at the end of one of my videos. Nvidia engineer recently said this isn't a silver bullet as whilst it might boost performance in some areas, at the same time, it could also cause a regression in performance elsewhere, which is also backed up by HUBs video

but AMD 's rubbish at RT'ing

Yup! :cry:

There is a lack of upgrade options without having to spend more, the 4070ti has the same problems as the 3080 and the 4080 is overpriced so 3080 owners are between a rock and a hard place. The best upgrade in terms of value for a 3080 owners would be the Radeon 7900xt or 7900xtx

eYRErT2.gif

Personally not for me:

- I've had it's RT perf. for the past 2 years and as shown, they are already ******** the bed with the newest and greatest RT games, imagine what they'll be like in 2 years i.e. bit like rdna 2
- nvidia works better/less issues on my AW GSYNC ultimate screen (which I won't be changing anytime soon)
- fsr is too hit and miss still
- power efficiency/consumption is a no no unless they have fixed it?
Only upgrade paths are 4080 and 4090 :)

The only game I’ve had issues with when using 3080 was FFVII as the game had hardly any settings to tweak.
Everything else I was dropping settings anyway do get higher framerate so never reached vram limit.
For £650 it was by far the best option at the time not to mention the card actually earned me money to fund 4090 by mining.
As said before most of the people whining about it are the ones who couldn’t get one at msrp so either bought one and overpaid by a lot, bought AMD card or a 3090… :p

Few understand.....

Hogwarts is the first game to have caused serious performance issues for me but looking at benchmarks for other gpus, appears to be the case for most gpus, even the best of the best currently are arguably ******** the bed to some extent.
 
Last edited:
  • Haha
Reactions: TNA
The only game I’ve had issues with when using 3080 was FFVII as the game had hardly any settings to tweak.
Everything else I was dropping settings anyway do get higher framerate so never reached vram limit.
For £650 it was by far the best option at the time not to mention the card actually earned me money to fund 4090 by mining.
As said before most of the people whining about it are the ones who couldn’t get one at msrp so either bought one and overpaid by a lot, bought AMD card or a 3090… :p

Really odd as others also sold game keys, shifted old hardware, mined and funded what they could at the time. So your statement there to fund an expensive £1700 card is trying to justify what you did. If anything it supports that you a) know it was expensive and b) you ditched it because of the very topic at hand. Over time it has proved to be contentious enough that you can bury your head in the sand or just acknowledge it. As we said many times before if your a serial upgrader around the bi-annual gen changes its not really affecting you much. The people that bought one expecting them to last 4 or 5 years will be experiencing it particularly if they buy a good display.
 
Really odd as others also sold game keys, shifted old hardware, mined and funded what they could at the time. So your statement there to fund an expensive £1700 card is trying to justify what you did. If anything it supports that you a) know it was expensive and b) you ditched it because of the very topic at hand. Over time it has proved to be contentious enough that you can bury your head in the sand or just acknowledge it. As we said many times before if your a serial upgrader around the bi-annual gen changes its not really affecting you much. The people that bought one expecting them to last 4 or 5 years will be experiencing it particularly if they buy a good display.
I’m not trying to justify anything just stating facts.
I justified the purchase of the 4090 when I bought it so don’t need to worry about it now and yes it was expensive, too expensive for what it is and if 4080 was costing less I would probably go for that and not worry about 16GB being the limit.
Upgraded to 4090 not because 3080 was struggling with vram but because performance in new games wasn’t great anymore.
As an example the new Plague Tale wasn’t getting anywhere near vram limit but was dropping to about 40fps at times so not great.
I could drop more settings and set DLSS to performance mode but that was too much of a visual sacrifice as I’m sitting close to my 48” screen and DLSS in that game isn’t as good as some others.
Now I’m playing it maxed out with with DLSS quality and frame generation enabled at constant ~120fps and the card is just chilling and not getting anywhere near max usage.
All things considered 3080FE was one of the best computer hardware purchase I’ve ever done.
Is will take top spot in my hardware Hall Of Fame… ;)
 
Last edited:
No sure what they are doing but looking at HUBs results in comparison something is way wrong in those results or Hubs. In HUB's results AMD had better 1% lows than Nvidia.

Yup TBF, TPU results are definetly not right for amd gpus.

Jansn also shows 7900xtx ******** the bed somewhat at 1920x1080


And pcgamershardware:

Hecxhzk.png

Probably down to where they are testing.


Essentially the game is a mess with RT turned on :)
 
Yup TBF, TPU results are definetly not right for amd gpus.

Jansn also shows 7900xtx ******** the bed somewhat at 1920x1080


And pcgamershardware:

Hecxhzk.png

Probably down to where they are testing.


Essentially the game is a mess with RT turned on :)
This game seems to be all over the place. Some sites saying it runs great and others the opposite. Could be down to hardware configerations at a guess.
 
Status
Not open for further replies.
Back
Top Bottom