• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Yeah remember when it started off with but but 4k max settings. Now 1440p is saying hello, not an issue though.

I'm guessing you aren't sacrificing settings on your 3090 then?

 
the Game Ready stuff is usually bollock s anyhow tbf.

Could be worse, could be a rdna 2 owner who hasn't received any support/drivers since rdna 3 release, planned obsolescence..... or at least that is what some would be saying if this where nvidia :p :D :cry:

But agree, generally they don't make a great deal of a difference in terms of performance improvements.
 
and those that said the 3080 "was fine" were ignoring the simple criteria that we said playing at 4k high settings it will end up with a choke (some of these deniers bought cards with more vram). Someone earlier listed the bullet points and again you should brush up on them.
Of course playing at 4k would at some point end up with a choke. That would absolutely be the case regardless of the actual vram of the 3080. Actualy every single card in existence chokes in hogwarts 4k.

My 3090 was chocking on day one, i tried to play cyberpunk at 4k native + rt and i ended up at 15-20 fps, lol. Guess it was okay to drop settings on a just released 2k GPU, but its horrible to drop settings on a 2.5 year old 700 euro card.

I would actually consider the whole vram thing an actual issue when you have to drop the textures to a level that they look like crap. Has that happened in a single game yet? Probably not.

Looking at those results, and realising that the game plays perfectly fine with textures at high instead of ultra makes me realize how stupid i was for buying a 3090 instead of a 3080. The 3090 gives you marginally better looking textures and 10% more performance for an extra k. Much WOW
 
Last edited:
Ι did. Proves my point. At 4k DLSS Q (which is basically his 1440p results) is faster than a 6950xt, which was much more expensive and has 16GB. The 6800xt which was it's actual competitor gets blasted. None of these can do 4k natively anyways whether they have 1gb or 1TB of vram caue they run out of grunt. Unless you are going to play the game at 21 fps with 14 minimums at 4k native with your 6800xt, I don't see wtf your point is. Id much much much much much rather have the 3080 than the 6800xt for hogwards. Not even a contest.

When I bought the 3080 10GB I did so knowing the VRAM was going to be more of an issue before GPU grunt was.

Let me give you two scenarios, one is due to lack of GPU grunt and the other is due to lack of VRAM. Let's assume the rest of the PC consistes of a good CPU and RAM etc.
  1. You are getting 35 FPS average with minimums of 30 and highs of 40.
  2. You are getting 35 FPS average with minimums of 5 and highs of 80.
In order to fix both scenarios you need to lower in game settings but scenario 1 means your GPU is running out of grunt. Scenario 2 means your GPU clearly has enough power to reach much better FPS but is held back by VRAM limits. Scenario 2 is down to poor GPU design balance, it's like designing a sports car with a great engine and handling, and putting an old 4 speed gearbox from a 1980s Ford Fiesta on it. You have made a design choice the cripples performance in a way that could easily have been avoided.

You stated that at 1440p (4K with DLSS quality) the 3080 was killing the 6950 XT. You are wrong and clearly looked at the 3080Ti 12GB numbers because the 3080 is getting killed at 1440p with RT on (at least according to HUB) and barely any faster on some other sites.
 
Last edited:
Clue...

A card that was designed for 4k can't do 1080p now because of the VRAM amount... The rest of what you state is obvious and not the topic.. We all know settings can be dropped etc... but when a 3060 12GB beats a 3080 10GB at the same settings there is something very wrong.
Yes the problem is the vram. I don't get this discussion, it's this to get us all riled up at tech companies or to prove posters in a 2 year old thread wrong? The 3080 destroyed the 3060 in basically every other game. If I had a £700 3080 and couldn't play hogwarts after owning it for 2 years I'd say oh well, refund and go play the other 99% of games out there.
 
When I bought the 3080 10GB I did so knowing the VRAM was going to be more of an issue before GPU grunt was.

Let me give you two scenarios, one is due to lack of GPU grunt and the other is due to lack of VRAM. Let's assume the rest of the PC consistes of a good CPU and RAM etc.
  1. You are getting 35 FPS average with minimums of 30 and highs of 40.
  2. You are getting 35 FPS average with minimums of 5 and highs of 80.
In order to fix both scenarios you need to lower in game settings but scenario 1 means your GPU is running out of grunt. Scenario 2 means your GPU clearly has enough power to reach much better FPS but is held back by VRAM limits. Scenario 2 is down to poor GPU design balance, it's like designing a sports car with a great engine and handling, and putting an old 4 speed gearbox from a 1980s Ford Fiesta on it. You have made a design choice the cripples performance in a way that could easily have been avoided.

You stated that at 1440p (4K with DLSS quality) the 3080 was killing the 6950 XT. You are wrong and clearly looked at the 3080Ti 12GB numbers because the 3080 is getting killed at 1440p with RT on (at least according to HUB) and barely any faster on some other sites.
Scenario 2 means you lower textures from ultra uber high to just high and get 35 fps average with minimums of 30. Big whoop.

If I knew 2.5 years ago that spending 2.5x times the money of a 3080 for my 3090 would give me marginally better looking textures, I would have bought a 3080 instead :D
Cause basically that's their difference, even in Hogwarts. They can run every other setting at the exact same option, but you need to drop textures 1 or 2 clicks on the 3080. That's DEFINITELY not worth 1200€ extra to me. I bought into the "10GB not enough hype", but now I know better
 
Yes the problem is the vram. I don't get this discussion, it's this to get us all riled up at tech companies or to prove posters in a 2 year old thread wrong? The 3080 destroyed the 3060 in basically every other game. If I had a £700 3080 and couldn't play hogwarts after owning it for 2 years I'd say oh well, refund and go play the other 99% of games out there.
The 3080 destroys the 3060 in hogwarts as well, whoever says or thinks otherwise is delusional
 
Especially considering a 3090 24gb couldnt maintain that day one (cyberpunk, ac odyssey etc.). Weird times we live in friend
Could be worse, could have spent £1300 or £1600+ on the latest and greatest and have to drop the same settings as on a 2+ year old £650 GPU including at 1440p and 1080p depending on the GPU..... :D :cry:
 
Last edited:
You might be laughing, but it's really not funny. If I didn't have the spare income to upgrade to a 4090, buying that 3090 instead of a 3080 would have been top 5 of the stupidest things I've ever done in my life. And don't get me wrong, I like my 3090, I have kept it, I've benched the crap out of it (it was a particularly good overclocker) I don't plan on selling it - still it was a stupendously stupid buy over a 3080.
I'll swap you my 3080 since it doesn't matter?
 
Status
Not open for further replies.
Back
Top Bottom