• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Even more importantly to hit the $699 price point.

Somewhere, in an alternate version of reality, the Internet is outraged as the 3000-series cards launched at the same high price points as the 2000-series, while packing "unnecessary" amounts of VRAM :p
 
Yeah I doubt the 10GB VRAM is going to be an issue now but I'm not sure about over the next year or two.

Of course, if a "higher" 3080 variant does come at some point, then it could merely be a 3080 with more memory or it could be a "Ti" or "Super" with better processor specs too.

Basically I'll be getting a 3080, I just have three options:

1. Buy an FE (like the look of them) blind on launch day before seeing reviews
2. Don't buy on launch and wait a week or two for the dust to settle and reviews to come out then decide what card to buy
3. Wait for longer (months) to see what RDNA2 looks like and whether there are any rumours of a better 3080 model coming

If you need a card, just pre-order, this will be an excellent card. A lot of people with 9 and 10 series passed on Turing, and are eagar to upgrade to the 30 series. We also have a global pandemic, and Samsung's usually poorer wafer yields (vs TSMC) which will take time to mature. So possible high demand, shortage of cards, may leave you waiting a while after reviews.

If you don't need a card, and can get by for awhile i.e you didn't sell your 1080 Ti/2080 Ti, or have a lower end spare card, then it's absolutely worth waiting to see what Big Navi brings on it's superior node to Samsung 8nm. If it's competative we'll almost certainly get a 3090 with say 12GB of VRAM, or a 3080 with 16GB-20GB given that price gap between the 3080 and 3090, but who knows when. 1080 Ti dropped in the following year, like many of its predecessors. 2080 Ti though, arrived at launch. So timing is anyone's guess.
 
If you need a card, just pre-order, this will be an excellent card. A lot of people with 9 and 10 series passed on Turing, and are eagar to upgrade to the 30 series. We also have a global pandemic, and Samsung's usually poorer wafer yields (vs TSMC) which will take time to mature. So possible high demand, shortage of cards, may leave you waiting a while after reviews.

If you don't need a card, and can get by for awhile i.e you didn't sell your 1080 Ti/2080 Ti, or have a lower end spare card then it's worth waiting to see what Big Navi brings. If it's competative we'll almost certainly get a 3090 with say 12GB of VRAM, or a 3080 with 16GB-20GB fairly soon given that price gap between the 3080 and 3090.

Define "need" :)

I am still leaning towards pre-ordering and giving the 1080ti to the missus, will be a nice upgrade for her from a 1060. I was lining up a move to Zen 3 when it becomes available from my 7700K anyway, so the 3090 price point is out of the question.
 
Define "need" :)

I am still leaning towards pre-ordering and giving the 1080ti to the missus, will be a nice upgrade for her from a 1060. I was lining up a move to Zen 3 when it becomes available from my 7700K anyway, so the 3090 price point is out of the question.
I sold my 1080 Ti before launch, I don't own a spare, so only have a low powered chromebook, and my phone. So I "need" a card to use my PC. :)

2080 Ti and 1080 Ti prices are right now in free fall, so it makes sense for you to recycle them.

The 3090 is a cross between the Ti GPU with Titan level RAM, relabeled 3090. It's a beast, if you have a 4K, high refresh monitor, and use use Blender, Maya, etc it's perfect. For everyone else, I'd say go with the 3080 for 4K or 3070 for 1440p, yes, you may have to turn down Ultra to high in time, but I'd be surprised if most folks would notice.
 
Last edited:
Yea im on 3440x1440 @ 120fps, should think the 3080 would be fine for some time. I'll keep an eye on reviews, hopefully they will drop the day earlier.
 
At this point, the stupid, impatient, greedy part of me is thinking sod it, I'll just get the 3090 and be done with it. I'd not be buying anything for at least half a decade after that...

I just don't trust that new game devs are going to get immediately on board with this new SSD PCI-E streaming tech and I tend to play older titles like mass effect and Skyrim, though of course I have TES VI and mass effect remastered in mind too. FS 2020, I want 40+ fps on high/ultimate everywhere. It may be that I have to get the KY jelly out and pay up for that 3090

Over a 10 year period you are better of getting a 3070 16gb, then a 4070, then a 5070, then a 6070, then a 7070 with the same money instead of buying a 3090 now. Obviously the above includes you selling the card before and adding £200 or whatever it is to upgrade each time. This method means you will always be included in the new shiny fun and will have a card no less than treble the performance of your 3090 by the time the 10 years is up.

The first Titan was out only 7 years ago and just look how **** that is now. Been **** for like 4-5 years too. Lol

Oops, seems I missed the half when reading, I read it as a decade. Lol. I only catches it after reading the post again after posting myself, was thinking wtf, a decade? Lol.

Somewhere, in an alternate version of reality, the Internet is outraged as the 3000-series cards launched at the same high price points as the 2000-series, while packing "unnecessary" amounts of VRAM :p
Lol :D
 
Hardware Canucks (no idea how reputable) on Youtube mentioned NDA for 3080 FE will lift on 14th, AIB's on 17th.
 
Yea im on 3440x1440 @ 120fps, should think the 3080 would be fine for some time. I'll keep an eye on reviews, hopefully they will drop the day earlier.
A 3070 or 3080 will both be fine for some time. If the 3090 was £1000, it would be a practical waste, as you'd be leaving performance on the table, and paying more, but you'd have bragging rights I suppose. I'd practice your pressing F5 skills and preorder from someone who has good returns policy. If reviews aren't solid you can return it.
 
Last edited:
The issue here is, we all assume we know what we are talking about and have been been part of the development process with Nvidia and partaking in the discussions between Nvidia / Microsoft and game devs. I assure you 99.999999999999999999% of us have not been privy to these discussions and do not know a scobie do about how this will work on the next gen games.

We are also probably thinking about this in terms of the old generation of games. The next gen games will be pulling in more data from nvme so perhaps this is why 10GB is OK for a high end gaming card for the next gen.

Remember this.... if games do start coming out needing more than 10GB of vram and the new AMD cards have 12-20GB on a card then sales will start going towards AMD and Nvidia would not be putting themselves in this position.

So I am fairly sure I trust them that 10GB will be enough.

I also think 10GB will be enough. Its like "PS5" put so much time and money into this SSD drive they have going on getting the most out of it on how games utilize it
 
I suspect too much is being made of Doom Eternal at nightmare quality. I've been looking at screenshots comparing nightmare to ultra and there are differences however it's not massive and primarily it seems to be that nightmare is a bit sharper and the shadows are a bit..darker. I would love to see native 4k screenshots of nightmare vs ultra with some sharpening applied, maybe even...gasp...DLSS. I really get the feeling this is one of those fringe situations that don't really represent the current state of gaming or anything in the near future. In otherwords, is D:E's nightmare preset just using VRAM for the sake of it? Is this similar to a situation we've seen before such as running uncompressed textures when it's almost completely pointless and consumers are latching on to this a little too much?
 
Ah, I thought it was all on the 17th. That's fantastic news, especially as I rather like the Founders. Thanks.

I like the founders too, although the design doesn't particularly lend itself to vertical mounting. Eyeing the Zotac but not confident my f5 skills will land me one.
 
I suspect too much is being made of Doom Eternal at nightmare quality. I've been looking at screenshots comparing nightmare to ultra and there are differences however it's not massive and primarily it seems to be that nightmare is a bit sharper and the shadows are a bit..darker. I would love to see native 4k screenshots of nightmare vs ultra with some sharpening applied, maybe even...gasp...DLSS. I really get the feeling this is one of those fringe situations that don't really represent the current state of gaming or anything in the near future. In otherwords, is D:E's nightmare preset just using VRAM for the sake of it? Is this similar to a situation we've seen before such as running uncompressed textures when it's almost completely pointless and consumers are latching on to this a little too much?
That is my thoughts exactly. If in the future I have to go from nightmare to one setting lower, that is not a problem at all for me. Better than paying £100 or more for extra ram for a card I will replace next gen anyways. Will be interesting to see how it goes.
 
Q. Is 10Gb of VRAM enough?

A. Yes *(but) - At the moment 10Gb is perfectly fine for 1080p and 1440p, and 'just' about passable for 2k UW and 4k.

But in a year or two from now when you are eyeing up that new 4k or 5k UW monitor upgrade you may wish you had more.
And that is the whole point really' future proofing, making the best upgrade you possibly can here and now at a reasonable price point.

Just look at how long some of us chaps have held on to our 1080ti's... that is a ownership experience we would all ultimately like to repeat.
And the truth is 16Gb or 20Gb is a better capacity than 8Gb and 10Gb going forward.

Anyway that's my 2 cents on the subject. :)
 
Q. Is 10Gb of VRAM enough?

A. Yes *(but) - At the moment 10Gb is perfectly fine for 1080p and 1440p, and 'just' about passable for 2k UW and 4k.

But in a year or two from now when you are eyeing up that new 4k or 5k UW monitor upgrade you may wish you had more.
And that is the whole point really' future proofing, making the best upgrade you possibly can here and now at a reasonable price point.

Just look at how long some of us chaps have held on to our 1080ti's... that is a ownership experience we would all ultimately like to repeat.
And the truth is 16Gb or 20Gb is a better capacity than 8Gb and 10Gb going forward.

Anyway that's my 2 cents on the subject. :)
If you don't plan on upgrading again for hopper then yes defo get minimum 16gb to future proof. If you do, I can't see it being a huge issue. People make it sound like nightmare setting for textures is the only option. There are many others and there is not a night and day difference between the one lower. Most people need to take a screen shot to point the difference.
 
Hopefully it's OK to share this. I'd be interested in whether anyone on here can support or rebut it:

https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/

I'm interested in the 3080 to play at 4K, but would like to be able to keep it for at least 3-4 years without having to turn settings down too much.
As long as you are happy to drop texture setting down one maybe 2 settings for the most demanding games, then you will be fine. Also there is DLSS where you can run at 1440p or lower if needed. That said at 3-4 years I would be looking at 16gb if you do not like the idea of lowering textures, I would not be as comfortable as I am now if I knew I would be keeping it much more than 2 years for example. Bu teach to their own.
 
Hopefully it's OK to share this. I'd be interested in whether anyone on here can support or rebut it:

https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/

I'm interested in the 3080 to play at 4K, but would like to be able to keep it for at least 3-4 years without having to turn settings down too much.

They are saying what other people in this thread have been saying and my response was as follows

snip

If we were talking about the 3060 or maybe even the 3070 with 10GB of VRAM. I would agree that it is enough (for now). But we are talking about the flagship card for PC gaming (assuming AMD drops the ball).
You're paying a premium so that you can have the same graphical fidelity as a console, admittedly with higher frame rates. (I think we all agree that an increase graphical fidelity, compared to the console, will require more VRAM right? Does anyone disagree?). Like i said on the first page for those that buy a new graphics card every generation this isn't going to be a problem. For anyone planning on skipping a generation, good luck with that.

To those who are saying that 10GB is enough, and RTX IO will make up the difference, etc...

You will all be singing a different tune when Nvidia launches their next generation of cards with ~16GB VRAM on the top end:p.
And none of you will be complaining about having to pay more for an extra 6GB of VRAM you "don't" need;).
 
Status
Not open for further replies.
Back
Top Bottom