Associate
- Joined
- 13 Mar 2009
- Posts
- 704
lulAnd all of that will happen before you get your 3080
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
lulAnd all of that will happen before you get your 3080
10gb VRAM is fine if you play at 1440p, 20 would just be overkill and a waste especially if it ends up costing you more cash.Honestly unless your card is **** I don't know why anyone would buy a 3080, your just going get mugged off when the 16gb or 20gb 3080s comes out
Get a 3090 then, we will see who will get mugged off when the 4080 comes out and the value of the cards plummet. Lol.Honestly unless your card is **** I don't know why anyone would buy a 3080, your just going get mugged off when the 16gb or 20gb 3080s comes out
Honestly unless your card is **** I don't know why anyone would buy a 3080, your just going get mugged off when the 16gb or 20gb 3080s comes out
If you want 16gb or 20gb you will have to pay serious money for it and if you do you may as well go all the way and get the 3090.
Highly relevant article about Nvidia cards getting more Ram, not because they need it, but because AMD is reportedly using 12 & 16GB and nVidia doesn't want to look second best.
Is 10GB enough - not if you're wanting to posture with marketing it isn't !
https://www.techradar.com/uk/news/a...uld-both-have-more-vram-than-nvidias-rtx-3080
And all of that will happen before you get your 3080
I'm surprised more people haven't mentioned this. Vram isn't the only way a new game can stop you from reaching xx settings and xx frame rates. MSFS 2020 is here, now, and it can't run at 60fps on anything, with any amount of vram.
Do people really think the 3080 would be able to max every new game over the next two years...if it only had more vram?
Nah, honestly, that 10GB is plenty. You'll run out of grunt before you run out of VRAM.
Particularly getting a 3090 is like inviting the burglars into your house and then waving them off and wishing them well as they take off with your precious money.
DLSS will make sure you have plenty of grunt.
10GB is clearly going to be skating on the edge for 4K gaming over the next few years. I get why they’ve gone with that to keep the costs down for the 3080, but...
I can’t help thinking that the logical config lineup should have simply been:
3070 8GB (aimed at 1080p and 1440p gaming, a spot of light 4K gaming, ie most people)
3080 16GB (aimed at 4K and more enthusiast / super high FPS gamers)
3090 24GB (aimed at workstation type people, and small appendiged people who just need to have the ‘best’)
The anomaly in that above is the 3080.
Also, good call above on the FuryX, was trying to remember which one it was. Wasn’t it the 4GB of HMB memory that was supposed to have magic properties. All the other cards were releasing with 6GB+ and it all turned out to be ********* did it not?
You can't have a 16GB 3080 though. It can only be 10 or 20GB because of the 320bit bus
You can't have a 16GB 3080 though. It can only be 10 or 20GB because of the 320bit bus
Thing is that DLSS has to be explicitly implemented, and I'm sure I run plenty of things where it's not (at least: yet) been implemented. MSFS and X-Plane for example.
I'm with you on the Fury X nonsense regarding VRAM, never believed all that for a second. However, I do think 10GB is enough to drive 4K just fine (some benchmarks show that having more VRAM just boost performance). Maybe it would not be enough in 5 years from now, but by that time I would have upgraded anyway, and if going for the at least twice as expensive an option that upgrade might not even be feasible as quickly.
Yeah I would've liked a little bit more (12GB would've been good) but I'd rather turn down a setting or two than pay twice the price.
As soon as the 16 GB & 20 GB cards hit, all the people defending the 10 GB models will go silent because now they don't have to rationalise to themselves that 10 GB is enough because they actually have the option to get more vram without stretching all the way to a 3090. It's gonna be all "oh having extra vram is so nice, peace of mind" etc. Happens every time the consoles change, exactly the same.
This is why I stopped arguing with people about it, it's so stupid & pointless. The new cards will automagically change their minds without any of the facts themselves actually changing.
Thing is that DLSS has to be explicitly implemented, and I'm sure I run plenty of things where it's not (at least: yet) been implemented. MSFS and X-Plane for example.
I'm with you on the Fury X nonsense regarding VRAM, never believed all that for a second. However, I do think 10GB is enough to drive 4K just fine (some benchmarks show that having more VRAM just boost performance). Maybe it would not be enough in 5 years from now, but by that time I would have upgraded anyway, and if going for the at least twice as expensive an option that upgrade might not even be feasible as quickly.
Yeah I would've liked a little bit more (12GB would've been good) but I'd rather turn down a setting or two than pay twice the price.
10GB is clearly going to be skating on the edge for 4K gaming over the next few years. I get why they’ve gone with that to keep the costs down for the 3080, but...
I can’t help thinking that the logical config lineup should have simply been:
3070 8GB (aimed at 1080p and 1440p gaming, a spot of light 4K gaming, ie most people)
3080 16GB (aimed at 4K and more enthusiast gamers into modding etc)
3090 24GB (aimed at workstation type people, and small appendiged people who just need to have the ‘best’)
The anomaly in that above is the 3080.
Also, good call above on the FuryX, was trying to remember which one it was. Wasn’t it the 4GB of HMB memory that was supposed to have magic properties. All the other cards were releasing with 6GB+ and it all turned out to be ********* did it not?
Thanks for the explanation. All of this is sound reasoning. Unfortunately, it won't take for some, IMO, until Nvidia releases those 16/20 cards.I tried pointing this out in the Ampere 8nm thread for about 30-40 pages of back and forth on this prior to launch but it didn't really go anywhere. There's a lot of old school thinking that the vRAM is somehow just a big cache for all of the assets in the game and they sit in there in case they're needed. And that was true 15+ years ago. But engines moved on since then to dynamic streaming of assets just in time, especially in open world games. So vRAM is much less about what the game needs, GTA V my goto example is 90Gb install for me but I have 8Gb vRAM, and that streams in all those assets just fine as you zip about the island in a heli or whatever.
vRAM today is way less about being a dumb cache and instead keeping just what is needed to render the next frame. From an architecture point of view this makes sense, the powerhouse in video cards is the GPU it's literally what does the work to calculate the next output frame based on all the inputs, the vRAMs only job is to service the GPU with the assets it needs to perform the calculations needed to push out the next frame, that's it. Using it as a dumb cache is expensive because the vRAM modules are expensive.
So there's this intrinsic connection between vRAM size (not just bandwidth) and GPU speed, and vice versa you can think about it like this: that every asset you're putting into vRAM is just more stuff the GPU will be doing work on, and as you demand more work from your GPU the frame rate goes down. In modern game implementations if you're putting a crap ton of stuff into vRAM then you're also putting more load on the GPU. And all attempts so far to find a game you can load up on 10Gb+ of vRAM and go "AH HAH! 10Gb is not enough!!" on the 3080 have resulted in unplayable frame rates. And low and behind you drop all those settings to make it playable and your vRAM usage is back below 10Gb.
It's also why we get all these erroneous arguments about future proofing. About the games in 2+ years that will use more vRAM. Sure they will. But a 3080 wont be playable maxed out by then, you'll have long run into that GPU bottleneck. And that's why you shouldn't think about vRAM in terms of future proofing or game performance or what you feel you deserve or what looks nice or what is the biggest number but rather "How much can this particular GPU realistically use and still maintain playable frame rates?" which on some rough aggregate is pretty much game agnostic.