• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Yes but remember what we call next gen is still using old engines. We don`t know how much Vram a good UE5 game will use.
It is true that 10gb looks like a sweet spot for 4k gaming right now. And that the GPU won`t have the power to run better games at 4k ultra anyway so more Vram may be useless on this generation of GPU`s. But i think we will see more Vram needed for UE5 games.
 
I am sure the 4080 will have at least 11gb. That should be enough for another year. :)
But if 10gb is enough for 4k gaming on ultra, why do they bring the 3080Ti with 20gb of Vram? Isn`t that wasted resources and bigger price?

It would be to offer more sku's. Clearly there are many who do not like the idea of going 10gb and also do not like the idea of paying £1400. Shave off £400 off that and provide something between the 3080 and 3090 and well, it just works :D

It will also be another weapon to combat the 6900XT and have something at the same $999 price point.

It has a lot more to do with that then what you are saying anyway imo.

4080 will have no less than 16GB. Put it this way, we can do a poll when the 4080 (Hopper) comes out and ask, if you was to be given a GPU for free, would you prefer a 4080 or 3090, I bet majority will want 4080 as it will have much better performance and also be the latest shiny. It is obvious anyway imo, once Nvidia make a jump to 5nm there will be a big leap in performance and not to mention if hopper is a chiplets design it will likely be a lot better.

@Th0nt can bookmark this post if he wants and we can visit it again in approximately 2 years time :D
 
I am sure the 4080 will have at least 11gb. That should be enough for another year. :)
But if 10gb is enough for 4k gaming on ultra, why do they bring the 3080Ti with 20gb of Vram? Isn`t that wasted resources and bigger price?

How much memory these cards are fitted with isn't something you can choose arbitrarily, it's restricted by various different factors. The architecture of the video card means it will use a certain memory bus width which means they need to stick to certain multiples of memory chips which are only available in certain sizes and speeds. You don't want to create more memory bandwidth than you need (vRAM speed * bus width) as that would be a waste and you don't want to put more memory on the card than you need as that would also be a waste, it's a careful balance between all these factors.

I'm not 100% sure on this but it looks to me like with a bus width of 320 bit on the 3080 they were stuck with multiples of 10, so 10GB is enough and 20GB would have been way too much. With the 3090 you have a bus width of 384 bit, and so likely multiples of 12, so either 12GB or 24GB and 12GB wouldn't have been enough for a prosumer card which is clearly what they were targeting with that memory config.

If I had to bet right now given all of what we know are rumours, I'd guess that a 3080Ti would switch it up to a 384bit bus of the 3090 with a GPU almost as fast as the 3090 (maybe a few less CUDA cores), and then stick with a 12GB memory config for it. That's going to keep the costs down as that extra 10-12GB of GDDR6x is going to be very expensive and they're going to need to try and get close to $999 of the 6900XT. 12Gb is the perfect amount of memory for a card like this targeted at gaming only.
 
The truth is nobody knows if 10GB is enough, I imagine it will be for at least the next 2 years, but 10GB is right on the edge.

I assume Nvidia could have added 2GB (a 20% boost) and still comfortably come in under £700, but that's addmitedly a wild guess, based on older VRAM cost per gig, so who knows.
 
The truth is nobody knows if 10GB is enough, I imagine it will be for at least the next 2 years, but 10GB is right on the edge.

I assume Nvidia could have added 2GB (a 20% boost) and still comfortably come in under £700, but that's addmitedly a wild guess, based on older VRAM cost per gig, so who knows.
I was playing zombies cold war last night, on ultra settings 1440p/144hz. I noticed when I pulled up the overlay that it was using about 11gb VRAM which is pretty surprising. I wouldn't be comfortable with just 10GB of VRAM going forwards.
 
It will be useless because in 2 years time you won`t be able to visit the forum. You will need more Vram for that.
Haha. Some would have us believe that yes :p

The adamant naysayers have all gone back under their rocks, for now anyway :D


My.. my straw man Derek is going to welcome the sword thrusting!!! :D
:)


I was playing zombies cold war last night, on ultra settings 1440p/144hz. I noticed when I pulled up the overlay that it was using about 11gb VRAM which is pretty surprising. I wouldn't be comfortable with just 10GB of VRAM going forwards.
That’s not how it works. That’s allocated vram not what is needed before fps tanks. Hell, even I had nearly the whole 12gb used on final fantasy 15 a couple of years ago on my Titan XP. Did that mean 1080Ti could not run the same settings? Nope.
 
Going from 11gb to 10gb for their high end graphics card was a foolish move by Nvidia, regardless of whether the latest gen games need it. Especially with AMD coming strong up the rear.

No one knows what future games will need, especially with such advances in resolution, textures, lighting, drive space, ray tracing, direct vram access etc. Anyone claiming they do is just speculating/lying/fanboying.
 
Going from 11gb to 10gb for their high end graphics card was a foolish move by Nvidia, regardless of whether the latest gen games need it. Especially with AMD coming strong up the rear.

No one knows what future games will need, especially with such advances in resolution, textures, lighting, drive space, ray tracing, direct vram access etc. Anyone claiming they do is just speculating/lying/fanboying.

I would hope that Nvidia themselves would be working closely with developers though.
 
Going from 11gb to 10gb for their high end graphics card was a foolish move by Nvidia, regardless of whether the latest gen games need it. Especially with AMD coming strong up the rear.

No one knows what future games will need, especially with such advances in resolution, textures, lighting, drive space, ray tracing, direct vram access etc. Anyone claiming they do is just speculating/lying/fanboying.
3080 with 10GB was not foolish, it allowed them to hit a decent price point which is why it is so damn popular.

What was maybe foolish is not also coming out with a 3080 with 20gb or 3080Ti with 12gb (slightly cut down 3090) on release also. Thing is we will be getting one of those soon anyway no doubt.

I agree no one knows what is coming regards to future games, can only take education guesses and make a choice that works for you personally. That is what I did.


I would hope that Nvidia themselves would be working closely with developers though.
They likely are in triple a games. At least on all the ones I am interested in playing anyway. I am sure Cyberpunk, Dying Light 2 and Bloodlines 2 will all work fine with 10gb on maximum textures. Those are the 3 games I am interested in by far and all seem to be working closely with Nvidia from what I can see.

But let’s not kid ourselves. As soon as their next gen stuff is out with hopper they won’t give a rats ass about ampere.
 
In terms of short term monetary gain, sure, it was a smart move.

But foolish with the mindshare with AMD finally stepping up to the plate. As someone who hasn't bought an AMD/ATi card since the 9700 PRO nearly two decades ago, I am questioning my brand loyalty now. And with the stock shortages, it's allowing me to have a long, hard think.

I dropped over a grand on a Nvidia card two years ago, might not be doing that for quite some time.
 
In terms of short term monetary gain, sure, it was a smart move.

But foolish with the mindshare with AMD finally stepping up to the plate. As someone who hasn't bought an AMD/ATi card since the 9700 PRO nearly two decades ago, I am questioning my brand loyalty now. And with the stock shortages, it's allowing me to have a long, hard think.

I dropped over a grand on a Nvidia card two years ago, might not be doing that for quite some time.
I am the opposite. I have no brand loyalty and very happy they hit that price point. Had they stuck 20gb on it and wanted more money I would not have a 3080 now and may have pushed me to sell my g-sync monitor for a VRR one and go 6800XT even though I really want to play Cyberpunk with all the bells and whistles.

I would rather go for 10gb now and upgrade again next gen then pay a lot more and be stuck with a card multiple gens. I mean we all enjoy this hobby and you have to admit when a new gen of cards come along it is fun most of the time. Why not join in the fun each gen if you are able to :)

Plus as I said, there will be an option to cater for people who cannot afford/justify £1400 soon anyway.


You must be crazy! ;) :p
Lol ;)
 
I assume Nvidia could have added 2GB (a 20% boost) and still comfortably come in under £700, but that's addmitedly a wild guess, based on older VRAM cost per gig, so who knows.

Except that as pointed out several times you can’t add an arbitrary amount of memory to a card. Every time you get a card with different memory options they’re always a multiple of each other (1060 with 3GB/6GB, RX580 with 4GB/8GB etc.). Nvidia were seemingly left with an option of 10GB or 20GB. 20GB is obviously better but would have come in at a much higher price point. They probably had some idea what was coming down the track from AMD and realised that wouldn’t be competitive.
 
You must be crazy! ;) :p

Yeah a little, but none of us know how much time we've got left, and 4k gaming was just that good. Still is.

But that was two years ago, at no point did I think I was getting good value, and a lot has changed since then. Chiefly the amazing value next gen consoles. The gaming landscape has changed, hopefully Nvidia don't see themselves as too big to not change with it.
 
Status
Not open for further replies.
Back
Top Bottom