• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Now I kind of don't want to buy the 3080 10gb. Buying by march, gonna either get a 3070ti 12gb/16gb(won't get if its 10gb) or a 6800xt now....

I never cared that much about ray tracing anyways... I just wanted a dlss competitor for which I am going to wait and see if they can get something out before march...
 
Now I kind of don't want to buy the 3080 10gb. Buying by march, gonna either get a 3070ti 12gb/16gb(won't get if its 10gb) or a 6800xt now....

I never cared that much about ray tracing anyways... I just wanted a dlss competitor for which I am going to wait and see if they can get something out before march...

I suspect AMD will try to do what Nvidia is doing with DLSS 3.0, getting it to work with any game that uses TAA.
 
I suspect AMD will try to do what Nvidia is doing with DLSS 3.0, getting it to work with any game that uses TAA.
I see.... If it(amd dlss) works like 80% of what dlss 2.0 is, that's good enough for me... DLSS was what was holding me off amd but I have been thinking about it. I want to keep my card for 4-5 years I think the 16gb vram might help.
I am sure 10gb is enough right now but I would like to not reduce textures in the future if possible. I'll just do medium settings + ultra textures :p
I decided I care more about textures than ray tracing
 
Except that as pointed out several times you can’t add an arbitrary amount of memory to a card. Every time you get a card with different memory options they’re always a multiple of each other (1060 with 3GB/6GB, RX580 with 4GB/8GB etc.). Nvidia were seemingly left with an option of 10GB or 20GB. 20GB is obviously better but would have come in at a much higher price point. They probably had some idea what was coming down the track from AMD and realised that wouldn’t be competitive.
Except you can. At the design stage, but Nvidia chose to design for 10GB.

I suspect GA-102 wasn't supposed to make it into the 3080, and as you alluded, they got whiff AMD was going to be competitive, and went with a cut down version with 1/2 the VRAM, but then again this may have been their plan all along.

My only argument is we really don't know if 10GB is enough, but it is cutting it close, and I feel they could have added more at the design stage without adding too much to the price, but again I don't know the price they paid so could be wrong.
 
I see.... If it(amd dlss) works like 80% of what dlss 2.0 is, that's good enough for me... DLSS was what was holding me off amd but I have been thinking about it. I want to keep my card for 4-5 years I think the 16gb vram might help.
I am sure 10gb is enough right now but I would like to not reduce textures in the future if possible. I'll just do medium settings + ultra textures :p
I decided I care more about textures than ray tracing

Raytracing isn't just reflections, it can also be used to get accurate indoor/outdoor dynamic lighting, War Thunder uses it for global illumination.

Also if you use DLSS at 4K you'll be using a 1440p textures (which DLSS then upscales to 4K).

I think within 4-5 years the bottleneck will be the GPU's raw power rather than the VRAM.
 
Except you can. At the design stage, but Nvidia chose to design for 10GB.

You can but it's not a straight win, it's always some kind of trade off with the end result usually being some compromise of different factors.

Typically you're trying to balance multiple things like:
1) The amount of memory
2) The memory speed (in mhz)
3) The memory bus width
4) How fast/demanding the GPU is
5) Power/thermal limits

You're not only targeting a certain amount of memory capacity but also a certain amount of total memory bandwidth. If you lack enough memory bandwidth to keep the GPU fed with data then you bottleneck it and performance drops. Memory bandwidth is the total bus width multiplied by the memory speed, however each actual memory chip typically has a smaller width on its own interface, in most cases it's 32bit.

The 3080 for example has 10GB of vRAM which is 10x1Gb chips each with a 32bit interface, giving you a total of 10x32bit for a bus width of 320bits. And then a memory speed of 19Gbps which in bytes is /8 so 2.375GBps, multiplied by the 320bit bus width for your 760GB/sec total memory bandwidth.

The 6800XT opted for 16GB total memory and used 2GB chips of GDDR6 but those chips while larger are still the same bus width, so 8 total chips of 2GB each to get the total 16GB but that only leaves you with a bus width of 8*32bit or 256bit. That mixed with the fact the chips are slower at only 16Gbps or 2GBps means they only have 512GB/sec total memory bandwidth.

Normally this little memory bandwidth would bottleneck such a powerful GPU, so they spend a lot of silicon area on chip making Infinity Cache, essentially a very large L3 cache to reduce demand on vRAM. But that has a trade off because more area of the silicon spent on memory means less on transistors to do calculations with, meaning less performance. And then on top of all of that there's cost. Faster memory cost more, GDDR6x is more expensive than GDDR6, higher capacity memory costs more and the downside is that it has the same memory interface which means if you use double the density memory you get half the effective bandwidth.

It's all one giant trade off, you can target one thing to get perfect but often to the detriment of other things.
 
Raytracing isn't just reflections, it can also be used to get accurate indoor/outdoor dynamic lighting, War Thunder uses it for global illumination.

Also if you use DLSS at 4K you'll be using a 1440p textures (which DLSS then upscales to 4K).

I think within 4-5 years the bottleneck will be the GPU's raw power rather than the VRAM.
Yea but ray tracing is in its infancy right now.
I see...

I mean you can do medium settings + ultra textures on the 6800xt in 4-5 years but not on the 3080 you will have to do medium settings + medium textures... or something like that

I might still go nvidia not sure yet
 
Yea but ray tracing is in its infancy right now.
I see...

I mean you can do medium settings + ultra textures on the 6800xt in 4-5 years but not on the 3080 you will have to do medium settings + medium textures... or something like that

I might still go nvidia not sure yet

It's actually been out for quite a while now, the problem is the lack of suitable hardware so devs can't crank it up (full path tracing etc.) outside of games like Quake II/Minecraft RTX's level of detail, it's why they focus on showing off RT reflections as it's the most visually obvious example of RT to the average joe.

Not necessarily, it depends on the type of game, those most likely to be affected are those with very large open environments where you can see many places at once, like FS2020.

So while you may not be bothered about reflections, if developers start using other RT features like GI/shadows then you may find an AMD card struggles, waiting for the 3080Ti may be a wise move atm.
 
Yea but ray tracing is in its infancy right now.
I see...

I mean you can do medium settings + ultra textures on the 6800xt in 4-5 years but not on the 3080 you will have to do medium settings + medium textures... or something like that

I might still go nvidia not sure yet

It's more tricky than that. The long term trend is that vRAM demands increase over time for games, but so does their demands on the GPU itself. Once the demand on the GPU becomes too large your frame rate becomes unplayable you're forced to lower visual settings to get your frame rate to an acceptable level. As you lower visual settings to free up the GPU cycles, the demand on the vRAM also goes down. And you can spend that reclaimed vRAM on whatever you like, such as bigger texture pool.
 
Best sell that gimped card and buy a 3090 then. Even the 6900XT will be gimped. 24GB should be enough though :p




16GB obviously :p

Nah, I think this is the best piece of hardware right now. It won the award as well: RTX 3080 wins Best Gaming Hardware at the Golden Joystick Awards.
 
So with all said and done we’re now all in agreement that we all need a 3090 and that if you don’t have one you might as well take a hammer to your pc and jump off a bridge. Preferably a high one :p
 
So with all said and done we’re now all in agreement that we all need a 3090 and that if you don’t have one you might as well take a hammer to your pc and jump off a bridge. Preferably a high one :p
:D:D:D:D:D:D:DROFL:D:D:D:D:D:D:D.

By the time any of the ones arguing about 10GB VRAM use their cards they really will need more VRAM, I see this thread running till year 2050 but will my 10TB card limit me on windows solitaire .
 
Nah, I think this is the best piece of hardware right now. It won the award as well: RTX 3080 wins Best Gaming Hardware at the Golden Joystick Awards.
Well I was being sarcastic, I do have a 3080 after all :D

So with all said and done we’re now all in agreement that we all need a 3090 and that if you don’t have one you might as well take a hammer to your pc and jump off a bridge. Preferably a high one :p

Haha. Some would like to make you feel like that way yes :D
 
Its natural progression we expect. The Supers done it to Turing offering better performance and price. In the current climate, unless they can flood the market in a few weeks, I am happy with owning a FE card now instead of having the same in 6 months time...(after battling for short stock and gouged prices on the AIB's) I can foresee the Ti also being sought after but probably a smidge easier to obtain but your looking 2021 for that.
 
Status
Not open for further replies.
Back
Top Bottom