• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
One of the biggest hurdles for direct storage on PC is selecting a minimum specification for storage speed. It works on consoles because they are standard spec but what about on PC.

What should a dev mandate as the storage device?
Sata ssd?
Nvme pcie 3?
Nvme pcie 4?

Okay now what speed should they target on this storage medium?

Or will they test your device and gate off certain settings because of your storage device? Sorry sir you can't use ultra preset because you need a pcie 4 Nvme drive even though you 3090/6900xt is fast enough to run the game.

Hardware support. Been trying to work out if my system supports it. So far NVMe Gen 3 drives could work but there is no information at all. You need PCIe p2p support and there is mention of other features. Intel server cpus have no issues running gpudirect.
 
One of the biggest hurdles for direct storage on PC is selecting a minimum specification for storage speed. It works on consoles because they are standard spec but what about on PC.

What should a dev mandate as the storage device?
Sata ssd?
Nvme pcie 3?
Nvme pcie 4?

Okay now what speed should they target on this storage medium?

Or will they test your device and gate off certain settings because of your storage device? Sorry sir you can't use ultra preset because you need a pcie 4 Nvme drive even though you 3090/6900xt is fast enough to run the game.

I sort of expect that because the SSD on the PS5 is quite fast @ 5.5GB/sec but the SSD on the Xbox Series X is quite a lot slower @ 2.4GB/sec, and on the PC it could be anything in between that or up to 7.5GB/sec, game developers will be forced to do one of two things. Either target the lowest common denominator and then for PC just make that the min spec, which is not ideal. Or they'll design games to make use of the speed in different ways, so we might still see loading sections in games like say an elevator ride that masks the transition, but the length of that ride would vary for everyone based on your loading speed. For open world games where streaming is more regular and dynamic I can imagine maybe view distances being different to mask loading in, or more aggressive LODs and things like that.
 
I sort of expect that because the SSD on the PS5 is quite fast @ 5.5GB/sec but the SSD on the Xbox Series X is quite a lot slower @ 2.4GB/sec, and on the PC it could be anything in between that or up to 7.5GB/sec, game developers will be forced to do one of two things. Either target the lowest common denominator and then for PC just make that the min spec, which is not ideal. Or they'll design games to make use of the speed in different ways, so we might still see loading sections in games like say an elevator ride that masks the transition, but the length of that ride would vary for everyone based on your loading speed. For open world games where streaming is more regular and dynamic I can imagine maybe view distances being different to mask loading in, or more aggressive LODs and things like that.

The xbox series x is 2.4GB/sec compressed and 4.8GB/s uncompressed (raw data). https://www.psu.com/news/ps5-ssd-vs-xbox-series-x-ssd-which-is-better/

 
Last edited:
I have a fairly unique setup 2x Samsung 960 Pros in RAID 0, which is 2 fast SSDs both with sequential read speeds of 3500 MB/sec set up to be read/written in parallel which doubles the speed (although actually not really doubled because the PCI-e 4x bandwidth is a bottleneck), in addition an overclocked [email protected] which is no slouch for gaming and more specifically uncompressing texture data on the fly. I suspect this has something to do with my game being hitch free. These are horrible bottlenecks to have to overcome with expensive components however, and I think DirectStorage and the RTX IO for Nvidia and the AMD equivalent (does this have a name?) will make this a thing of the past.

I wonder why they went pci-e at all. imagine what AMD could do if they stuck an IO die on there and two DDR4 slots. cheap, 32/64gb caches with eight times the bandwidth and no need for the GPU to use the pcie bus to access the cache.
 
I will have to turn down settings on my favorite sim due to a lack of GPU horsepower, not an issue with vram.

Are we going to see a thread titled "Is xxTFLOPS enough for 4k gaming?".

Because it isn't right now. It's not enough to max all games at 4k. Not even the 3090.

No one seems to care though unless they have to turn down settings for a lack of vram.

Lack of GPU horsepower? Crickets....
 
I will have to turn down settings on my favorite sim due to a lack of GPU horsepower, not an issue with vram.

Are we going to see a thread titled "Is xxTFLOPS enough for 4k gaming?".

Because it isn't right now. It's not enough to max all games at 4k. Not even the 3090.

No one seems to care though unless they have to turn down settings for a lack of vram.

Lack of GPU horsepower? Crickets....

High five!

The 3080 already cant hit a 60fps average on a number of titles at 4k if you max everything out, and out of all of those we've seen so far, none of them have been a result of having insufficient vram, not even watchdogs. people buying a new card expecting everything to run flawlessly using ultra over-the-top settings need a reality check i think.
 
I will have to turn down settings on my favorite sim due to a lack of GPU horsepower, not an issue with vram.

Are we going to see a thread titled "Is xxTFLOPS enough for 4k gaming?".

Because it isn't right now. It's not enough to max all games at 4k. Not even the 3090.

No one seems to care though unless they have to turn down settings for a lack of vram.

Lack of GPU horsepower? Crickets....

Maybe I am missing the point of your post, but a 3080-level and above card is generally good for gaming at 4k, the reviews and benchmark show that. It's mainly when heavy ray tracing is added that things start to slow down below 60fps. Or when the game is is coded like a POS eg: Watchdogs.

According to your sig you run a 1080Ti and as respectable as it is, it is still a 2 generations old card so you can't expect it to rip through 4k on your favourite sim without turning down settings.
 
I will have to turn down settings on my favorite sim due to a lack of GPU horsepower, not an issue with vram.

Are we going to see a thread titled "Is xxTFLOPS enough for 4k gaming?".

Because it isn't right now. It's not enough to max all games at 4k. Not even the 3090.

No one seems to care though unless they have to turn down settings for a lack of vram.

Lack of GPU horsepower? Crickets....

High five!

+1

Because it does not fit what they are pushing :D

What makes me laugh is, why can't people understand that everyone's needs are different? Like I come out and said 10GB on the 3080 is fine for me as it plays every game ever made to date without running out of VRAM and likely at most only a handful of games that I am interested in playing will want more than that between now and the launch of Hopper which I will upgrade to so no big deal. That is then twisted into something completely different and I am put in the box of defending 10GB and I am silly for doing so etc.

Mark Twain comes to mind “Never argue with an idiot. They will drag you down to their level and beat you with experience.” :p
 
Maybe I am missing the point of your post, but a 3080-level and above card is generally good for gaming at 4k, the reviews and benchmark show that. It's mainly when heavy ray tracing is added that things start to slow down below 60fps. Or when the game is is coded like a POS eg: Watchdogs.

According to your sig you run a 1080Ti and as respectable as it is, it is still a 2 generations old card so you can't expect it to rip through 4k on your favourite sim without turning down settings.
I'm looking at 28fps when I max out Project Cars 2. Since I'm on a Reverb in VR, I need 3X the performance of my 1080Ti. I'm also only alocating a little ove 7gb of vram when everything is maxed.

So, I don't need more vram. I need 3 times the horsepower.

I have seen zero benchmarks suggesting that a 3090 is 3 time as fast as a 1080Ti.

Of course, after testing, I have turned my settings back down to potato and I'm holding 90fps most of the time with my 1080Ti.

When I get my 3080 or 6800XT, I will turn up the settings, but I will not be able to max the settings.
 
Last edited:
Maybe I am missing the point of your post, but a 3080-level and above card is generally good for gaming at 4k, the reviews and benchmark show that. It's mainly when heavy ray tracing is added that things start to slow down below 60fps. Or when the game is is coded like a POS eg: Watchdogs.

According to your sig you run a 1080Ti and as respectable as it is, it is still a 2 generations old card so you can't expect it to rip through 4k on your favourite sim without turning down settings.

Their point is that the discussion is one sided, a lot of people are super concerned about if a video cards vRAM is future proof, but not if the GPU is, and the truth is you need a balance of both otherwise you get no benefit. There's zero benefit in having a video card that has enough vRAM to run games in 4 years time at max settings, but at the same time has a GPU so weak that those games run at 7fps. The reverse is also true, you can have all the GPU power in the world but if the vRAM runs out there's no real benefit.

This is why we can look to more demanding games that are new and have lots of next gen features, what happens when we crank up the settings. Do we run out of vRAM first or does our frame rate tank due to the GPU not being fast enough. And we've built a body of evidence looking at performance of games that show first that no games we've tested exceed the vRAM on these cards, and that we can push GPUs to their limit with modern games.
 
3080 is just a midrange card anyway so 10gb is probably a good match for it.

RTX 3080 is not midrange thanks to the very strong competitive pressure from AMD forcing Nvidia to use the full-fat GA102 die for it - 628 sq. mm.

GTX 1080 was a midrange card because its GP104 dies measured only 314 sq. mm.
 
Status
Not open for further replies.
Back
Top Bottom