• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Don't get me wrong. I'm not chasing crazy FPS. It's just at 1440p everything runs much cooler and quieter. There is no need to overclock. I enjoyed CP2077 maxed out at 1440p with 45-capped 60 FPS using this ancient 3770k. The other problem is at 27" 4k the desktop appears just too small fom me to read.

Yeah 4k needs to be 32" or above or everything is too small on desktop. 27" is the sweet spot for 1440p as above that you start to see pixels.
 
If you throw out a large tex when its out of LOD then its going to be SLOW to get back to VRAM over the pcie bus when you need it again.

Sure, but you can do that predicatively. That's why modern game engines zone games into areas, they store a list of what can be seen from 1 zone to all other zones and at what LOD those objects would be in. They look at player movements and position and infer what will be needed in the next zone and can pre-load this stuff before it's needed. If you do all of this well then you can have many, many more times the assets in a playable area than can be held in vRAM and the assets are just swapped in and out as you play.

Me and LtMatt had this out over FarCry5 way back in the thread. I think he used a 16Gb AMD card to play FC5 in the same settings as me, and I recorded myself on a 3080 in a helicopter flying around the map, jumping out, getting into fights, and then back in and flying to other parts of the map, all in rapid succession. Doing the same in a jeep, cruising about at high speed across the entire open map. And you can see the memory usage graphed on my video, its rapidly swapping between 5GB and 7GB usage as the different areas stream in and out different textures and it's no problem.
 
Most cards cant do 4k in the latest AAA games without something like DLSS or FSR. So you are likely to be 1440p or even 1080p internal resolution upscaling to 4k. Unrel Engine 5 temporal upscaling is designed for a 1080p image upscaled to 4k on console and PC. So the GPU is ment to render the best quality 1080p image it can and then the upscaling tech does the rest. Textures are then streamed via directstorage to the gpu. Windows 11 has the support for Direct Storage.

You have developers stating left that you need as much vram as possible and MS aiming to stream textures from a NVMe drive in and out of memory. NVidia aim to do the same via RTX I/O. AMD have gone for 16GB of vram. So who wins?
 
They aren't suddenly increasing res it's happening with the release of NVIDIA/AMD 12/16 GB cards its gradually being phased in as more people own these 12/16 GB cards.
By the time say 10 months from now a new game is released with more Vram required 12/16 GB cards will be in lots of peoples hands, but not right now.
Don't expect too long though for these games to be released.

AMD, quite rightly so, has little market share and the number of Nvidia cards with more than 10GB of VRAM isn't worth considering. Does it make sense that game developers would suddenly target such a niche market? Perhaps these game developers you have been talking to are fresh of the bus.
 
The only 30 series cards worth buying right now are the FE models and for £650 10gb of vram is pretty decent coupled with the raster performance available but we can all argue its not enough so if you feel that way then you need to stump up another £400 for 12gb or an extra £750 if you want 24gb while only slightly better raster performance... Is it worth it?
 
You're right that the situation is more complicated but the principle is sound.

I didn't comment on the principle I said your highlighted statements were wrong.

A more accurate way of putting it would be that it would be pointless in having more than 1 texel per pixel. If you've got 3840x2160 pixels then you see no benefit to having more than that many texels visible on screen. (exceptions for MSAA/SSAA that in a sense take subpixel samples, but are almost never used today)

I would say 1 texel being slightly smaller than a pixel but that is a minor detail. From the game texturing perspective, you don't have full control over the scene. Since in a video game camera position is controlled by the player it is not possible for the artist know the size of the texels size relative to the pixels. The artists concern is ensuring a consistent level of texture quality throughout the scene.


But the bottom line is that I'm talking absolutely worse case scenario, which rarely happens. In 3D spaces, objects appear smaller at greater distances, but due to the nature of the volume of a sphere and how volume increases as a function of radius cubed, you end up with vastly more objects at a distance on average than close by, assuming roughly uniform distribution.

Objects aren't uniformly distributed in a game scene so that makes this irrelevent then.

But the bottom line is that I'm talking absolutely worse case scenario, which rarely happens. In 3D spaces, objects appear smaller at greater distances, but due to the nature of the volume of a sphere and how volume increases as a function of radius cubed, you end up with vastly more objects at a distance on average than close by, assuming roughly uniform distribution. Imagine you're in open world Fallout 4 and you look out into the game world almost all of the objects are tiny in your field of view and use the low resolution mipmaps of the textures. Even objects quite close to you end up as a small total proportion of the pixels on the screen. Which is why there's limits on texture resolutions that result in improved visuals.

It's why a large percentage of gamers report no visual improvement for high res texture packs. And why they're typically labeled or advised for "UHD" or "4k".
Yes there is a limit to texture resolution, i never said or implied otherwise. The limit more closely relates to the texel density (Number of texels relative to the surface area of the model). There are number of tricks artist use to increase this as much as possible, (overlaying UVs, prioritising texture surface area to certain parts of the model, using multiple textures, etc..) Without knowing how the models were unwrapped and what changes are actually in those high res texture packs relative to the standard one, I'm not going to speculate on why people can't see visual improvements because there isn't enough information.
 
Last edited:
The only 30 series cards worth buying right now are the FE models and for £650 10gb of vram is pretty decent coupled with the raster performance available but we can all argue its not enough so if you feel that way then you need to stump up another £400 for 12gb or an extra £750 if you want 24gb while only slightly better raster performance... Is it worth it?
That's just Nvidia milking customers because they know they will have people defending them for releasing a gimped card.
 
Thread can be closed now, 10gb VRAM, conclusively plenty for a 3080. ;)

You literally had people claiming that 3080 was only good for 1080P LOL and 4k was all blurry, still havent provided examples even when called out about it.
Right now at this point in time a 3080 is plenty. next year? who bloody knows, dont think anyone on this forum has a crystal ball.
 
Thread can be closed now, 10gb VRAM, conclusively plenty for a 3080. ;)
:D

You literally had people claiming that 3080 was only good for 1080P LOL and 4k was all blurry, still havent provided examples even when called out about it.
Right now at this point in time a 3080 is plenty. next year? who bloody knows, dont think anyone on this forum has a crystal ball.
Yep. All they can say is yea but it may not be in the future etc etc. Well in the future I will be on a future graphics card :D
 
AMD, quite rightly so, has little market share and the number of Nvidia cards with more than 10GB of VRAM isn't worth considering. Does it make sense that game developers would suddenly target such a niche market? Perhaps these game developers you have been talking to are fresh of the bus.
but what if nvidia themselves push 16 gb 4070 and 4080? in that case, wouldn't it be nvidia's benefit to letting games run bad on 8-10 gb vram gpus so that everyone is forced to buy their new 16 gb shiny cards
 
but what if nvidia themselves push 16 gb 4070 and 4080? in that case, wouldn't it be nvidia's benefit to letting games run bad on 8-10 gb vram gpus so that everyone is forced to buy their new 16 gb shiny cards

There has to be a mainstream requirement for such VRAM for it to make sense. So far there hasn't been and there is nothing on the horizon to make it so. It could be Nvidia do launch the next series with 16GB, which they could make use of with ray caching. But then the 3080 will have run out of grunt having to do some of that in software.
 
but what if nvidia themselves push 16 gb 4070 and 4080? in that case, wouldn't it be nvidia's benefit to letting games run bad on 8-10 gb vram gpus so that everyone is forced to buy their new 16 gb shiny cards

I think NVidia would be pushing more vram now but for the shortages. Just to look the same as AMD. Most people just go on one media outlet and dont care if its objectively telling them the truth. So 16GB is best subjectively. FSR is better than DLSS sujectively. AMD cards are faster subjectively. It does not matter if almost every game benchmarked is faster on the AMD CPU or if SAM support in that game made AMD cards faster.

Same with FSR, you get a lot of games that no one plays like terminator.

People will just see 10GB bigger than 16GB and then make up whatever reason they can to justify 10 < 16GB. Afterall a bigger number is better, right.

No one considers AMD or NVidia cards as a performance balance between DXR and raster games. This is because the media turned reviews into DLSS and DXR are irrelevent and thus AMD win. Now that FSR is out, its already better than DLSS and will get better in time. DLSS was ripped apart in pixel by pixel comparison with native resolution images.

With such a baised media reviewing products, 16GB will win over 10GB in the hearts and minds of customers. Reguardless if games need 10GB or more. Must customers will hear their opinion on something like hardware unboxed or someother gaming review channel and never engage their critical thinking or seek a second objective opinion.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom