• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
There is no RTX 3080 ti, it does not exist. NVidia have rejected all the rumors of more vRAM so far. The rumors are just that rumors.
Why would Nvidia confirm Rumours? If they did have a 3080ti they will have an official launch event not release a statement confirming Internet rumours
 
I notice that people are NOT saying that the "Flagship card must be able to run all games maxed at 4K".

Instead they seem willing to accept a card that cannot max settings on all games....But if the reason they have to turn down a setting is vram, well, now *this* is just unacceptable!

A 3090 can't run my favorite sim maxed out in VR and that game doesn't even use 8gb of vram. I'm not calling for everyone to grab their torches and pitchforks though.

I have pointed this out, but most of the people making that stupid argument have me on ignore i think :p
 
Im sure that if the WiFi can get you on OCUk then it can buffer a 3 minute video. :p
I can just about get on to the website. Sometimes it times out and I get booted off the WiFi.

I have however seen the article and the quote is ambiguous. I am glad that we are moving to 4k textures though.
 
Even better point here https://polycount.com/discussion/217846/can-you-tell-which-render-is-using-4k-textures Note how most people could not tell the difference.
From the first few posts
given the fact its a 1080 render, chances are you would never be able to see the difference as it is pixel bound by the resolution anyways.

but in my 4K monitor, with such a small image (1798x915), it's very hard to see any difference.

I think the main issue in identifying the 4k image is this rendered image is 1800x900 pixels, and we're trying to identify which model is 4k textures

Edit. I see he did provide higher quality renders further down

Edit 2: wait are you really trying to say there is no discernable difference between 4k and 2k texture because of this one post you found?
*Insert laughing gif here*
 
Last edited:
They confirmed the 3080 20GB by saying it wont exist.
Did they say this during a financial analyst day?
If not they could be lying. They could have also been lying at the time they made the announcement but then decided not to release the card a few days later.

However let's assume that the 3080 20gb or 3080ti it did exist. How would it benefit Nvidia to confirm that it does exist?
 
They are 4K x 4K textures.
From the first few posts






Edit. I see he did provide higher quality renders further down


From the guy that did it.

4k (image size) renders using the same assets and render settings as the 1080p (image size) renders.

Answer: In all of the images (including the first 1080p render) the 2k textures are on the left and the 4k textures are on the right.

Methodology:
All renders use the same 3D assets, lighting setup, render settings and mip correction. One model uses 4k (4096x4096) texture files and the other uses texture files that were down sampled from 4k to 2k (2048x2048).

All comparison images have a 1:1 pixel size. Renders were cropped, without resizing or sharpening, to remove empty space and place both renders side by side in one image. Uncropped renders were captured at 1920x1080 (1080p) and 3840x2160 (4k).

Other thoughts:
This was an informal survey (for fun) and the spirit of the initial poll was to get an off the cuff response based on the limited information provided by the post and the sample image. A 1080p render size was chosen for the initial poll because: it's the most common resolution reported by the STEAM hardware survey, Artstation (according to their documentation and unless you have a Pro subscription) converts images to normalized 1920xN JPGs and other social platforms have similar or smaller image size restrictions.

So far there's been some great technical discussion but looking at it from a less technical standpoint: does this result have any broader implications for situations where non-artists and non-technical people are reviewing portfolios?

So what you do in a game that uses 4k (4096x4096) texture files and 12GB vRAM you down sampled from 4k to 2k (2048x2048). No one can tell the difference.
 
Last edited:
Did they say this during a financial analyst day?
If not they could be lying. They could have also been lying at the time they made the announcement but then decided not to release the card a few days later.

However let's assume that the 3080 20gb or 3080ti it did exist. How would it benefit Nvidia to confirm that it does exist?

I already posted Gamers Nexus stating in a video they confirmed with NVidia.
 
From the first few posts






Edit. I see he did provide higher quality renders further down

Edit 2: wait are you really trying to say there is no discernable difference between 4k and 2k texture because of this one post you found?
*Insert laughing gif here*

There is really little to none if done right.

Normal maps

I would go with nearest neighbor.

2k left and 4k right


4096x4096 textures are better overall. Spot the difference. 2k left and 4k right

You should start to see the difference.


The one on the left is burred and the one on the right has more detail.
 
Last edited:
There is really little to none.

o5akl7mjy3b0.jpg
I'm on mobile and I could see a difference in the centre thing and some other areas around it.

There is a difference between 4k textures and 2k textures, (I can't wait to have this same conversation regarding 8k and 4k textures :p) just because in this one instance people are not able to spot the difference doesn't mean 4k textures are pointless. This is from my own experience working with textures in blender and seeing other peoples work around the internet

Edit: quixel mixer lets you flick between 2k and 4k working reaolution textures and even on my 1080p screen at home I could notice a difference in the detail while observing from a reasonable distance
 
I'm on mobile and I could see a difference in the centre thing and some other areas around it.

There is a difference between 4k textures and 2k textures, (I can't wait to have this same conversation regarding 8k and 4k textures :p) just because in this one instance people are not able to spot the difference doesn't mean 4k textures are pointless. This is from my own experience working with textures in blender and seeing other peoples work around the internet

Edit: quixel mixer lets you flick between 2k and 4k working reaolution textures and even on my 1080p screen at home I could notice a difference in the detail while observing from a reasonable distance

The difference in game is not that great. You keep the textures you see the most at 4k and downsample the rest to 2k. Or you can compress the textures. I believe you can use the tensor cores to compress the contents of the vRAM. I believe the tensor method is lossless. Not much is known but there are other methods like Delta colour compression which is lossless as well.

Memory-wise, Nvidia appears to be focusing on improvements in memory compression to deliver increased effective memory bandwidth with Ampere. This allows Nvidia to increase its memory performance without increasing the VRAM capacities, and build costs, of its next-generation graphics cards significantly. A new technology called Tensor Accelerated VRAM Compression is also said to be in the works.
https://www.overclock3d.net/news/gp...res_leak_-_rtx_speed_boost_nvcache_and_more/1

The new chips will also use Tensor Memory Compression, which allows the Tensor Cores to compress and decompress items in VRAM. This would result in a solid reduction in VRAM usage, which would allow for games with more textures to run smoother.
https://medium.com/@techtelligence/...ng-we-know-about-ampere-s-future-afb214900ba2

Turing Memory Compression
NVIDIA GPUs utilize several lossless memory compression techniques to reduce memory bandwidth demands as data is written out to frame buffer memory. The GPU’s compression engine has a variety of different algorithms which determine the most efficient way to compress the data based on its characteristics. This reduces the amount of data written out to memory and transferred from memory to the L2 cache and reduces the amount of data transferred between clients (such as the texture unit) and the frame buffer. Turing adds further improvements to Pascal’s state-of-the-art memory compression algorithms, offering a further boost in effective bandwidth beyond the raw data transfer rate increases of GDDR6. As shown in Figure 10, the combination of raw bandwidth increases, and traffic reduction translates to a 50% increase in effective bandwidth on Turing compared to Pascal, which is critical to keep the architecture balanced and support the performance offered by the new Turing SM architecture.
https://developer.nvidia.com/blog/nvidia-turing-architecture-in-depth/

Watch Dogs legion HD texture pack requires 11GB of memory but runs fine on a 3080.

4K / Ultra Settings
  • CPU: Intel Core i7-9700K / AMD Ryzen 7 3700K
  • GPU: NVIDIA GeForce RTX 2080 Ti or AMD Radeon VII
  • VRAM: 11 GB
  • RAM: 16 GB (Dual-channel setup)
  • Storage Space: 45 GB (+ 20 GB HD Textures Pack)
  • Operating System: Windows 10 (x64)
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom