• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA’s Neural Texture Compression - 90% Less VRAM Usage

Soldato
Joined
19 Feb 2007
Posts
15,280
Location
London
The answer is yes .

I for one does own decent audio hardware and I use tidal instead of Spotify.

I also encode video and mess with settings.

Spotify generally streams at 160kbps and then they artificially ups the bitrate to 320 for paid users which makes nearly no difference due to the music already being heavily compressed, So not really a good comparison.

The point of this tech is to reduce VRAM usage while not impacting visual quality or at the least to a point you'd find it incredibly difficult to tell the difference.
 
So 7zip archives are degrading the files I backup every day? :D



Close enough is often good enough - especially for the purpose of this topic: textures.

Does it actually matter if when the texture is "recreated" using AI, that a couple of pixels have RGB values that are off by single digits?



I could almost agree with you that differences in Audio files are more easily noticed, but then you rolled out the Audiophile bingo card with things like "fuller" and "distinguished" :D

Cousin of mine is an audiophile, I swapped out 1 of the albums he was listening to regularly that was "super mega high bit rate mega super" for one that was 320kbps and he didn't notice until I said something a year later :cry:

The amount of BS the audiophile community comes out with is quite entertaining :D
 
The thing with audiophiles is most of them are half deaf from loudspeakers, i do think i can tell mp3 from flac music but i never tried a blind test to see

I've got my own FLAC collection and there's definitely a difference between mediocre quality MP3 and high bit rate music but listening to my cousin come out with all the jargon he does is like listening to a salesman in the 1800's wildwest telling you why his snake venom tonic is the best... full of BS but entertaining :D
 
I've long thought ( yet have zero proof ) that nVidia uses some form of image / texture compression in their general pipeline which was an area that gained them a performance advantage over AMD. My opinion originally being based on side by side comparison examples on review sites on games where there were leaves and similar objects. The textures on nV I felt just had a bit of softness to them which AMD didn't, with my gut feeling being it looked like like how image compression would affect clarity.

I even feel it might be the case in the 2D desktop environment. Years ago, I commented in a post here about how I felt that the picture quality from my 2400G apu had more clarity than the discrete nVidia 1080ti fitted into the machine. This was using the same hdmi cable and 4k TV display and going between each source on the computer. I know that shouldn't be the case given the digital signal path down hdmi, but I really felt there was a difference.

I still have the same 2400G and 4K TV ( different motherboard tho) , and just last week I installed a 3060ti into the machine ... and you know what, again, it feels like the nVidia card image quality just lacks a little something that the 2400G had in terms of clarity on the desktop environment.

So it really doesn't surprise me that nVidia now would be putting out some form of texture compression as a feature. Cause I think they've been doing it for years !

I read absolutely ages ago, Around the RTX 2000 series launch, About Nvidia using some type of texture compression that was made possible with GDDR6 on the 2000 series.

Can't find anything on Google but I do remember reading about it.
 
Last edited:
Back
Top Bottom