No i mean its *using* only 6gb. Its allocating the full 8gb...Even if it required 100GB Video memory, afterburner will not show more than 8192MB as it's the maximum capacity of the 3070 that was tested.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
No i mean its *using* only 6gb. Its allocating the full 8gb...Even if it required 100GB Video memory, afterburner will not show more than 8192MB as it's the maximum capacity of the 3070 that was tested.
Wait for god fall coming on 12th november, its a true next gen game, it should give us a true idea if 10gb and 8gb are good for 4k and 1440p respectively..Unfortunately looking at previous generation games will not really give us any indication of where the next gen is heading because the next gen have a brand new architecture with faster access to disc and double the ram capacity.
Well, it will only use whatever video memory your GPU has (actual usage) it cannot use more video memory than what is present. After that it will start to use system memory instead of video memory.
By real time.. you mean decompress, move and operate in 1 clock cycle?
Again, yes I do.Do you honestly think Nvidia would have released new cards up to 10 GB (excluding the 3090 as that is a halo card) if this was going to be a problem?
Seems you are missing the point he is making, when you run out of VRAM the performance will drop off a cliff or there will be some other negative consequence. If there is not any of that happening and the game lets you run with the highest texture then there is no problem then it does not NEED 12gb.Even if it required 100GB Video memory, afterburner will not show more than 8192MB as it's the maximum capacity of the 3070 that was tested.
I agree mate, i think my point was perhaps lost but this was the message i was trying to convey in the 6000 series 16GB thread.That is not the point, when you run out of VRAM the performance will drop off a cliff or there will be some other negative consequence. If there is not any of that happening and the game lets you run with the highest texture then there is no problem then it does not NEED 12gb.
Like a few years ago, I was running Final Fantasy 15 on my Titan XP. It was using up nearly the full 12gb. Does that mean the game was not running fine on 11gb or even 8gb cards? Nope. It ran perfectly fine, if not better on a 1080Ti which had better clocks than my Titan XP.
Not long left now, we will soon find out![]()
Yeah, if that happens then we will know 10gb is indeed not enough for the highest texture setting for that gameI agree mate, i think my point was perhaps lost but this was the message i was trying to convey in the 6000 series 16GB thread.
Sometimes you'll just get a hitch/stutter when you saturate video memory and use system memory instead.
Good for DLSS etc. Tensor Memory Compression
The diagram shows the data going straight to the SMs. https://www.nvidia.com/content/dam/...ter/nvidia-ampere-architecture-whitepaper.pdf page 41.
They've both done it over the years. I remember when ATi were forcing FP16 to do the same thing.I wonder if Nvidia can write a special driver 'hack' to reduce VRAM usage (and IQ) to Godfall to fit in the 3080's small 10GB buffer?
I remember they got in trouble for driver hacks in 3dmark years ago, where they'd lower IQ to score higher in the benchmark.
Again, yes I do.
https://videocardz.com/newz/godfall-requires-12gb-of-vram-for-ultrahd-textures-at-4k-resolution
Like I said before next gen games ported to pc will require more vram. Specially those games that are open world using ray tracing.
Nvidia has no part/no say in console development let alone console games. So they won't know.
![]()
I just checked that tweaktown article.. that was supposed to be some kind of pre release leak compilation, and most of the other points in there like traversal coprocessor, tdp etc didn't turn out true
That diagram just shows hierarchy .. I will revert after I am able to skim through it
What we will see is if a few more games titles suggesting more than 10 gigabytes of vram.We will see when it releases. Bet it runs on the RTX 3080.
True, i just hope all the games coming out in the next 2 years run nicely on the 3080 at 4k ultra, if that is true, that means i can make a 3080 last 3 years or so at 1440p ultraWhat we will see is if a few more games titles suggesting more than 10 gigabytes of vram.
![]()
What we will see is if a few more games titles suggesting more than 10 gigabytes of vram.
![]()
They've both done it over the years. I remember when ATi were forcing FP16 to do the same thing.
I remember Nvidia being caught red handed far more often than AMD at least.
In modern days they wouldn't get a way with it, so tend to introduce proprietary 'features' such as gameworks, designed to cripple AMD's performance.
These days they've moved to DLSS, which looks absolutely terrible some games, making huge rendering mistakes, while looking very good in others. Either way, they don't like playing by the book!
DLSS got fixed its now a game changer.