• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Unfortunately looking at previous generation games will not really give us any indication of where the next gen is heading because the next gen consoles have a brand new architecture with faster access to disc and double the ram capacity.
 
Unfortunately looking at previous generation games will not really give us any indication of where the next gen is heading because the next gen have a brand new architecture with faster access to disc and double the ram capacity.
Wait for god fall coming on 12th november, its a true next gen game, it should give us a true idea if 10gb and 8gb are good for 4k and 1440p respectively..
 
Well, it will only use whatever video memory your GPU has (actual usage) it cannot use more video memory than what is present. After that it will start to use system memory instead of video memory.

I wonder if Nvidia can write a special driver 'hack' to reduce VRAM usage (and IQ) to Godfall to fit in the 3080's small 10GB buffer?

I remember they got in trouble for driver hacks in 3dmark years ago, where they'd lower IQ to score higher in the benchmark.
 
Do you honestly think Nvidia would have released new cards up to 10 GB (excluding the 3090 as that is a halo card) if this was going to be a problem?
Again, yes I do.
https://videocardz.com/newz/godfall-requires-12gb-of-vram-for-ultrahd-textures-at-4k-resolution

Like I said before next gen games ported to pc will require more vram. Specially those games that are open world using ray tracing.

Nvidia has no part/no say in console development let alone console games. So they won't know.

:D
 
Even if it required 100GB Video memory, afterburner will not show more than 8192MB as it's the maximum capacity of the 3070 that was tested.
Seems you are missing the point he is making, when you run out of VRAM the performance will drop off a cliff or there will be some other negative consequence. If there is not any of that happening and the game lets you run with the highest texture then there is no problem then it does not NEED 12gb.

Like a few years ago, I was running Final Fantasy 15 on my Titan XP. It was using up nearly the full 12gb. Does that mean the game was not running fine on 11gb or even 8gb cards? Nope. It ran perfectly fine, if not better on a 1080Ti which had better clocks than my Titan XP.

Not long left now, we will soon find out :D
 
That is not the point, when you run out of VRAM the performance will drop off a cliff or there will be some other negative consequence. If there is not any of that happening and the game lets you run with the highest texture then there is no problem then it does not NEED 12gb.

Like a few years ago, I was running Final Fantasy 15 on my Titan XP. It was using up nearly the full 12gb. Does that mean the game was not running fine on 11gb or even 8gb cards? Nope. It ran perfectly fine, if not better on a 1080Ti which had better clocks than my Titan XP.

Not long left now, we will soon find out :D
I agree mate, i think my point was perhaps lost but this was the message i was trying to convey in the 6000 series 16GB thread. :p

Sometimes you'll just get a hitch/stutter when you saturate video memory and use system memory instead, performance won't always drop off a cliff as you put it.
 
I agree mate, i think my point was perhaps lost but this was the message i was trying to convey in the 6000 series 16GB thread. :p

Sometimes you'll just get a hitch/stutter when you saturate video memory and use system memory instead.
Yeah, if that happens then we will know 10gb is indeed not enough for the highest texture setting for that game :D

Will be interesting to see what happens. I can see a few people here panting already waiting hoping it is not enough so they can say they said so (not talking about you or TheRealDeal by the way) :p
 
Good for DLSS etc. Tensor Memory Compression

I just checked that tweaktown article.. that was supposed to be some kind of pre release leak compilation, and most of the other points in there like traversal coprocessor, tdp etc didn't turn out true


That diagram just shows hierarchy .. I will revert after I am able to skim through it

Edit: seems he got the tdp right
 
Last edited:
I wonder if Nvidia can write a special driver 'hack' to reduce VRAM usage (and IQ) to Godfall to fit in the 3080's small 10GB buffer?

I remember they got in trouble for driver hacks in 3dmark years ago, where they'd lower IQ to score higher in the benchmark.
They've both done it over the years. I remember when ATi were forcing FP16 to do the same thing.
 
I just checked that tweaktown article.. that was supposed to be some kind of pre release leak compilation, and most of the other points in there like traversal coprocessor, tdp etc didn't turn out true



That diagram just shows hierarchy .. I will revert after I am able to skim through it

I get the feeling you are right but I cant prove it. I read somewhere before that the tensor cores are per clock but the rest is not. Cant remember were I got that from.
 
What we will see is if a few more games titles suggesting more than 10 gigabytes of vram.

:D

Thats a title on AMD's youtube channel as well snd is most likely not even out yet. More RAM is better on a GPU to a point but the Radeon 7 had lots more RAM than Nvidia. There was none of this non sense.

We are still waiting for AMD benchmarks in a RT game. Bet performance at 4k is not there. All these 16GB cards are 1080p and 1440p in RT games like Control. They will then have no DLSS to turn 1080p to 4k like a 2080ti. Instead AMD will upscale and have the worst image quality possible.
 
Last edited:
They've both done it over the years. I remember when ATi were forcing FP16 to do the same thing.

I remember Nvidia being caught red handed far more often than AMD at least.

In modern days they wouldn't get a way with it, so tend to introduce proprietary 'features' such as gameworks, designed to cripple AMD's performance.

These days they've moved to DLSS, which looks absolutely terrible some games, making huge rendering mistakes, while looking very good in others. Either way, they don't like playing by the book!
 
I remember Nvidia being caught red handed far more often than AMD at least.

In modern days they wouldn't get a way with it, so tend to introduce proprietary 'features' such as gameworks, designed to cripple AMD's performance.

These days they've moved to DLSS, which looks absolutely terrible some games, making huge rendering mistakes, while looking very good in others. Either way, they don't like playing by the book!

DLSS got fixed its now a game changer.
 
DLSS got fixed its now a game changer.

Sadly not mate, looks pretty awful in the brand new Watch Dogs Legion:

https://www.reddit.com/r/nvidia/comments/jn06hi/watch_dogs_legions_implementation_of_nvidias_dlss/

jyyng3mx7xw51.png
 
Status
Not open for further replies.
Back
Top Bottom