• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
but is that even theoretically possible?
here we are comparing dropping textures one notch and the inability to properly experience ray tracing?
isnt the choice clear.. like simple choice between life and death :D

It is. Some games have a variety of RT presets e.g. cp 2077, metro ee and I'm sure there are a good few more with presets for RT but most of them it is just on or off.

I'll personally always sacrifice other settings or/and use a higher preset of dlss (not fsr though as it is **** for lower presets) in order to keep RT maxed out as it makes a far bigger difference to "overall" IQ :cool:

And 3080's...

3080s are still kicking ass when it comes to RT, same definetly can not be said for rdna 2 with the exception of maybe a 6950xt :cry:
 
but is that even theoretically possible?
here we are comparing dropping textures one notch and the inability to properly experience ray tracing?
isnt the choice clear.. like simple choice between life and death :D
I think one could make settings based on number of rays, bounces and the like.
Playable Turing era RT could be probably achievable by next generation even without upscaling even on x60 cards at least at 1080p, not spectacular but better than all-or-nothing.
 
And usually the fact that overrules them reasons when it comes to vram debates (basically still just fc 6.... :p) is either bad optimisation or intentional sabotage, take your pick ;) :p :D :cry:
yes, i would find it stupendously stupid if anyone is recommending 3090 over 3080 or 4070 ti for vram reasons in gaming applications.. its the kind of stuff that impacts average IQ of human race or something to that effect
 
Last edited:
thanks the word i was looking for command lists not display lists..
but really nvidias architecture is a lot different than amd, their instruction queues run full always and they still use internal scoreboarding to manage instructions and threads once submitted to the gpu
its still done in hardware.. i dont know why you think amds scheduler would work better for nvidia
 
It is. Some games have a variety of RT presets e.g. cp 2077, metro ee and I'm sure there are a good few more with presets for RT but most of them it is just on or off.
yeah, that makes sense disabling RT sub-modules but surely the impact is going to be a lot higher than going one notch down on texture quality
i am just comparing this in my head one notch down on texture quality vs. disable reflections..
and the choice becomes clear as day :D

I think one could make settings based on number of rays, bounces and the like.
i dont think you would be able to predictably control final image quality with those settings because of complex non-linear relationships, but lets see... worth a try
 
i dont think you would be able to predictably control final image quality with those settings because of complex non-linear relationships, but lets see... worth a try
Lower settings would likely look like crap, remember when shadows were GPU heavy and there were several settings for them?
 
yes, i would find it stupendously stupid if anyone is recommending 3090 over 3080 or 4070 ti for vram reasons in gaming applications.. its the kind of stuff that impacts average IQ of human race or something to that effect

I and others always said this from the get go but "few understand"....
:cry:
I'll never forget the guy who said his 3080 had issues with vram in some VR game (running at like 8k) on one particular area/map then he proceeded on to say "having said that, even my 3090 struggles badly due to lack of grunt"
:cry:


At the end of the day, there will be the odd game/use case where more vram will be beneficial e.g.

- a **** ton of mods in something like MSFS (but even then, your fps will still be **** due to cpu bottleneck issues, which only FG can overcome...)
- refuse to use dlss/fsr at 4k for certain games (only a 4090 be capable of this if you want to max out settings or compromise on settings with other gpus)
- play at a res. higher than 4k (in which case, good luck achieving this with anything other than a 4090 and even then you'll still need to use dlss or/and FG :cry:)

Given that we have fsr/dlss, which majority will use due to "lack of grunt" along with most likely using settings which said gpu grunt is capable of in the first place.... vram (within reason) will unlikely be an issue for the foreseeable future, certainly won't be in 95+% of use cases.

yeah, that makes sense disabling RT sub-modules but surely the impact is going to be a lot higher than going one notch down on texture quality
i am just comparing this in my head one notch down on texture quality vs. disable reflections..
and the choice becomes clear as day
:D



i dont think you would be able to predictably control final image quality with those settings because of complex non-linear relationships, but lets see... worth a try

But but but think of the 4k super uber duper textures you will miss out on...

TuwOJiz.png

xpJxGx2.png

YuHWjKU.png

H05TpNr.png

WKTZYrt.png

Meanwhile, RT:

X9kQn0W.jpg

EKi8OU2.jpg

rzOtlrQ.jpg

PwOFKKg.jpg





g6qNaRX.jpg

rD8jjgA.jpg

UFL7u1T.jpg

Y66YdO7.jpg

Can't see a difference....

:cry: :D
 
Let's not make this a 2nd RT thread please, especially by copy/pasting.
I do have an interesting old-ish outlier: Middle Earth Shadow of War.
At 1080 with the high res textures it uses about 10GB RAM, I didn't check for higher resolution but I suspect the game gets even heavier on VRAM, possibly exceeding 12GB for 4k.

It could be a good case where you exhaust VRAM before running out of grunt.
 
I and others always said this from the get go but "few understand"....
:cry:
I'll never forget the guy who said his 3080 had issues with vram in some VR game (running at like 8k) on one particular area/map then he proceeded on to say "having said that, even my 3090 struggles badly due to lack of grunt"
:cry:


At the end of the day, there will be the odd game/use case where more vram will be beneficial e.g.

- a **** ton of mods in something like MSFS (but even then, your fps will still be **** due to cpu bottleneck issues, which only FG can overcome...)
- refuse to use dlss/fsr at 4k for certain games (only a 4090 be capable of this if you want to max out settings or compromise on settings with other gpus)
- play at a res. higher than 4k (in which case, good luck achieving this with anything other than a 4090 and even then you'll still need to use dlss or/and FG :cry:)

Given that we have fsr/dlss, which majority will use due to "lack of grunt" along with most likely using settings which said gpu grunt is capable of in the first place.... vram (within reason) will unlikely be an issue for the foreseeable future, certainly won't be in 95+% of use cases.



But but but think of the 4k super uber duper textures you will miss out on...

TuwOJiz.png

xpJxGx2.png

YuHWjKU.png

H05TpNr.png

WKTZYrt.png

Meanwhile, RT:

X9kQn0W.jpg

EKi8OU2.jpg

rzOtlrQ.jpg

PwOFKKg.jpg





g6qNaRX.jpg

rD8jjgA.jpg

UFL7u1T.jpg

Y66YdO7.jpg

Can't see a difference....

:cry: :D
I'd agree that once you get to a certain level, making high-res textures even more "super" has little value.

That said, it's difficult to notice Raytacing (or lack thereof) in a still image. I know those areas in CP77 and you can 100% tell the difference when RT is on (and when it's not) when in game.
 
  • Like
Reactions: TNA
Let's not make this a 2nd RT thread please, especially by copy/pasting.
I do have an interesting old-ish outlier: Middle Earth Shadow of War.
At 1080 with the high res textures it uses about 10GB RAM, I didn't check for higher resolution but I suspect the game gets even heavier on VRAM, possibly exceeding 12GB for 4k.

It could be a good case where you exhaust VRAM before running out of grunt.

IIRC, that is on gamepass? I can check it out on my 3080 at 4k if so.
 
Lower settings would likely look like crap, remember when shadows were GPU heavy and there were several settings for them?

When you buy a flagship card, why are you having to lower settings when if you had the hardware spec this wouldn't be a thing (i.e. hello 3080 12GB)?

It’s almost as if Nvidia planned it :D

Worked on some! ;)

It's a list of excuses for nexus to use? Might recognise the template. Why are you asking me for a list of games?

:cry:
 
Last edited:
I'd agree that once you get to a certain level, making high-res textures even more "super" has little value.

That said, it's difficult to notice Raytacing (or lack thereof) in a still image. I know those areas in CP77 and you can 100% tell the difference when RT is on (and when it's not) when in game.


BUT BUT 8K... :rolleyes:

I agree it's all getting silly again, from both teams now trying to push 8K while 4K is still a problem for even their new cards on new games.

But they want to push 4x the pixels now from 4K or selling half 8k monitors now to push next graphics cards that can't push that many pixels without tricks. Also I actually really like the half 8k ultrawide screens but maybe when 5090 or 6090 comes out and look at them as I like the format and the pixels and the very large sizes they come in.



The problem we all face right now is the world doesn't have the money right now for such tech and people are more worried about paying their bills not upgrading what they currently have in most cases. This forum is not even a drop in the ocean of real computer users out there, most buy hardware and only think about replacing it when it dies or doesn't do what they need anymore, here we are pretty extreme about tech and hardware and we are enthusiasts but real world is a very different market and real world right now is pushing people to breaking point and not one thought is about tech upgrades. Lets hope the world improves this year and coming years as this will even damage the market for us enthusiasts as sorry we will not be paying £3k for a gpu as nvidia would love us to do and some models of gpus are very close to that right now.
 
When you buy a flagship card, why are you having to lower settings when if you had the hardware spec this wouldn't be a thing (i.e. hello 3080 12GB)?
We just had Nvidia showcase a game where their flagship runs at 24fps native.
Besides, some developers have "future proof" options available in their games so that you won't be able to fully exploit them until a couple gens after.
 
Yup, we did warn people that a xx70(ti) would be out and matching the xx90 tier gpu for a lot cheaper but some didn't want to hear it, right @TNA ;) :D :cry:
But But :p

Nvidia was trying to sell it as a 4080 not a 4070ti :p For $100 more and we know the fake msrp game too..

It only got renamed because of the outrage from the internet. Also look at the 4060ti (128bit bus and less cuda cores than previous gen it replaces) specs less than the 3060ti (256 bit bus and more cuda cores). What we would call a xx50 class or less back in the day...

 
Last edited:
Status
Not open for further replies.
Back
Top Bottom