• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
So the game doesnt state anywhere that it requires 16 gb of vram, does it?
Drop settings one notch? Not allowed to do that in these parts. A GPU either runs a game maximum settings or does not run it at all!

According to a few in here anyway :cry:
This is the mentaility of some people in these VRAM threads and these people have the cheek to accuse others of being disingenuous.
 
Last edited:
What's huge about 16gb of ram? Everyone has at least 16

I have had 16GB RAM for over 10 years now. Currently 32GB. But no, that's not the issue here. These guys having the time of their lives as finally a game that is not called Far :cry: 6 will use more than 10GB vram.

Here is how I imagine Matt and his homie:

popcorn.gif
 
I have had 16GB RAM for over 10 years now. Currently 32GB. But no, that's not the issue here. These guys having the time of their lives as finally a game that is not called Far :cry: 6 will use more than 10GB vram.

Here is how I imagine Matt and his homie:

popcorn.gif
Dont hold your breath yet, godfall made similar claims about requiering a megagazilion of vram for 4k (due to amd paid propaganda since it was an amd bundle game just like forsppoken is). Fast forward to release date, even 6gb were enough for 4k :D :D :D
 
Last edited:
So it seems Forspoken may be the new Far Cry 6 for the guys here. I hope at least it is a good game this time.

I recon I will be fine with 12GB as I am not on 4K.
 
Last edited:
Those Forspoken requirements do not state the game requires 16GB VRAM. They state it needs a 4080... which has 16GB VRAM. So if you spent all that extra on a 3090 over VRAM worries, you still can't play the game on max settings (same with Portal) - proving that the overall GPU perf would come into play long before the VRAM, which is exactly what people said at the time.
 
Last edited:
It's funny to see people take devs' published specs so seriously, even though they're never really accurate. Then again, if the fact that they "recommended" 24 GB of RAM didn't trigger your B.S. detector, then you were never going to figure it out anyway. :cry:
 
Those Forspoken requirements do not state the game requires 16GB VRAM. They state it needs a 4080... which has 16GB VRAM. So if you spent all that extra on a 3090 over VRAM worries, you still can't play the game on max settings (same with Portal) - proving that the overall GPU perf would come into play long before the VRAM, which is exactly what people said at the time.
thats a good point which either means one of 2 possibilities:
1. its a rough draft. requirements havent yet been frozen
2. 3090 is supposedly faster than 4080 (HAHAHAHA)
 
Those Forspoken requirements do not state the game requires 16GB VRAM. They state it needs a 4080... which has 16GB VRAM. So if you spent all that extra on a 3090 over VRAM worries, you still can't play the game on max settings (same with Portal) - proving that the overall GPU perf would come into play long before the VRAM, which is exactly what people said at the time.
Both Ultra GPUs state 16Gb Vram, as one of them is a 6800xt, the other a much faster 4080, it's looking more of a claimed vram minimum to me.

Whether it's BS or not we don't have long to find out if some of us can squeeze 16 into 10 or 12.
 
Last edited:
Those Forspoken requirements do not state the game requires 16GB VRAM. They state it needs a 4080... which has 16GB VRAM. So if you spent all that extra on a 3090 over VRAM worries, you still can't play the game on max settings (same with Portal) - proving that the overall GPU perf would come into play long before the VRAM, which is exactly what people said at the time.
It also states it requires a 6800xt which also has 16GB VRAM, the requirements are obvious 16GB VRAM for 4K ULTRA. Cherry picking...

Is a 6800xt faster than a 3090? :rolleyes:
 
Last edited:
So it seems Forspoken may be the new Far Cry 6 for the guys here. I hope at least it is a good game this time.

I recon I will be fine with 12GB as I am not on 4K.

No doubt even if it does run fine (and on everyone's system), they'll ignore all that and go "zOMG but the requirements state otherwise!!!!" :D :cry:

I was just thinking there, I wonder if amd could maybe make a "VRAM optimiser" (like their ray tracing analyser) given how they guzzle on vram compared to nvidia gpus :D

It's funny to see people take devs' published specs so seriously, even though they're never really accurate. Then again, if the fact that they "recommended" 24 GB of RAM didn't trigger your B.S. detector, then you were never going to figure it out anyway. :cry:

Nor the fact that apparently a 6800xt will provide an equivalent experience to a 4080? Or how about the 3070 only being good enough for 1440p 30 fps....

:cry:




LOL at the portal one too needing 16gb vram to be playable....

048v6IU.png

npjdRAB.png

qkZtIx2.png

Even though every gpu needs to either use DLSS performance or ultra performance (reducing the vram usage) or/and FG with adjustment in settings in order to get 60+ FPS i.e. in which case, gpus with <16gb vram get a perfectly playable experience, well except for amd gpus :D :p
 
Last edited:
  • Like
Reactions: TNA
AMD have a hardware scheduler and smart access memory can work very well in those scenarios. Nvidia rely on software for scheduling, that has some disadvantages (and some advantages for DX11) which is what you see here.

That doesn't seem to be the case. Anyhow looks like Nvidia treats it's block scheduler as a trade secret while you could find some info on thread schedulers
 
i would put that under speculation.. nvidia has a different architecture that i agree given what i have read about both
but nvidia also uses hardware schedulers
edit: though i remember something abt nvidia using cpu generated display lists
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom