• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
No doubt even if it does run fine (and on everyone's system), they'll ignore all that and go "zOMG but the requirements state otherwise!!!!" :D :cry:

I was just thinking there, I wonder if amd could maybe make a "VRAM optimiser" (like their ray tracing analyser) given how they guzzle on vram compared to nvidia gpus :D



Nor the fact that apparently a 6800xt will provide an equivalent experience to a 4080? Or how about the 3070 only being good enough for 1440p 30 fps....

:cry:




LOL at the portal one too needing 16gb vram to be playable....

048v6IU.png

npjdRAB.png

qkZtIx2.png

Even though every gpu needs to either use DLSS performance or ultra performance (reducing the vram usage) or/and FG with adjustment in settings in order to get 60+ FPS i.e. in which case, gpus with <16gb vram get a perfectly playable experience, well except for amd gpus :D :p

Don't worry about what makes sense. The narrative must be maintained at all costs :p
 
This isn't going down well, seeing as Portal RTX is No.1 on the list of needing more than 12Gb@ultra, here's some support links:


:cry:
 
This isn't going down well, seeing as Portal RTX is No.1 on the list of needing more than 12Gb@ultra, here's some support links:


:cry:
Benchmark for Portal RTX on a 4090 shows 16141 MB VRAM usage at 4K https://www.techpowerup.com/review/portal-with-rtx/3.html
 
These VRAM threads make absolutely no sense.

The 40 series HAS more than 10GB, so no problem there.
The 30 series doesn't benefit from more VRAM, so no problem there either.


Either way, using some hideously optimised game (Forspoken, FC6) or a tech demo (Portal) to base your GPU purchases on is asinine. Those Forsaken requirements don't even make any sense.
 
@Nexus18 Problem is, when we use any vram in games at all, it always come down to one or all of the following:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by amd, not fair
- hardly any games have high quality textures so who cares
- who on earth enables high quality textures

Doesn't matter what game it is, it will be one of the excuses above.
 
@Nexus18 Problem is, when we use any vram in games at all, it always come down to one or all of the following:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by amd, not fair
- hardly any games have high quality textures so who cares
- who on earth enables high quality textures

Doesn't matter what game it is, it will be one of the excuses above.
or drop texture settings one notch..
unfortunately in rt there are arent many notches its supposed to be a cliff and a mountain
 
@Nexus18 Problem is, when we use any vram in games at all, it always come down to one or all of the following:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by amd, not fair
- hardly any games have high quality textures so who cares
- who on earth enables high quality textures

Doesn't matter what game it is, it will be one of the excuses above.

Which games? You make it sound like a huge list. Tell me which games don't run at 12GB please :D
 
And yet it’s used to promote ray tracing. Not much of a promotion!

Actually more a case to promote Remix, dlss and frame generation ;) :D

@Nexus18 Problem is, when we use any vram in games at all, it always come down to one or all of the following:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by amd, not fair
- hardly any games have high quality textures so who cares
- who on earth enables high quality textures

Doesn't matter what game it is, it will be one of the excuses above.

And usually the fact that overrules them reasons when it comes to vram debates (basically still just fc 6.... :p) is either bad optimisation or intentional sabotage, take your pick ;) :p :D :cry:

or drop texture settings one notch..
unfortunately in rt there are arent many notches its supposed to be a cliff and a mountain

Yup that is sadly an issue with RT atm as Alex/DF highlighted, there needs to be more fine grain controls rather than just on and off especially when people have old gen/out of date/weaker gpus for RT like rdna 2.

Which games? You make it sound like a huge list. Tell me which games don't run at 12GB please :D

And back on the wheel we all go....

:cry:
 
heres something i found, nvidia moved to tiled rendering post maxwell and they had to set-up clustered data structures for geometry and visibility..
this is not software based scheduling, its an additional feature to extract more performance out of nvidia hardware... nvidia still uses hardware schedulers and instructio buffers and all the other normal stuff youd expect
 
Yup that is sadly an issue with RT atm as Alex/DF highlighted, there needs to be more fine grain controls rather than just on and off especially when people have old gen/out of date/weaker gpus for RT like rdna 2.
but is that even theoretically possible?
here we are comparing dropping textures one notch and the inability to properly experience ray tracing?
isnt the choice clear.. like simple choice between life and death :D
 
@Nexus18 Problem is, when we use any vram in games at all, it always come down to one or all of the following:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by amd, not fair
- hardly any games have high quality textures so who cares
- who on earth enables high quality textures

Doesn't matter what game it is, it will be one of the excuses above.

*drops mic. Will not penetrate dense matter.
 
heres something i found, nvidia moved to tiled rendering post maxwell and they had to set-up clustered data structures for geometry and visibility..
this is not software based scheduling, its an additional feature to extract more performance out of nvidia hardware... nvidia still uses hardware schedulers and instructio buffers and all the other normal stuff youd expect
 
Status
Not open for further replies.
Back
Top Bottom