• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
A RTX 3070 can run Godfall on epic settings @ 4k. 7.1GB vRAM


RTX 2060 super 1440P epic.



Huge problem with this games in regards to Shader Cache causing deep fps drop stuttering

Seems like the required 12GB of VRAM at 4k max settings that the devs claimed is a lie

so it seems it needs 9GB for the 4K epic settings which is nice.

RTX 3090 8.4GB vRAM @ epic settings.


Epic settings 2080 ti. 7.8GB vRAM.


So the last you need more than 10GB of vRAM game. Does not need more than 10GB of vRAM.
 
Last edited:
Go RTX 3080! Go! :) I am very glad about this, I think there is now a slim chance we might get a game which will use more than 10GB in the next 2 years, but who knows the future, right...
Also there is a point that ray tracing and 4K textures will be available via patch later, so it could still use more.
 
Last edited:
2im4nf.jpg
EvPIYJ.jpg
https://youtu.be/XhddaznExCA

Godfall 4k epic settings(maxed out)

6g-8gb allocation, and only 5.5gb-6.5gb actual usage.

Sooooooo, 12gb my ass lmaaao

I am definitely buying the 3080 10gb now its not gonna be an issue for a while :p(1440p144hz)
 
Last edited:
Well looks like its not optimized nicely... But its still next gen and uses only about 6gb~ vram(actual usage) soo its possible that it is making use of ssds and ram?

That seems like the only reasonable explanation for such a low vram usage for a next gen game...

Also that means 10gb has quite a headroom right now and 8gb is just enough for 90% of the games, it might cause issues in 10% of the games going into next gen..
 
It will be interesting to see if/when the HQ Texture pack becomes available. I am not sure if it is already included or will be added at a later date as I’ve not downloaded the game yet myself. Can anyone confirm?
 
It will be interesting to see if/when the HQ Texture pack becomes available. I am not sure if it is already included or will be added at a later date as I’ve not downloaded the game yet myself. Can anyone confirm?
There is probably nothing of that sort coming. Turns out it allocates 12gb when Amd FidelityFX LPM is turned on.

https://youtu.be/lY9OSdebKcc

Without using that thing its under 8gb
 
Interesting, i wondered why all the 8GB GPUs seemed to have this option disabled. Well no shame in lowering quality settings if it improves performance.

what does that setting do anyway? Its just an HDR gimmick... I mean if someone really really wants to use stuff like that they should probably go with amd or wait for a 3080ti i suppose.

For vanilla games i don't see 10gb becoming a problem anytime soon.
 
what does that setting do anyway? Its just an HDR gimmick... I mean if someone really really wants to use stuff like that they should probably go with amd or wait for a 3080ti i suppose.

For vanilla games i don't see 10gb becoming a problem anytime soon.
It looks like it turns on some of the AMD Fidelity FX image quality improvements, included in that is the HDR wide gamut tone. The game is AMD Sponsored so it will make use of Fidelity FX effects.

EDIT Just checked and it enables AMD FidelityFX Contrast Adaptive Sharpening (CAS) and the HDR Mapper (LPM), Link.
 
A RTX 3070 can run Godfall on epic settings @ 4k. 7.1GB vRAM

Godfall 4k epic settings(maxed out)

6g-8gb allocation, and only 5.5gb-6.5gb actual usage.

Lol :D


Matt see’s the posts above and scours the web looking for evidence that 16gb will be needed :p:D


It looks like it turns on some of the AMD Fidelity FX image quality improvements, included in that is the HDR wide gamut tone. The game is AMD Sponsored so it will make use of Fidelity FX effects.

EDIT Just checked and it enables AMD FidelityFX Contrast Adaptive Sharpening (CAS) and the HDR Mapper (LPM), Link.
Interesting, so would that not be a unique feature to AMD or does FidelityFX work on Nvidia cards?

We will get a proper analysis soon enough :D
 
Matt see’s the posts above and scours the web looking for evidence that 16gb will be needed :p:D
Ha, i just looked at user feedback on Reddit since it's good place to collect customer issues, it's not my feedback though I've not even played it yet. I'm sure there is a logical explanation. :p

Fidelity FX can be used on any GPU it is not vendor agonistic.
 
Ha, i just looked at user feedback on Reddit since it's good place to collect customer issues, it's not my feedback though I've not even played it yet. I'm sure there is a logical explanation. :p

Fidelity FX can be used on any GPU it is not vendor agonistic.
Awesome :D
 
Heh, so it's a big fat nothing-burger, i hate to say I told you so...but I literally called this the moment it was mentioned.

To be as thorough as possible there is one place I've seen memory exceed 10Gb is one of the youtube videos posted here and it seems likely they're probably measuring vRAM allocated, but I've sent the guy a message asking to know how he measured it and if he can try the new beta and also measure used, so we'll see what that comes back with, if he can't or wont do it then I might have to look at testing myself.

*edit*

He wasn't using the new Afterburner to measure real memory usage, but he's replied and says he will try it.
 
Last edited:
This is the RTX 3080 with epic 2k settings. Godfall.

This is turned off.

FidelityFX HDR Mapper
High-Quality HDR Gaming
Optimized for use with AMD FreeSync™ Premium Pro1 displays, AMD's Luminance Preserving Mapper (LPM) delivers superb HDR and wide color gamut content.

1. AMD FreeSync™ technology requires AMD Radeon™ graphics and a display that supports FreeSync technology as certified by AMD. AMD FreeSync™ Premium technology adds requirements of mandatory low framerate compensation and at least 120 Hz refresh rate at minimum FHD. AMD FreeSync™ Premium Pro technology adds requirements for the display to meet AMD FreeSync Premium Pro compliance tests. See www.amd.com/freesync for complete details. Confirm capability with your system manufacturer before purchase. GD-127

https://www.displayninja.com/best-freesync-premium-pro-monitors/

Looks like most people will have to leave AMD FidelityFX LPM off. On NVidia cards the driver also needs to support AMD FreeSync Premium Pro technology and the right monitor is used.

FreeSync Premium Pro does not require HDR capable monitors; driver can set monitor in native mode when FreeSync Premium Pro supported HDR content is detected. Otherwise, HDR content requires that the system be configured with a fully HDR-ready content chain, including: graphics card, graphics driver and application. Video content must be graded in HDR and viewed with an HDR-ready player. Windowed mode content requires operating system support. GD-162. https://www.amd.com/en/technologies/freesync-hdr-games

AMD FreeSync™ Premium Pro technology raises the bar to the next level for gaming displays, enabling an exceptional user experience when playing HDR games, movies and other content:

  • At least 120hz refresh rate at minimum FHD resolution
  • Support for low framerate compensation (LFC)
  • Low latency in SDR and HDR
  • Support for HDR with meticulous color and luminance certification
https://www.amd.com/en/technologies...n=freesync&utm_medium=redirect&utm_source=301

AMD has three FreeSync tiers: the original FreeSync technology, FreeSync Premium, and FreeSync Premium Pro.

You need AMD FreeSync Premium Pro (previously branded as FreeSync 2) for AMD FidelityFX LPM.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom