Associate
- Joined
- 15 Oct 2014
- Posts
- 746
- Location
- Somerset England
Slightly off topic, but I see there are 3070's going for over £7,000 on a certain foresty competitor
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
We need to see how what kind of vram usage 3090 owners get.
Thing is we are in a bit of pickle. Stick wiht Nvidia and the 3080 but potentially have vram issues. Go AMD for the 16GB of vram but then the RT performance will be less than the RTX 3000 series.
We cant win either way.
8GB should be used on a GPU with the performance level of the 5700 XT as an absolute maximum, in my personal opinion.Nvidia is saying that the 8GB 3070 is fast as or even faster then the 2080ti which has 11GB of vram
Is having only 8GB of vram going be enough ?
8GB should be used on a GPU with the performance level of the 5700 XT as an absolute maximum, in my personal opinion.
This is with gaming at 1440P and 5120x1440 resolution. Even then, there were multiple games where at maximum details the game would stutter due to video memory saturation.
This was always resolved by switching to a 16GB Radeon VII.
So in my opinion, 8GB is not sufficient for a GPU like the 3070 as it is more powerful than the 5700 XT.
+1This is classic NV VRAM gimping.
We need to see how what kind of vram usage 3090 owners get.
Thing is we are in a bit of pickle. Stick wiht Nvidia and the 3080 but potentially have vram issues. Go AMD for the 16GB of vram but then the RT performance will be less than the RTX 3000 series.
We cant win either way.
Imo I think the in-game vram counter is right, so around 9 GB, but if you have extra it will 100% help with streaming and stutter less. Granted it's unplayable with RT on ultra even on a 3090 without reducing resolution (either natively or through DLSS), but if you lower that & add 100% details it's probably still 9-10 GB required.
Nah this doesn't usually work. This is why the memory allocated is a bad way to measure usage, because often developers will just allocate an arbitrarily high value such as all, or most, of the free vRAM, no matter whether they can use it or not. And so in many games you'll see a 3080 assign 10Gb or close to it, and a 3090 assign like 20Gb.
This is why you NEED to measure how much vRAM is actually in use. We can't do that with watchdogs yet because the DRM prevents 3rd party tools from inspecting the exe/process and its real memory use, but Denuvo has been cracked now so it wont be long before we see a release we can test on and when that happens I'll post real vRAM usage. Although until my 3080 arrives I wont be able to test ray tracing, but at the very least we can prove the point again that ALLOCATED != USAGE
I've shown that, with evidence for Doom Eternal, Wolfenstein II, Resident Evil 3, etc. There's an entire thread at the resetera forums demonstrating this problem.
It's always an over estimate but question is by how much? According to a recent post on the resetera forums on their vRAM thread they used Ultra presets with RT on DLSS off and 4k, they claim variable usage between 8-9Gb, but they also claimed that disabling BattleEyeLauncher allowed them to measure real per process vRAM and it's reliably coming in at about 1Gb less usage than the allocated value. What you're seeing in that video is not how much vRAM the game is using but rather how much it has requested from the video card be allocated. These are different values.
The poster does exceed 8Gb of usage at those settings on what I assume is a 3080 (he doesn't say) but the killer is that exceeding 8Gb of vRAM in the screenshot he posted, he has 21fps
So again, I maintain that future games will need more vRAM, but they will also need better GPUs. And that 10Gb on the 3080 or 8Gb on the 3070 is sufficient so the GPU gives out before the vRAM budget is reached. 21fps is not playable, let's not be stupid about this.
Source - Poster RCSI in this thread https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/page-3
The other thing to consider is the texture packs that they release for games to bump them up to 4k textures I don't think cost anything in compute to render. They are just assets that sit in vram taking up space. So this is where a higher vram budget helps a lot. And a compute bottleneck would not be reached in this situation before the vram limit as the texture dont cost any compute to render.
The other thing to consider is the texture packs that they release for games to bump them up to 4k textures I don't think cost anything in compute to render. They are just assets that sit in vram taking up space. So this is where a higher vram budget helps a lot. And a compute bottleneck would not be reached in this situation before the vram limit as the texture dont cost any compute to render.
I'm still suspicious as to why AMD have put 16GB of ram on their cards. Just marketing to make Nvidia look bad or is there more too it??
No they don't they just gobble up vram. The thing is at 4k with large monitors you really notice low res textures and if a game is moddable one of the first things you do is search out high res texture packs because no-one likes looking at poor definition images. Then of course you start finding out just how much your vram is worth.
Yea I am waiting for your benchmarks, im sure watch dogs legions is good under 8gb too even at 4k! Just need some hard evidence which I hope you can provide once the game is crackedNah this doesn't usually work. This is why the memory allocated is a bad way to measure usage, because often developers will just allocate an arbitrarily high value such as all, or most, of the free vRAM, no matter whether they can use it or not. And so in many games you'll see a 3080 assign 10Gb or close to it, and a 3090 assign like 20Gb.
This is why you NEED to measure how much vRAM is actually in use. We can't do that with watchdogs yet because the DRM prevents 3rd party tools from inspecting the exe/process and its real memory use, but Denuvo has been cracked now so it wont be long before we see a release we can test on and when that happens I'll post real vRAM usage. Although until my 3080 arrives I wont be able to test ray tracing, but at the very least we can prove the point again that ALLOCATED != USAGE
I've shown that, with evidence for Doom Eternal, Wolfenstein II, Resident Evil 3, etc. There's an entire thread at the resetera forums demonstrating this problem.
Because they are the underdog so they have to fight harder to prove themselves and win over other users compared to Nvidia. So they're gonna give us more (as usual) in order to do that. I think it's also a happy coincidence, because the tech they're working on towards enabling an MCM future (infinity cache) also happens to afford them this advantage of doing more with lower-end memory & smaller bus size compared to Nvidia needing to go GDDR6X. A sort of reversal to what it used to be, when Nvidia's design advantage let them use slower memory and still keep up or out-pace AMD which was using HBM/HBM 2 etc.I'm still suspicious as to why AMD have put 16GB of ram on their cards. Just marketing to make Nvidia look bad or is there more too it??
Lol!Yea I am waiting for your benchmarks, im sure watch dogs legions is good under 8gb too even at 4k! Just need some hard evidence which I hope you can provide once the game is cracked
Can you run Final Fantasy benchmark in this setting? See how it compares to mine?Guys its finally done! We finally have real benchmarks of actual vram usage. I asked a benchmark youtuber to install the new msi beta and do his next benchmarks with the program. Here they are, 8gb should not be a *HUGE* issue at 1440p for atleast a few years even if vram requirements increase, if we also get optimized games at 1440p 8gb could last 4 years if you can turn down textures in some games.
Watch dogs legions without the hd texture pack uses only 5gb at 1440p lol
Anyways, here it is!
https://www.youtube.com/watch?v=rVMbkjtY9ko&lc=z23it1gifpylhvnwi04t1aokgxlcmstpmka1w12i3zjerk0h00410