• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Is this really 115 pages of people arguing whether vram measurements are usage or allocation?

It seems no one really knows (and would possibly require a crystal ball) but in my experience of computers for over 25 years, it's always best to have more ram than you need. And dropping £649 (lol) on a graphics card with potentially not enough vram doesn't seem like a wise move.
It's still better than dropping £1500 on a card with enough which has been the only choice we had up till now.
 
All claims of large vRAM requirements have so far been debunked with evidence, demonstrating that games of this generation really do not need more than 10GB, in fact the average of AAA games for the last few years is about 6GB. I think the only people left who seriously disagree with this are people who are ignorant of how real memory usage works, and base memory usage off what is allocated.

But in addition to that, I think I've started to build a case for another claim, that these cards will be reasonably well future proofed from a memory standpoint. That comes from looking at the most demanding games today in 4k Ultra which include things like FS2020, Crysis Remastered, Avengers, Watch Dogs Legion and seeing that not only is 10Gb of vRAM enough for a 3080, but that the GPU struggles to run these games at 4k Ultra and the settings need to be dropped down. They're an excellent example of a GPU bottleneck. I've also done quite detailed comparisons to the console GPUs which are a good barometer of the games to come in the next generation both in terms of them having the same amount of effective vRAM, but also the direction they're going in with their new technology. By investing big into disk speed and asset streaming with DirectStorage and fast disks and memory controllers.

So there's speculative claims both ways about future use, but at least mine has some evidence and reason. And I keep an open mind about future games, let's just get them and measure them and see what is what. But from all of my testing so far a pattern has emerged that is quite noticeable.
1) Speculative measurements of vRAM are usually wrong, allocated vs used.
2) The used is always less, and usually way less.
3) As games push towards the 10GB limit of real usage, the GPUs choke first.

The biggest win out of all this in my opinion is the revelation that we've been measuring this badly for a long time and now finally have tools to allow us to do it properly, and that's a good thing. We can start adapting our expectations to be more reasonable and sensible. Everyone benefits from that.

This is such a nice post, can't agree more.

It is indeed enough right now for games which are actually extremely demanding and it's a bold fact. Also there is an excellent point regarding the GPU power, for example on my RTX 3080 with Watch Dogs, there is indeed a 10GB usage, but I hardly can keep up above 30fps. Was this done on purpose by Nvidia, of course it is a big company which likes to force you to buy a new product in such a bad manner. Also I think the greed and DLSS on which they relied, were the main factors why we have 10GB, definitely DLSS is helping with VRAM usage as well, which I don't like at all BTW. There is also the fact that 2GB VRAM modules were having problems with production.

In the end, I really think the Godfall, since I do consider it a next-gen game, will show clearly are we good within the 10GB range, or is the next thing with high-end future games between 10-15GB. I think we will have an answer tomorrow.
Also there is a point with future PS4/PS5 PC ports regarding the VRAM usage, who knows what will be enough, Horizon is also a crazy VRAM consumer. Who can claim here for sure, if we for example get Horizon Forbidden West PC port in the following 1-3 years, that the VRAM usage won't be extreme?
 
So, case in point, I reached out to W1zzard who did that article on Techpowerup and asked how he measured vRAM and he confirmed it was with GPU-Z which only measures what is allocated and not what it is in use. I've gone back to him again to see what he has to say about potentially measuring with Afterburner, if he replies again I'll post an update.

Yeah let us know :)


I feel that Valhalla is not a good case anyway in this argument because for me I do not consider it a next gen tittle.

It's using the same engine as Origins if not older is it not?

It has no RT either.

So it's a very distinctly current gen game.
Yeah. But my point was not that, it was that it looks better (according to techpowerup) and consumes a lot less vram achieving this :D

Lets be honest, it is going to take a few years until we start to see proper next gen titles coming out regularly, by then Hopper will be here.


I really do hope this is true. Please be true, please.

Lol :D
 
It's AnvilNext 2.0, first used on Assassin's Creed Unity in 2014 according to wikipedia.
I should have been more specific. There is nothing to suggest that they re wrote the engine from scratch. It seems they made updates to the existing engine and the update was large enough to warrent a new revision. The underlying engine however was originally built in 2007.
 
I should have been more specific. There is nothing to suggest that they re wrote the engine from scratch. It seems they made updates to the existing engine and the update was large enough to warrent a new revision. The underlying engine however was originally built in 2007.
Yeah, but that's standard operating procedure. No one re-writes engines from scratch anymore because it would make 0 business sense and also wouldn't give you that much anyway, because it's not all bad in the first place.

The more interesting part about current-gen AnvilNext is it was built in heavy collaboration with Nvidia, and it's actually why AC & GR have been so poor on AMD and consoles alike, at least until Valhalla. Perhaps it's no coincidence that when they jump to Vulkan or DX12 (respectively) that performance for AMD improves on PC finally. Still, I'd say that the next version of AnvilNext is going to be radically different in its look & performance profile compared to anything you see out now from them (in current games), and that one is absolutely going to big AMD a big boost.
 
Enthusiast level knowledge has gone. Handful of people remain on here who actually understand VRAM and GPU correlation, like frosty. This used to be common knowledge.

Now people think you can wear the enthusiast badge because you can determine one number is bigger than the other?

Just like the megapixel wars in TVs or cameras, many other factors and limitations at play.
 
The 3080ti will come in at 1k though for the FE courtesy of AMD overpricing the 6900XT so partner models will probably start at £1050 although it's still a deal compared to the 3090.
 
Yeah, but that's standard operating procedure. No one re-writes engines from scratch anymore because it would make 0 business sense and also wouldn't give you that much anyway, because it's not all bad in the first place.

The more interesting part about current-gen AnvilNext is it was built in heavy collaboration with Nvidia, and it's actually why AC & GR have been so poor on AMD and consoles alike, at least until Valhalla. Perhaps it's no coincidence that when they jump to Vulkan or DX12 (respectively) that performance for AMD improves on PC finally. Still, I'd say that the next version of AnvilNext is going to be radically different in its look & performance profile compared to anything you see out now from them (in current games), and that one is absolutely going to big AMD a big boost.
First Anvil Engine was built in partnership with nvidia. Their partnership expired with Oddysey. Oddysey and Valhalla has been made in partnership with AMD and both are signed with Ryzen & Radeon Logo.

Nvidia still got Watch_Dogs and occasionally Far Cry. Some of the FC games has been made under AMD banner.
 
Status
Not open for further replies.
Back
Top Bottom