• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
dunno i see graphs, it has frametime stability issues which a 400 dollar ps5 digital does not have

"it already paid itself" or "its already 2 years old i will get a new one anyway lmao" comments are explanatory enough. it proves that 3080 users themselves accept that it began to be a SOL card for 4k. i dont even know why these people are here defending their gpus if they're not going to care about its performance beyond 2 years
 
dunno i see graphs, it has frametime stability issues which a 400 dollar ps5 digital does not have

"it already paid itself" or "its already 2 years old i will get a new one anyway lmao" comments are explanatory enough. it proves that 3080 users themselves accept that it began to be a SOL card for 4k. i dont even know why these people are here defending their gpus if they're not going to care about its performance beyond 2 years
Link? So far all I seen is a broken game that needs patching.

Seems people see what they want to see ;)


Probably about as real as when AMD owners were saying that 4GB HBM memory equals 8GB GDDR.
Lol. Yeah, both sides are as bad as each other.
 
Seems people see what they want to see ;)

Pretty much this.

Even consoles are having some frame latency issues as "evidenced", heck the developers have even acknowledged the issues on PC and are actively working on resolving the issues....

Also love how certain people selectively nit pick certain figures to try and prove their point and conveniently ignore the other stats which throws their argument out the window ;)

Even though it's rather pointless, here are some more benchmarks from other sources for people who want to see the full picture....

https://www.pcgameshardware.de/Deat...rk-Test-Review-Systemanforderungen-1379400/2/

Sadly seems they only enabled ray traced shadows and not ray traced ambient occlusion for that though....

https://www.dsogaming.com/pc-perfor...p-ray-tracing-amd-fsr-benchmarks-comparisons/

They only have a 3080 and 6900xt.

But yeah, looks like having "only" 10gb is causing all kinds of issues eh :D :cry:

Too much entitlement from PC gamers.

They buy a game designed for 16gb memory on next gen consoles then complain when their PC runs out of vram on a 10gb card. Oh boo hoo.

If you can't even match the console's specs then you can't complain about console games not running as they do on console

I'll bite....

Firstly, consoles don't have 16GB "dedicated" for vram, it is shared memory.

We still haven't seen an example where 3080 etc. owners have had to turn down settings because of "vram" other than a very select few trying to run a game at some ridiculous VR resolution or/and using like 50+ mods (something which consoles can't handle/cope with in the same way), any claims surrounding certain games i.e. resident evil village and godfall have been proven wrong and in fact, once ray tracing is turned on, amd cards actually suffer with performance because of lack of RT grunt, of course, now that they have FSR, it's a not as much of a problem.

As shown by digital foundry comparisons, consoles still have reduced settings or/and adaptive resolution scaling in order to hold 4k @ 60, where as most high end gpus don't need to compromise to the same extent.

Throw in ray tracing (and very limited ray tracing at that) and consoles have to pick between ray tracing or/and high resolution + 60fps. This is extremely evident from the metro enhanced comparison, even look at the best ray traced game the ps 5 has and all the compromises it still has to make (spiderman) to achieve ray traced visual fidelity.

I had to turn down settings on day one of owning my 3080 whilst using less than 6GB of vram.

I didn't expect the GPU to be strong enough to max out all my games to begin with, so no big deal. I may someday need to lower settings due to running out of vram, but I have already lost my max-all-the-things virginity, so there won't be a lot of fanfare if/when that happens.

This guy gets it.

Doesn't matter what GPU it is, everyone is going to have to turn down settings at some stage, be that related to the vram, ray tracing or/and sheer grunt required.

Same could be argued for amd gpus and having to turn down/off graphical features, since day 1, they have had to basically turn ray tracing off due to lack of grunt and not having any FSR for a good 8/9 months meant it was a complete no go in any games, which had ray tracing, now they have FSR, which means ray tracing is a bit more usable but still, it has to be carefully fine tuned those RT settings on amd gpus but guess that isn't such a big deal eh???

I personally would sooner turn down other settings than turn off/reduce ray tracing settings, cyberpunk, control, metro, the ascent etc. has completely spoiled me for visuals, everything else just feels "last gen" now.
 
Pretty much this.

Even consoles are having some frame latency issues as "evidenced", heck the developers have even acknowledged the issues on PC and are actively working on resolving the issues....

Also love how certain people selectively nit pick certain figures to try and prove their point and conveniently ignore the other stats which throws their argument out the window ;)

Even though it's rather pointless, here are some more benchmarks from other sources for people who want to see the full picture....

https://www.pcgameshardware.de/Deat...rk-Test-Review-Systemanforderungen-1379400/2/

Sadly seems they only enabled ray traced shadows and not ray traced ambient occlusion for that though....

https://www.dsogaming.com/pc-perfor...p-ray-tracing-amd-fsr-benchmarks-comparisons/

They only have a 3080 and 6900xt.

But yeah, looks like having "only" 10gb is causing all kinds of issues eh :D :cry:
You hit the nail on the head mate.

Funny seeing people come out the woodwork to only either disappear again after looking into it further or just ending up embarrassing themselves :D
 
I'll bite....

Firstly, consoles don't have 16GB "dedicated" for vram, it is shared memory.

We still haven't seen an example where 3080 etc. owners have had to turn down settings because of "vram" other than a very select few trying to run a game at some ridiculous VR resolution or/and using like 50+ mods (something which consoles can't handle/cope with in the same way), any claims surrounding certain games i.e. resident evil village and godfall have been proven wrong and in fact, once ray tracing is turned on, amd cards actually suffer with performance because of lack of RT grunt, of course, now that they have FSR, it's a not as much of a problem.

As shown by digital foundry comparisons, consoles still have reduced settings or/and adaptive resolution scaling in order to hold 4k @ 60, where as most high end gpus don't need to compromise to the same extent.

Throw in ray tracing (and very limited ray tracing at that) and consoles have to pick between ray tracing or/and high resolution + 60fps. This is extremely evident from the metro enhanced comparison, even look at the best ray traced game the ps 5 has and all the compromises it still has to make (spiderman) to achieve ray traced visual fidelity.

Spot on, it's no secret to anyone even remotely technical that console generations do not have fast APUs, they're sharing the same die for CPU and GPU functions and all in all they aren't that fast. Which is when you get technical breakdowns like the guys at Digital Foundry and you look at sacrifices made to get something like RT running on these slow pieces of hardware, it's extensive. And laughably regarding the "dedicated" memory, the PC variants of things like Watch Dogs Legion are running high res texture packs that the consoles simply do not use.

We've been over this already before, but just to re-stress. The 16Gb is shared among the OS and the graphics demands. So the actual OS and all other running background apps/tasks use that memory, an estimated 2.5Gb or plausibly more today. And then the game itself would traditionally use system RAM in the PC which has to come from the same pool on console, so maybe 4-8Gb of RAM usage for the .exe itself to run, and then whatever is left over can be used for the GPU. But that's the kicker, it CAN be used, but because there's a relationship between how much vRAM you use and the demands that content in vRAM has on the GPU itself it means you can't necessarily even use the 8-10Gb of "vRAM" that's left over. This is something you cover in more detail

This guy gets it.

Doesn't matter what GPU it is, everyone is going to have to turn down settings at some stage, be that related to the vram, ray tracing or/and sheer grunt required.

Precisely, this is why I stopped looking at this problem as "do you have enough vRAM for game X" because that's not how games and developed these days and it's not the relationship between games and performance. What you need to care about is do you have enough vRAM to service the GPU. That's literally 100% of what vRAM does, it feeds the GPU and that's it. Oddly enough the question title of the thread is the most appropriate question when it asks if 10Gb of vRAM is enough for the 3080, it's asking the right question. Too many people are translating that into some question of whether 10Gb is enough for something else entirely, some specific game at some specific settings like running FS2020 at 16fps or something dumb that no one ever does.

Exactly a full year on from release date and the 3080's 16Gb of vRAM is holding up fine, we've had attempts at nitpicking stupid examples, but there's no clear trend. And the prediction were consoles would use all the memory they have to use texture packs that far exceed what the PC can use and not only is that blatently not true, we have real world examples of the opposite, of console not using high resolution texture packs where as the PC does. Probably in large part because you can't even appreciate packs that large unless you're in true 4k which none of the consoles can handle. It's a case of the GPU being a limiting factor via the proxy of screen resolution.
 
Spot on, it's no secret to anyone even remotely technical that console generations do not have fast APUs, they're sharing the same die for CPU and GPU functions and all in all they aren't that fast.
Considering that the PS5 and the XSX are better than a vast majority of gaming pcs, I think they are doing pretty well for themselves.

So the actual OS and all other running background apps/tasks use that memory, an estimated 2.5Gb or plausibly more today
MS has gated 2.5GB off for the OS. It is highly unlikely to be more or to ever grow because any game that is right up against that limit would break if a future update tried to allocate more RAM to the OS. Chances are it will most likely go down than up with future updates.

Probably in large part because you can't even appreciate packs that large unless you're in true 4k which none of the consoles can handle
Still spouting this ********. Nice.

Maybe you should tone down on the condescension, Mrs remotely technical.
 
yeah thats why ps4 ports require 5-7 gb vram to run properly on modern gpus huh

ps4 has 8 gb total ram lmao.

consoles do not run "game exes". they're so specially different. you cant even know what goes under the hood. it is entirely possible that console games only use 500 mb-1.5 gb ram in total. pc is weird and gimmicky. tons of stuff is re-written onto RAM unnecessarily... all usages are overblown and overused. rdr 2 uses 7 gb ram + 6.5 gb vram on ps4 settings level. combine that, you get 13.5 gb total ram , which is nowhere near what ps4/xbox one has.

16 gb consoles will destroy 8/10 gb vram cards in terms of texture fidelity.
 
Considering that the PS5 and the XSX are better than a vast majority of gaming pcs, I think they are doing pretty well for themselves.

MS has gated 2.5GB off for the OS. It is highly unlikely to be more or to ever grow because any game that is right up against that limit would break if a future update tried to allocate more RAM to the OS. Chances are it will most likely go down than up with future updates.

Still spouting this ********. Nice.

Maybe you should tone down on the condescension, Mrs remotely technical.

Well the consoles are sold more or less at cost, the business model is to subsidise the manufacturing/part costs with royalties from game sales and subscriptions over a 6-7 year period, so console gamers have always got more bang for their buck with the hardware. But they're bled dry through higher cost of games because the royalties are passed onto the consumer by the development studios/publishers. And for the PCs it's not for the lack of trying there's obviously massive shortages in GPUs preventing people from upgrading to modern hardware for what would be normal prices.

OS RAM requiremens almost never ever go down, if you look at a history of memory requirements for OS's it's always up every generation. Either way in the case of MS we kind of know a bit better what the engineers were thinking because they actually spread that 16Gb of memory across 6Gb and faster 10Gb, we know the faster memory is needed for GPU functions because vRAM in general is much more demanding on bandwidth than RAM for the CPU. So it's pretty obvious they're aiming at a ceiling of 10Gb for the GPU. And quite frankly I'd bet money that they'd never actually use that anyway, I'd be interested in seeing tools that can tell us these numbers for real games because my suspicion is the console APUs run out of sheer power long before they're filling 10Gb of memory with GPU related data.

Not sure what you're on about. Are you debating if WD:L uses a high res texture pack on the PC and not on the consoles? Or are you debating you can see a difference unless you're in 4k? I've referenced this before but if you look at texture packs designed for 4k such as the Rainbow 6 siege one, and look at reviews on steam specifically for that pack and look at the comments, you very quickly see a pattern of people saying they literally can't tell the difference if they're in lower resolutions, which makes perfect sense. The number of texels you can display is limited by the display resolution. Texture detail can exceed your screens ability to display it, that's a basic fact about 3d rendering and a good example of why GPU grunt actually matters. Because that goes up very much linearly with number of pixels you display.

A hypothesis made on these forums a decent number of times has been that the consoles will exceed the PC in terms of vRAM usage because they can simply throw that budget at higher resolutions textures and gain the benefit of better fidelity like that. So what we'd expect to see if that were true is devs loading up way more texture data into these games. But we don't see that, and in some cases we see the opposite we see the PC getting the texture packs instead. But we also see games like WD:L using aggressive resolution scaling which is often not letting the game exceed 1440p and sometimes even lower, which I think creates a nice internally consistent explanation of what is happening with respect to what I'm saying.
 
yeah thats why ps4 ports require 5-7 gb vram to run properly on modern gpus huh

ps4 has 8 gb total ram lmao.

consoles do not run "game exes". they're so specially different. you cant even know what goes under the hood. it is entirely possible that console games only use 500 mb-1.5 gb ram in total. pc is weird and gimmicky. tons of stuff is re-written onto RAM unnecessarily... all usages are overblown and overused. rdr 2 uses 7 gb ram + 6.5 gb vram on ps4 settings level. combine that, you get 13.5 gb total ram , which is nowhere near what ps4/xbox one has.

16 gb consoles will destroy 8/10 gb vram cards in terms of texture fidelity.

Er, yes they do. With MS it's basically just custom windows. And with The PS it's a custom Linux distro, they run complied libs with code compiled into dlls. They may not have a file extension of .exe in some cases but that's neither here nor there from a technical perspective.

PS ports require more RAM because the code is not natively run on the hardware. This is because these consoles arne't backwards compatible and cannot natively run code designed for another version of the console, they have to be partially emulated by aspects of the hardware being emulated, which is normally prohibitively expensive from a hardware perspective. it's why unofficial emulators on the PC can emulate something like the early consoles but they can't emulate modern consoles, because that extra layer of abstraction is just too demanding from a hardware perspective.

The console variants of these games are woefully underpowered compared to the PC variants. I believe you're the person I discussed WD:L with, and the differences between what the console and PC version was doing in terms of how cut down the console version needs to be in order to actually run at a respectable frame rate. If memory serves me correct you have an extremely over inflated view of just what consoles are doing, you seem to think the PC and console variants of games are basically identical when there's typically huge number of corners cut in order to get console variants playable. That's precisely what I've predicted all along, that console variants simply wont require the amount of memory that PC variants do. And when the WD:L settings files were found an analysed this was actually vindicated. Whereas I don't believe there's any examples of consoles "destroying" the PC in terms of texture fidelity. Do you have any examples of this to date? We do have examples of the opposite with WD:L...
 
Well the consoles are sold more or less at cost, the business model is to subsidise the manufacturing/part costs with royalties from game sales and subscriptions over a 6-7 year period, so console gamers have always got more bang for their buck with the hardware. But they're bled dry through higher cost of games because the royalties are passed onto the consumer by the development studios/publishers. And for the PCs it's not for the lack of trying there's obviously massive shortages in GPUs preventing people from upgrading to modern hardware for what would be normal prices.
Irrelevant to my post. I never said anything about console cost.

OS RAM requiremens almost never ever go down, if you look at a history of memory requirements for OS's it's always up every generation. Either way in the case of MS we kind of know a bit better what the engineers were thinking because they actually spread that 16Gb of memory across 6Gb and faster 10Gb, we know the faster memory is needed for GPU functions because vRAM in general is much more demanding on bandwidth than RAM for the CPU. So it's pretty obvious they're aiming at a ceiling of 10Gb for the GPU. And quite frankly I'd bet money that they'd never actually use that anyway, I'd be interested in seeing tools that can tell us these numbers for real games because my suspicion is the console APUs run out of sheer power long before they're filling 10Gb of memory with GPU related data.
Talks about console prices and their cost, then speculates that MS and sony don't know how to build consoles and have decided to pay extra to put too much VRAM in them. Okay.
Maybe you should apply to work there, so you can show them how it's done Mrs remotely technical.

Also people still bringing up the 6GB of "slow" memory like it is literally unusable for running graphics. Yeah there is no way GDDR6 running at 336GB/s can be used for storing graphics items. :rolleyes:
Oh wait whats that, the 6700XT runs at 384 GB/s, literally unusable, what was AMD thinking.

*cough*PS5 448GB/s *Cough*

Are you debating if WD:L uses a high res texture pack on the PC and not on the consoles?
Did i mention WD:L in my post?

Or are you debating you can see a difference unless you're in 4k? I've referenced this before but if you look at texture packs designed for 4k such as the Rainbow 6 siege one, and look at reviews on steam specifically for that pack and look at the comments, you very quickly see a pattern of people saying they literally can't tell the difference if they're in lower resolutions, which makes perfect sense. The number of texels you can display is limited by the display resolution. Texture detail can exceed your screens ability to display it, that's a basic fact about 3d rendering and a good example of why GPU grunt actually matters. Because that goes up very much linearly with number of pixels you display.
It really depends on a few more things and should be evaluated on a game by game (arguably game model by game model) basis. You can't make any blanket statements on it. At least you didn't say 4k textures should only be used when gaming at 4k.

A hypothesis made on these forums a decent number of times has been that the consoles will exceed the PC in terms of vRAM usage because they can simply throw that budget at higher resolutions textures and gain the benefit of better fidelity like that..
You sprinkled a bit of truth but interlaced it with falsehood. The hypothesis was that VRAM requirements for games would grow due to the consoles having more VRAM. It was mentioned that at a point the 3080 would not be able to run better textures than the consoles because it doesn't have the VRAM to store them.

We are a year in with video games still catering for the previous generation of consoles. With the last few games seeming to grow in requirements. Unless you have proof that these new consoles have peaked and they won't be getting any better.
 
People mentioning consoles not being able to run true 4k games (which is a lie, plenty of native 4k games out there) and then bringing out watchdogs which even a 3090 will strugle with at 4k (specially without DLSS) is kinda ironic, for 450 quid consoles are bloody impressive specially considering that will barely get you a 2060 nowadays unless you are really lucky to manage a Nvidia FE drop.

Regarding the 3080 Vram, i believe it will age badly, we are barely into "next gen" games and you already get games reaching 9gbs easily (Doom eternal and resi come to mind, I am sure there is more of them),I would also really love to see Nvidia do something about the drivers CPU overhead problem since in some games its really noticeable (even with a 3090 paired with a 5800x on my case)
 
Status
Not open for further replies.
Back
Top Bottom