• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Using the term Superior is just marketing. All that freesync/g-sync does is help prevent tearing. There is no superiority for that. Also a larger market on console with disagree with you.:D

I know they ultimately achieve the same thing.

But I think Gsync is superior because with Gsync there is no confusion over things like Freesync and FreeSync 2. No confusion over what monitor supports which features of the Freesync standard and what it doesn't.

A Gsync certified monitor does everything that the standard says it should do which makes it very easy for the consumer.

You buy a Gsync certified monitor it will support the full Gsync standard.

I think in this regard Freesync still has work to do. A FreeSync certified monitor should support the full suite of the Freesync standard. Not leave the consumer to dig in to what parts of the FreeSync standard does this monitor support etc...

AMD tried to achieve this with FreeSync 2. It was meant to standardise the technology and a FreeSync 2 certified monitor would support the full FreeSync standard. But this was years ago and the waters are still muddy.
 
To your first point, you are completely wrong. This gen of consoles only had eight gigabytes. The pc needed 16gb to compensate. Now they doubled it to 16GB. Believe it or not,and I know you don't, consoles are the trendsetters. Nvidia has absolutely no sway in the console gaming market. Except for a title here and there.

One X has 12Gb. What exactly did PCs need to 16Gb to compensate for?

So your admission clearly states that you are aware that in this instance more RAM is better. However your omission that Windows 10 operating system is a far cry from of what they use on the consoles which is more streamlined and efficient. They do not require as much RAM as they do on the PC. This is basic, rudimentary knowledge. And, it's also rudimentary to know that as Microsoft and Sony provide updates it would reduce the memory footprint. In which case developers would have more memory to work with.

It's also rudimentary knowledge that the current consoles already use around that for their streamlines OSs. Would you reasonably expect this new generation to use less, or more?
 
One X has 12Gb. What exactly did PCs need to 16Gb to compensate for?
Win10 bloat.
Full screen optimization and other Win10 standard operating features that run in the background. Such as prefetching, the antivirus, defrag, scheduled alerts, etc. Background apps one download such as discord and utility apps to sync your rgb lighting, etc. Even buffer memory for nic card. Also, visual effects at desktop. And that is by no means an exhaustive list just a precursor. :D

The point is that Windows 10 has bloat when compared to consoles. Therefore the use of the memory between the two is not equivalent.

Edit:
I almost forgot the most offending featuring Windows 10 is the standby memory list. In which we still use to this very day an app called intelligent standby list by Wagner in order to compensate for. It's not "as" required as in prior Windows 10 versions though.;)
 
Last edited:
Win10 bloat.
Full screen optimization and other Win10 standard operating features that run in the background. Such as prefetching, the antivirus, defrag, scheduled alerts, etc. Background apps one download such as discord and utility apps to sync your rgb lighting, etc. Even buffer memory for nic card. Also, visual effects at desktop. And that is by no means an exhaustive list just a precursor. :D

The point is that Windows 10 has bloat when compared to consoles. Therefore the use of the memory between the two is not equivalent.
Errr Windows doesnt run in VRAM boys and the xbox one X doesnt have 12gb of dedicated vram its shared system ram
 
Win10 bloat.
Full screen optimization and other Win10 standard operating features that run in the background. Such as prefetching, the antivirus, defrag, scheduled alerts, etc. Background apps one download such as discord and utility apps to sync your rgb lighting, etc. Even buffer memory for nic card. Also, visual effects at desktop. And that is by no means an exhaustive list just a precursor. :D

The point is that Windows 10 has bloat when compared to consoles. Therefore the use of the memory between the two is not equivalent.

Edit:
I almost forgot the most offending featuring Windows 10 is the standby memory list. And which we still use to this very day and app called intelligent standby list by Wagner in order to compensate for. It's not as required as he used to be in Prior Windows 10 versions though.

Right, so no actual numbers then?

Well, here's some numbers for you:

the original xbox had 8gb of ram. 3gb reserved for the OS, 5Gb for games (That's game engine and 'vram' combined). Right now I have 12Gb in my pc. 6.8Gb used, 5gb available. Plus my gpu VRAM.....

So are you really sure about that? I thought this was all rudimentary stuff. *shrugs*
 
I think 8GB is still okay for 1440P. I've personally not seen more than 8GB used at 1440P on my Radeon 7, I think the most I've seen is around 7GB, though more commonly see 5-6GB.

Hard to see how PS5/Xbox X game ports will fare at 1440P a few months/year from now though, we'll have to see.

In 6 years, GPUs will have insane amounts of vram compared to today. Yet the consoles will still be stuck on their tiny 16gb.

8Gb fine for the short life of a gpu which ends up on the used market after 2 years.

A new GPU only has to last 2 years.
A new console has to last 7 - 8 years.

That's the difference. Consoles MUST be groundbreaking at release.
 
Errr Windows doesnt run in VRAM boys and the xbox one X doesnt have 12gb of dedicated vram its shared system ram
The amount of RAM the consoles have keeps going down (16->12 GB) and the amount the OS will use keeps going up (2->4->6 GB) every time the PCMR crowd post about it :p

By release the final spec will be 8 GB of shared RAM with the OS using 7.5 GB :p At least according to posters here.. :p
 
One / One S ps4 / ps4pro = 8Gb
One X = 12Gb
Series X = 16Gb

there's nothing really confusing about it, it's just some people like to forget the 12Gb One X exists and claim PCs need 16Gb of ram to match a console with 8Gb shared ram which is obviously nonsense.
 
Nvidia's claim (amongst others) that a games like RDR2 use no more than "4GB to 6GB of memory" is somewhat accurate.

I Googled for RDR2 at 4k benchmarks and Guru3d measured 6.5GB vram usage.

Here over at Reddit they are having the same conversation. Any it seems they seem to think due to the DirectStorage / RTX IO thing vram could be more intelligently used.

alcoholbob over here says: "Well you can look at it this way, console assets will be 4K native now, so your high or ultra texture quality should be the same between console and PC, and the Xbox Series X has 10GB of video optimized memory so that should be the cut-off point. It's enough for 4K. If there are higher ultra quality PC only options however, you probably want more than 10GB. This will probably be a very small subset of games."
 
Last edited:
Errr Windows doesnt run in VRAM boys and the xbox one X doesnt have 12gb of dedicated vram its shared system ram


Right, so no actual numbers then?

Well, here's some numbers for you:

the original xbox had 8gb of ram. 3gb reserved for the OS, 5Gb for games (That's game engine and 'vram' combined). Right now I have 12Gb in my pc. 6.8Gb used, 5gb available. Plus my gpu VRAM.....

So are you really sure about that? I thought this was all rudimentary stuff. *shrugs*
Lol, it is common knowledge that Windows 10 uses vram. You don't have to believe me because Google is your friend. If I recall correctly it reserves around 20%. Many requests were made to change it in the insider builds. So, I don't know if that's changed. Therefore it doesn't make much sense what you're blathering about. Lol

Had you known this you would not need to ask me. However, you can check Windows 10 right now to see how much vram is being used. Which is again rudimentary. Something that most should know by now. So why are you asking me? :D
 
Last edited:
Nvidia's claim (amongst others) that a games like RDR2 use no more than "4GB to 6GB of memory" is somewhat accurate.

I Googled for RDR2 at 4k benchmarks and Guru3d measured 6.5GB vram usage.
Lol dude, don't worry about VRAM. Unless you plan to keep the card for 3-4 years and have an issue with lowering texture one notch it really is unlikely to be a problem. Keep your 1080 as a spare or some other cheaper card then in 2 years time just before 4080 comes out sell the 3080 and use the said graphics card for a while, then jump on the 4080 on release for not much money on top. The 4080 will no doubt have 16GB as a minimum then. Job done.

The 10GB only becomes a huge issue if you MUST have highest textures available. But your not even on 4K so you don't even need to have 4K textures anyway?
 
Lol dude, don't worry about VRAM. Unless you plan to keep the card for 3-4 years and have an issue with lowering texture one notch it really is unlikely to be a problem. Keep your 1080 as a spare or some other cheaper card then in 2 years time just before 4080 comes out sell the 3080 and use the said graphics card for a while, then jump on the 4080 on release for not much money on top. The 4080 will no doubt have 16GB as a minimum then. Job done.

Yes you're probably right. I'm only at 1440p too.
 
Lol, it is common knowledge that Windows 10 uses vram. You don't have to believe me because Google is your friend. If I recall correctly it reserves around 20%. Many requests were made to change it in the insider builds. So, I don't know if that's changed. Therefore it doesn't make much sense what you're blathering about. Lol

Had you known this you would not need to ask me. However, you can check Windows 10 right now to see how much vram is being used. Which is again rudimentary oh, something that everyone should know by now. So why are you asking me? :D

Uhh might want to check before you hit reply, wasn't me who commented on windows VRAM usage and i don't think anybody asked you to explain it. But whilst we're on the subject, yes it does reserve some but it's not much. Certainly nothing to get excited about. Mine says 600mb right now, although i've got multiple windows open etc. I also dont know how much is released for 3d apps so....meh.
 
I do agree that 10GB is the bare minimum amount Nvidia could have put on the cards and really they should have bumped it to something like 12GB instead. Considering these are meant to be high end 4k gaming cards.

With the huge gulf in vram between the 3080 and the 3090 it is quite clear they have done this on purpose to make room for a 3080ti/super.

But it isn't coming before Xmas that is for sure.
 
I think nVidia chose this route because of board complexity and the case of the 3090 the lack of available gddr6x modules right now.

If nvidia went for 12gb on the 3080, that would leave no room for a Ti with more memory. The only way around that would be for nvidia to design the gpu for use with wider buses, ie 384bit on the what would be the base 12gb cards and what, 512bit for a 16gb 3080ti? The last nVidia went that wide was over a decade ago. It requires even more complex boards, more memory modules and more complex power delivery. Maybe the cost was just wrong. Or maybe the GPU's didnt scale well with a bus that wide? I dunno, but i think 10gb is the result of trying to reduce the costs (selfishly, probalby - it is nVidia) rather than doing the customers out of 1gb of ram.
 
Last edited:
I am well aware that vram allocation and actual usage are not one in the same, however...Horizon Zero Dawn patch 1.3 on high settings, 90 FOV at 1920 x 1200 shows a minimum of circa 5.4gb and a maximum of circa 7.6gb vram during game play on my screen. It has been disproven time and time again that Nvidia GPU's better utilize vram contrary to AMD GPU's in any meaningful way. So why when my now 3 year old Vega 56 is let's say “allocating” close to its maximum vram on current games, does the soon to be released almost £500 RTX 3070 have just 8gb of vram?
 
I do agree that 10GB is the bare minimum amount Nvidia could have put on the cards and really they should have bumped it to something like 12GB instead. Considering these are meant to be high end 4k gaming cards.

With the huge gulf in vram between the 3080 and the 3090 it is quite clear they have done this on purpose to make room for a 3080ti/super.

But it isn't coming before Xmas that is for sure.
The 3090 is the titan equivalent.

The titan VRAM has always dwarfed the xx80 or xx70in recent memory.
 
One / One S ps4 / ps4pro = 8Gb
One X = 12Gb
Series X = 16Gb

there's nothing really confusing about it, it's just some people like to forget the 12Gb One X exists and claim PCs need 16Gb of ram to match a console with 8Gb shared ram which is obviously nonsense.
Yeah but last gen's consoles will no longer be the benchmark.

The PCMR won't be able to get excited/aroused about trouncing an XBone if the XSX is giving a better experience than their £2000 PC :p
 
Lol, it is common knowledge that Windows 10 uses vram. You don't have to believe me because Google is your friend. If I recall correctly it reserves around 20%. Many requests were made to change it in the insider builds. So, I don't know if that's changed. Therefore it doesn't make much sense what you're blathering about. Lol
:D
Eh? The fact that windows 10 reserves a tiny amount of VRAM (1gb in my case of 11Gb with a lot of video windows open) does not mean that "windows 10 bloat, installed applications etc" contributes in a meaningful way to VRAM requirements of a game, this has been accounted for. It goes back the my very original point that if it was required it would have been implemented. There is certainly room for a card with more vram but it really doesnt take away from the fact that 10Gb is clearly going to be enough. The 24Gb is for the crazy crap they are doing at 8k and potentially also for the card being split quadro style in soft for 2 virtual 12gb rigs, given where the world is going streaming wise.
 
Status
Not open for further replies.
Back
Top Bottom