• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
31 Oct 2002
Posts
9,870
I keep seeing the LG recommended... I currently game at 27" 1440p and looking at 4k with the next gen GPUs..

You mind showing me your set up? I'm intrigued as to how far you sit away from your screen to make a 48" work. I currently sit just shy of 2ft away... could push it to 3ft if I wall mount a screen... just doesn't seem to be far enough... maybe my brain just can't comprehend it.

Sure. Here's a pic from when I first set it up - I've since replaced the included LG original stand with a 3rd party one, which lets me position the screen another 5-6 inches back further on the desk. My BenQ 27" monitor is to the side for scale.

LJ9YqOD.png

I'll update with another picture once I've finished redecorating and repositioning things in my study.
 
Soldato
Joined
31 Oct 2002
Posts
9,870
Yep. These whizz-kids still haven't figured out that you will need a better GPU before you need more than the 10GB.

You should have a meeting with Nvidia, they clearly don't know what they are talking about when they've decided to release a 20GB 3080 and a 16GB 3070 later this year.....

16GB would have been perfect for the 3080, but it just wasn't possible with the memory bus layout Nvidia went for, that's why they have to replace the 10x1GB (10GB) modules with 10x2GB modules (20GB) for the upcoming 3080 20GB. The 2GB modules weren't ready in time for the 3080 launch.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
You should have a meeting with Nvidia, they clearly don't know what they are talking about when they've decided to release a 20GB 3080 and a 16GB 3070 later this year.....

16GB would have been perfect for the 3080, but it just wasn't possible with the memory bus layout Nvidia went for, that's why they have to replace the 10x1GB (10GB) modules with 10x2GB modules (20GB) for the upcoming 3080 20GB. The 2GB modules weren't ready in time for the 3080 launch.

They know the customer base, more epeen is better :D
 
Soldato
Joined
18 May 2010
Posts
22,395
Location
London
I found this via here:

The PlayStation 5 features 16GB of GDDR6 unified RAM with 448GB/sec memory bandwidth. This memory is synergized with the SSD on an architectural level and drastically boosts RAM efficiency. The memory is no longer "parking" data from an HDD; the SSD can deliver data right to the RAM almost instananeously.
Essentially the SSD significantly reduces latency between data delivery and memory itself. The result sees RAM only holding assets and data for the next 1 second of gameplay. The PS4's 8GB of GDDR5 memory held assets for the next 30 seconds of gameplay.
"There's no need to have loads of data parked in the system memory waiting to potentially be used. The other way of saying that is the most of the RAM is working on the game's behalf."
The SSD allows Sony to keep RAM capacity down and reduce costs.
"The presence of the SSD reduces the need for a massive inter-generational increase in size."
Excerpt from https://www.tweaktown.com/news/7134...ep-dive-into-next-gen-storage-tech/index.html, which was an analysis of the Mark Cerny video "Road to PS5"

So, we have only a 2.7x increase in VRAM because of how the I/O improvement changes the paradigm of how developers utilize memory for next-gen.

We recently learned that this same amazing I/O revolution, will also be enabled on the RTX 30 series.
For those of us who pair an NVME SSD with a 30 series GPU, VRAM isn’t going to be a limiting factor in performance thanks to technologies such as DirectStorage and RTX I/O and Sampler Feedback Streaming. See the end of this post for links to articles that go into more detail of what these technologies do.
 
Soldato
Joined
26 May 2014
Posts
2,958
Yep. These whizz-kids still haven't figured out that you will need a better GPU before you need more than the 10GB.
Simply not true. You don't have to use graphical presets in games. If you have the VRAM, you can keep all the other settings the same whilst cranking up the texture setting to ultra. This has zero effect on your performance, unless you don't have enough VRAM, whilst often providing a nice boost to image quality. Running a game at medium settings with ultra textures is still a hell of a lot better than running it at medium with medium textures. Feel free to fire up Red Dead Redemption 2 if you'd like to see a stark contrast first-hand.

This is actually something I've run into personally recently via revisiting the R9 Fury. In scenarios where it isn't limited by VRAM, the Fury is generally around 10-15% faster than an RX 580 (a card which itself is still absolutely fine for playing the latest games at 1080p). Yet in a lot of newer titles I've found it to be a complete stutterfest without dialling the texture setting down a notch or two. In some games that doesn't have too large an effect on image quality, but in others (like RDR2), it's huge, and so even an 8GB 580 provides a notably better experience, despite the Fury's extra grunt. Other titles, like Doom Eternal, completely lock out 4GB cards from choosing their higher texture settings altogether.

We'll see how VRAM usage fares over the next couple of years, but I'd expect it to balloon significantly when ports of games designed with the new consoles in mind start arriving. I reckon a 10GB 3080 will start looking a lot like a 4GB Fury before Nvidia's next cards arrive, especially since it's a card basically made for 4K+ gaming in upcoming AAA titles, not scratching around at 1080p in games designed with an APU from 2013 as a baseline.
 
Last edited:
Soldato
Joined
20 Aug 2019
Posts
3,031
Location
SW Florida
The result sees RAM only holding assets and data for the next 1 second of gameplay. The PS4's 8GB of GDDR5 memory held assets for the next 30 seconds of gameplay.

This would seem to indicate that speed can lower required capacity.

The idea that the memory buffer loads what it needs for x amount of time is important. It changes the equation to look more like a "just in time" approach.

I bet one second of 4k mods, texture-packs and whatever apocalyptical eventuality people want to throw in there, requires a lot less vram than 30 seconds of "normal" assets.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Simply not true. You don't have to use graphical presets in games. If you have the VRAM, you can keep all the other settings the same whilst cranking up the texture setting to ultra. This has zero effect on your performance, unless you don't have enough VRAM, whilst often providing a nice boost to image quality. Running a game at medium settings with ultra textures is still a hell of a lot better than running it at medium with medium textures. Feel free to fire up Red Dead Redemption 2 if you'd like to see a stark contrast first-hand.

This is actually something I've run into personally recently via revisiting the R9 Fury. In scenarios where it isn't limited by VRAM, the Fury is generally around 10-15% faster than an RX 580 (a card which itself is still absolutely fine for playing the latest games at 1080p). Yet in a lot of newer titles I've found it to be a complete stutterfest without dialling the texture setting down a notch or two. In some games that doesn't have too large an effect on image quality, but in others (like RDR2), it's huge, and so even an 8GB 580 provides a notably better experience, despite the Fury's extra grunt. Other titles, like Doom Eternal, completely lock out 4GB cards from choosing their higher texture settings altogether.

We'll see how VRAM usage fares over the next couple of years, but I'd expect it to balloon significantly when ports of games designed with the new consoles in mind start arriving. I reckon a 10GB 3080 will start looking a lot like a 4GB Fury before Nvidia's next cards arrive, especially since it's a card basically made for 4K+ gaming in upcoming AAA titles, not scratching around at 1080p in games designed with an APU from 2013 as a baseline.

You do relise the next gen consoles have only 16GB of RAM total? Most PC gamers will upgrade to a 3060 or equivelent. The next few years will see RT being pushed even more, which the 3080 currently just about manages at 1440p. We already need faster GPUs today. Bring on Hopper or AMDs RDNA3 ASAP, I will be upgrading again.

A good example of why we need more capable GPUs ASAP -

New rtx game, demo available on Steam atm

 
Last edited:
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Sure. Here's a pic from when I first set it up - I've since replaced the included LG original stand with a 3rd party one, which lets me position the screen another 5-6 inches back further on the desk. My BenQ 27" monitor is to the side for scale.

LJ9YqOD.png

I'll update with another picture once I've finished redecorating and repositioning things in my study.

I never considered a monitor of that size sitting so close. Any chance you could open explorer on both and perhaps show some text in notepad to give an Idea of scale? I'm guessing the 4K is a lot easier to read at that size than the 1440p.
 
Soldato
OP
Joined
26 Aug 2004
Posts
5,033
Location
South Wales
Screen of that size would be way too big for me being that close, guess it's ok for slower paced games but i enjoy my FPS too much for that.

I also see new rumours of a 20GB card, at this rate these will be revealed officially before i even get my 3080!

Oh well, Zen 3 show today!
 
Man of Honour
Joined
13 Oct 2006
Posts
91,321
This would seem to indicate that speed can lower required capacity.

The idea that the memory buffer loads what it needs for x amount of time is important. It changes the equation to look more like a "just in time" approach.

I bet one second of 4k mods, texture-packs and whatever apocalyptical eventuality people want to throw in there, requires a lot less vram than 30 seconds of "normal" assets.

This doesn't make much sense to me from a game design perspective - there is a certain amount of guesswork you can do for streamable resources but generally you'd need to be able to stream the high quality, nearby assets within a game frame. For a lot of other resources you can't really load them in ahead i.e. if the player is using a rocket launcher you need the data for that projectile loaded in time for any random time they might press the trigger and fire the weapon.

There are just so many eventualities with game design that approach can really only work when you can very carefully curate the player progression and what they encounter when and that ultimately tends to make for very boring games.
 
Soldato
Joined
18 Feb 2015
Posts
6,485
I never considered a monitor of that size sitting so close. Any chance you could open explorer on both and perhaps show some text in notepad to give an Idea of scale? I'm guessing the 4K is a lot easier to read at that size than the 1440p.

Screen of that size would be way too big for me being that close, guess it's ok for slower paced games but i enjoy my FPS too much for that.

I also see new rumours of a 20GB card, at this rate these will be revealed officially before i even get my 3080!

Oh well, Zen 3 show today!

You get used to it within a few hours, trust me. The difference seems huge because of the side-by-side but you'll quickly use it and it will feel normal.
 
Caporegime
Joined
8 Sep 2005
Posts
27,425
Location
Utopia
I never considered a monitor of that size sitting so close.
Don't, it's really not smart. A TV so large so close to you on a desk is completely unergonomic and it can lead to developing serious RSI (repetitive strain injury) in the upper body (mainly neck and shoulders) due to all of the movements needed to see all of the screen. Every time I see people with this setup I cringe inwardly at how unware people are of the basic principles of desk and screen ergonomics. Unfortunately, many people won't realise until it's too late.

Screen of that size would be way too big for me being that close, guess it's ok for slower paced games but i enjoy my FPS too much for that.

I also see new rumours of a 20GB card, at this rate these will be revealed officially before i even get my 3080!

Oh well, Zen 3 show today!

The 20GB will be launched in December, apparently https://videocardz.com/newz/nvidia-geforce-rtx-3080-20gb-to-launch-in-december
 
Last edited:
Soldato
Joined
27 Feb 2015
Posts
12,621
I feel the industry might need more regulation, more and more SKU's been released this year when they cannot even fulfil the existing one's, clearly they have no concern they wont be able to supply and just want to rake in the cash from those pre orders.
 
Soldato
OP
Joined
26 Aug 2004
Posts
5,033
Location
South Wales
You get used to it within a few hours, trust me. The difference seems huge because of the side-by-side but you'll quickly use it and it will feel normal.
Too big to be that close, no chance of adjusting to that within a few hours.. weeks maybe, even then it's still far too big to be that close to it. I agree with what Richdog said. Need to see the whole screen without moving your head around ideally. I would not enjoy playing shooters like that no matter how immersive it might seem at first
 
Man of Honour
Joined
13 Oct 2006
Posts
91,321
I feel the industry might need more regulation, more and more SKU's been released this year when they cannot even fulfil the existing one's, clearly they have no concern they wont be able to supply and just want to rake in the cash from those pre orders.

I don't know arrangements with suppliers but unlike OcUK a lot of places don't take money on pre-orders until stock is ready to despatch.
 
Status
Not open for further replies.
Back
Top Bottom