• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
So Doom eternal is out so far for 8GB at max 4K. Can we get any more to add to the list?

Makes you wonder that's why they're releasing the 3070 later, for when the "Ultra" 4K benchmarks hit in reviews.

ui9hLnC.png

If they test Avengers, that's probably gonna be on the list too.
 
Do you really need ultra with current games though? I wish a developer would add an Ultra Plus setting that's much higher than ultra as a social experiment. Just to see how upset people get when they realise today's hardware can't use it. I'm sure they'd blame Nvidia for it.
 
ui9hLnC.png

If they test Avengers, that's probably gonna be on the list too.

Is that a crash from the 2080 where the graph stops? This lacks context somewhat, is the frame times due to VRAM or just bad frame times in general?

Edit: Not trying to be awkward by the way, although it may seem that way. I just feel at the moment, as a community, we should bring this all together with evidence to make a real debate on the 10GB topic as it's not as black and white as we'd hope, rather than just **** flinging either side.
 
Last edited:
The 3gb 780 Ti was enough in 2013 when the consoles launched with 8gb. That's 3Gb Vram Vs 8Gb Vram. It wasn't a problem.

Now in 2020, 10gb Vram is suddenly a problem Vs the consoles 16Gb? It's a smaller gap than in 2013.

Doesn't anybody remember this??? It wasn't a problem. Stop panicking.

What the hell are you smoking dude...

PS4 didn't have 8GB of VRAM, it had 8GB of total RAM in the system.

It only had 4.5GB available for developers to use at launch, the rest was reserved for OS. This 4.5GB was then further split between normal RAM and VRAM.

This resulted in PS4 maybe having 3GB of VRAM available in the best case scenario with the remaining 1.5GB used for game logic.

Of course the 780Ti didn't struggle initially since it probably had more VRAM available than what the PS4 had.

Infamous Second Son developer even did a technical breakdown of how they used the 4.5GB of RAM that was available to them....

https://wccftech.com/infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details/
infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details
 
Well I was set on this 3080 coming from a 1080ti, but all this VRAM talk has pushed me over the edge. I'll sit it out for a little while longer and see how things fall.....until tomorrow when I change my mind again.

Just like me, and, I suspect, a few million others! :p

Me too. Is this the way to the support group?

I think I'll probably go for a 3080 as I don't want to spend more than £700 on a GPU (I'd already be wincing as I hit 'Buy') and I expect a Ti / Super / whatever would be more.

This is for 3440x1440 120Hz so I will try to do the sensible thing and wait for reviews.
 
What the hell are you smoking dude...

PS4 didn't have 8GB of VRAM, it had 8GB of total RAM in the system.

It only had 4.5GB available for developers to use at launch, the rest was reserved for OS. This 4.5GB was then further split between normal RAM and VRAM.

This resulted in PS4 maybe having 3GB of VRAM available in the best case scenario with the remaining 1.5GB used for game logic.

Of course the 780Ti didn't struggle initially since it probably had more VRAM available than what the PS4 had.

Infamous Second Son developer even did a technical breakdown of how they used the 4.5GB of RAM that was available to them....

https://wccftech.com/infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details/
infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details

My mistake. I didn't dig any deeper than a 5 second Google search :o
 
Is that a crash from the 2080 where the graph stops? This lacks context somewhat, is the frame times due to VRAM or just bad frame times in general?

Edit: Not trying to be awkward by the way, although it may seem that way. I just feel at the moment, as a community, we should bring this all together with evidence to make a real debate on the 10GB topic as it's not as black and white as we'd hope, rather than just **** flinging either side.

It's from here:
https://www.computerbase.de/2019-02/amd-radeon-vii-test/5/

They tried all sorts of tests to see whether Radeon VII's double vram buffer was an advantage, so they go into various scenarios at 4K & 5K to see if it does. Chrome should auto-translate the page fine.

The test scene used in Wolfenstein 2 hooks at one point regardless of the hardware, which can be clearly seen in the diagrams. That being said, all three graphics cards do the job well in 3840 × 2160. In 5,120 × 2,880, it can be seen that Nvidia has a slightly better memory management. The Radeon RX Vega 64 is completely out of joint, while the GeForce RTX 2080 still delivers good frame times. The only way to tell that the Nvidia graphics card is running out of memory is that the Radeon VII, which has equally clean frame times, suddenly works a little faster, although the AMD accelerator in the game is otherwise slower. Since the test scene is a worst-case scenario, Wolfenstein 2 is definitely playable on both graphics cards, even if the frame rate is not high.

The Vulkan game offers in the graphics options to increase the texture streaming by a further level despite the highest preset. This means that additional textures are loaded into the graphics memory, which are intended to prevent possible reloading, including stuttering. However, the function does not have a practical advantage. If it is activated anyway, neither the Radeon RX Vega 64 nor the GeForce RTX 2080 will have enough 8,192 MB - Wolfenstein 2 is then unplayable. The Radeon VII, however, does the job without any problems due to the 16 GB and works just as well as with the second highest texture streaming.
 
What the hell are you smoking dude...

PS4 didn't have 8GB of VRAM, it had 8GB of total RAM in the system.

It only had 4.5GB available for developers to use at launch, the rest was reserved for OS. This 4.5GB was then further split between normal RAM and VRAM.

This resulted in PS4 maybe having 3GB of VRAM available in the best case scenario with the remaining 1.5GB used for game logic.

Of course the 780Ti didn't struggle initially since it probably had more VRAM available than what the PS4 had.

Infamous Second Son developer even did a technical breakdown of how they used the 4.5GB of RAM that was available to them....

https://wccftech.com/infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details/
infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details
I knew it was low, but i didn't realise it 4.5GB kind of low. We are in for a huge uplift in texture quality and scene complexity with these next gen consoles. I can't wait to see what developers will bring to the table.
 
I knew it was low, but i didn't realise it 4.5GB kind of low. We are in for a huge uplift in texture quality and scene complexity with these next gen consoles. I can't wait to see what developers will bring to the table.

Agree it will be exiting to see the improvements we're going to get.

My worry with the 10GB of 3080 is that I expect console to not target native 4k after the lunch hype dies down and instead design games around 1440p resolution. We know that consoles has various compromises which normally result in using medium type settings compared to the PC, so I don't see how a 3080 rendering 4k native on high/max PC settings will cope. I expect it to hit VRAM limitations probably next year already...

If consoles target 1440p & medium settings along with having around 10GB VRAM budget, there is no way that will translate well onto PC.
 
Agree it will be exiting to see the improvements we're going to get.

My worry with the 10GB of 3080 is that I expect console to not target native 4k after the lunch hype dies down and instead design games around 1440p resolution. We know that consoles has various compromises which normally result in using medium type settings compared to the PC, so I don't see how a 3080 rendering 4k native on high/max PC settings will cope. I expect it to hit VRAM limitations probably next year already...

If consoles target 1440p & medium settings along with having around 10GB VRAM budget, there is no way that will translate well onto PC.

Over the laife span of the consoles the hardware will not be upgraded. So in a couple of years when games are more complicated and require more compute they will indeed need to be scaled back for the then years older consoles.

We PC master race will be running our 4080ti's on the other hand.
 
Yeah I don't realistically think 10GB is enough. It may be enough NOW but it's not like I upgrade GPU every year - I'd want it to be enough for 3-4 years and I'm pretty confident it won't be.
 
Maybe this will turn out true, WCCFT rumours before launch:
https://wccftech.com/exclusive-nvid...-partial-specs-and-tentative-launch-schedule/
  1. We then have the PG132-20 and PG132-30 boards, both of which are replacing the RTX 2080 SUPER graphics card and will have 20GB and 10GB worth of vRAM respectively. The PG132-20 board is going to be launching in the first half of October while the PG132-30 board is going to be launching in mid-September. It is worth adding here that these three parts are likely the SKU10, 20 and 30 we have been hearing about and the SKU20 is going to be targetted dead center at AMD's Big Navi offering (and hence the staggered launch schedule). Since AMD's Big Navi will *probably* have 16GB worth of vRAM, it also explains why NVIDIA wants to go with 20GB.

See you in October! :)
 
Yeah I don't realistically think 10GB is enough. It may be enough NOW but it's not like I upgrade GPU every year - I'd want it to be enough for 3-4 years and I'm pretty confident it won't be.
For sure get minimum 16GB if keeping it that long.
 
I'd love an upgrade to a 3080 from my 1080ti but I think I will wait until there's some actual benchmarks comparing the two cards to see how much performance increase it is and where the 3080 might fall down.
 
Status
Not open for further replies.
Back
Top Bottom