Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
The 3gb 780 Ti was enough in 2013 when the consoles launched with 8gb. That's 3Gb Vram Vs 8Gb Vram. It wasn't a problem.
Now in 2020, 10gb Vram is suddenly a problem Vs the consoles 16Gb? It's a smaller gap than in 2013.
Doesn't anybody remember this??? It wasn't a problem. Stop panicking.
I feel exactly the same!Well I was set on this 3080 coming from a 1080ti, but all this VRAM talk has pushed me over the edge. I'll sit it out for a little while longer and see how things fall.....until tomorrow when I change my mind again.
Well I was set on this 3080 coming from a 1080ti, but all this VRAM talk has pushed me over the edge. I'll sit it out for a little while longer and see how things fall.....until tomorrow when I change my mind again.
Just like me, and, I suspect, a few million others!
What the hell are you smoking dude...
PS4 didn't have 8GB of VRAM, it had 8GB of total RAM in the system.
It only had 4.5GB available for developers to use at launch, the rest was reserved for OS. This 4.5GB was then further split between normal RAM and VRAM.
This resulted in PS4 maybe having 3GB of VRAM available in the best case scenario with the remaining 1.5GB used for game logic.
Of course the 780Ti didn't struggle initially since it probably had more VRAM available than what the PS4 had.
Infamous Second Son developer even did a technical breakdown of how they used the 4.5GB of RAM that was available to them....
https://wccftech.com/infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details/
Is that a crash from the 2080 where the graph stops? This lacks context somewhat, is the frame times due to VRAM or just bad frame times in general?
Edit: Not trying to be awkward by the way, although it may seem that way. I just feel at the moment, as a community, we should bring this all together with evidence to make a real debate on the 10GB topic as it's not as black and white as we'd hope, rather than just **** flinging either side.
The test scene used in Wolfenstein 2 hooks at one point regardless of the hardware, which can be clearly seen in the diagrams. That being said, all three graphics cards do the job well in 3840 × 2160. In 5,120 × 2,880, it can be seen that Nvidia has a slightly better memory management. The Radeon RX Vega 64 is completely out of joint, while the GeForce RTX 2080 still delivers good frame times. The only way to tell that the Nvidia graphics card is running out of memory is that the Radeon VII, which has equally clean frame times, suddenly works a little faster, although the AMD accelerator in the game is otherwise slower. Since the test scene is a worst-case scenario, Wolfenstein 2 is definitely playable on both graphics cards, even if the frame rate is not high.
The Vulkan game offers in the graphics options to increase the texture streaming by a further level despite the highest preset. This means that additional textures are loaded into the graphics memory, which are intended to prevent possible reloading, including stuttering. However, the function does not have a practical advantage. If it is activated anyway, neither the Radeon RX Vega 64 nor the GeForce RTX 2080 will have enough 8,192 MB - Wolfenstein 2 is then unplayable. The Radeon VII, however, does the job without any problems due to the 16 GB and works just as well as with the second highest texture streaming.
I knew it was low, but i didn't realise it 4.5GB kind of low. We are in for a huge uplift in texture quality and scene complexity with these next gen consoles. I can't wait to see what developers will bring to the table.What the hell are you smoking dude...
PS4 didn't have 8GB of VRAM, it had 8GB of total RAM in the system.
It only had 4.5GB available for developers to use at launch, the rest was reserved for OS. This 4.5GB was then further split between normal RAM and VRAM.
This resulted in PS4 maybe having 3GB of VRAM available in the best case scenario with the remaining 1.5GB used for game logic.
Of course the 780Ti didn't struggle initially since it probably had more VRAM available than what the PS4 had.
Infamous Second Son developer even did a technical breakdown of how they used the 4.5GB of RAM that was available to them....
https://wccftech.com/infamous-son-consumed-ps4s-ram-gpu-cpu-visual-optimization-panel-details/
My mistake. I didn't dig any deeper than a 5 second Google search
I knew it was low, but i didn't realise it 4.5GB kind of low. We are in for a huge uplift in texture quality and scene complexity with these next gen consoles. I can't wait to see what developers will bring to the table.
Agree it will be exiting to see the improvements we're going to get.
My worry with the 10GB of 3080 is that I expect console to not target native 4k after the lunch hype dies down and instead design games around 1440p resolution. We know that consoles has various compromises which normally result in using medium type settings compared to the PC, so I don't see how a 3080 rendering 4k native on high/max PC settings will cope. I expect it to hit VRAM limitations probably next year already...
If consoles target 1440p & medium settings along with having around 10GB VRAM budget, there is no way that will translate well onto PC.
No one cares tbh, as we all know once they are available to buy, they'll be flying out, as will the 3070 with its 8GB.
- We then have the PG132-20 and PG132-30 boards, both of which are replacing the RTX 2080 SUPER graphics card and will have 20GB and 10GB worth of vRAM respectively. The PG132-20 board is going to be launching in the first half of October while the PG132-30 board is going to be launching in mid-September. It is worth adding here that these three parts are likely the SKU10, 20 and 30 we have been hearing about and the SKU20 is going to be targetted dead center at AMD's Big Navi offering (and hence the staggered launch schedule). Since AMD's Big Navi will *probably* have 16GB worth of vRAM, it also explains why NVIDIA wants to go with 20GB.
For sure get minimum 16GB if keeping it that long.Yeah I don't realistically think 10GB is enough. It may be enough NOW but it's not like I upgrade GPU every year - I'd want it to be enough for 3-4 years and I'm pretty confident it won't be.