• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

16GB vram enough for the 6900XT? Discuss..

With the RTX3090 packing 24GB and games becoming more demanding on resources all the time, will 16GB vram be enough to make the AMD Radeon RX6900XT a card that will have decent longevity for a good few years to come and handle current games at the best settings?

Will next gen consoles 'capable' of 4k gaming change the way developers create their games, increasing texture resolutions and therefore making vram capacity even more important for GPU performance?
Erm :p
 
Gears 5, Gears Tactics, AC:Valhalla and Dirt 5 are all 4k 60fps on Xbox series X

two of those are old games, one of those has the same texture quality as a racing game on my iPhone and the other may not be locked 4k since Ubisoft's other Series X games are not and use dynamic 4k - for example watch dogs legion drops as low as 1440p maintain its framerate on the Series X
 
was just reading up on memory heirarchy it seems vram has to be sized for 0 miss probability. that's a pretty steep requirement and would theoretically require infinite amount of vram
 
Well that is just misleading. The 3080 replaces the 2080 which had 8GB.

I forgot they used the 2080 in all their comparisons..................

If you want to point the finger of misleading then perhaps it should be at NVIDIA who compared it directly with the TI, and not the regular 2080.
 
If you want to point the finger of misleading then perhaps it should be at NVIDIA who compared it directly with the TI, and not the regular 2080.

Got a source? I've just had a quick look at Nvidia marketing and announcement for the 3080 and every comparison has been directly with the 2080... Jensen mentioned the 2080ti once in relation to say it's faster.

There'll be a 3080ti with higher specs and 16/20GB of memory I'm sure. Will probably be on TSMC around £1000 to compete with the RX6900, that's the successor to the 2080ti.
 
I ran a game and it allocated all 16gb of vRam so I would say 16gb is not going to be enough because I have limited understanding of how GPU's work but I did once pass by a monitor that had a Youtuber on who was sure he knew everything thus empowering my own knowledge, however it may even cause my gentleman's sausage to fall off and my virginity to continue for many many more years owning a card without enough vRam.
 
I can understand asking if 10gb is enough but c'mon, this has to be trolling.

It's clearly tongue in cheek because of the other threads, but it's actually a fair enough question to ask. The early benchmarks show 6900XT is a faster GPU than either the 3070 or the 3080 so I would expect it to be able to make use of about 10-12Gb vRAM tops. I think that's what the competitor will essentially launch, a 12Gb 3080Ti with a GPU about the same speed. I think in 2-3 years time we'll look back on this specific card and see the same thing we saw with the 1080Ti or even 2080Ti which is that the additional memory they came with was never realistically used for anything.

We even see the 3090 running low on FPS with any game that pushes 10Gb of vRAM usage, Watch Dogs Legion at ultra 4k, FS2020 at Ultra 4k, Avengers at Ultra 4k, they all consume less than 10Gb vRAM but the frame rates are running into GPU bottlenecks already.
 
Thing about adding more vram to a 6900xt is that the suits at AMD wouldn't allow it. Imagine a 32GB variant offering twice the performance for a few pounds more. Not going to happen.
 
opinion seems to be split on if 10gb is enough, im not that techy minded, so can someone explain why it wouldnt be enough, i going AMD as usual, but just understand why there is such a split
 
Got a source? I've just had a quick look at Nvidia marketing and announcement for the 3080 and every comparison has been directly with the 2080... Jensen mentioned the 2080ti once in relation to say it's faster.

There'll be a 3080ti with higher specs and 16/20GB of memory I'm sure. Will probably be on TSMC around £1000 to compete with the RX6900, that's the successor to the 2080ti.

No, I may have been incorrect if they used the regular 2080 in announcements.
I seem to recall them stating all of the new 30 cards were "faster than a 2080ti "but maybe i'm misremembering it or maybe it was 3rd part reviewers.
 
It's clearly tongue in cheek because of the other threads, but it's actually a fair enough question to ask. The early benchmarks show 6900XT is a faster GPU than either the 3070 or the 3080 so I would expect it to be able to make use of about 10-12Gb vRAM tops. I think that's what the competitor will essentially launch, a 12Gb 3080Ti with a GPU about the same speed. I think in 2-3 years time we'll look back on this specific card and see the same thing we saw with the 1080Ti or even 2080Ti which is that the additional memory they came with was never realistically used for anything.

We even see the 3090 running low on FPS with any game that pushes 10Gb of vRAM usage, Watch Dogs Legion at ultra 4k, FS2020 at Ultra 4k, Avengers at Ultra 4k, they all consume less than 10Gb vRAM but the frame rates are running into GPU bottlenecks already.
Far Cry 5 (HQ Textures) and Call of Duty World at War 2 (Extra Quality textures) both use up around 12GB of video memory right off the bat at 5120x1440, this creeps up to around 15-16GB after time so no doubt some caching going on there.

The 5700 XT can handle those games at these settings comfortably with playable FPS, but the stuttering that occurs due to video memory saturation is not a pleasant experience.

8GB will be enough for most games, but you will have to make image quality and texture quality sacrifices in some games where as users with 16GB won't. I personally wouldn't be comfortable using a GPU with more horsepower than a 5700 XT with only 8GB of memory, as i don't like lowering image quality due to video memory.

But as always, to each their own. Some people won't care about having to lower quality to mitigate stuttering and there's nothing wrong with that.
 
I seem to recall them stating all of the new 30 cards were "faster than a 2080ti "but maybe i'm misremembering it or maybe it was 3rd part reviewers.

That was said for the 3080 and 3070 in the announcement. But that's just good marketing as it was their previous flagship so it's in their interest to mention it (even though the 3070 is maybe 4 FPS faster on a handful of games), doesn't mean that a particular card is it's replacement though.
 
Back
Top Bottom