• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Kinda pointless question really, 10GB VRAM is what the RTX 3080 has, you just adjust settings as needed. If not, buy AMD or wait for next gen.
'Adjust settings as needed' on a brand-new, £700 flagship GPU because it doesn't have the VRAM to run Ultra on new game(s) that it would otherwise have enough power to run? Yeah, that sounds like a road to buyer satisfaction on a card you bought less than a month or two ago.

This whole Godfall thing has yet to be confirmed in real-world testing, but as others have posted, there is no smoke without fire, and Ampere has been dogged by concerns about VRAM from both consumers and tech media alike, since the moment the specs were released. No way would I be buying such an expensive card when I'm not even sure, from the moment I buy it, if it has enough VRAM to deal with games released in the next 6 months... never mind 2 years. It's the kind of uncertainty you just shouldn't have to deal with on any expensive flagship before you even buy it, and I can't remember the last time this was such a concern on a new flagship card. Seriously, it's just nuts to buy something, for this price, with that kind of uncertainty from the get-go.

AMD going with 16GB of cheaper GDDR6 RAM on Big Navi, GDDR6 that due to their clever engineering has ended up just as fast in real-world performance as a card with newer and more expensive GDDR6x, was a very smart choice to both put people's mind at ease and remove that potential limitation as a concern for the entire generation. It's what you expect a company to do on a flagship GPU for 2021. Nvidia even openly said in their developer blog that they were forced to compromise on VRAM due to the price of GDDR6x. AMD didn't have to compromise, they were smarter.

The other day I was also thinking that at some point AMD would be using some of their own marketing tactics and cash to work with devs to use that extra VRAM and features that the 3080 doesn't have and I see that it didn't take them long. If I were AMD I would be doing exactly the same and as I have previous posted I predict some level of buyers remorse for any 3080 10GB owner in 2021, due to having 10GB VRAM. It just hasn't kicked in yet.
 
Last edited:
999 times out of 1000 in my experience it is because they aren't comparing like to like - there is a very slight difference mostly in red saturation (someone linked to it before) but properly setup it is very very close and in most cases impossible to tell by eye.

This ain't true imo but would have believed you if i never had eyes. Straight swap from a 290x 8gb to a 1080ti and not just me noticed it. Full rgb was the first thing i made sure was selected. He had a second pc so we ran them side by side but this time was not completely apples to apples as the 290 was on a cheaper monitor and the difference was still very easy to see. We even brought his kids in who don't know crap about pc hardware. They also thought the 290 screen looked way better.

It can't just be setup as this has been going around for a long time now with many saying the same and not to many think the other way around.

A 1080ti was a huge improvement and in the main he was pleased, pleased enough to then buy a 2080ti as the performance was there but he was always a bit miffed when at mine that my visuals looked better. People can say and think what they like but once you see it you have to go on your own experience.
 
With that screen it's a no brainer, 165 frames synced to perfection @ 1440p? Also DLSS which is becoming more widely adopted now?, don't let the 10GB brigade get in your head, these are features you can make use of right now, and tbh at your resolution your should be more than fine, hand of heart I would rather have g-sync with a slower card than a faster card without it, it makes games feel like butter wouldn't melt.
Yeah but I'm wondering if i can run high frame rates anyway would it matter that much? I suppose i can just turn off gsync now and see how it runs, if i don't like it then stick with Nvidia. It's disappointing though, potentially better perf with AMD and more vram
 
This ain't true imo but would have believed you if i never had eyes. Straight swap from a 290x 8gb to a 1080ti and not just me noticed it. Full rgb was the first thing i made sure was selected. He had a second pc so we ran them side by side but this time was not completely apples to apples as the 290 was on a cheaper monitor and the difference was still very easy to see. We even brought his kids in who don't know crap about pc hardware. They also thought the 290 screen looked way better.

It can't just be setup as this has been going around for a long time now with many saying the same and not to many think the other way around.

A 1080ti was a huge improvement and in the main he was pleased, pleased enough to then buy a 2080ti as the performance was there but he was always a bit miffed when at mine that my visuals looked better. People can say and think what they like but once you see it you have to go on your own experience.

I have no idea what is going on in your instance but I have actually tested my old 780GHz side by side with a 290X on my Dell U2913WM which comes with an individual panel industry standard certified colour calibration and has PBP mode for side by side testing. Properly setup (and that doesn't include digital vibrancy adjustments) the differences are incredibly tiny between them (in fact by eye the only difference I could really see was that AMD had slightly better distinction between dark red and dark brown colours in Dishonoured - which matches up with the PCM analysis handily).

If there was that level of difference to it GN and HU, etc. would be all over it.

You probably remember the thread on this awhile back - there is some slight differences as per here https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

As an aside cheaper monitors can sometimes use tricks to boost the perceived colour contrast to make the display look more attractive to those who don't have an eye for colour accuracy at the expense of image quality.
 
I just read about godfall needing 12gb vram too over here. Looks like the 3070 will suffer and maybe 3080. Especially if more games are like that, wonder if cyberpunk at 4k maxed out needs more than 10gb?
https://www.dsogaming.com/news/nvidia-geforce-rtx3080-godfall-4k-ultra-12gb-vram/

Just to add to this, considering that this game has been in development for probably 2-3 years and we only found out the specs for the 3000 series a month ago the data in the game probably isn't being inflated by AMD to screw over Nvidia. They probably saw it more as an opportunity to shoot a shot at Nvidia and got the devs to mention it in the presentation.

It does serve as interesting look into what to expect from next gen games. Let's see how they've used all that extra VRAM.
 
Here is something interesting about vram usage that I did not know about from techpowerup:

Because of the high-res texture pack, which is an optional download, we can finally make use of all the VRAM paired with modern GPUs. Even at 1080p, Watch Dog Legion will use more than 6 GB, which is a good thing as textures are much improved. NVIDIA cards with 3 GB and 4 GB VRAM do much better than their AMD counterparts in this memory constrained situation. For example, the 4 GB RX 5500 XT drops to only 12 FPS, whereas the GTX 1030 3 GB runs twice as fast at 22 FPS. This just confirms what we've been seeing in such scenarios previously—NVIDIA handles memory pressure much better than AMD.

Link


Will be interesting to see the results once Godfall and future next gen games come out.
 
Here is something interesting about vram usage that I did not know about from techpowerup:



Link


Will be interesting to see the results once Godfall and future next gen games come out.

That *might* explain why amd consistently tries to offer more vram than Nvidia if it's compression tech is inferior to Nvidias

also 6gb vram used at 1080p is impressive

True nextbgen games will be worse, just watched an uncompressed 4k video capture of miles morales on the ps5 and the texture resolution is insane, I could see little grains in the pavement and road, it's extremely high quality and making full use of that 16gb memory
 
'Adjust settings as needed' on a brand-new, £700 flagship GPU because it doesn't have the VRAM to run Ultra on new game(s) that it would otherwise have enough power to run?

The 3080 already cant hit a 60fps average on a number of titles at 4k if you max everything out, and out of all of those we've seen so far, none of them have been a result of having insufficient vram, not even watchdogs. people buying a new card expecting everything to run flawlessly using ultra over-the-top settings need a reality check i think.
 
Here is something interesting about vram usage that I did not know about from techpowerup:



Link


Will be interesting to see the results once Godfall and future next gen games come out.
That *might* explain why amd consistently tries to offer more vram than Nvidia if it's compression tech is inferior to Nvidias

Because of the high-res texture pack, which is an optional download, we can finally make use of all the VRAM paired with modern GPUs. Even at 1080p, Watch Dog Legion will use more than 6 GB, which is a good thing as textures are much improved. NVIDIA cards with 3 GB and 4 GB VRAM do much better than their AMD counterparts in this memory constrained situation. For example, the 4 GB RX 5500 XT drops to only 12 FPS, whereas the GTX 1030 3 GB runs twice as fast at 22 FPS. This just confirms what we've been seeing in such scenarios previously—NVIDIA handles memory pressure much better than AMD.
Did they base this conclusion purely on the 5500 XT? If so, you might want to have a read of this. https://www.techpowerup.com/forums/...-have-enough-bandwidth-for-rx-5500-xt.262236/ :)

Test System
Processor:
Intel Core i9-9900K @ 5.0 GHz
(Coffee Lake, 16 MB Cache)
 
That *might* explain why amd consistently tries to offer more vram than Nvidia if it's compression tech is inferior to Nvidias

also 6gb vram used at 1080p is impressive

True nextbgen games will be worse, just watched an uncompressed 4k video capture of miles morales on the ps5 and the texture resolution is insane, I could see little grains in the pavement and road, it's extremely high quality and making full use of that 16gb memory
Yeah. Will be interesting to see what happens over the next 12 months.

As someone who will be likely only keeping the 3080 for 18 months or so and selling before the next gen hits, I am ok with 10gb.

Before all the specs came out I wanted to see a 3070 no less than 12gb and 16gb for 3080. But it is what it is. No doubt next gen hopper cards the 4070 will be 16gb minimum and all this is 10gb enough for next gen titles rubbish will be forgotten about.
 
Did they base this conclusion purely on the 5500 XT? If so, you might want to have a read of this. https://www.techpowerup.com/forums/...-have-enough-bandwidth-for-rx-5500-xt.262236/ :)

Test System
Processor:
Intel Core i9-9900K @ 5.0 GHz
(Coffee Lake, 16 MB Cache)
But they said “This just confirms what we've been seeing in such scenarios previously—NVIDIA handles memory pressure much better than AMD.” so I assume they have seen it more than once.

Either way does not change my mind on anything. We will see what is what in 12 months time :)
 
I just read about godfall needing 12gb vram too over here. Looks like the 3070 will suffer and maybe 3080. Especially if more games are like that, wonder if cyberpunk at 4k maxed out needs more than 10gb?
https://www.dsogaming.com/news/nvidia-geforce-rtx3080-godfall-4k-ultra-12gb-vram/

Not surprised. 10GB on the flagship is a total joke. Luckily for Nvidia, it will be quickly rectified with the 3080ti. I wonder if they'll go with 20GB with the same 320bit memory bus, or increase the memory bus to 384bit and go with 12/24? I don't think they'll get away with 12GB either, from PR standpoint.

I wonder how quickly they can rush it out?
 
That *might* explain why amd consistently tries to offer more vram than Nvidia if it's compression tech is inferior to Nvidias

also 6gb vram used at 1080p is impressive

True nextbgen games will be worse, just watched an uncompressed 4k video capture of miles morales on the ps5 and the texture resolution is insane, I could see little grains in the pavement and road, it's extremely high quality and making full use of that 16gb memory

Nvidia's compression technology is inferior to AMD's. I can make claims too with no evidence :rolleyes:

Just amazing how you spout nonsense like that before we've had third party reviews testing said memory compression, RDNA2 is a new architecture yo.
 
Nvidia's compression technology is inferior to AMD's. I can make claims too with no evidence :rolleyes:

Just amazing how you spout nonsense like that before we've had third party reviews testing said memory compression, RDNA2 is a new architecture yo.
It is Techpowerup making the claim. Not anyone here.

Still yet to see a single game that needs more than 10gb. Yet you keep spouting nonsense that it is not enough on doom etc.
 
Kinda pointless question really, 10GB VRAM is what the RTX 3080 has, you just adjust settings as needed. Unfortunately gddr6x isnt cheap, nor particularly power efficient.

Game developers will try to keep to the budget available for recent GPUs, not much point in designing settings for gpus costing over £1000, like the rtx 3090.

If not, buy AMD or wait for next gen.

gddr6x is made to be more power efficient than GDDR6. At 21Gb/s, GDDR6X uses 15% less power per bit than GDDR6 at 14Gb/s. https://venturebeat.com/2020/09/15/...-unlocking-4k-on-nvidias-rtx-30-series-cards/

Everyone ignores the fact AMD have much lower memory bandwidth. At 4k it will hurt performance. This is why the whole line up from the 6900xt down has the same vRAM. If they reduced the RAM the bandwidth loss would be massive. It's like the whole NVidia line up going with the 3090's 24GB of vRAM and 384 bit bus.

Also every AMD benchmark ignores RT. So they run games in DX11 (DX12 is faster https://youtu.be/gUVItGDbygM?t=164), like Battlefield V. Shadow of the tomb raider has no AA, RT or DLSS used. Same with wolfenstein Youngblood no RT or DLSS. You can tell by the FPS they are off. The only reason to do that is NVidia cards are faster with different settings.

I doubt that the 6800xt can run a RT game at 4k @60fps at this point. I would guess AMD cards are faster in DX11. Time will tell if this is true.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom