• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.

Yup resident evil village characters are very nicely detailed and there are a lot of areas which look fantastic, more so the indoor environments.

But surely we could do that for any game? i.e. google and find complaints for anything, even if it is a small minority of people having issues? Bit like that horizon zero dawn texture "issue", which turned out to be an issue down to one guys system or/and an issue with the game, which was either resolved with patches or/and a change/update to his system. Like I said, based on my experience, I didn't have any severe issues and seems like the "majority" of users on here didn't either, also, cyberpunk seems to be doing alright on metacritic for pc etc. so these severe LOD/texture pop in etc. issues mustn't be that bad if the "overall" majority are rating it good and not having these game ruining issues....

Again, that's a preference though isn't it? i.e. if I just took an initial look at the below, I would automatically go oh the non ray traced version looks better but then take a closer look, think of how real life lighting and your eyes work, the ray traced version is far more accurate (not to mention, screenshots don't show just how much better ray tracing is):

dS86mBo.png

3p3NAMa.png

No bias here (had 8400, 8800, 3850, 4850, 7850, vega 56 and now 3080 and play any game whether it is amd or nvidia sponsored, it just so happens to be nvidia sponsored games atm, which I prefer). I buy whatever is best for bang per buck at the time and that just so happened to be the 3080 this time round for my needs (and more so because, there was zero chance of getting a 6800xt at MSRP). Now when was the last time you had a nvidia gpu? :p ;) :D

Like I said before, it's swings and roundabouts on here with both sides, again:

- fury x when it released with its low vram, same arguments had back then as is happening right now
- crysis 2 and witcher 3 over done tessellation, same arguments as are being had right now but with vram instead

Mark my words, watch as the tables are reversed for the next big feature/gen ;)

Also, it has been proven that RE has "dumbed down" ray tracing unless have you got something that proves and states otherwise?


Not exactly rocket science.... amd don't have as good ray tracing perf. as nvidia, therefore consoles/amd have to dial back the effects otherwise they cripple themselves although it's better this than nvidia going overboard to the point they cripple even themselves alas witcher 3 and crysis 2..... Why would you turn off ray tracing though? Unless as stated, you have a "preference" for the way fake lighting looks but given this games setting and visuals/artistic style, you really would want ray tracing turned on...

Isn't this thread about the 3080/10GB VRAM????? I don't think anyone disagrees that 8GB is absolutely not enough going forward for 4k.... (was the 3070 ever pushed as being a 4k60 gpu either???) Not to mention 3080 and 3070 have different types of VRAM so you can't exactly do a like for like based purely on how much vram is being used, same way people said the same about the fury x and it's 4GB HBM....


So pretty much what I said then, because they reuse assets.... this is one way where they can keep vram low, therefore, there are methods where you can "optimise" your game and still have it look considerably better than vram hungry games. Like I said, it's all about the end result, if a game has far more unique assets but due to that, they have had to make sacrifices in other areas which results in a worse looking end result, who cares then? Now when you compare to a game which has reused assets and looks far better.... Guess which game I and I imagine the majority of people will prefer... which is exactly why the likes of div, rdr 2, gta 5, batman ak, cyberpunk, days gone are referred to as being some of the best graphical games....

which category is that? I can't remember.

:D
 
I can't fathom the utter nonsense coming from people saying 10GB is enough even when there is ample evidence form multiple games that show it isn't.

My own experience with my 3080 is that most times it is enough but on some occasions it utterly tanks. DCS Normandy map with quick mission against lots of enemies utterly tanks a 3080 on my Pimax 8KX. Oh but that's only one map in one sim game and with a Pimax 8KX VR HMD, so that's an outlier and doesn't count. My 6800 non XT (now sold) played that DCS Normandy mission on my Pimax 8KX far better than the 3080. So it absolutely wasn't a lack of GPU grunt but the VRAM limit.

You can't win with these people who keep moving the goalposts to mean "10GB is enough in the games I care about and we can dismiss the rest".

So what is it guys? Is 10GB enough as long as we stop finding instances where it isn't?
 
How about HW accerelation bloat then? Even if you have "10 GB", if you leave these programs in the background in VRAM heavy games, you're bound to have

- low %1s
- possible frame drops
- possible performance degrations

I get degradede framerate performance across a variety of ray tracing games at 1080p with a 3070 if I don't disable the hardware accerelation in the programs I use in the background (you may say that you can shut down them down, but I like them being on the background. this is why I choose PC as a platform, to be able to quickly transition between Discord, Chrome and other programs are not a thing I can do freely with a console.)

vlUs9AY.png

Tef76sU.png

In the end, my tweaks helped a lot in Cyberpunk with RT enabled. But;

- Juddery and laggy rendering performance for Steam, Discord, Chrome, it feels like it lags. It is clear that these apps were designed with HW accerelation in mind
- Extra burden on CPU because now they entirely on CPU (which I can sacrifice, thankfully)

I have even created a shortcut to kill dwm.exe because it gets bigger with time for unknown reasons and never evicts VRAM when games need to. I made extensive tests on this subject. Windows 10 is clearly an unoptimized mess.

These are all valid points for a 3080. These reviews are mostly done on setups where every kind of background app is closed. What if the casual user won't? This is one of the most casual situations you can be in. Windwos 10 and 3-4 programs clearly use nearly "1 GB" of VRAM. So you don't even have raw 8 GB or raw 10 GB with a 3070 and 3080 unless you shut down everything or shut down HW accerelation (which is a valuable tip. i bet most of the 3080 users that love multitasking will have to use these kind of CHEAP tactics for bareable frametimes in near future, as I'm already being forced at 1080p with certain RT games like Cyberpunk and RE Village)
 
Hardware Unboxed had an interesting question about some of the games coming out now, in that 8 and 10GB VRam has proven to be not really enough, are the RTX 3070 and 3080 starting to struggle a little in some later games?

Its ok, don't worry, Nvidia will have higher capacity VRam GPU's to sell you soon :)

They also mention RT performance on RDNA2 GPU's, that it appears to be improving with later games. So at this point they are conceding the RT games they tested were designed for Nvidia, and that AMD's RT performance should not be judged on those game, yes i said this at the time, and they say they said this all along, no they ####### didn't.

https://youtu.be/oQ-qFOcgzGo?t=273
 
Hardware Unboxed had an interesting question about some of the games coming out now, in that 8 and 10GB VRam has proven to be not really enough, are the RTX 3070 and 3080 starting to struggle a little in some later games?

Its ok, don't worry, Nvidia will have higher capacity VRam GPU's to sell you soon :)

They also mention RT performance on RDNA2 GPU's, that it appears to be improving with later games. So at this point they are conceding the RT games they tested were designed for Nvidia, and that AMD's RT performance should not be judged on those game, yes i said this at the time, they say they said this all along, no they ####### didn't.

https://youtu.be/oQ-qFOcgzGo?t=273

Extra raster fire-power definetely helps offsetting the RT performance deficit to a degree, as well. Not to mention, RDNA2 benefits hugely from SAM and that is also a boost. Only missing thing from the equation is FSR for now.
 
They also mention RT performance on RDNA2 GPU's, that it appears to be improving with later games. So at this point they are conceding the RT games they tested were designed for Nvidia, and that AMD's RT performance should not be judged on those game, yes i said this at the time, and they say they said this all along, no they ####### didn't.

RT performance is going to be a moving target for many years, and it makes sense that RT favours Nvidia as AMD only recently entered the game.

Software and implementation techniques will improve as it becomes a more common feature in games, which should reduce the overhead.
 
Yup resident evil village characters are

I assume RT is on the left. Man, the difference between it being on or off shouldn't be that huge in the second pic. Not sure why it is so big, surely they could be closer together, it'd be like playing a different game with the difference in lighting.

Also, can't believe people are still arguing this point. It's been known since the beginning that 10gb wasn't going to be enough, but it was fine for most current games and probably some future ones too. I guess the price point/performance of the 3080 just didn't want some people to accept it.
 
A lot of these reviewers just plain called AMD RT performance very average at best and very bad at worst, the kindest thing a lot of them reasoned was its AMD first attempt at it and Nvidia had a lot more experience, the fact that these were all Nvidia sponsored tittles was rarely a reason given.

I suppose they believed that, perhaps one day they will stop underestimating AMD and stop having to rewrite their own history to save face, but how many times do AMD have to slap them in the face before they realise AMD should be much more carefully considered in what they can do?
 
I can't fathom the utter nonsense coming from people saying 10GB is enough even when there is ample evidence form multiple games that show it isn't.

My own experience with my 3080 is that most times it is enough but on some occasions it utterly tanks. DCS Normandy map with quick mission against lots of enemies utterly tanks a 3080 on my Pimax 8KX. Oh but that's only one map in one sim game and with a Pimax 8KX VR HMD, so that's an outlier and doesn't count. My 6800 non XT (now sold) played that DCS Normandy mission on my Pimax 8KX far better than the 3080. So it absolutely wasn't a lack of GPU grunt but the VRAM limit.

You can't win with these people who keep moving the goalposts to mean "10GB is enough in the games I care about and we can dismiss the rest".

So what is it guys? Is 10GB enough as long as we stop finding instances where it isn't?

Sorry if it has been brought up before but what other games is a 3080 having issues with because of vram @1440/4k? AFAIK, the only ones people have mentioned so far have been:

- godfall - an outlier and even then, as per the youtube videos etc. posted before, a 3080 looks to be performing very well here

- hzd - debunked since the one person who did a comparison was using a 5700xt/vega iirc and on an older version of the game, which had issues rendering textures

- re village - as above, seems like there aren't any issues with performance and a 3080 actually performs better than a 6800xt (with ray tracing on)

DCS Normandy map? What game is that from?

How about HW accerelation bloat then? Even if you have "10 GB", if you leave these programs in the background in VRAM heavy games, you're bound to have

- low %1s
- possible frame drops
- possible performance degrations

I get degradede framerate performance across a variety of ray tracing games at 1080p with a 3070 if I don't disable the hardware accerelation in the programs I use in the background (you may say that you can shut down them down, but I like them being on the background. this is why I choose PC as a platform, to be able to quickly transition between Discord, Chrome and other programs are not a thing I can do freely with a console.)

vlUs9AY.png

Tef76sU.png

In the end, my tweaks helped a lot in Cyberpunk with RT enabled. But;

- Juddery and laggy rendering performance for Steam, Discord, Chrome, it feels like it lags. It is clear that these apps were designed with HW accerelation in mind
- Extra burden on CPU because now they entirely on CPU (which I can sacrifice, thankfully)

I have even created a shortcut to kill dwm.exe because it gets bigger with time for unknown reasons and never evicts VRAM when games need to. I made extensive tests on this subject. Windows 10 is clearly an unoptimized mess.

These are all valid points for a 3080. These reviews are mostly done on setups where every kind of background app is closed. What if the casual user won't? This is one of the most casual situations you can be in. Windwos 10 and 3-4 programs clearly use nearly "1 GB" of VRAM. So you don't even have raw 8 GB or raw 10 GB with a 3070 and 3080 unless you shut down everything or shut down HW accerelation (which is a valuable tip. i bet most of the 3080 users that love multitasking will have to use these kind of CHEAP tactics for bareable frametimes in near future, as I'm already being forced at 1080p with certain RT games like Cyberpunk and RE Village)

I never close discord, steam, chrome etc. when gaming on a 3080, 5600x and 16gb ram, even with cyberpunk.

iPYCRHy.png

Although I do install the minimal/clean nvidia drivers because geforce experience is a POS and to not have any telemetry BS going on.

As mentioned before, your cpu (2700x iirc) is what is causing a lot of your issues because of the nvidia driver overhead. Upgrading my 2600 @4.1GHz to a 5600x increased the experience tenfold.

Hardware Unboxed had an interesting question about some of the games coming out now, in that 8 and 10GB VRam has proven to be not really enough, are the RTX 3070 and 3080 starting to struggle a little in some later games?

Its ok, don't worry, Nvidia will have higher capacity VRam GPU's to sell you soon :)

They also mention RT performance on RDNA2 GPU's, that it appears to be improving with later games. So at this point they are conceding the RT games they tested were designed for Nvidia, and that AMD's RT performance should not be judged on those game, yes i said this at the time, and they say they said this all along, no they ####### didn't.

https://youtu.be/oQ-qFOcgzGo?t=273

Yup, metro enhanced ray tracing was very good on amd hardware, I don't think amd current RDNA 2 will ever firmly beat or even match ampere for ray tracing perf. though:

https://www.pcgameshardware.de/Rayt...cials/ART-Mark-Raytracing-Benchmarks-1371125/
 
Last edited:
I assume RT is on the left. Man, the difference between it being on or off shouldn't be that huge in the second pic. Not sure why it is so big, surely they could be closer together, it'd be like playing a different game with the difference in lighting.

Also, can't believe people are still arguing this point. It's been known since the beginning that 10gb wasn't going to be enough, but it was fine for most current games and probably some future ones too. I guess the price point/performance of the 3080 just didn't want some people to accept it.

Ray traced version is on the right.

Not a great screenshot/comparison that first one tbf as you can't see the light sources entirely.
 
Yup, metro enhanced ray tracing was very good on amd hardware, I don't think amd current RDNA 2 will ever firmly beat or even match ampere for ray tracing perf. though:

https://www.pcgameshardware.de/Rayt...cials/ART-Mark-Raytracing-Benchmarks-1371125/

Agreed, turns out its a lot better than initially proclaimed tho.

PS: don't know the relevance of the link you posted, appears to be a German language site with a crap ton of advert banners and some synthetic RT Benchmark.
 
Isn't this thread about the 3080/10GB VRAM?????
Yes it is but you took it off topic with tangents, so I'll leave it there. :p

I can't fathom the utter nonsense coming from people saying 10GB is enough even when there is ample evidence form multiple games that show it isn't.

My own experience with my 3080 is that most times it is enough but on some occasions it utterly tanks. DCS Normandy map with quick mission against lots of enemies utterly tanks a 3080 on my Pimax 8KX. Oh but that's only one map in one sim game and with a Pimax 8KX VR HMD, so that's an outlier and doesn't count. My 6800 non XT (now sold) played that DCS Normandy mission on my Pimax 8KX far better than the 3080. So it absolutely wasn't a lack of GPU grunt but the VRAM limit.

You can't win with these people who keep moving the goalposts to mean "10GB is enough in the games I care about and we can dismiss the rest".

So what is it guys? Is 10GB enough as long as we stop finding instances where it isn't?
Pretty much, yeah. You've summed it up well.
 
Ray traced version is on the right.

Not a great screenshot/comparison that first one tbf as you can't see the light sources entirely.

I prefer the screenshots on the left lol

They look overblown on the right.

But the difference still shouldn't be that big, seems a poor attempt by the developers.
 
Sorry if it has been brought up before but what other games is a 3080 having issues with because of vram @1440/4k? AFAIK, the only ones people have mentioned so far have been:

- godfall - an outlier and even then, as per the youtube videos etc. posted before, a 3080 looks to be performing very well here

- hzd - debunked since the one person who did a comparison was using a 5700xt/vega iirc and on an older version of the game, which had issues rendering textures

- re village - as above, seems like there aren't any issues with performance and a 3080 actually performs better than a 6800xt (with ray tracing on)

DCS Normandy map? What game is that from?



I never close discord, steam, chrome etc. when gaming on a 3080, 5600x and 16gb ram, even with cyberpunk.

iPYCRHy.png

Although I do install the minimal/clean nvidia drivers because geforce experience is a POS and to not have any telemetry BS going on.

As mentioned, your cpu (2700x iirc) is what is causing a lot of your issues because of the nvidia driver overhead.



Yup, metro enhanced ray tracing was very good on amd hardware, I don't think amd current RDNA 2 will ever firmly beat or even match ampere for ray tracing perf. though:

https://www.pcgameshardware.de/Rayt...cials/ART-Mark-Raytracing-Benchmarks-1371125/


https://www.youtube.com/watch?v=sJ_3cqNh-Ag (1+ gb shared memory usage, stutters)

https://www.youtube.com/watch?v=RUdK3W6bYWc (0.7 gb shared memory usage, stutters)

https://www.youtube.com/watch?v=v74fd_w_AxI (game restart, 0.15 gb shared memory usage, pristine performance)

(compare and contrast the last 2 vids. tell me if its CPU related, really)


CPU has NOTHING to do with the topic. We're talking about VRAM here (I prove it. Video is there. Shared memory usage can be seen. The higher PCIE bandwidth usage can be seen [it is because game constantly flows data from RAM to VRAM). I'm having VRAM RELATED issues in CYBERPUNK, GODFALL, AND RE VILLAGE AT 1080p/1440p. There's no point bringing up the CPU here (nice trick to derail a discussion, though. props)

And do you play at 4K? If so, weird. I would like to have a 10 minutes of video where you cruise around the city with a frametime graph open. I present real and solid facts. I would suggest you do the same. If you're at 1440p, this is irrevelant. 10 GB will be good enough for a couple more years. It is the 4K where the most of the discussion starts.
 
Last edited:
Metro Exodus Enhanced is skimping on RT effects, probably due to it being a free upgrade.

It's when you dial up the effects, such as CP2077, that you see AMD crumple. Even Matt admitted that you have to turn down IQ for AMD to be competitive in that title and that there is the problem. Why would I buy a next gen card that just doesn't do next gen well.

At best RDNA2's RT is comparable to Turing and we know how that turned out.

So back on topic, is 10GB enough? Well why isn't it today and why won't it be tomorrow?
 
So pretty much what I said then, because they reuse assets.... this is one way where they can keep vram low, therefore, there are methods where you can "optimise" your game and still have it look considerably better than vram hungry games. Like I said, it's all about the end result, if a game has far more unique assets but due to that, they have had to make sacrifices in other areas which results in a worse looking end result, who cares then? Now when you compare to a game which has reused assets and looks far better.... Guess which game I and I imagine the majority of people will prefer... which is exactly why the likes of div, rdr 2, gta 5, batman ak, cyberpunk, days gone are referred to as being some of the best graphical games....



:D

No not pretty much what you said, not even close. You also ironically enough ignored this

So we are just going to ignore that certain games, genres and artistic directions lend themselves more to reusing certain assets to help build the enviroment over other games which may require more unique assets per scene to help build the enviroment.

I guess you just didn't understand the statement. That is also fine.


You then proceed to claim this

if a game has far more unique assets but due to that, they have had to make sacrifices in other areas

How does increasing the number of unique assets mean they are making sacrifices in other areas relating to graphics? The number of unique assets used to produce a certain look, is based on the concept art and/or the artistic direction of the game. That's it.

Guess which game I and I imagine the majority of people will prefer... which is exactly why the likes of div, rdr 2, gta 5, batman ak, cyberpunk, days gone are referred to as being some of the best graphical games....

You also seemed to have ignored this part as well

Define "good" and "better"? Are we talking technically or artistically? Because one of those has nothing to do with VRAM.

I see a theme going on. Then again, it is not like you actually care why the VRAM is different for the games, now do you.
 
Last edited:
I prefer the screenshots on the left lol

They look overblown on the right.

But the difference still shouldn't be that big, seems a poor attempt by the developers.

There is not many people who wear sunglasses while gaming :eek:

The image on the right has increased the number of light bounces making it more realistic. Metro Exodus on release was negatively criticised on its RT due to the overly dark areas.
 
Status
Not open for further replies.
Back
Top Bottom