• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
CglI0zx.jpg

Here's what 4K, RT On, HD Textures, FSR Off looks like for me ;)

Seems okay with FSR @ UQ though...

It's possible that when they fixed the bug that caused low resolution textures to appear on nvidia cards it increased the VRAM enough that you will not be able to get away with running those settings now on the 10GB card.
 
And that's why I am bringing his video up as well as what "some" 3090 owners over on ubi support forums posted with regards to fps plummeting too i.e. they are having awful fps issues too and yet we have ruled out the amount of vram as being the factor so what is causing their issues?
We haven't ruled it out. With the Joker video, it was going fine, then tanked. He then claimed to take a screenshot, which solved it. That may solve it for 3090 owners too as it sounds like a game bug which is a very different issue I believe I have where the vRAM never allows me to get going at 4k.

I am using windows 10, and I have rebar on. Rebar is a difference which may impact things, I'll have a look at rebar later. I am not updating to Windows 11 until I absolutely have to though (it screws up my audio drivers and ruins my sound setup).
6800XT isn't the point, but I believe it is getting playable frame rates (maybe not locked 60fps, but playable) at 4k with HD - It does not need FSR to do this. My 3080 is not getting playable frame rates, unless I turn on FSR (reducing video processing requirements etc). Grunt is fairly similar between the two cards across a wide variety of games (RT excepted), so until I can prove otherwise by getting decent gameplay on my machine, vRAM is the big difference here.

I'm not interested in the 6800XT performance however. It's not the question I'm trying to answer.
 
CglI0zx.jpg

Here's what 4K, RT On, HD Textures, FSR Off looks like for me ;)

Seems okay with FSR @ UQ though...

It's possible that when they fixed the bug that caused low resolution textures to appear on nvidia cards it increased the VRAM enough that you will not be able to get away with running those settings now on the 10GB card.

My videos are after the patch, which fixed the texture rendering issue, it also improved my performance and fixed some of the stuttering too. As noted by several of us, with that patch, RAM usage jumped up by 5/6GB too (my ram usage went from 10/11gb to 15/16gb) so obviously whatever ubi did meant they were offloading some work to the RAM.

We haven't ruled it out. With the Joker video, it was going fine, then tanked. He then claimed to take a screenshot, which solved it. That may solve it for 3090 owners too as it sounds like a game bug which is a very different issue I believe I have where the vRAM never allows me to get going at 4k.

I am using windows 10, and I have rebar on. Rebar is a difference which may impact things, I'll have a look at rebar later. I am not updating to Windows 11 until I absolutely have to though (it screws up my audio drivers and ruins my sound setup).
6800XT isn't the point, but I believe it is getting playable frame rates (maybe not locked 60fps, but playable) at 4k with HD - It does not need FSR to do this. My 3080 is not getting playable frame rates, unless I turn on FSR (reducing video processing requirements etc). Grunt is fairly similar between the two cards across a wide variety of games (RT excepted), so until I can prove otherwise by getting decent gameplay on my machine, vRAM is the big difference here.

I'm not interested in the 6800XT performance however. It's not the question I'm trying to answer.

But can't you see it is still the same behaviour from his video back on launch and same as the 3090 owners who reported the issue, it's still fps plummeting to single digits, there's no 2 ways about that..... so we have removed the "vram amount" from the equation yet for some reason, they still get fps plummeting? Why is that? Could it be there is something wrong with the game where on some setups, it just so happens to be issues, same way for example, the game isn't even launching for me yet it is obviously launching fine for you?

If you have a recent nvidia driver, rebar should be turned off on the driver side.

And what about in my case? If I switched to a 6800xt, it would provide more or less the same performance that my 3080 gets me in FC 6 and I would still have to turn FSR on.... To me, it needs to be a locked 60 to be playable, anything less such as those dips to 40s as reported by PCGH and computerbase are not "playable", same way the same dips in the 40s on my 3080 are not playable.
 
So why the big deal of having to enable FSR on the 3080 to solve the issues on your end and whoever else's end, however, at the same time, having to enable FSR on the competing card with more vram because said card doesn't have enough grunt as it is.... no one bats an eye lid....
Missed this - Its not a big deal. At all. I would DEFINITELY use it.

But the point I'm trying to make is that it seems 10gb is not enough for this game to play 4k WITHOUT upscaling it from a lower resolution.
 
Cheers Bill, I see it is choking hard in actual gameplay at 4K max settings. It's exactly the same as PCGH and Computer base saw. Also the same as what Tommy and Gerard saw, but they were in different parts of the campaign further on in the game. @tommybhoy, you still got your video?

Ahh. So its exactly what we were posting. One, two, three, four, five.. users then (and will keep incrementing when people use these conditions) so unlikely to be anything other than hardware limitation. :cool: Also note posts which (finger in ears from multiple people) were playing in 4k resolution which seems to be less than 10% people that own the card. No wonder they have never seen it then! :cry:

But the point I'm trying to make is that it seems 10gb is not enough for this game to play 4k WITHOUT upscaling it from a lower resolution.

Glad you caught up with the inner knitting posters then Bill?! :rolleyes:

Enjoy the sun.
 
And what about in my case? If I switched to a 6800xt, it would provide more or less the same performance that my 3080 gets me in FC 6 and I would still have to turn FSR on.... To me, it needs to be a locked 60 to be playable, anything less such as those dips to 40s as reported by PCGH and computerbase are not "playable", same way the same dips in the 40s on my 3080 are not playable.
Agreed. But my dips are to the 10s. That is not comparable. I need fsr to go from 10s to 60s, not 40s to 60s. Its very different. 40, though too low for me, is actually playable.

Maybe your machine isn't fully loading textures which is why you can get it working? Unlikely I know, but you can't even start it now, so somethings not right.
 
Ahh. So its exactly what we were posting. One, two, three, four, five.. users then (and will keep incrementing when people use these conditions) so unlikely to be anything other than hardware limitation. :cool: Also note posts which (finger in ears from multiple people) were playing in 4k resolution which seems to be less than 10% people that own the card. No wonder they have never seen it then! :cry:

Enjoy the sun.
We did ta, we were down on Yarmouth beach front with the little un earlier, popped in to say hello to @mingey. :cry:
 
Missed this - Its not a big deal. At all. I would DEFINITELY use it.

But the point I'm trying to make is that it seems 10gb is not enough for this game to play 4k WITHOUT upscaling it from a lower resolution.

Finally you agree then that certain individuals are just making a big deal of it for the narrative/agenda of "zOMG vram not enough, massive issue in multiple games right now! Let me go and secretly stroke my 3090 now to justify the expensive purchase" :D

As we have all said, why aren't there far more people posting about it on other forums (again, people will always post more when they have issues, just see CP 2077 release day for proof of this), why didn't the rest of the review/technical sites note it (we only have 2? btw, both of which had "rebar" on [nvidia had it enabled for launch but disabled it later on], which is where I faced the issues), including HUB who did note it in their 3070 video? (although I believe they had rebar on for the 3080 vs 6800xt video too so maybe rebar isn't the "sole" cause of the issue either?)

Agreed. But my dips are to the 10s. That is not comparable. I need fsr to go from 10s to 60s, not 40s to 60s. Its very different. 40, though too low for me, is actually playable.

Maybe your machine isn't fully loading textures which is why you can get it working? Unlikely I know, but you can't even start it now, so somethings not right.

Yup like I said, I don't get fps plummets anywhere near that, I've already stated where and when I get issues with fps plummeting to that level though:

- benchmark regardless of rebar
- gameplay with rebar on

If it was happening in gameplay with the same settings, I would say so but alas, I think we can agree, it's not quite a clear cut "100% vram" and "nothing" else at play here.

You can see my videos, textures look fine to me, this was an issue before the patch though, as I posted many screenshots showing the issue before the patch in the fc 6 thread.

As I said many times, if we use the same brush to tarnish everyone with some issue then by that logic, everyone should be facing the same issue I am currently encountering i.e.


:D
 
Finally you agree then that certain individuals are just making a big deal of it for the narrative/agenda of "zOMG vram not enough, massive issue in multiple games right now! Let me go and secretly stroke my 3090 now to justify the expensive purchase" :D
No, that's not at all what I'm saying, I'm saying 10gb is not enough (seemingly) to run this game at 4k without upscaling. I am fine with that.
I said many times, if we use the same brush to tarnish everyone with some issue then by that logic, everyone should be facing the same issue I am currently encountering i.e.
That'll be system issues.
 
No, that's not at all what I'm saying, I'm saying 10gb is not enough (seemingly) to run this game at 4k without upscaling. I am fine with that.

That'll be system issues.
I'm referring to this comment of yours:

Its not a big deal. At all. I would DEFINITELY use it.

Yet 2 individuals are insisting that it is a big deal and the gpu is suffering in "multiple" games because of it (still waiting on the evidence on them though since all we have so far is "anecdotal").

Careful now with "blanket statements" Bill..... :cry: It's not enough in "yours" and a select few others situations, for me and HUB it appears fine if we ignore the low fps because of the lack of grunt :D ;)

But every other game launches fine including other games on my uplay connect account so it must be an issue with the game.... ;)
 
Yet 2 individuals are insisting that it is a big deal and the gpu is suffering in "multiple" games because of it (still waiting on the evidence on them though since all we have so far is "anecdotal").
Im not talking about, promoting or defending this. I'm certainly not sure who is saying it's a big deal, I thought it was generally agreed turning a setting down was absolutely fine. Let me know where I'm mistaken though.

I'm looking at a 3080 not being able to run far cry at 4k, with the evidence (at least for my machine) being that VRAM is the reason. I cannot see any other reason as yet. Other people have also, today, found the same as me. If we can sort it, that's brilliant for everyone.

At the moment, it's your machine that is the outlier (in this forum). Speaking of which
As we have all said, why aren't there far more people posting about it on other forums
Loads on reddit. You have pointed out that people are having various issues with high vram cards, but there are also loads who find the same as me. Could well be two separate issues here, one being a bug, one being vram limit. All I'm trying to do is clear up the vram limit issue, so that we can say for certain that it is an engine bug causing drop outs .I'm so far unable to do that.

If vram isn't an issue, it should be easy to solve. The answer from a gameplay point of viee is to run FSR, which is fine and should be done. But that doesn't disprove the vram issue when running native.
 
Im not talking about, promoting or defending this. I'm certainly not sure who is saying it's a big deal, I thought it was generally agreed turning a setting down was absolutely fine. Let me know where I'm mistaken though.

I'm looking at a 3080 not being able to run far cry at 4k, with the evidence (at least for my machine) being that VRAM is the reason. I cannot see any other reason as yet. Other people have also, today, found the same as me. If we can sort it, that's brilliant for everyone.

At the moment, it's your machine that is the outlier (in this forum). Speaking of which

Loads on reddit. You have pointed out that people are having various issues with high vram cards, but there are also loads who find the same as me. Could well be two separate issues here, one being a bug, one being vram limit. All I'm trying to do is clear up the vram limit issue, so that we can say for certain that it is an engine bug causing drop outs .I'm so far unable to do that.

If vram isn't an issue, it should be easy to solve. The answer from a gameplay point of viee is to run FSR, which is fine and should be done. But that doesn't disprove the vram issue when running native.

Clearly you haven't read the man with an axe to grind posts then, don't blame you if you have skipped them though tbf :cry:



Like I've said a few times now when it comes to my scenario and HUBs and the likes of yours:

why this happens, I don't know, same way I don't know why it also happens to some 3090 users too

Again FPS plummeting to single digits is still the same issue, how they get to these issues may very well be down to different bugs/external factors, same way the menu bug you encounter might be something entirely different but alas, neither you nor I can confirm what this is. I think we can agree there is a lot more going on under the hood than just 100% vram if looking at the "big picture" outside of just your experience or my experience. As I've noted a few times, working in the development industry and having experience with very similar issues every day on various products, I am also looking at this from a completely different perspective as opposed to just from a "consumer" POV.

EDIT:

Btw have a watch of this video (with rebar and FSR on) just to throw in another peculiar issue and note the FPS at the start of the video and then at the end of the video, particularly at 1.38:


Seems a bit odd why fps would suddenly be lower just because rebar is turned on (bearing in mind what rebar does i.e. removes the 256MB temporary swap system between the CPU and gpu vram)? Correct?
 
Last edited:
Seems a bit odd why fps would suddenly be lower just because rebar is turned on (bearing in mind what rebar does i.e. removes the 256MB temporary swap system between the CPU and gpu vram)? Correct?
Maybe? I thought rebar wasn't perfect in a number of games, and nVidia defaulted rebar to off and only turned on in games that it did work correctly with?

I can't see any reason why rebar should hurt performance, but I think it does in a number of games. That would suggest a rebar issue rather than a game specific one necessarily wouldn't it?
 
I can't believe this thread is still going. The reason it exists is because 10gb was not enough for a flagship card. People buy high end cards to not care about vram or even think about it. On a high end card you turn settings to what the gpu is capable of with no regard for vram but in this case it was always a question. So yea it was not enough and had Nvidia given it 12 which would have needed a higher spec we would not be here. AMD with there spec had to go 16gb as 8gb was not enough. It's not rocket science that Nvidia wanted to sell some 3090's and thought the 12gb card was gonna get to close to being perfect at the price point.
 
Maybe? I thought rebar wasn't perfect in a number of games, and nVidia defaulted rebar to off and only turned on in games that it did work correctly with?

I can't see any reason why rebar should hurt performance, but I think it does in a number of games. That would suggest a rebar issue rather than a game specific one necessarily wouldn't it?

As per HUB testing, it very much depends on your hardware (regardless of amd or nvidia gpu) and the game. However, at the same time, there are games where it is defaulted to "off" even though based on my testing/machine, it actually improved performance such as for halo infinite so YMMV.

Like my previous answer:

why this happens, I don't know, same way I don't know why it also happens to some 3090 users too

Could maybe explain the people with 3090s who are encountering issues? Or even explain Jokers issue? (not sure if he had rebar on or off, probably was on though as that video was from launch time) but then HUB didn't have any issues with their testing so who knows....

Just thought I would throw that in as another weird issue where fps drops for no apparent reason (although not quite the same as single digit fps).

I can't believe this thread is still going. The reason it exists is because 10gb was not enough for a flagship card. People buy high end cards to not care about vram or even think about it. On a high end card you turn settings to what the gpu is capable of with no regard for vram but in this case it was always a question. So yea it was not enough and had Nvidia given it 12 which would have needed a higher spec we would not be here. AMD with there spec had to go 16gb as 8gb was not enough. It's not rocket science that Nvidia wanted to sell some 3090's and thought the 12gb card was gonna get to close to being perfect at the price point.

Precisely, that's why the 3090 existed so they could milk more money from said victims.




Would like to see some more testing from tech press done on amd vs nvidia vram consumption in several games too as a few sites have noted how nvidia sees less vram usage compared to amd so perhaps nvidia handle vram utilization differently/more efficiently on their end than amd, which is why amd need more vram in the first place? Or maybe nvidia are cutting corners somewhere by not fully rendering textures 100% in the distance or elsewhere? If it where this, they are doing an incredibly good job given it is indetectable......
 
We've gone almost 2 years and only 1 game has caused the 3080 VRAM related issues, an AMD sponsored title with a texture pack and RT designed for AMDs cards so not bad for a £650 GPU when top of the range for nvidia now costs £1800.

AMDs £600-900+ Gpus have had issues running RT from day one and in many more games than just one.
 
Status
Not open for further replies.
Back
Top Bottom