• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Associate
Joined
17 Aug 2009
Posts
1,684
Then you can always throw in the fact that "ultra/epic/very high" settings generally bring little to nothing for a huge perf hit. so essentially if you adjust settings properly in the first place and know what the limits of said card is, you can quite easily make a product last much longer.

dont forget what was Ultra in 2020 might only be high to medium in 2022

but you are right in a lot of games I cant tell the difference from high to ultra
 
Soldato
Joined
27 Feb 2015
Posts
12,636
This is good what HU advised and sensible tbh. The games tend to not offer great scaling options and as they have shown nightmare or whatever the highest setting is named in the menu's seems to barely differ from say ultra. Environment settings seem to be a good one for clouds, reflections, distance so its down to how real you want it to look and adjust for your own experience.

The thing is with this flexibility, you ruin the point in benchmarking - the control aspect in any test/experiment is keeping all conditions the same. If users drop settings to gain higher fps, then surely they wont run into issues others have/experience which is where the comparisons are getting woolly.

I have seen in many games where ultra for things like fog are not worth it.

On shadows you can usually tell the difference but I can live with softer shadows if I have to, but the problem is textures, they are a clear visual upgrade with proper 4k textures. Most mods out there for games are texture upgrades, because in many games that is a weakness.

A low quality texture will either be blocky lacking detail, or it will be upscaled and look a blurry mess, I think some people just dont care as I have seen people claim there is no difference, but taking ff7 remake as an example, I am playing with a high quality textured character, I then stand next to a wall and its blurry low quality, very jarring. However I am a player who likes to absorb my surroundings and take my time in games, I dont play fast paced competitive shooters where people only seem to care about their frame rate. It feels like its two sets of players disagreeing with each other, if you value performance then VRAM is seen as a overhyped thing, if you value image quality and immersion, then its important.

Nvidia are masters at marketing, there is nice things about AMD, but they dont make those things known, Nvidia are well aware the vast majority of the review industry basically measures FPS, and thats the basis of their reviews, so they balance their spec to maximise their scoring in those reviews. I personally see texture streaming as a 'hack' to compensate for lack of VRAM, in an ideal world you would load an area entirely into memory VRAM/RAM combined, but people dont like loading screens, and the main hardware manufacturer is resisting higher VRAM specs. So we have streaming a compromise solution. I have observed two different types of streaming, one where you only get high quality textures, this one tends to be vulnerable to pop in, the other where it loads lower quality, and then conditional if enough VRAM, will up the quality as you move closer. Sometimes this happens quite late so you can see textures upgrade as you approach which can be jarring but not as bad as pop ins.
 
Caporegime
Joined
4 Jun 2009
Posts
31,362
I have yet to notice any real "pop in" or "texture loading" issues with regards to textures because of "vram", what I do notice is poor LOD draw distance in pretty much every game especially ubisoft open world games though, obviously if we could dump everything straight into vram, this would solve those LOD issues but sadly not even 16GB, let alone 24GB would be enough for this, not to mention, it is an incredibly lazy and inefficient method.

This is why direct storage is going to be a must have in the future, we have already seen how it benefits this kind of thing in ratchet and clank for the ps 5 but given it is rather new gen and majority of pc gamers won't be on the cutting edge tech to support it, we sadly won't see it anytime soon, although I am hopeful maybe nvidia will sponsor some games for their version of it to get the ball rolling i.e. RTX IO

https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/

I look forward to getting a 16GB card. As soon as I do I plan on coming in here to say 10GB is not enough for the lols :D
:cry:

I read an article earlier where nvidia have found a way to be able to get 2x extra ray tracing performance so looking forward to the 4070/4080 :D
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,346
Location
Greater London
I looked forward to getting a 10gb 80 to find out for myself and got one just before they upgraded it to 12gb.:p
To be fair it ain't a free upgrade, that extra 2gb costs more. As far as I know there is not FE for that version. If money ain't an issue then I suggest people go for the 3090. 24gb then :p:D


:cry:

I read an article earlier where nvidia have found a way to be able to get 2x extra ray tracing performance so looking forward to the 4070/4080
I was expecting that from the 3000 series but was disappointed. Would not be surprised if we get 2x RT performance on 4000 series. Plus you know what nvidia are like, as soon as it is out the 3000 series won't get any further love and all work on drivers to squeeze things out from game ready drivers will go to the 4000 series. Plus much more fun getting latest gear each gen. It does not have to cost the world either when you sell your old card :D
 
Soldato
Joined
18 Feb 2015
Posts
6,490
Damn, even Nvidia sponsorships aren't enough to keep the vram thirst at bay (at least not past the launch window winkwink nudgenudge):

Ray tracing costs even more power than before
Ray tracing has always cost a lot of power in Cyberpunk 2077, regardless of the hardware, and that hasn't changed. [...]
So it's not surprising that a rendering resolution of 2,560 × 1,440 is too much even for the fast Ampere model. There is no more than 39 FPS even with medium ray tracing details, with high ray tracing details it is only 51 percent - because the GeForce RTX 3080 also runs out of 10 GB of memory. As a result, you no longer have to look at Ultra HD without DLSS, which is already completely unplayable with RT on “Medium”.
https://www.computerbase.de/2022-02/cyberpunk-2077-patch-1.5-benchmark/

inb4 "bUt yOu dOn'T uNdersTanD, thE gAmE is unOptimisEd!1!!1"

:cool:
 

G J

G J

Associate
Joined
3 Oct 2008
Posts
1,429
I look forward to getting a 16GB card. As soon as I do I plan on coming in here to say 10GB is not enough for the lols :D

Well lets be honest this thread is now just a troll. Has been for pages. ;)
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,362
Damn, even Nvidia sponsorships aren't enough to keep the vram thirst at bay (at least not past the launch window winkwink nudgenudge):


https://www.computerbase.de/2022-02/cyberpunk-2077-patch-1.5-benchmark/

inb4 "bUt yOu dOn'T uNdersTanD, thE gAmE is unOptimisEd!1!!1"

:cool:

I sense bait but I'll bite anyway :cry:

Tiring/busy day so maybe I'm missing something but what exactly are they saying with regards to a "supposed vram issue" or rather what's their point? It's not very clear.... That at 2560x1440, 3080 is running out of vram and performance issues are happening because of running out of vram? And this is without dlss? If so..... bit of a pointless point given that again, the main thing is not enough RT grunt hence the need for DLSS/FSR in the first place, of course when enabling these, vram usage drops.

And if we take a look at the rest of their article just below the quoted bit:

Speaking of unplayable: ray tracing on a Radeon was unplayable in Cyberpunk 2077 and remains so. The Radeon RX 6800 XT is already crawling around in 1,920 × 1,080 at RT on "Medium" with just under 38 FPS, the performance loss is a whopping 63 percent. And with RT on High, the framerate drops to just under 26 FPS - that's 75 percent slower than without the rays. Ray tracing does not even have to be tried on the AMD flagship Radeon RX 6900 XT.

KWHiTxi.png

NWilyeV.png

aqmJQR2.png

So what exactly is the point/relevance here? ;)

Shame they didn't test with RT maxed and using dlss/FSR, at least I only skimmed the article and couldn't see RT maxed, only medium and "high"????



As for the new patch, I only played a bit with the new patch and have noticed no issues with RT maxed and a mix of custom settings for everything else with dlss balanced (needed for when using RT maxed, if I dropped RT lighting to medium, could get away with dlss quality but then the game takes a hit to "overall" IQ) and vram is maxed at 10GB but there are no performance issues. Only way I encountered vram issues in CP was when using several 4k & 8k texture mod packs, I had to remove 2 of the packs to get the vram usage to drop.

New patch has added local RT shadows, which in theory should drop performance, however, it made no difference on my end, at least not that I have noticed so far and there are also a few others reporting the same e.g. mrk

Where is this? RT is improved after the patch. We now have RT local shadows where we only had sun shadows previously so FPS may drop slightly as a result of turning on RT shadows since objects now cast RT shadows.

https://wccftech.com/cyberpunk-2077...-on-pc-thanks-to-partnership-with-nvidia/amp/

Still seems to run the same for me with all RT on and where available, set to Psycho.

Also seen several on reddit saying they haven't noticed a difference in fps.



It definitely is not a case of being "unoptimized" either and if people claim this, they clearly haven't seen the game in action for themselves with RT maxed :cry:

Well lets be honest this thread is now just a troll. Has been for pages. ;)

Pretty much, I await the day where I can't nuke everyone's so called arguments/debates on this topic from orbit :cry: ;):cool:

EDIT:

Interesting, they also seem to find FSR awful in this alongside techpowerup.... Where as this is the first game where I thought FSR looked good, even at 3440x1440....
 
Last edited:
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Just tried the 1.5 bench at 1440p with everything maxed out, of course, and DLSS Quality which gave an average of 62FPS, while the VRAM used maxed out at ~7200MB.

No idea if resizeable bar helps or hinders, but I have it enabled.

FSR looks garbage v DLSS.

A good example on reddit showing not just why DLSS out performs FSR, but also can behave better than native.

https://www.reddit.com/r/nvidia/comments/svpvg9/dlss_in_cyberpunk_15_fixes_flickering_metal_fence/

This also proves that those who buy Nvidia have a more defined member :cry:
 
Associate
Joined
25 Apr 2017
Posts
1,127
I wish they would give us an option to turn off the other newer reflections they have added. I used to play RT Ultra and DLSS performance in CP2077 pre 1.5 at mostly 60 fps with drops into the low fifties.

Now I am averaging 52 fps with drops into the high forties and I have to turn off either the lighting or reflections to get similar performance to the older version maxed out and it starts looking worse at that point.

Also there is something wrong with the RT in the game. I start out the game into the 50s but after an hour of gameplay, the fps just goes down into the forties and stays there permanently. The only way to fix is to reload the save and my fps goes back into the fifties.

The game definitely looks better than before though. The lighting looks improved and there seems to be higher resolution textures on several surfaces. The game just had a blurry look with dlss which is now resolved. But sadly it looks worse overall than pre 1.5 as I need to disable 1 major RT setting to achieve the same level of performance.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,762
Also there is something wrong with the RT in the game. I start out the game into the 50s but after an hour of gameplay, the fps just goes down into the forties and stays there permanently. The only way to fix is to reload the save and my fps goes back into the fifties.

I don't think that is the RT, though might be. I get the same thing after 1-2 hours of playing - goes from averaging ~60FPS to 45FPS and feels choppy until I restart the game/reload. It seems like some kind of bug with the streamed asset management like the game falsely detects a low memory state and starts swapping stuff out aggressively when it doesn't need to causing a performance hit.

I've not tested it on my main system yet but on my 3070 mobile laptop I didn't get that much hit strangely with the 1.5 patch - maybe 3 FPS less than before - ultra settings + ultra RT is still just about playable with DLSS quality mode at 1080p.
 
Soldato
Joined
25 Sep 2009
Posts
9,748
Location
Billericay, UK
Damn, even Nvidia sponsorships aren't enough to keep the vram thirst at bay (at least not past the launch window winkwink nudgenudge):


https://www.computerbase.de/2022-02/cyberpunk-2077-patch-1.5-benchmark/

inb4 "bUt yOu dOn'T uNdersTanD, thE gAmE is unOptimisEd!1!!1"

:cool:
On the whole that's a decent analysis but I'm not sure how they concluded the 3080 'run out of vram'. Typically when there's not enough buffer memory on the graphics card the minimums fall of a cliff as the game has to send data to slower system memory but looking at the 1% results that's not happening.
 
Caporegime
Joined
4 Jun 2009
Posts
31,362
On the whole that's a decent analysis but I'm not sure how they concluded the 3080 'run out of vram'. Typically when there's not enough buffer memory on the graphics card the minimums fall of a cliff as the game has to send data to slower system memory but looking at the 1% results that's not happening.

Like I said bait.... :cry: :D

But do agree, I can't really see the issue at 2560x1440 like they say.... however, it does look like at 4k without any dlss/fsr, vram is choking the 3080 which is why the 6800xt then has the lead.
 
Status
Not open for further replies.
Back
Top Bottom