• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

Lol.

Call of Duty Black Ops maximum settings with the HQ Texture pack says hi.
Is that allocated or what is actually used though? As it is known that call of duty games allocate a lot more than they use. Just ask Kaapstad, he showed that much years ago.


Your right i exaggerated it was 8GB. In the early days of thread posters were talking like games wouldn't grew in size but that argument was quickly nipped in the bud. However there are some people who seem to think games growing in size is a rare occurence rather than the norm. *Insert pointless comment about selling card*

The irony being that with Nvidia reshuffling their line up to have more VRAM, the size increase in games is going to happen a lot sooner than I expected.
VRAM requirements have always gone up with time. I do not understand why anyone would think otherwise. Not really news ;)


I play black ops with the settings on 90% memory allocation so the game uses 22gb vram, beat that :p

This is now a contest, to see who can have the most vram allocated by a game
Must mean his 16gb is not enough. Must be stuttery and not smooth experience... :D
 
I read it and said it was ironic because of the below, even though you may have said this in jest others will feed off it and my ability to even try discuss performance and features in this game is lost.
Seriously? I love me some banter but don't lose sight of my posts and don't take them personally. Like I said, I respect your view and that alone should tell you that I am not against what you say. I have or had no idea what you was running in fairness or cared and accepted so many of your posts for valid points. I would just like AMD users to be able to run RT in CP and then make up their mind. I certainly wasn't doing an "us V them" post lol.
 
I'm not seeing any "you are with us or against us", all I'm seeing are RTX or/and nvidia diehard fans ignoring peoples posts simply stating that without RTX, cyberpunk is terrible compared to what has been done many times before in other games, as soon as someone posts any screenshots/evidence showing games like alien isolation, rdr 2 reflections etc. The RTX PR team come in and go "but but but, RTX looks so much better in cyberpunk, anything non RTX sucks massively"....

Then when asked for a simple yes or no, you get a huge TLDR drivel post dissecting different methods/optimisations etc. i.e. not relevant to the simple question that has been asked "forget RTX, does cyberpunks default SSR look as good as what has been done in multiple games over the last few years?"


Like I posted a page or 2 back, it's exactly like monitor PR teams trying to highlight differences:

OVA8E1X.png

When in reality, the difference isn't that big unless the product/image has been sabotaged in order to make the advantages of said new item/tech stand out even more.

Exactly this
 
I play black ops with the settings on 90% memory allocation so the game uses 22gb vram, beat that :p

This is now a contest, to see who can have the most vram allocated by a game

I’ve got to ask, does it run better for you with the 90% VRAM option enabled?

I’m sure it does for me. Everything just runs more smoothly, the game and cut scenes.
 
I think FS2020 is over 20GB, will see if I can screenshot.

24GB.... might need to go 64GB soon? All settings maxed, looks nice.
8hApGd1.jpg
RoY9WCa.jpg
MlpL9zM

8hApGd1
 
Games needing more that 10GB is inevitable. We are just waiting to laugh at the people who thought that it isn't possible. Lol

Edit: And laugh at the notion that VRAM is hard tied to performance. Lol

I don't think there's anyone I've read that thinks it's impossible to go over 10Gb of usage. That's a straw man of way more reasonable positions that people have actually taken on this issue, which is right now we're not using more than 10Gb for anything and likely won't in the near future because you become GPU bottlenecked long before you're vRAM bottlenecked.

vRAM of course is tied to performance. The purpose of vRAM is to hold data that the GPU can do calculations on, that's 100% the purpose of having vRAM. If the GPU doesn't need to do work on something that is in vRAM then it's pointless in that thing being loaded into vRAM it's just wasting space. If the GPU has to do calculations based on what is in vRAM then it makes perfect sense that as you put more assets into vRAM that you increase work load on the GPU and the performance goes down.

That's intuitively the case for pretty much all gamers. If you've ever played a game before that shows you estimated vRAM usage in the video settings menu, you'll see any time you increase visual fidelity by turning up settings, the estimated vRAM usage also goes up, and performance goes down. To say the 2 aren't hard tied together is some weird fantasy world, it's just not true in the slightest.

The reality today is that with the GPUs we have and the work load they can sustain, we cannot make use of more than 10Gb. Any games that are close to using all 10Gb of a 3080 are slideshows, we demonstrated this quite conclusively in the "is 10Gb enough" thread of which had 100's of pages of examples. FS2020 uses about 9.5Gb at peak (4k ultra) measuring real usage, and even on a 3090 runs at like 25fps at those settings, same for Watch Dogs legion at 4k Ultra you can get real memory usage pretty close to 10Gb but performance is like 5fps or something really low. Avengers in 4k ultra is the same thing about 9Gb of memory used but I think 17fps if memory serves.

The consoles already have access to 12GB VRAM for the game's assets, so why would developers not make games that need at least 12GB VRAM. The current 3070 and 3080 wont age like fine wine

Er, well they don't. They both have 16GB of memory to share between everything, that needs to be used as both "system RAM" and "video RAM" equivalents of the PC. So you need to put the console's OS and any running apps into that memory pool, and then you need to put the games memory usage into that pool (which on the PC would use system RAM, anywhere up to something like 4GB these days) and whatever left over can be used for vRAM purposes. The bottom line is that works out to be about 10Gb maximum.

And more to the point the console APUs aren't very good, they're kinda like mid range video cards form last generation on PC, they cannot meaningfully make use of even 10Gb of vRAM, if you loaded the games up with 10Gb worth of assets then the frame rate would crawl to a halt. Besides the new consoles also have much faster NVMe drives which can stream assets directly into memory at high speed so the reliance of having assets wasting space in vRAM just in case you happen to need it, is just going away anyway. Read up on Microsoft's DirectStorage.
 
I'm not seeing any "you are with us or against us", all I'm seeing are RTX or/and nvidia diehard fans ignoring peoples posts simply stating that without RTX, cyberpunk is terrible compared to what has been done many times before in other games, as soon as someone posts any screenshots/evidence showing games like alien isolation, rdr 2 reflections etc. The RTX PR team come in and go "but but but, RTX looks so much better in cyberpunk, anything non RTX sucks massively"....

Then when asked for a simple yes or no, you get a huge TLDR drivel post dissecting different methods/optimisations etc. i.e. not relevant to the simple question that has been asked "forget RTX, does cyberpunks default SSR look as good as what has been done in multiple games over the last few years?"

When in reality, the difference isn't that big unless the product/image has been sabotaged in order to make the advantages of said new item/tech stand out even more.

I dunno, I've posted a lot in the gaming thread for CP2077 as I played it through which had plenty of discussion with the visuals. People capable of using RTX are preferentially using it because it looks better, and no one is unilaterally trashing the game for how it looks without RTX. It looks really good without RTX, it just looks better with RTX.

If anything the hyperbole I've seen is the other way around, it's the blowing out of proportion of the reaction of ray tracing. Who has actually said that "anything non RTX sucks massively"? That's not the kind of thing I'm actually reading on the forums. You'd be right in saying that's hyperbole, but who is actually saying this? And is it a "PR team" or just one fanboy?

Graphics methods being better over time is non-controversial, you don't see people typically arguing that say SMAA isn't that big of a difference, or that per object motion blur is not that big of a difference, or SSAO is not that big of a difference. If it's better then it's better, and people accept that and strive to use it to improve their experience. But the moment it's an effect that one team can do and the other can't (at least not practically, yet) the significance of the effect is then immediately drawn into question. This is quite transparently a cope. I'm sure the next gen AMD cards will dedicate more die space to RTops and then they'll be boasting ray tracing visuals and I suspect that quantification of just how good ray tracing is will fade into obscurity.

Avoiding hyperbole cuts both ways.
 
I don't think there's anyone I've read that thinks it's impossible to go over 10Gb of usage. That's a straw man of way more reasonable positions that people have actually taken on this issue, which is right now we're not using more than 10Gb for anything and likely won't in the near future because you become GPU bottlenecked long before you're vRAM bottlenecked.
Kind of like the straw man were people state that having more VRAM doesn't provide more performance even though nobody has made that argument. But that didn't stop you or other parroting it, now did it. I guess i shouldn't stoop to the level of others, by exagerating my posts.
But lets address your actual post. As i remember there were people pointing to current games and proclaiming that we wouldn't need anymore, and this was before you started waving your banner about VRAM usage being hard tied to GPU horsepower. Talking about that.

vRAM of course is tied to performance. .
Ironic that you start your paragraph with a strawman.


vRAM of course is tied to performance. The purpose of vRAM is to hold data that the GPU can do calculations on, that's 100% the purpose of having vRAM. If the GPU doesn't need to do work on something that is in vRAM then it's pointless in that thing being loaded into vRAM it's just wasting space. If the GPU has to do calculations based on what is in vRAM then it makes perfect sense that as you put more assets into vRAM that you increase work load on the GPU and the performance goes down.
Pointless waffle


That's intuitively the case for pretty much all gamers. If you've ever played a game before that shows you estimated vRAM usage in the video settings menu, you'll see any time you increase visual fidelity by turning up settings, the estimated vRAM usage also goes up, and performance goes down. To say the 2 aren't hard tied together is some weird fantasy world, it's just not true in the slightest.
I would describe that as a soft tie not a hard tie. Since the change in vram and total VRAM consumption for certain settings can differ significantly between games/game engines.

However considering that my orginal statement was in reference to you stating that a 3080 can not use more than 10GB without being bottlenecked (Let me know if i have misconstrued your position). I consider this a tangent.


vRAM of course is tied to performance. The purpose of vRAM is to hold data that the GPU can do calculations on, that's 100% the purpose of having vRAM. If the GPU doesn't need to do work on something that is in vRAM then it's pointless in that thing being loaded into vRAM it's just wasting space. If the GPU has to do calculations based on what is in vRAM then it makes perfect sense that as you put more assets into vRAM that you increase work load on the GPU and the performance goes down.

That's intuitively the case for pretty much all gamers. If you've ever played a game before that shows you estimated vRAM usage in the video settings menu, you'll see any time you increase visual fidelity by turning up settings, the estimated vRAM usage also goes up, and performance goes down. To say the 2 aren't hard tied together is some weird fantasy world, it's just not true in the slightest.

The reality today is that with the GPUs we have and the work load they can sustain, we cannot make use of more than 10Gb. Any games that are close to using all 10Gb of a 3080 are slideshows, we demonstrated this quite conclusively in the "is 10Gb enough" thread of which had 100's of pages of examples. FS2020 uses about 9.5Gb at peak (4k ultra) measuring real usage, and even on a 3090 runs at like 25fps at those settings, same for Watch Dogs legion at 4k Ultra you can get real memory usage pretty close to 10Gb but performance is like 5fps or something really low. Avengers in 4k ultra is the same thing about 9Gb of memory used but I think 17fps if memory serves.
.

As has been mentioned and ignored. Textures have small effect on performance when compared to other settings but consume the most RAM. Difference in texture settings can also be the most noticeable depending on the game.
Most of the performance intensive effects have a smaller effect on VRAM consumption compared to Textures. (From Nvidias own specs guide that i posted).


You seem hung up on 4k, except screen resolution does not have a significant effect on VRAM usage. There maybe indirect effects suh as engines lowering settings when switching screen res, but that is on a game by game basis. It is therefore possible for a game to exceed 10GB of VRAM use at a lower resolution, were framerates may still exceed 60fps.


If VRAM usage is hard tied to performance as you are arguing why does, Doom eternal get nearly 200fps on a 3080 while using over 8GB of VRAM? Or is the game going to crater to sub 45fps once it hits the magical 10GB number.
This is a rhetorcial question. I'm certain that there are other games using above 8GB of VRAM and exhibiting very good performance (maybe not as good as Doom). It is clear that VRAM isn't hard tied to performance, or to word it another way, you cannot predict the performance of a game on a specific GPU by looking at VRAM usage (This is what hard tied means). Games using 10GB of VRAM or close to it aren't going to crater down the sub 45fps levels as you and others like to believe. It is going to depend on the game engine and the game.

Lets also not forget that these are computers and often people multi task with their PCs. *Awaits the drivel about how XYZ closes everything down when gaming*

I'm getting dizzy almost as if i'm going round in circles.

And more to the point the console APUs aren't very good, they're kinda like mid range video cards form last generation on PC, they cannot meaningfully make use of even 10Gb of vRAM, if you loaded the games up with 10Gb worth of assets then the frame rate would crawl to a halt..

You should go work for sony and MS and show them how to design a proper consoles. They clearly could have saved a ton of money by putting less RAM in but someone there didn't think of it. But if they had you on the team this would not have happened.


Edit: Feel free to continue this wthout me. I've put more than enough time in the previous thread on this matter.
 
Last edited:
I’ve got to ask, does it run better for you with the 90% VRAM option enabled?

I’m sure it does for me. Everything just runs more smoothly, the game and cut scenes.

I havent checked, the game doesnt have a built in benchmark so its hard to make a completely fair comparison

If I had to guess, I think the VRAM allocation slider was added to the game for people that have cards that are 8gb or less. I don't think on my card there will be any difference between 70% or 90% in performance simply because 70% is still 17gb which should more than enough
 
I'm not seeing any "you are with us or against us", all I'm seeing are RTX or/and nvidia diehard fans ignoring peoples posts simply stating that without RTX, cyberpunk is terrible compared to what has been done many times before in other games, as soon as someone posts any screenshots/evidence showing games like alien isolation, rdr 2 reflections etc. The RTX PR team come in and go "but but but, RTX looks so much better in cyberpunk, anything non RTX sucks massively"....

Then when asked for a simple yes or no, you get a huge TLDR drivel post dissecting different methods/optimisations etc. i.e. not relevant to the simple question that has been asked "forget RTX, does cyberpunks default SSR look as good as what has been done in multiple games over the last few years?"


Like I posted a page or 2 back, it's exactly like monitor PR teams trying to highlight differences:

OVA8E1X.png

When in reality, the difference isn't that big unless the product/image has been sabotaged in order to make the advantages of said new item/tech stand out even more.
Not only is this correct but they are a minority to what could have been the community at large if they took the time to develop the game proper.


https://www.guru3d.com/news-story/c...st-80-of-its-active-players-since-launch.html

It appears, at large, popular opinion is that this game is a complete disaster. And, there are reports that the developers will scrap most of the code and start over. No doubt do to those class action lawsuits they are trying to "defend" themselves from.


 
of course the player base will drop quick - people finish the game and move on, it has no multiplayer to keep them playing.
Its still the top selling PC game 7 weeks in a row
 
Back
Top Bottom