• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

Same ol 4 frustrated users trying to convince themselves and others that cp2077 looks bad, that RT looks bad, that raster is great bla bla bla...

Don’t you have anything better to do? :D
 
Depends if you have the performance budget for a better more expensive and better quality type of raster reflections. Here's a solution, why don't you use the RT reflections (more expensive and better quality type of reflection) or did you ignore the emperor's new clothes effects. You know after watching a youtube video from someone like Not an Apple Fan. You cant see the RT in Cyberpunk 2077 crowd. So got an AMD card for faster raster. Then scream why are the raster reflections so crap, its because they are raster reflections. You are not going to do a big raster screen space ray tracing like RLR type reflection if you have RT reflections available and waiting in the wings.



Graph 1: O wait maybe not faster raster after all and lower RT performance. Maybe should have got a Nvidia card and that way raster reflections would not be a problem. Just turn on RT reflections. But I got a 1660/5700xt because the emperor's new clothes effects are not worth it. Always with the buyers remorse.

Then log into overclockers.co.uk forums to complain to nvidia card uses in the cyberpunk thread why your raster graphics have poor reflections. In a game enabled with RT reflecations, why are the RT reflections that good you cry. I mean, how silly can you get, they were always that good. Why are raster reflections crap but they have always been that way. Sounds like extreme buyers remorse to me. Got a AMD card found out the raster was not the only important feature.

RT+DLSS are nice features after all but every game has got to give me RT like reflections in raster. Some games did it like x and y and z. Why not cyberpunk 2077?

I mean how dense is this going to get? Sounds like extreme buyers remorse.

You're very strange, I don't have buyers remorse and I do have an rtx card. Also I appreciate what RT can do and think it's the future, I only question why in this game non-rt is poor in comparison to other older games
 
Rasterised effects will be turned down to make the RTX on effects look much better. This is why some of them look poor,so to emphasis it. It was the same with Hairworks/TressFX - there was already hair animation methods and other physics methods in other games,but in any games with Hairworks/TressFX the "off" effects were toned down,so the "on" effects looked much more of a jump. Most reviews will only do a comparison within the same game for image quality but not try to check if implementations are done better elsewhere,so it works brilliantly as a marketing tool!

Its been done enough times by Nvidia/AMD/ATI to emphasis some new feature over the older generation. After all they want to sell new GPUs to you,and you sitting on your "old" ones is "inconvenient" for them.

Its typical marketing - just go to a shop and see reps trying to compare their latest TV model with the earlier one,or an expensive HDMI cable compared with an el-cheapo on.... you will find they have generally mucked around with the image quality settings on one demo unit,to make the other one look better.

But enough gamers and hardware enthusiasts have the attention span of gnats,so it works very well on them. They have such belief in companies such CDPR,Nvidia,AMD,etc that they can't fathom these companies will try every trick to try and oversell something to them.
 
Last edited:
Not exactly. X-Plane 11 in Vulkan mode at 4K with photo scenery and a couple of command line arguments to load mesh data etc into VRAM (helps to counter some stuttering when looking around), it'll use about 18GB of VRAM on its own and 20GB VRAM use overall. (Though further improvements to their Vulkan engine may still improve this in the future and reduce the VRAM requirements).

So yeah, I have actually found a use for having this much VRAM. It's still a card with a terrible value for money ratio though.
X-Plane? You not on FS2020? :)

Yeah I am sure there are some mods for Skyrim or something like that which may go over it also. But that is not exactly what I was talking about. But even so, great, a handful of games that tinkering about with mods will get you over 10gb then. Better quickly sell my 3080 and get a 3090 :p
 
Just out of interest wrinkly, you playing on an oled screen?

If not you're missing out here and regardless of ray tracing etc. Ultimately games will always look "cardboard" like on **** LCD screens.

The difference oled makes is massive to having games look less flat. Throw in hdr and you're talking about ps 3 quality Vs ps 5 next gen differences. I always describe oled like you're looking out the window. Sadly this is something you really do need to see with your own eyes as the oled advantages can't be seen on LCD screens.

Nope. I'm terrified of burn in. My PC is on 24/7 with work, gaming, TV, etc. I have an old 1440p/60Hz IPS panel (DGM 2701 https://www.tftcentral.co.uk/reviews/dgm_ips-2701wph.htm) that I've been meaning to upgrade for almost as long as my 3770k. Input lag is bad, but the panel is very nice. I've recently been playing with Dynamic Contrast Ratio that gives a little pop to colour and I have Corsair's LS100 lighting behind the panel to throw screen colours on to the wall.

I've been after HDR since it was first announced, but it looks like it's only now it has become a proper supported standard?
 
Last edited:
Rasterised effects will be turned down to make the RTX on effects look much better. This is why some of them look poor,so to emphasis it. It was the same with Hairworks/TressFX there was already hair animation methods and other physics methods in other games,but in any games with Hairworks/TressFX the "off" effects were toned down,so the "on" effects looked much more of a jump. Its typical marketing - just go to a shop and see reps trying to compare their latest TV model with the earlier one,or an expensive HDMI cable compared with an el-cheapo on.... you will find they have generally mucked around with the image quality settings on one demo unit,to make the other one look better.

But enough gamers and hardware enthusiasts have the attention span of gnats,so it works very well on them. They have such belief in companies such CDPR,Nvidia,AMD,etc that they can't fathom these companies will try every trick to try and oversell something to them.

Raster based reflections are not good. It's the whole point of creating reflections using RT in the first place. Sound like extreme buyers remorse. Sounds like you should have got a Nvidia card.
 
You're very strange, I don't have buyers remorse and I do have an rtx card. Also I appreciate what RT can do and think it's the future, I only question why in this game non-rt is poor in comparison to other older games

As I pointed out the other games use different method for reflections. Re-rendering the whole scene for every reflection angle in cyberpunk 2077 will distory performance but thats what himan 2 appears to be doing at times. Each game has its own frame time budget and you can afford to spend that performance only in certain ways.

Cyberpunk 2077 SSR looks no difference than Controls. Two different engines, made by two different groups. Instead of creating something like RLR for raster, they use RT instead. RT reflections are better and there is no point in a RT game that supports DX12u were you would not use RT for the quality reflections.

Raster from a quality standpoint does have reduced image quality when compared to RT. The issue here is you reached the point were you have worked it out. Instead of accepting the true reason you have made up a crackpot conspiracy instead. That companies are reducing raster image quality so RT looks better. RT does looks better.

All Cyberpunk 2077 does is implement well known raster methods, like SSR. Yes that is never going to look as good as RT reflections.

Raster water reflections in Witcher 3. 15:37


Raster water reflections in Cyberpunk 2077 0:01


Both more or less the same on the water. Both games use SSR. Both reflections blured on the water with raster and not as clear as RT.

Unreal Engine 4.12 water (SSR vs Planar reflections)

Same type of reflections with SSR in Unreal Engine 4, even the planar reflections are not as clear as Cyberpunk 2077 RT reflections. Both look far worse than RT reflections.

https://forums.cdprojektred.com/ind...ted-game-technologies-in-the-witcher-3.34419/
 
Last edited:
Raster based reflections are not good. It's the whole point of creating reflections using RT in the first place. Sound like extreme buyers remorse. Sounds like you should have got a Nvidia card.

I have an Nvidia GPU and what has this got to do with what I said?? Cyberpunk 2077 is an RPG game,and I played the original pen and paper RPG. I couldn't give two ***** about whether some reflections are better in the game,when the RPG elements are a bit disappointing,world interaction is not as deep as CDPR touted and its buggy. This is what YOU don't seem to understand - a good RPG game can be replayed for years with progressively better hardware.

It sounds like YOU have extreme buyers remorse,so are trying to justify whatever hardware you have by projecting onto others because Cyberpunk 2077 hasn't lived up to expectations. I have mates who bought Ampere GPUs for this played it for a while,and literally have started playing something else. It appears a few here have invested way too much into the game,and now need to defend CDPR to the ends of the earth,so have latched onto graphics. In a number of aspects the Witcher 3 is a better RPG.
 
Last edited:
Rasterised effects will be turned down to make the RTX on effects look much better. This is why some of them look poor,so to emphasis it. It was the same with Hairworks/TressFX - there was already hair animation methods and other physics methods in other games,but in any games with Hairworks/TressFX the "off" effects were toned down,so the "on" effects looked much more of a jump. Most reviews will only do a comparison within the same game for image quality but not try to check if implementations are done better elsewhere,so it works brilliantly as a marketing tool!
Keep in mind that whenever you see some effects done great with rasterisation you don't know how much work went into that vs what the middle-ware solution looks like in terms of cost/performance. The examples you give with hair actually work against your argument because a lot of the great hair effects done in game were afterwards done with things like TressFX as a base from which they forked. Why? Because it was already great & it was cheaper than starting from scratch. See lots of SqEnx games:

Hair simulation is an in-house technology, based on AMD’s TressFX, which decouples the hair simulation from the actual rendered hair strands. While noise simulation for hair has been used before in video games, a real hair simulation algorithm provides much better results for our characters.
https://www.makinggames.biz/program...nd-divided-gpu-frame-exploration,2304358.html

So then if you can achieve better results visually with RT (and this is indisputable) then why would you spend all that extra time on trying to do it with rasterisation (for a high-end option)? You wouldn't, it just doesn't make sense. Obviously the "base layer" will be rasterisation, which will be whatever it will be, aka the console target. Plus tbh a lot of the time whenever game devs pushed really high-end rasterised options, even when done with Nvidia/AMD, it just was never very stable and was prone to so many issues (HFTS, PCSS, VXAO etc.).
 
X-Plane? You not on FS2020? :)

I've got it installed, but no ... pretty (if you ignore the problems with photogrammetry) but the flight model, clunky UI, half the buttons reading INOP, and a often murderous AP that just rolls you towards the ground just don't do it for me. And I get bored of the sightseeing quite quickly. :)


Oh, another on VRAM ... with Cyberpunk I've seen some at least some 12GB of VRAM usage (could be 16GB, not sure) when maxing it out at 4K (no DLSS either). Unfortunately though, performance was also not good enough. Dropped to less than 10GB of VRAM usage when I got to settings that did run well enough.
 
Keep in mind that whenever you see some effects done great with rasterisation you don't know how much work went into that vs what the middle-ware solution looks like in terms of cost/performance. The examples you give with hair actually work against your argument because a lot of the great hair effects done in game were afterwards done with things like TressFX as a base from which they forked. Why? Because it was already great & it was cheaper than starting from scratch. See lots of SqEnx games:


https://www.makinggames.biz/program...nd-divided-gpu-frame-exploration,2304358.html

So then if you can achieve better results visually with RT (and this is indisputable) then why would you spend all that extra time on trying to do it with rasterisation (for a high-end option)? You wouldn't, it just doesn't make sense. Obviously the "base layer" will be rasterisation, which will be whatever it will be, aka the console target. Plus tbh a lot of the time whenever game devs pushed really high-end rasterised options, even when done with Nvidia/AMD, it just was never very stable and was prone to so many issues (HFTS, PCSS, VXAO etc.).

Cyberpunk 2077 was in development for 8 years IIRC,and the engine is based off the one in the Witcher 3. At its heart its an engine which does stuff in a rasterised way. That means RT was added relatively late in the R and D cycle. Its an inter-generational game hence why the console version does not even implement RT yet - it tells me the features were added very late in the development cycle. Its also why RT performance is not well optimised at native resolution even on the best RT capable GPUs available right now.

It was exactly the same thing with The Witcher 3. CDPR added the Gameworks effects in the last 6 months of development IIRC,and again performance was not brilliant at launch. If you want to see a game which seems to have RT at its heart,I suspect Atomic Heart will be that game. If you look carefully at the Turing reveal two years ago,JHH showed off some work from the game.

If you look at Cyberpunk 2077 builds from back then,none had RT effects(Atomic Heart was already demoing them). So adding RT effects added another graphics fork to the game,and honestly I wish they had not done so AT LAUNCH,primarily because the game had features cut and lack of QA/QC despite the devs having to crunch.

I wish the engineers had been spending time on actually getting the base game right. Like with Star Citizen there was clearly feature creep.

Also the issue,is that most of the market is on existing hardware - so the biggest argument is that you are serving most of the market. For one most consoles people actually own can't handle it. Most GPUs gamers own cannot do RT - to put in context there are more GTX1060 owners on Steam than all RTX2060/RTX2070/RTX2070 Super owners. There are twice the number of GTX1650/GTX1660 owners than RTX2060 owners.

The issue is the RTX2060/RTX2070/RTX2070 Super can't do RT very well and the AMD GPUs are barely better.

So realistically RT is a minority feature for most gamers playing games - it seems a bigger deal on enthusiast forums. Most PC gamers,and I suspect most of the 13 million people who bought Cyberpunk 2077 are not playing it on high end systems with an Ampere or RDNA2 based GPU.

For one supply is a problem,and two a lot of people will not upgrade an existing system for a game. Remember,its the uber fans who care enough to post on tech forums about how much they are spending to upgrade their rig for X or Y games.

So that means most Cyberpunk 2077 players are playing with with RT off,or with reduced RT settings(or having to reduce resolution and other settings). So all this argument about people having buyers remorses means diddly squat - most won't be playing it on hardware where RT or "maximum" RT settings will be a consideration.

Its also was advertised as an RPG,so realistically the rest of its more important anyway.

It also tends to be AMD or Nvidia throwing money at devs,to push their way of doing stuff. This is why people are surprised when certain console games look far better than they deserve to do,and that is because they are not constrained by having to push new hardware sales every year or so.

If you looked at many of these games with GPU aided physics,they actually reduced on physics effects which were seen in games already. It was seen in a number of PhysX games,where even simple physics which were already using existing APIs/libraries were totally removed. You only have to look at games such as Red Faction to realise the level of physics doable without the new for a GPU. Its typical marketing.

Its going to be the same with RDNA2 and Ampere. If RDNA3 and Hopper land next year,expect RDNA2 and Ampere to also suffer the same fate. Instead of trying to maximise performance,Nvidia and AMD will use the brute force method to emphasise the improvements in their new GPUs. This is the big thing holding back PC gaming - BOTH Nvidia and AMD need to push new hardware sales,so realistically performance is being left on the table,at the altar of new GPU sales.

Many tech companies play these tricks to sell their latest and greatest - even Apple making sure their newer versions of iOS start eating up more RAM,so they can emphasis how much better their new generation is.

PS:Apologise for the essay! :o
 
Last edited:
We already know not one game needs over 10gb to date anyway. Trust me, there are a few users out there waiting for the day so they can post in the 10gb is not enough thread. Lol.

Yeah, I think you may be on point with the sell it on before it becomes a thing next year. Until then its a factoid. :)

Cyberpunk 2077 was in development for 8 years IIRC,and the engine is based off the one in the Witcher 3. At its heart its an engine which does stuff in a rasterised way. That means RT was added relatively late in the R and D cycle.

Was thinking the same myself. It will be games released late 2021 that use RT properly for the 30 series cards IMO.
 
I have an Nvidia GPU and what has this got to do with what I said?? Cyberpunk 2077 is an RPG game,and I played the original pen and paper RPG. I couldn't give two ***** about whether some reflections are better in the game,when the RPG elements are a bit disappointing,world interaction is not as deep as CDPR touted and its buggy. This is what YOU don't seem to understand - a good RPG game can be replayed for years with progressively better hardware.

It sounds like YOU have extreme buyers remorse,so are trying to justify whatever hardware you have by projecting onto others because Cyberpunk 2077 hasn't lived up to expectations. I have mates who bought Ampere GPUs for this played it for a while,and literally have started playing something else. It appears a few here have invested way too much into the game,and now need to defend CDPR to the ends of the earth,so have latched onto graphics. In a number of aspects the Witcher 3 is a better RPG.

Rasterised effects will be turned down to make the RTX on effects look much better. This is why some of them look poor,so to emphasis it. It was the same with Hairworks/TressFX - there was already hair animation methods and other physics methods in other games,but in any games with Hairworks/TressFX the "off" effects were toned down,so the "on" effects looked much more of a jump. Most reviews will only do a comparison within the same game for image quality but not try to check if implementations are done better elsewhere,so it works brilliantly as a marketing tool!

Its been done enough times by Nvidia/AMD/ATI to emphasis some new feature over the older generation. After all they want to sell new GPUs to you,and you sitting on your "old" ones is "inconvenient" for them.

Its typical marketing - just go to a shop and see reps trying to compare their latest TV model with the earlier one,or an expensive HDMI cable compared with an el-cheapo on.... you will find they have generally mucked around with the image quality settings on one demo unit,to make the other one look better.

But enough gamers and hardware enthusiasts have the attention span of gnats,so it works very well on them. They have such belief in companies such CDPR,Nvidia,AMD,etc that they can't fathom these companies will try every trick to try and oversell something to them.

Then why make up a conspiracy theory about it? Run down the quality improvement of RT reflections. What game are you playing?
 
Was thinking the same myself. It will be games released late 2021 that use RT properly for the 30 series cards IMO.

I think Atomic Heart will be that game - JHH showed off a scene from the game back in 2018 during the Turing reveal.


The 2020 trailer looks funky:
https://www.youtube.com/watch?v=FJ7cCN-DmFY


https://www.youtube.com/watch?v=kxdWSyoBcH0

I don't know if it will be actually a good game,but if people think Cyberpunk 2077 is the new Crysis,I think Atomic Heart will prove that theory wrong! :p
 
I've got it installed, but no ... pretty (if you ignore the problems with photogrammetry) but the flight model, clunky UI, half the buttons reading INOP, and a often murderous AP that just rolls you towards the ground just don't do it for me. And I get bored of the sightseeing quite quickly. :)


Oh, another on VRAM ... with Cyberpunk I've seen some at least some 12GB of VRAM usage (could be 16GB, not sure) when maxing it out at 4K (no DLSS either). Unfortunately though, performance was also not good enough. Dropped to less than 10GB of VRAM usage when I got to settings that did run well enough.
I assume you are looking at the vram requested rather than actually used. Because if that is the case I had nearly a full 12gb used on my Titan XP on Final Fantasy 15 a few years ago. But that is not the actual usage. I think you need the latest afterburner with a plug in or something like that to see actual usage. @PrincessFrosty knows more about it.

It could be that it does go above 10gb with rt on and no dlss as you say though. But that is pointless as you run out of performance way before vram.
 
Back
Top Bottom