*** Cyberpunk 2077 ***

I really hope they build the sequel using Unreal Engine 5. They should also do the same for Witcher 4.

Some story DLC this year would be nice. Will wait to grab a RTX 4070/80 before playing it again though. I played it on release on a 3080 but feel I need some more performance in the RT department which drags frames down. The 4000 series sounds like it will bring a nice leap in that department :D
 
I really hope they build the sequel using Unreal Engine 5. They should also do the same for Witcher 4.

Some story DLC this year would be nice. Will wait to grab a RTX 4070/80 before playing it again though. I played it on release on a 3080 but feel I need some more performance in the RT department which drags frames down. The 4000 series sounds like it will bring a nice leap in that department :D

I revisited this after building my new PC (12700K @ 5.1/4 + 3080 Ti UV), and with full psycho RT settings and DLSS on balanced, it maintains 65-80+ FPS @ 3440x1440, looks stunning.

The game itself still needs work though, but more playable than it was at inception, though that's not saying much.
 
I revisited this after building my new PC (12700K @ 5.1/4 + 3080 Ti UV), and with full psycho RT settings and DLSS on balanced, it maintains 65-80+ FPS @ 3440x1440, looks stunning.

The game itself still needs work though, but more playable than it was at inception, though that's not saying much.
Would want to play on no less than Quality setting now that I am on 3440x1440 as that drops to 1080. Balanced is like getting close to 720p I think. Also I would need some new story dlc before playing again. By then we will have the 4070/80 out :)
 
Would want to play on no less than Quality setting now that I am on 3440x1440 as that drops to 1080. Balanced is like getting close to 720p I think. Also I would need some new story dlc before playing again. By then we will have the 4070/80 out :)

Damned if I can see the difference between balanced and quality, but then I'm not running a side by side comparison.
 
Damned if I can see the difference between balanced and quality, but then I'm not running a side by side comparison.
Not tried it to be honest. I only just downgraded my resolution recently from 4K. But my feeling is I would not want to go much lower than 1080p with DLSS which for tour resolution would be Quality setting, but will have to do some tests my self at some point :)
 
Been playing this for the last month or so. Only get a few hours a week so games can last me ages.

Other than the odd floating NPC and one really weird incident where a car seemingly started to be crushed by an invisible force, the game has run perfectly for me.
 
Interesting, well somewhat, how this game uses both the P and E cores...

Cyber-Punk-3080-Ti.png


....not too many manage to do that.

I'll need to actually start playing this, soon..........
 
Interesting, well somewhat, how this game uses both the P and E cores...

Cyber-Punk-3080-Ti.png


....not too many manage to do that.

I'll need to actually start playing this, soon..........
I've found it to be quite healthy usage but even still, CPU usage in-game as a whole is around 35-40% for me at max settings 440x1440: Only 2 P cores seem to be actually doing legwork too.

CIwhV9s.jpg
 
I've found it to be quite healthy usage but even still, CPU usage in-game as a whole is around 35-40% for me at max settings 440x1440: Only 2 P cores seem to be actually doing legwork too.

CIwhV9s.jpg

From what I have noted in other games, for example Workers and Resources Soviet Republic, that one uses mostly two of the P cores and zero E cores.
Civ 6 seems to do somewhat similar to Cyber Compared to some games CP is pretty good at its CPU usage.

I bet, like mine, it loads your 3080Ti to almost max clocks and actual load, at times. At least it shows that for me, with that screen from the benchmark.
How is your performance with your 3080Ti at your resolution, mine being 1440p.?
 
Yep the GPU is 100%-99% pretty much at all times when not in the menus. I am at 3440x1440 though with max RTX and HDR enabled too. Performance seems fine however and am getting solid high frames well above 60 with DLSS enabled and no tearing thanks to gsync :cool:
 
As long as you are within the Gsync range then it is generally fine. With your UW resolution, not quite 4k iirc, then you should be able to achieve that low range of what Gsync needs, as such, in most if not all games.
 
DLSS makes it look like crap to my eyes, way too blurry :( Ran it maxed out minus Raytracing at 1440p on a 3090, we have got a while to go on GPU power before raytracing is actually useable IMO (DLSS doesn’t count :p)
 
I notice the softer image vs native especially on my 4K monitor - can't say it overly bothers me but I'd still rather have the native image overall - despite the latest implementation of DLSS having advantages in certain areas.
 
I started playing this with my new rig (in sig)

It has DLSS set to 0.25. I'm finindng it blurry, what shall I do to sharpen the image but still get 60fps on my ultra wide monitor?

I'm thinking turn DLSS off ans drop RT then put everything to medium /high settings
 
I really hope they build the sequel using Unreal Engine 5. They should also do the same for Witcher 4.

Some story DLC this year would be nice. Will wait to grab a RTX 4070/80 before playing it again though. I played it on release on a 3080 but feel I need some more performance in the RT department which drags frames down. The 4000 series sounds like it will bring a nice leap in that department :D

Jokes on you...I am waiting for the 6090Ti to get the ULTIMATE performance from CP2077, so I will get a better experience than you.
I will be able to implement the latest version of DLSS (prob 6.990995 by then) meaning you will lose out.

You should wait until then to enjoy the game, otherwise it wont be worth it. The image will be too soft.
 
Back
Top Bottom