*** Cyberpunk 2077 ***

Well I mean, take the Johnny's lines from the trailer...

"How many times you willing to get burned before you stop trusting someone?"

"How many times you gotta take a bullet for these **********ers in the name of empty promises?"

All I could think of was the meltdown at launch lol.
 
Well I mean, take the Johnny's lines from the trailer...

"How many times you willing to get burned before you stop trusting someone?"

"How many times you gotta take a bullet for these **********ers in the name of empty promises?"

All I could think of was the meltdown at launch lol.
Sounds by design. Very good if so.
 
Compared some recent benchmarks with a mate, he got a 7900XTX recently and has the QD-OLED too, CPU is a 5900X We both used the same settings, he used FSR Balanced, I used DLSS Balanced, both on RT Ultra preset otherwise.

Ryzen 5900X: + 7900 XTX + 32GB DDR4 @ 3800MHz
rSW8OxE.jpg


i7 12700KF + 3080 Ti FE + 64GB DDR4 @ 3200MHz:
6vZP2RW.jpg

Basically neck and neck lol. AMD's RT performance has definitely improved, but it's still a gen behind for sure. Good though.
 
Last edited:
Would be interesting to see what the GPUs can manage native mind (albeit no one really plays that way) - DLSS/FSR can skew the results a bit as it isn't a perfect like for like.
 
Last edited:
I've done the bench with DLSS off, so will await his result with FSR off and post back :) I am expecting similar results, even though the raster perf of the XTX will smash the 3080 Ti, with RT enabled the playing field is most likely gonna even out.
 
Well I mean, take the Johnny's lines from the trailer...

"How many times you willing to get burned before you stop trusting someone?"

"How many times you gotta take a bullet for these **********ers in the name of empty promises?"

All I could think of was the meltdown at launch lol.

Things which annoyed me most about the game are the absolutely woeful clothing system and associated bonuses/armour and the absolutely tragic character upgrade trees, etc. a lot of the rest personally I can live with but one of the biggest draws with a game like this is a compelling ability to uniquely customise your character and upgrade their abilities and that doesn't look like it will ever be fixed (would need a huge revamp of the game).
 
Would be interesting to see what the GPUs can manage native mind (albeit no one really plays that way) - DLSS/FSR can skew the results a bit as it isn't a perfect like for like.
Here we go:

Ryzen 5900X: + 7900 XTX + 32GB DDR4 @ 3800MHz:
pYxhpi3.jpg

i7 12700KF + 3080 Ti FE + 64GB DDR4 @ 3200MHz::
iZSGm3w.jpg

What's impressive is 44fps max in raster vs 105fps in DLSS. So DLSS 3 frame gen would yield probably more than my monitor's refresh rate of 144fps really even from a 4070 Ti.


No, I will not!
 
Last edited:
Here we go:

Ryzen 5900X: + 7900 XTX + 32GB DDR4 @ 3800MHz:
pYxhpi3.jpg

i7 12700KF + 3080 Ti FE + 64GB DDR4 @ 3200MHz::
iZSGm3w.jpg

What's impressive is 44fps max in raster vs 105fps in DLSS. So DLSS 3 frame gen would yield probably more than my monitor's refresh rate of 144fps really even from a 4070 Ti.


No, I will not!

That’s a poor showing from a 7900 XTX. So glad I picked up a 3080 Ti for half the price :D
 
That’s a poor showing from a 7900 XTX. So glad I picked up a 3080 Ti for half the price :D

Await matt to come in going "those are incorrect, I can get 20+% extra performance over that*"

*highly fined tuned best of the best components with liquid metal cooling, fresh windows install and AC unit right beside the PC

;) :cry: :D

Should post that in either the RT, 40xx or better yet, the rdna 3 thread @mrk :D :cry:
 
What's impressive is 44fps max in raster vs 105fps in DLSS. So DLSS 3 frame gen would yield probably more than my monitor's refresh rate of 144fps really even from a 4070 Ti.

nvidia-jensen-huang.gif
 
  • Haha
Reactions: mrk
Whats that? You upgraded from your 3090 to a 4070 Ti is it? :cry:


Await matt to come in going "those are incorrect, I can get 20+% extra performance over that*"

*highly fined tuned best of the best components with liquid metal cooling, fresh windows install and AC unit right beside the PC

;) :cry: :D

Should post that in either the RT, 40xx or better yet, the rdna 3 thread @mrk :D :cry:


michael-jordan-crying.gif
 
If the Intel system had some slightly better DDR4, it would likely be neck and neck.

3200mhz is pretty slow for alderlake unless the timings are extremely low, that's a doubt though with 64GB.
CL18, my RAM is 3600MHz but I run it at 3200 because of stability on this BIOS/board with 2 sticks of 32GB modules - I CBA with the tediousness of trying every voltage range and timing etc to get it 100% stable, so I did the next best thing and dropped to 1:1 3200MHz. Noticed zero perf diff in games though so doubt the fps would have been any different in the above. DDR4 4000MHz might have added 2-3fps, but then again what's the point for such measly fps at such extra cost :p
 
Last edited:
Back
Top Bottom