That is on planets where you can see stars. Interesting how its Wccftech again - didn't they start the DLSS controversy too? Maybe MLID is correct,that Nvidia CBA now and would rather play the sympathy card.
Yet in New Atlantis during the daytime,with no stars,etc my mates RX6700XT is still faster than my RTX3060TI. You can look at the ground and still its slower. You can go into The Well,which is an interior space and still the same. Unless you think stars are rendering inside buildings.
We compared performance using the exact settings and I saw him play it too and we have the same CPU,same RAM settings,etc. We have a similar speed SSD too.
Also @keef247 actually played the game on July Nvidia launch drivers for their RTX4070. They updated to the latest "Game Ready" drivers and there was no change in performance for them.
So maybe Nvidia needs to fix its drivers too.
I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.
#Totally normal behaviour.
The lack of stars etc. was noticed by a couple of end users on Reddit so it's not just some press site making this up.... So it's either an AMD or/and game issue, same way nvidias lesser performance is either driver or/and game related.
As for ray tracing and cyberpunk decreasing amd performance, well that is 100% down to amds lack of investment in dedicated hardware acceleration for rt and it's nothing new as this is the case across every RT title (aside from 1-2 amd sponsored rt games) so it's not really surprising, of course if you want to play the "Nvidia bad" card, you be better of using the rtx remix titles where amd gpus completely crash to like 5 fps and have graphical artifacts.
I suppose amd not having ray reconstruction like Nvidia thus lesser IQ and worse performance will also be nvidias fault too?