The delay is to try and fix the game on the current gen consoles, poor Xbox probably struggling at 600p
As for your Ncidia conspiracy theories - none of this is new, many other RTX games have similiar performance with al RT features enabled and devs still use it because graphics sells games not framerate - stay off the cool aid, not everything is a conspiracy theory by Nvidia.
Bless, you are still full of hope!
Some of us have seen the tricks played by most of the companies over the last 20 years. Remember,how you also talk about Intel and their tricks against AMD,well Nvidia does it also. Nvidia is not your mate and neither is AMD - they are here to sell new products. CDPR never delayed a game for 3 months to make better console performance according to forums,and never used that time to add Gameworks features instead 2 months before launch. None of the X87 PhysX stuff,tessellation,etc. Never happened.The tech press all imagined it.
Everyone,Grim5 says all the Turing GPUs will run Cyberpunk 2077 fine,so there is no need to buy Ampere,and if Ampere runs it much better and Turing does not its a coolaid conspiracy theory. It was the way it was meant to be played.
Every RTX game runs at under 1080p native resolution,at well under 60FPS(close to 30fps in many parts) on a £1000 GPU,and needs upscaling to get to 1080p. Those reviewers had it all so wrong with their reviews. Metro:Exodus and Control all need upscaling at 1080p apparently.
Also PC is not about FPS,that is why gamers don't buy expensive overclocked CPUs,and 120/144HZ/240HZ monitors don't exist,as PC is all about cinematic gaming with 30FPS lows,like consoles. Locked 60FPS,who wants that? I remember a bloke called Rollo once who was a fountain of knowledge who gave me such insights.
Phew,I was worried there for a second. I take it all back,all GPUs will run this game perfectly fine,and a potato CPU will be OK.
The Xbox series S has a 4tflop rdna2 gpu - so clearly AMD has some very small Rdna2 cards planned
Or they will repurpose RDNA1....I doubt it will be made to target any kind of RT. AMD has a habit of mixing and matching designs for each generation nowadays. RDNA1 has partial GCN features,so it should make porting over existing games a bit easier.
I get your point but the reality is if you are not blinded by the AAA games and know that benchmarks are easily weighted due to engine/brand/sponsor stuff then as long as the AMD card has enough meat to grunt the required horsepower, the majority of indie titles or mods that are played by the masses should be able to understand that.
You mean all the reviews which will test this game,etc and probably see not so great launch performance. There is no point sponsoring short FPS games which people forget about after a few months,when Nvidia is sponsoring many RPG games and MMOs,which people play for 100s or 1000s of hours. Hence reviewers keep them in the review lists much longer. Look at Unreal Engine?? Nvidia forged very close links with Epic,and as a result lots of Indie games,will show better performance natively on UE4 as it integrates a ton of Nvidia specific features. You need to optimise for AMD on the PC version. Only this year,has there some movement on trying to incorporate TressFX into UE4.
I don't understand them at times,they have more money now,and they need to make the most of their GPU investments. If not they need to brute force performance,which means they need to push GPUs more,meaning more power consumption,etc.
The worst thing is half the battle is already won as this is on the new consoles - AMD really does not leverage it's console wins as well as they should. Thank goodness(for them) Nvidia didn't have one of the home consoles,otherwise they would be in deeper trouble.