They haven't optimized for Intel and NVIDIA at all. I pushed my 9900k to 5.0ghz and gained miniscule FPS. Disabled Hyperthreading and FPS shot up by almost 10 indicating no QA was done on Intel. Can't remember the last game where I had to do that.
This is not true at all, and in fact Dunia is notoriously favouring Intel CPUs (to no small degree because it's started from a Cryengine base which itself is pretty much an Nvidia/Intel engine). But like that base it's also reliant on the main thread quite heavily, so there's diminishing returns past 8c/8t. Disabling HT (or SMT) and gaining FPS is not a sign of bad QA and is in fact a common thing for more than a handful of games. This isn't Cinebench, you can't expect perfect scaling, and even in the more hyper-CPU optimised titles like Doom: Eternal you don't just see thread scaling into infinity (in fact that also starts petering out after 8c/8t and is a far simpler game than an open world one). Games are simply too complex for things to go that smoothly, and that's not even going into production issues.
On the NVIDIA side, the game's VRAM allocation is extremely shoddy. I am sitting at just 6GB utilization without the pack and textures don't look good at all. Turning on the pack, the game just starts acting up when VRAM usage exceeds 10.8GB and it refuses to use anything beyond that. I think the engine is just inefficient as Cyberpunk uses approx. 8GB of VRAM and the textures in that game are night and day compared to FC6 stock textures.
It's true the game could use more granular approach to vram management ala SFS (Sampler Feedback Streaming) but that's cutting edge tech and this is still a cross-gen tech base. Imo the problem is one of the best ones to have because all you need is more VRAM and you're good to go - which is inevitable as memory only goes up, so for a future re-play or for people buying cards next year (or not the vram-starved ones today) it won't be an issue in the slightest.
On the other hand what you're saying about Cyberpunk is the exact opposite - that game has more granular management of textures & LODs but in turn it means you get LOWER QUALITY because the world arounds you morphs all the time as you get LOD transitions constantly, and tbh if you look at billboards and whatnot you can see significant issues with those textures loading to their highest level, which btw is actually very low res (<480p) AND when you're driving the game puts itself into potato mode so as to alleviate streaming pressure and therefore all LODs and streaming gets nuked.
I am just thankful, the optimization isn't as bad as Watchdogs Legion. Frame rate just gets shot to **** when I get in a car and drive around the city with RTX ultra.
Actually Watch Dogs: Legion optimization was quite good, the main issue was again one of ignorance on the part of people because they just cranked up everything and didn't understand what the settings did, so things like shadows (think what Nvidia tried to do with HFTS) had very demanding and advanced forms at ultra, which in conjunction with other settings that are CPU demanding (and the game had plenty on offer) plus its nature as an open world raytraced title meant people expected a certain level of performance which was simply unrealistic for those settings from existing hardware. That's how it's meant to be! Otherwise we'd just have a console version getting brute forced.
I get more FPS with Cyberpunk 2077 maxed out with RT Shadows, RT Lighting and RT Reflections then WD Legion with just RT Reflections.
That's because there's more than meets the eye going on. RT reflections =/= RT reflections. There's many other hidden variables behind that name. Nevermind that WD:L gives Cyberpunk almost as much a spanking as Far Cry 6 as far as LOD & streaming management goes. So yes, WD:L does ask more from the CPU than CP2077 but for good reason.
See:
https://www.pcgameshardware.de/Rayt...acing-CPU-Cost-Benchmarks-Frametimes-1371787/
Ubisoft are just terrible at PC ports.
This is generally untrue. Yes sometimes they just give you essentially the console version (last 3 AC games) but they've been on the forefront of pushing quality HDR on PC, adding all sorts of PC QOL settings, as well as some advanced tech for PC (raytracing, great LODs, 4K texture packs, early with temporal reconstruction, etc.) plus various niceties such as Freesync Premium Pro (it helps with tone-mapping a fair bit). They actually do a good job with their PC versions overall, more so than any other big publisher, and they also put out PC specific videos to help people navigate the differences.