• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Some people actually like tinkering with game settings.
I know I do and I like to choose which graphical options to sacrifice to get more performance.
I can make a game run the way I want to and not the way someone is telling me is best.
Can you disable stuff like chromatic aberration, film grain etc on consoles.? Probably not maybe devs let you disable motion blur and choose between performance and quality.
My PS5 been collecting dust for the past year as I haven’t even turn it on once.
I actually removed NVME drive from it and transferred it to my PC as my main drive.

Yeah that's not something I can relate to any more. So probably explains my bemusement at the current climate of £1000 gpus that are mid tier.

I tend to disable motion blur in games if it annoys me but that's about it.

All I ever did with pc games was take recommended settings then maybe make a tweak if there was gains to be made.

My pc became a dota2 machine. Nothing more. Any other big release just goes on xbox cos I know its gonna work fine. I still browse this forum and keep eye on pc stuff but I currently see nothing but crap stuttering reported on games that look no better to me.
 
Shortly after I recorded this video there was a patch. It's added another 2 FPS onto what is seen in this video. Cost of recording is 2 FPS as well. :)

7900 XTX, 4K + FSR Quality Ultra Quality + Ultra Ray Tracing.
Will turn to 4K quality at some point.
Looks and plays pretty good with ultra RT/Ultra Q + FSR. What timing with the patch but at least you'll have a good before and after if you decide to run it again with same settings.

That old geezer with 'Arry is a nimble old git, climbed the rock like he was a ninja before he "repair-o'd" the path to the ruins.
 
Yeah that's not something I can relate to any more. So probably explains my bemusement at the current climate of £1000 gpus that are mid tier.

I tend to disable motion blur in games if it annoys me but that's about it.

All I ever did with pc games was take recommended settings then maybe make a tweak if there was gains to be made.

My pc became a dota2 machine. Nothing more. Any other big release just goes on xbox cos I know its gonna work fine. I still browse this forum and keep eye on pc stuff but I currently see nothing but crap stuttering reported on games that look no better to me.
Yeah stuttering IS a big issue with PC gaming especially when it comes to UE4 games.
I don’t understand why only few games use shader precompilation at the start which fixes it completely.
I guess it’s too much to ask from developers…
 
Last edited:
The HWunboxed review is using a similar spec to my CPU and RAM (32GB, DDR5 6000, CL30), only difference being that they used an AIO CPU cooler:

It seems like a well optimized system. Frankly, most won't be spending more >£100 on a CPU cooler though (theirs cost about ~£150).

Interesting to see that the game is playable on a Vega64 at 1080p (medium preset).
 
Last edited:
The HWunboxed review is using a similar spec to my CPU and RAM (32GB, DDR5 6000, CL30), only difference being that they used an AIO CPU cooler:

It seems like a well optimized system.

Interesting to see that the game is playable on a Vega64 at 1080p (medium preset).
7700X is a great gaming cpu, in the HUB tests over loads of games it came in second place just behind the 13900K in 1st. The 7800X3D could be a force to be reckoned with.
 
Just tried it on a 4090 + 12900k, the game runs great, ive no idea what the outrage is all about. Even 4k native + RT ultra is entirely possible at 60+ if you turn RT shadows off. With DLSS quality and everything including shadows on ultra you get 75 fps at the heaviest scenes, 85 to 100 on average. It's definitely running better than cyberpunk and it actually looks pretty good I have to say.

Also the game never exceeded 10gb of vram. In fact it went as low as 4gb but on average it was at 7-7.5gb.
 
Last edited:
I wish they just had a console preset, that basically copies the settings exactly for the series X, but lets you set the FPS cap higher if you wish.

At least you know you are on par with consoles then. Then you can easily test their PC optimisation too.
 
Last edited:
I wish they just had a console preset, that basically copies the settings exactly for the series X, but lets you set the FPS cap higher if you wish.

At least you know you are on par with consoles then.
You don't need to be, if you turn RT shadows off the game performs absolutely insanely well. If you dont mind using DLSS quality, youll be hitting CPU limitations at 4k DLSSQ + RT ultra, LOL
 
Last edited:
Just tried it on a 4090 + 12900k, the game runs great, ive no idea what the outrage is all about. Even 4k native + RT ultra is entirely possible at 60+ if you turn RT shadows off. With DLSS quality and everything including shadows on ultra you get 75 fps at the heaviest scenes, 85 to 100 on average. It's definitely running better than cyberpunk and it actually looks pretty good I have to say.

Also the game never exceeded 10gb of vram. In fact it went as low as 4gb but on average it was at 7-7.5gb.

Isn't the outrage from people who don't have a £2500 + system?
 
Isn't the outrage from people who don't have a £2500 + system?
But that's my point, when it can easily handle NATIVE 4k + RT ultra (except shadows) on a 4090, you can just as easily play the game decently with lower end hardware. If you don't have a 4090 you might need to use DLSS Q for 4k + RT, but that's to be expected. With RT off the game isn't heavy at all and it still looks great.
 
Isn't the outrage from people who don't have a £2500 + system?
image35dd35d9e745bcaa.png
 
As i've said, the game is pretty easy to run on in terms of gpu. Multiple settings to bring performance to acceptable levels. If you wanna play with RT, you should be more worried about your CPU than your GPU. Unless you have the latest and greatest CPU, youll be dropping to the 40ies in certain heavy areas :eek:
 
Remarkable isn't it.
Then you had people saying 3070/ti are better than 2080ti's... The proof is now there.. Vram is very important to keep performance levels up and same for system RAM as we have all known for decades in some cases on here.

The real kicker is they have turned 4k capable cards regarding their grunt into 1080p cards because they don't have enough VRAM. Nvidia.........
 
You can't just trivially swap engine versions like that. It takes a ton of work, and masses more to get the best of especially on bigger games. You need to decide early in the dev cycle which engine version to use.
Ahaa well I remember reading somewhere UE4 games could easily be made to work on UE5 as that was one of the advantages to UE5 too that it helps devs update their engines on games and use new features. Seems more going on than obvious and agree if it takes years or tons of work then I understand why it's still UE4.. Maybe just a bad engine to have used on such a game from the start and the levels they were aiming for.
 
Hey Matt, for benchmarking watch out for the frame generation bug. Steve at HUB spent a lot of time turning it off; this is because it would randomly turn itself on and the only way to disable it is to go into the menus and turn frame generation on (the menus say it's off even though it's on) and then turn it off again in the menu and benchmark. You have to keep doing this otherwise some of your runs might have FG on and some off
 
TBF, if you just want to "game" and don't care too much about FPS or the absolute best visuals then consoles are a much better experience but at the same time, I like 21.9 format and high fps and also K+M for when a game just works better with this :)

This is why pc gaming wins for me the ultrawide resolutions and monitors, I can never go back to a desktop with 16:9 again after being on a 3440x1440 34" 21:9, I'm even now past 21:9 and on 49" 32:9 5120x1440p and would like to go to 57" 7680x2160 32:9 as a future update.

Only reason pc gaming to me and desktop use is worth it for me is the resolutions and aspect ratios available on pc. If next gen consoles come out with ultrawide resolution support and tvs come out with 21:9 screens again at higher resolutions (we did have one from Philips years ago but low resolution), I would be all over that as movies will be amazing in the living room and gaming.
 
Last edited:
You don't need to be, if you turn RT shadows off the game performs absolutely insanely well. If you dont mind using DLSS quality, youll be hitting CPU limitations at 4k DLSSQ + RT ultra, LOL

I've kind of mixed feelings about RT shadows in the game - they are kind of doing it wrong but that is understandable in a game which isn't purely ray/path traced but they also look harder with sharper edges, the non-RT shadows look a bit more natural and fit the aesthetic better, but sometimes the RT shadows handle complex scenes with less visual issues.
 
I wonder if even faster NVME drives (PCIe5) might do a better job of games like this.

Especially if they ever get around to supporting DirectStorage 1.1, which is still very new.

PCIe5 X4 slots max out at 15.7GB/s throughput, which does seem like a hell of a lot.

Presumably, the console version streams in lower quality textures, so the already impressive console NVME drives handle this game comfortably?

I'm just wondering when it might be worth getting an NVME drive for games...

The finalised PCIe6 specification was released in January 2022, so I suppose we'll see even faster NVME drives in 2-3 years time, potentially upto 30 GB/s.

Maybe someone could check task manager to see if there's a lot of drive activity when playing Hogwarts Legacy?
 
Last edited:
Back
Top Bottom