***Hogwarts Legacy - RPG***

Isn't it amazing that we are spending more time trivialising about the performance issues with modern games than the games themselves?! It blows my mind that we have even mid range gaming systems that should be capable of ploughing polygons into dust yet here we are where even a £1600 GPU alone has to make do with frame generation and the like to get 60fps. The screen space glitches as stuff moves to the edge of the viewport is also something that should not be that obvious as games from years gone by refined that sort of issue so why are we seeing them so clearly now...

Here we have a game that looks better with RT turned off, and various ULTRA settings set to HIGH :cry:

What a state game development is in. I can only think of one or two games that have come out in the last year that have had no major issues at launch requiring gamers themselves to come up with workarounds to sort of resolve.
 
Last edited:
You should cap your frame rate to a few FPS below your monitors top spec. You don't really want Vsync doing anything, but G-sync stops working right at the top of the range.

In my case I can use 120 because my monitor does 144hz.

Should turn Vsync off in NVCP as well as in the game, let G-sync operate within the fps range.

Some people said to set the camera acceleration to 0 as well.

Beyond that switch off most of the extra stuff like motion blur or whatever if it exists, I think I only left Depth of Field on but that one is optional.

If you have RT on try it with RT off and see how your performance compares.

Shouldn't vsync be turned on in NVCP?

 
Apparently not anymore. From more recent stuff I read capping fps below your max monitor g-sync fps should be fine even at lower framerates, g-sync can double up frames when it's too low.

You can enable it but it will introduce a bit of input lag, otherwise won't kill anything. Try it out with vsync fully disabled.
 
I never have vsync on anywhere. Gsync handles everything perfectly. I don't know how relevant those guides are in today's landscape but as long as I see a frame cap below the refresh rate I use (144Hz), then there is no tearing, even uncapped above that there is no tearing in many games, just depends on the game and its rendering pipeline.

Essentially I see the lowest mouse to screen movement latency with vsync all off between game and NVCP, and Gsync enabled. I can then hotkey toggle the RTSS frame cap on or off depending on if I'm playing the game or benchmarking tests etc.

This is all assuming a VRR Gsync display is being used of course. ANywhere near 60fps if vsync is on then I can feel the slight latency of the mouse movement which does not exist with Vsync off. Some games are not smooth regardless of vsync/gsync etc, just down to their engine being used. Witcher 3 next gen is a prime example, the panning movement is not smooth by the nature of the engine it uses, whereas Dead Space Remake is as smooth as butter. Cyberpunk is smooth too but without Reflex+Boost being turned on, you can feel the mouse movement latency due to the game with RT enabled running closer to the 60fps mark.

hogwarts is perfectly smooth and input latency from mouse is instant with Reflex+Boost enabled. no issues there. The issues we seem to have is transitional stutter and random fps dips in tterms of performance!
 
Last edited:
I used to be someone who'd have it on in NVCP due to that guide, but I think things have changed since then.

I have it off everywhere now and use the in game fps limiter in COD MW2 for example at 3 frames below refresh rate or an external limiter if the game doesn't have one. Definitely provides the least amount of latency for me, but I play online FPS, so every little helps in that department.
 
Last edited:
I just bypass any ingame caps entirely and use RTSS exclusively for that. Just makes things easier as not all games employ frame cap the same way, some can introduce further latency for example or add stutter, whereas RTSS only adds a single frame of latency.

I suspect the same applies to FreeSync displays too.
 
I set a global FPS limit via RTSS as well, seems to work well.

There is an option to set an FPS limit via NVCP but when I looked into it people didn't recommend it, RTSS doesn't impact performance of anything else.

Then in game I can also FPS limit if I want to, but not all have it.
 
I've always read it's best to use the in-game limiter if it has one, especially in titles that also have reflex/boost options.

I think it would be mostly old titles now that have shoddy built in caps. I've certainly not noticed any issues using them in recent titles.
 
I just bypass any ingame caps entirely and use RTSS exclusively for that. Just makes things easier as not all games employ frame cap the same way, some can introduce further latency for example or add stutter, whereas RTSS only adds a single frame of latency.

I suspect the same applies to FreeSync displays too.

For this one, if I limit frames at RTSS to 60 and set Hogwarts uncapped, it uses more power than if I also limit Hogwarts to 60.

I think under the hood by not limiting Hogwarts it's rendering more frames and then RTSS is only allowing it to show 60.

Behaviour may differ by game, but this is certainly what I've been seeing.
 
I just bypass any ingame caps entirely and use RTSS exclusively for that. Just makes things easier as not all games employ frame cap the same way, some can introduce further latency for example or add stutter, whereas RTSS only adds a single frame of latency.

I suspect the same applies to FreeSync displays too.

I remember reading somewhere that it's better to use NVCP for frame limiting rather than RTSS, so that's what I use. Never had a problem with it.
 
It was in a recent Digital Foundry video where Alex mentioned the randomness of ingame fps limiters and how they can employ it all differently so it's better to use a known consistent external tool (in this case RTSS). I think it was in the Spiderman Remaster review.
 
Last edited:
Just playing with HDR enabled, actually some nice quality HDR really have to say.

Also, I've unlocked the Raiden look from Mortal Kombat :D
How have you managed to get HDR working on the Ultrawide? It's greyed out for me and I can only do windowed fullscreen rather than true fullscreen.

Have managed to stop it from tripping the PSU now by limiting power to 90% so thanks for whomever suggested that one. I did order the 1000w Asus Loki PSU but its almost £300 so might send it back if it remains stable at this lower power limit.
 
For this one, if I limit frames at RTSS to 60 and set Hogwarts uncapped, it uses more power than if I also limit Hogwarts to 60.

I think under the hood by not limiting Hogwarts it's rendering more frames and then RTSS is only allowing it to show 60.

Behaviour may differ by game, but this is certainly what I've been seeing.

Capping at source rather than capping at the pipeline would make sense, but sense doesn't always make sense :p For me there wasn't much difference in GPU power draw between the two as we tested yesterday. It was like 4 watts or so difference? The game may be rendering uncapped, but RTSS is still working low level and hooking into the engine so can still do the frame capping within the pipeline (assuming RTSS/MSIAB settings have been set accordingly, I know there are a couple of settings that hook in low level etc).

If RTSS is capped at 60, then I see much lower power draw, it was sub 200 watts at 60fps but then randomly shoots up to 300 watts for the GPU. Regardless of where the cap is set, seems this game is so random with its resource use.

another example of its randomness is that I saw RAM use go from 18GB to 10GB in the blink of an eye. It seems to throw resource use about all over the place lol.
 
Some minor tips for people:

  • On the main gear selection screen there is an option to modify the appearance of gear, so you can stop looking like a clown :)

I was most happy when I found this option - I thought it was ridiculous that I was dressed like a class A ****head in school and the teachers didn't pass comment - the game almost lost a bit of magic for that reason alone!
 
Back
Top Bottom