The Last of Us Remake | March 28th 2023

  • Thread starter Thread starter mrk
  • Start date Start date
Naughty Dog sent me an email in response to a bug report form submission about these crashes so have replied with all the answers they requested inc the log files and stuff, also mentioned the load times with my info previously posted. Hopefully they are working on it!
 
Finding it really strange... native res (1440p), I don't get much difference (in FPS) between medium/high settings, and again, it doesn't look like GPU is being fully utilised. My VRAM stays just a touch under 10GB on high @ 1440p. I'm finding 50-55 fps with drops into the 40's rather unplayable, especially with gsync on as the low refresh rate gives me headaches/motion sickness. Also feels stuttery and not smooth at all. I'm used to playing MW2 @ 140 fps/Hz with Gsync so 40-50 fps just feels awful for me... I'm not getting any crashes, even if I turn DLSS on.

What are those with 4090's seeing in terms of FPS @ 1440p on ultra?

P.S. Not sure if I'm on to something here, but regardless of what resolution/settings I play at, my CPU and GPU are only being 40-50% utilised maximum, and neither are getting anywhere near being hot... and no, I definitley haven't set a frame rate cap. I just followed a youtube video covering the best settings to improve FPS, and I'm still stuck at 50 fps with drops into the 40's...
4090 at 3440x1440 - varies - sometimes its 130fps , then it will drop to 70fps at times.
 
4090 at 3440x1440 - varies - sometimes its 130fps , then it will drop to 70fps at times.

And that’s on ultra yeah? I guess the 4090s are getting over the super poor optimisation with brute force…

I tried checking integrity of the files in steam. Good news is that little patch they did meant my CPU ran over 15 degrees cooler during the shader optimisation whilst still only taking 10mins. Bad news is that it did nothing for my FPS. Can’t get over 45-55fps, even with DLSS on. :-(
 
installed v1.0.1.5 Patch will test later

need to test longer but pretty sure when I had DLSS on it was crashing without havent had crash yet, 3080 1440p ultra it doesnt have the grunt to not dip below 60fps , with preset high it stays above 60fps guess ultra with dlss quality probably look same as high without using dlss

another thing I turned off all the grain settings much prefer cleaner look without it
 
Last edited:
Initial start up yesterday, shaders took about 20 odd mins, then loading about 1 - 2 mins. Managed to go through the prologue and all worked fine. Then got my first crash during my first fight in the first main mission. Since then I pretty much crash within 1- 2 mins of starting. Tried with and without dlss.
Noticed the shaders after the patch took an age (I updated my drivers) and my fans were going mental, cpu getting spiked and gpu spiked just during shader cache!
Anyone still using Afterburner? Could that be a cause?
 
Initial start up yesterday, shaders took about 20 odd mins, then loading about 1 - 2 mins. Managed to go through the prologue and all worked fine. Then got my first crash during my first fight in the first main mission. Since then I pretty much crash within 1- 2 mins of starting. Tried with and without dlss.
Noticed the shaders after the patch took an age (I updated my drivers) and my fans were going mental, cpu getting spiked and gpu spiked just during shader cache!
Anyone still using Afterburner? Could that be a cause?

Yeh, using AB here, no issues.
 
Ok so I'm getting the max 45-55 fps regardless of what graphics settings... low all the way to ultra, which is crazy. I enabled all the in game stats and it's showing CPU utilisation as no higher than 2% (between 0-2%), and that the CPU is limiting the FPS. I'd be happy with the GPU FPS .... which is up at 70 fps on ultra @ 1440p. No way should my 7900X be limiting me to 45-55 fps (regardless of quality setting).I know Windows has had issues with accurately reporting CPU usage, but clearly something is broken. I think the update/patch yesterday has made things worse. Quite a few on Reddit have said the same.

Any ideas? I already have core isolation turned off...
 
Last edited:
nevermind it crashed on high preset now , think its got worse with the v1.0.1.5 Patch I'll just wait for some more patches before trying it again
 
Last edited:
Hmmmm, just played for an hour with zero issues. I bumped it down to 1080p rather than 1440, so looks a bit naff! My issue seems to be with VRAM, as long as I'm about 80% used then I'm ok. Although I guarantee next time I start it up I'll crash within a few mins :)
 
A few more observations, the torch has a pretty big fps impact when turned on. Inside a dim room I can see the fps drop from 103fps to 88fps - The amount varies a few fps either end depending on the scene, but torch on = fps drop! This is with everything volumetric/effects/particle related set to Ultra.

Also, for those wanting more fps without sacrificing GFX quality settings, you could try dropping the render resolution in the Display settings screen. I played around with this and noticed very similar output image, whilst boosting the fps quite a bit, goes from 66fps outdoors (example right now as no DLSS is on due to those crashes), to 93 fps. visually it looks near identical, a slight change to aliasing of course but that's to be expected as you're expanding a lower render res via the GPU's scaler (or monitor's depending on your GPU driver settings), this is subtle again, though.


Examples:

hgvvtKY.jpg

Render scale 80% (2752x1152, Ultra textures):
9LQkuSq.jpg

Render scale 100% (3440x1440, High textures):
JqbL2M5.jpg

Render scale 80% (2752x1152, Ultra textures):
hMw5zbv.jpg

Render scale 100% (3440x1440, High textures):
MiNsjJW.jpg

So you can run more ultra settings and have more fps with 80% render scale, whereas at your native res you would have had to lower some settings to meet VRAM expense and keep the fps desirable.
 
Last edited:
So after an hour it runs without crashing which is good. However my 9700k is constantly at 90-100% utilisation and quickly getting temps into the mid 80s. Shaders are cached and files verified.

An i7 9700k surely cannot be the bottleneck. Cost me £350 new and its not really that old!
 
So after an hour it runs without crashing which is good. However my 9700k is constantly at 90-100% utilisation and quickly getting temps into the mid 80s. Shaders are cached and files verified.

An i7 9700k surely cannot be the bottleneck. Cost me £350 new and its not really that old!

What res and GPU are you on? If it's below 1440P, then you're into CPU limited territory, so that CPU will be heavily.

Ignoring the technical issues for a moment, one good thing about the engine in its current state is that it does utilise all the CPU cores, so it does seem to be relatively multi threaded, which is uncommon for most new games that focus all their efforts onto one or two cores tops. The issue in the game's current multi-threading is that it appears to be inefficient, for example it's using over 50% of all but one core which is at 47% on my 12700KF, it should not be using the E-Cores, so Iron Galaxy have for some reason said "if cores = exist, then use as much as possible" - In this sort of use case, being too heavily multi threaded will result in performance issues. It's like they are just saying to the engine, use whatever the system has and run with it.
 
What res and GPU are you on? If it's below 1440P, then you're into CPU limited territory, so that CPU will be heavily.

1440p high with an RTX 3080. The GPU isn't really being tested at all and not really going above 60% utilisation.

I just don't understand how 9th gen is stuck at 100% while 12/13th gen are coasting along at 30-40% util. That's quite the improvement in tech!
 
1440p high with an RTX 3080. The GPU isn't really being tested at all and not really going above 60% utilisation.

I just don't understand how 9th gen is stuck at 100% while 12/13th gen are coasting along at 30-40% util. That's quite the improvement in tech!
The GPU being low utilised like that and CPU being so high indicates it's still doing some compilation in the BG, at least that is the behaviour anyway. At 1440p you should be seeing near 100% GPU use and CPU use below 60% (total package use that is, individual cores may well be above 50%). Weird stuff going on there!

Reminds me of uncharted 4 in that it looks great but has very unsatisfying combat with AI that is dumb as a box of rocks
Have you played this? I've got both and the Uncharted AI is dumb and bricks lol. Last of Us AI uses tactics and cunning methods to get you, a lot of times I was surprised by human AIs that hunt me down in sneaky ways and the way they work as a team.
 
The GPU being low utilised like that and CPU being so high indicates it's still doing some compilation in the BG, at least that is the behaviour anyway. At 1440p you should be seeing near 100% GPU use and CPU use below 60% (total package use that is, individual cores may well be above 50%). Weird stuff going on there!


Have you played this? I've got both and the Uncharted AI is dumb and bricks lol. Last of Us AI uses tactics and cunning methods to get you, a lot of times I was surprised by human AIs that hunt me down in sneaky ways and the way they work as a team.

I’ve only played the first 90 mins or so but had a situation where I was downstairs and a bunch of AI were upstairs on a balcony in prime position but instead of using their height advantage they went full yolo and ran down the stairs toward me guns blazing.

Is AI intelligence effected by difficulty level or does that just effect their damage output ?
 
Back
Top Bottom