It’s a fair point.
How so, when the graphics his software is producing is poor for the hardware
It’s a fair point.
Wonder if its a 5090 thing, no crashes yet on my 5070TI (fullscreen), no crashes yet on my other halves 4070TI (fullscreen).I've had 4 crashes now, all using exclusive full screen ( also on a 5090 ).
No idea. Anecdotal, but it appears people using RTX 4000 series cards ( and AMD GPUs ) are having fewer issues.Wonder if its a 5090 thing, no crashes yet on my 5070TI (fullscreen), no crashes yet on my other halves 4070TI (fullscreen).
No idea. Anecdotal, but it appears people using RTX 4000 series cards ( and AMD GPUs ) are having fewer issues.
There is an in-game FPS counter you can utilise. Normally I use Afterburner/Rivatuner though.Which utilities are you using to display framerates etc?
There is an in-game FPS counter you can utilise. Normally I use Afterburner/Rivatuner though.
No crashes yet on my 5090, after around 9 hours of play, in borderless windowed mode.Wonder if its a 5090 thing, no crashes yet on my 5070TI (fullscreen), no crashes yet on my other halves 4070TI (fullscreen).
It would be a fair point if the graphics were super amazing wow fab groovy, but they’re nothing to write home about. What’s all the GPU power being used on? I don’t think BL4 is a particularly well optimised gameIt’s a fair point.
What’s all the GPU power being used on?
Wonder if its a 5090 thing, no crashes yet on my 5070TI (fullscreen), no crashes yet on my other halves 4070TI (fullscreen).
Which utilities are you using to display framerates etc?
Have to say I'm with Randy on the 4K thing - why is everyone so obsessed with "4K gaming"?
I get that for some types of game, like RTS or world-building stuff, the high resolution is very desirable. For 'action' style games though, it's a total waste of time, sapping FPS for no real visual benefit. DLSS is so good these days you can't tell the difference from native 99% of the time. I've been using DLSS Quality with BL4 and literally can't tell the difference from native resolution.
I'm actually intending to get a 5K2K monitor at some point but for productivity usage. When it comes to games, I'm not expecting to run fast-paced games at native 5120x2160 most of the time and will instead be leaning more heavily on DLSS.
I find it amusing how disparaging people are about upscaling or framegen.
When I had my 4k monitor, it definitely looked better, ultimately though the drop in fps was just too much for me and I returned to my ultrawide oled 1440. Didnt look as good (apart from the colours looked better on my oled than my 4k) but way more fps. I just couldnt justify the percentage loss of fps vs the percentage gain in image quality. One day, when theres more oomph in the boxes, I'll no doubt return to 4k, mind you, by then people will be gaming on 8k monitors![]()
That's the point I was trying to make, that sometimes you're far better off dropping the res to improve FPS.
Of course 4K or RT will look better, the point is that quite often 1440p with high FPS will be a far better overall experience than 4K with low FPS and people lose sight of this. If they can't run a game at 4K/RT with everything on, then it must be broken.
If you can lean on DLSS to upscale to your display's native res then even better and frankly I'd challenge anyone to tell the difference between 4K native and DLSS Quality in a fast-paced game, yet there still seems to be so much disdain targetted towards using DLSS versus "native" rendering.
I think you're onto something ... perhaps some of the "power" is being siphoned off to keep Epic's Fortnite servers runningThat hidden UE5 magic.
It still is a fair point, but yes the game does appear to be poorly optimised.It would be a fair point if the graphics were super amazing wow fab groovy, but they’re nothing to write home about. What’s all the GPU power being used on? I don’t think BL4 is a particularly well optimised game