• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
RT off looks better if you want to see someone hiding in the dark bit.

That's why you have a flashlight ;)

But yeah it is a bit dark on it tbf, however, overall considerably more realistic since it seems like the sun is on the right side.

Suspect that is down to the limited RT bounces though, I imagine once you move closer to the car for example, the area would lighten up quite a bit more, kind of like what we seen with metro and the transitions when going from bright outdoors to indoors.

Main thing with RT is that it looks a hundred times better in motion when it comes to lighting, shadows etc.
 
Some gameplay footage from 3080ti, 5950x 4k, max settings with dlss quality, perf. not too bad considering 4k with dlss quality and the RT settings in use.

Textures look a bit crappy on it but game is visually stunning with that RT lighting, shadows etc. especially when you look at the sheer scale + density + variety of the city:



Defo 3440x1440 and max settings for me especially after reading there is no HDR support?!
 
RT off looks better if you want to see someone hiding in the dark bit.

good thing its not a multiplayer game (I think), for single player it looks more immersive - its supposed to be an action-horror-survival game, so it should be dim or dark in parts at least
 
I think it is just down to the sheer difference in contrast/lighting conditions in that particular scene too, I don't think you'll see many scenes with such a stark difference like that, pretty much the way it was for metro too (of which some also made the same complaint). We can see similar with the below

m0kJccx.png

Does seem like they have put little to no effort into the old gen methods tbh.

Not having HDR support is downright lazy and a silly move given the setting, this is one game where it would really shine.
 
The game is missing stuff that seems either lazy or they just don't have the resources for everything.

as you say, no HDR (however windows 11 has auto HDR), no FSR, no upscaling on consoles and the game doesn't make use of kraken, velocity or DirectStorage and so the game takes between 27seconds and 36seconds to load a save file regardless of the system you play it on
 
The game is missing stuff that seems either lazy or they just don't have the resources for everything.

as you say, no HDR (however windows 11 has auto HDR), no FSR, no upscaling on consoles and the game doesn't make use of kraken, velocity or DirectStorage and so the game takes between 27seconds and 36seconds to load a save file regardless of the system you play it on

Suppose the game has been in development for 7 years? HDR displays weren't even a thing then so I can forgive some technical things missing as long as the gameplay shows 7 years worth of work.

Game does have FSR too:

https://www.dsogaming.com/pc-perfor...2-nvidia-dlss-amd-fsr-ray-tracing-benchmarks/

The AMD FSR implementation appears to be okay-ish, though it produces a noticeably blurrier image. On the other hand, NVIDIA DLSS is bugged and suffers from ghosting issues. As said, NVIDIA is working on a fix. Once the green team manages to address these issues, DLSS will be a must for all RTX owners.
 
I found a benchmark which included AMD GPUs, it's from Techradar they just put up the article with benchmarks https://www.techradar.com/uk/news/dying-light-2-pc-performance


The benchmark is 4K, max settings and RT on. Nvidia GPUs have DLSS on Quality and AMD GPUs have FSR on Quality.

AMD GPUs have minimum framerate under 10fps and this is with FSR on, it's basically unplayable for AMD owners

Lol if true.


it looks like no GPU can run this game with RT natively at 60fps, they all need dlss or FSR to get 60fps
Can you imagine paying a 4 figure sum for the best consumer GPU money can buy and still not be able to max out all games? All that vram means nothing if not enough grunt bud :D
 
If the game really ships without HDR then that is truly sick. No amount of RT can compensate for that.

Unfortunately I have a mixed impression of RT here. I think the problem for them is similar to what Metro Exodus had with the initial RT, except there it worked in their favour and suited the atmosphere, but here the low amount of ray bounces is very evident with an over-darkening of AO in many instances. You simply cannot have such low lighting amount in an outdoor open world game with so much vegetation. The GI part seems as weak as it did for Cyberpunk (RTL Psycho), probably only used as an enhancement for the existing GI, except in CP their base GI was much stronger so it was an acceptable compromise. Reflections also don't get to play much of a role here either. It is the sun shadows that do the heavy lifting in most situations in this game, so I'd say that's a must to turn on.

But as much as I am slagging their RT if we look at how the rasterised-only image looks it makes a very bad impression. Sometimes it works in its favour when you get the golden hour effect (whereas with RT near buildings you sometimes get a much colder image if you're opposite the sun; more accurate, yes, but less aesthetically pleasing), but a lot of the times it just puts even more emphasis on the low quality of textures & pure geometric detail it ships with (also similar to Metro Exodus) and is a complete mess. One reviewer said it best, in many ways this is like an Xbox 360 title taken to the extreme and brought into 2022. On the one hand you have advanced RT techniques necessitating obscene amounts of computing power, on the other you have tufts of grass in the urban environment which are literally just 2D cards plopped down and randomised, looking like placeholders more than a finished product. The studio simply doesn't have the technology base required for dealing with so much vegetation and it shows. They are in way over their heads. One of the guys on gamegpu showed it compared to The Division 2 and the difference was immediate and staggering, no doubt that game is in a class of its own and shows how far you can push traditional raster techniques at the end of its cycle. Different worlds entirely, and we know they did a LOT of work for RT + vegetation for Avatar, so that'll be raising the bar yet again. Amateur devs (Techland) vs Pros (Massive), it's a embarassing to even make the comparison but here's them juxtaposed anyway.


You can find some more tests & performance here:
https://gamegpu.com/action-/-fps-/-...obzor-i-sravnenie-graficheskikh-nastroek-igry
https://gamegpu.com/action-/-fps-/-tps/dying-light-2-stay-human-test-gpu-cpu

As for RT vs Vram
If you're on Nvidia then the RT grunt will do you no good, because you will be choked by the lack of memory. If you're with AMD then the grunt is not there, but out of the two if it goes to raster vs raster then AMD wins. So unless you have a 3080 (and even that's an edge case) or better then it should give you no comfort that you have dedicated RT cores in this case (3070 & lower).

T8Pej7x.jpg.png

Oops, guess it ended up a rant.
 
Last edited:
More benchmarks, this one from a Russian site and includes both dlss/fsr numbers and native numbers.

Something funny is that with DLSS on, the RTX3070 is a bit ahead of the RTX2080ti but with DLSS off, the 3070 only has half the 2080ti's performance - I guess the old friend, 8GB VRAM, is knocking on the door again :cry:

 
More benchmarks, this one from a Russian site and includes both dlss/fsr numbers and native numbers.

Something funny is that with DLSS on, the RTX3070 is a bit ahead of the RTX2080ti but with DLSS off, the 3070 only has half the 2080ti's performance - I guess the old friend, 8GB VRAM, is knocking on the door again :cry:


The FPS on every single card in that graph sucks, are you really mocking the 3070 8GB because the 2080ti gets a monster 15 fps ? :p
 
The FPS on every single card in that graph sucks, are you really mocking the 3070 8GB because the 2080ti gets a monster 15 fps ? :p
He is mocking it because he enjoys looking down at us from his high horse with his 24GB 3090 :p

Which by the way can’t even do 60fps even with the aid of DLSS from what we have seen so far :cry::cry::cry:
 
As for RT vs Vram
If you're on Nvidia then the RT grunt will do you no good, because you will be choked by the lack of memory. If you're with AMD then the grunt is not there, but out of the two if it goes to raster vs raster then AMD wins. So unless you have a 3080 (and even that's an edge case) or better then it should give you no comfort that you have dedicated RT cores in this case (3070 & lower).

Looking purely at VRAM numbers between manufacturers is incredibly simplistic. They're very different architectures with different hardware, so using "a > b" to draw a conclusion is anecdotal at best. Play the game and see whether there are issues/differences.

Pretty sad for console users that they're only getting 1080p/30. The game itself is rather heavy but any level of RT obviously hurts RDNA2. That being said, given current GPU prices, getting a console for £400+ is really reasonable.
 
I found a benchmark which included AMD GPUs, it's from Techradar they just put up the article with benchmarks https://www.techradar.com/uk/news/dying-light-2-pc-performance


The benchmark is 4K, max settings and RT on. Nvidia GPUs have DLSS on Quality and AMD GPUs have FSR on Quality.

AMD GPUs have minimum framerate under 10fps and this is with FSR on, it's basically unplayable for AMD owners

Lol if true.



Can you imagine paying a 4 figure sum for the best consumer GPU money can buy and still not be able to max out all games? All that vram means nothing if not enough grunt bud :D

Ooooooooooooof! Single digits!!!!! :p ;) :cry:

I'm not surprised in the slightest, pcgamershardware and another site did say this back when RDNA 2 and ampere released that ampere was considerably better when it came to not just overall RT performance but the way they handle RT especially when RT effects are applied on top of each other i.e. the more complex it becomes, the bigger the hit for RDNA 2.

Given that the consoles are only using 2 RT effects out of the 7 (?) and only hitting 1080P 30fps with them 2, the writing was on the wall what would happen when you throw even more at RDNA 2. Sadly for RDNA 2 users, RT is a must have from the looks of this though, game without any RT looks like a ps 3/4 game :o

Looking purely at VRAM numbers between manufacturers is incredibly simplistic. They're very different architectures with different hardware, so using "a > b" to draw a conclusion is anecdotal at best. Play the game and see whether there are issues/differences.

Pretty sad for console users that they're only getting 1080p/30. The game itself is rather heavy but any level of RT obviously hurts RDNA2. That being said, given current GPU prices, getting a console for £400+ is really reasonable.

Yeah a bit pointless seeing just how much vram is used and not overall perf. as we have seen time and time again, games can and will use all the vram they can if possible with no benefit in performance i.e. see HZD, RE village, godfall.

I think this is sadly another case of spreading themselves too thin holding back overall perf and visuals i.e. optimising for old gen consoles too, the sooner they are dropped the better. That and it kind of looks like this game was originally intended for the old gen consoles (which not a surprise given the time development was started...)

It does look like 8GB is a bit problematic though.....
 
Last edited:
I swear games like this are designed to tank cards like this so that people want the next generation as soon as it's released. The 3090 should easily be able to get over 60 with this game, it's nothing special. Poor or deliberate optimisation sponsored by Nvidia :D
 
Status
Not open for further replies.
Back
Top Bottom