• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Dying Light 2 PC Tech Review

From the picture I've seen, the game looks good generally, but non-rt seems to suffer from too much light at times and rt seems to suffer from too little light at times. I think on the DF video, I liked the indoor scene with the seats best with the console settings and not the 'full PC' settings options they seemed to prefer.

Mack from WAB mentioned indoors looked a bit jank:

 
Getting realistic in door lighting with rasterisation is very time intensive, each and every indoor area must be manually tuned and tested by adding, removing, placing lights and adjusting a bunch of parameters - it's a pain stacking process -'so naturally it doesn't look good in many games
 
Getting realistic in door lighting with rasterisation is very time intensive, each and every indoor area must be manually tuned and tested by adding, removing, placing lights and adjusting a bunch of parameters - it's a pain stacking process -'so naturally it doesn't look good in many games

He said even with RT it look a bit meh and no wonder as internal scenes are more difficult to calculate properly. The issue again,is most RT capable dGPUs are poor at RT even at 1080p. You not only need to have sufficient RT capability in rendering the scene at a certain framerate...but to render the scene accurately. Its why RT for the film industry needs tons of processing power as they want to not only have enough performance but high accuracy.

So it's still a choice between performance and quality when it comes to games. So you can still have poor quality RT,if you choose to use less rays per object. This is why we are still running games with hybrid rasterised and RT effects.

The dGPUs which are OK at RT,ie,RTX3080 and above dGPUs are a tiny minority of all dGPUs owned.
 
Last edited:
My gameplay footage, tried to show a mix of a few areas and flicking flashlight on and off in dark areas (mainly at night near the end) to let people see how it looks in motion as screenshots aren't great:

 
Mack from WAB mentioned indoors looked a bit jank:

To me, the photos both indoor and outdoor are hit and miss tbh. Some look fine, some look good, some make me wonder where the extra light bulb is hiding to illuminate stuff and some make me wonder why shadows make part of a scene look pitch black. Think I'd just play with RT off and deal with the extra light and take the extra frames tbh - though it seems to run pretty meh on his 2080.
 
To me, the photos both indoor and outdoor are hit and miss tbh. Some look fine, some look good, some make me wonder where the extra light bulb is hiding to illuminate stuff and some make me wonder why shadows make part of a scene look pitch black. Think I'd just play with RT off and deal with the extra light and take the extra frames tbh - though it seems to run pretty meh on his 2080.

The issue is that in more complex lighting scenes you need to do more calculations,hence the lighting can fall apart IMHO. I suspect as time progresses and we get more powerful RT capable dGPUs things will start to improve. ATM,I think developers have to sort of compromise.
 
@CAT-THE-FIFTH Yep I agree, I've seen a few people mention light bounces, whether that's the issue or not I don't know. As I said, I thought the console settings on the DF video looked best, though no idea how they stack up in other areas.
 
@CAT-THE-FIFTH Yep I agree, I've seen a few people mention light bounces, whether that's the issue or not I don't know. As I said, I thought the console settings on the DF video looked best, though no idea how they stack up in other areas.

Probably because they need to mix the RT and rasterised effects more optimally to get the "look" the developers want??

I think once we start to double RT performance per tier,things will start to get a better IMHO. So either next generation or the generation after. The biggest issue being RT performance an entry and mainstream tier dGPUs IMHO.

For example if you look at Cyberpunk 2077 with RT on:
https://tpucdn.com/review/msi-geforce-rtx-3050-gaming-x/images/cyberpunk-2077-rt-1920-1080.png

The RTX3060TI is 54% faster than an RTX2060 Super at 1080p,which is its upper mainstream/entry level enthusiast Nvidia dGPU. The RTX3060 is only 34% faster than an RTX2060!

However,the RTX3080 is 72% faster than the RTX2080 at 1080p but is 81% faster at 1440p. RTX3090 is 63% faster than an RTX2080TI at 1440p. This is part of the issue - many here own the faster dGPUs,which have shown the greatest per generation RT performance increases,especially at higher resolutions.
 
I do not understand the hate towards FSR in this title. I have had some more looks at FSR today and noticed some differences in shadows while in some parts FSR looks actually better and others it's mixed.

Take this first example I not sure what is happening here I playing maxed out no RT with AUTO HDR enabled.

https://imgsli.com/OTQzMjE - Notice the bottom left shadow is darker with Native but as you move up you see more shadow detail with FSR. I just think tbh it might be a game day and night effect but you can see it side by side it no real winner between FSR on or OFF.

https://imgsli.com/OTQzMTg - Hard to spot a difference without getting all zoomy

https://imgsli.com/OTQzMTk - I actually think FSR in this side by side looks better.
 
Probably because they need to mix the RT and rasterised effects more optimally to get the "look" the developers want??

I think once we start to double RT performance per tier,things will start to get a better IMHO. So either next generation or the generation after. The biggest issue being RT performance an entry and mainstream tier dGPUs IMHO.

For example if you look at Cyberpunk 2077 with RT on:
https://tpucdn.com/review/msi-geforce-rtx-3050-gaming-x/images/cyberpunk-2077-rt-1920-1080.png

The RTX3060TI is 54% faster than an RTX2060 Super at 1080p,which is its upper mainstream/entry level enthusiast Nvidia dGPU. The RTX3060 is only 34% faster than an RTX2060!

However,the RTX3080 is 72% faster than the RTX2080 at 1080p but is 81% faster at 1440p. RTX3090 is 63% faster than an RTX2080TI at 1440p. This is part of the issue - many here own the faster dGPUs,which have shown the greatest per generation RT performance increases,especially at higher resolutions.
Yep, and I'd rather they mix it to get a good look then try to go overboard with RT and it look (at least IMO) worse. Also, you'd have better performance until those more capable GPUs arrive.

I do not understand the hate towards FSR in this title. I have had some more looks at FSR today and noticed some differences in shadows while in some parts FSR looks actually better and others it's mixed.

Take this first example I not sure what is happening here I playing maxed out no RT with AUTO HDR enabled.

https://imgsli.com/OTQzMjE - Notice the bottom left shadow is darker with Native but as you move up you see more shadow detail with FSR. I just think tbh it might be a game day and night effect but you can see it side by side it no real winner between FSR on or OFF.

https://imgsli.com/OTQzMTg - Hard to spot a difference without getting all zoomy

https://imgsli.com/OTQzMTk - I actually think FSR in this side by side looks better.

What resolution is that at? The pictures I saw of FSR didn't look great (DLSS was definately better), but I think they were done at 1080p. That said, the PCGamer article IIRC their issue with DLSS was in motion clarity which they disliked.
Those picture look fine to me.
 
Yep, and I'd rather they mix it to get a good look then try to go overboard with RT and it look (at least IMO) worse. Also, you'd have better performance until those more capable GPUs arrive.



What resolution is that at? The pictures I saw of FSR didn't look great (DLSS was definately better), but I think they were done at 1080p. That said, the PCGamer article IIRC their issue with DLSS was in motion clarity which they disliked.
Those picture look fine to me.

I play at 4k
 
https://imgsli.com/OTQzMTk - I actually think FSR in this side by side looks better.

The FSR image looks a lot less sharp than native, near and far. DLSS-type tech will always have the potential to be the better option.

That being said, issues are always more obvious in stills so as long as you're happy with it while playing the game, as increased FPS can often offset other issues, then that's fine.
 
He said even with RT it look a bit meh and no wonder as internal scenes are more difficult to calculate properly. The issue again,is most RT capable dGPUs are poor at RT even at 1080p. You not only need to have sufficient RT capability in rendering the scene at a certain framerate...but to render the scene accurately. Its why RT for the film industry needs tons of processing power as they want to not only have enough performance but high accuracy.

So it's still a choice between performance and quality when it comes to games. So you can still have poor quality RT,if you choose to use less rays per object. This is why we are still running games with hybrid rasterised and RT effects.

The dGPUs which are OK at RT,ie,RTX3080 and above dGPUs are a tiny minority of all dGPUs owned.

3060Ti does heavy RRT at 1080p 60 (lows) with DLSS quality in CP2077.. it looks and plays great.
 
Yep, and I'd rather they mix it to get a good look then try to go overboard with RT and it look (at least IMO) worse. Also, you'd have better performance until those more capable GPUs arrive.

Agreed.

3060Ti does heavy RRT at 1080p 60 (lows) with DLSS quality in CP2077.. it looks and plays great.

I have an RTX3060TI FE myself,and have over 300 hours in the game on a GTX1080 and RTX3060TI(RTX3060TI FE bought last summer). It was a £400 dGPU under a year old,and if I need to use DLSS quality right now,which means RT effects are rendered at sub-1080p resolution,I hate to thing in the next 12 months,how well it will fare. I foresee lower and lower quality DLSS settings being used.

But the issue is I game at qHD,so it means I had to turn down RT settings,and run DLSS at that resolution. In the end I just switched off everything apart from reflections,as the poorly implemented rasterised water in Cyberpunk 2077 was annoying.
 
Last edited:
I do not understand the hate towards FSR in this title. I have had some more looks at FSR today and noticed some differences in shadows while in some parts FSR looks actually better and others it's mixed.

Take this first example I not sure what is happening here I playing maxed out no RT with AUTO HDR enabled.

https://imgsli.com/OTQzMjE - Notice the bottom left shadow is darker with Native but as you move up you see more shadow detail with FSR. I just think tbh it might be a game day and night effect but you can see it side by side it no real winner between FSR on or OFF.

https://imgsli.com/OTQzMTg - Hard to spot a difference without getting all zoomy

https://imgsli.com/OTQzMTk - I actually think FSR in this side by side looks better.


FSR is fine for amd users, just no reason to use it if you have DLSS, that's all - no reason to hate though, one is just better than other and depends on what gpu you have
 

It's basically killing gpus and consoles, above is the PS5 and Xbox review too.. So gives people an idea how demanding this game is on gpus and consoles..

PS5 and Xbox can't do native 4k at the highest resolution setting with no RT they are able to basically do 30FPS with no RT and for RT on the resolution is 1080p..:rolleyes: seems we need to wait for PS5 Pro and the new XBOX updated version to get even 60FPS with RT off and higher than 1080p.. The consoles that sold 8K and 120FPS.. not looking good for games with RT on consoles or AMD GPUS. Seems DLSS and FSR are going to be our saviours going forward.
One must remember that the consoles have ‘weak’ gpu’s (around a non super 2070 in performance’. Now that is enough to display decent enough graphics but it’s around 2015 pc performance and that isn’t going to impress the pc crowd now let alone in five years time. Console optimisation is often bandied about but that simply means reducing the quality albeit in a clever way. In the same way you can dial down the settings of a 2070 so it can run the latest games. At best the latest consoles are a 1080p product but for how long they can stay at 1080p is anyone’s guess.
 
I have an RTX3060TI FE myself,and have over 300 hours in the game on a GTX1080 and RTX3060TI(RTX3060TI FE bought last summer). It was a £400 dGPU under a year old,and if I need to use DLSS quality right now,which means RT effects are rendered at sub-1080p resolution,I hate to thing in the next 12 months,how well it will fare. I foresee lower and lower quality DLSS settings being used.

But the issue is I game at qHD,so it means I had to turn down RT settings,and run DLSS at that resolution. In the end I just switched off everything apart from reflections,as the poorly implemented rasterised water in Cyberpunk 2077 was annoying.

Yeah, 1440p heavy RRT on a 3070 is stretching it. 3060 TI is defo 1080p max and DLSS definately needed for those 1% lows. Give it a year and we'll be having to limit light bounces but should still be able to get the base RT visual boost, a lot of RX6000 buyers are going to be extremely disappointed - whether they admit it or not. I really hope AMD nail RT on 7000 series and give us a solid alternative at a reasonable price.
 
Back
Top Bottom