• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Cyberpunk can be used as a tech demo for path tracing, sure. Making anything but the top GPUs unplayable.

Thing is, Cyberpunk also runs on steamdeck
And they made it look very good without any kind of raytracing turned on.
So not really Crysis
 
My 4070Ti ran it fine enough. My 5070Ti runs it even better. He just went for the wrong card :p:cry:

Are you trying to make me regret buying AMD?

Nvidia alternative was the 4060 Ti 16GB at the same price, its no better in this than my card and falls short everywhere else.

The 4070 Ti was an £800 GPU, why was it an £800 GPU? How is it that a ##70 class card commands £800? Is it something to do with the mentality with the sort of people who buy them? Is there something about those people that makes Nvidia think this is fine and be correct in that analysis?

This is the reference one which is actually limited to 240 watts, mine being an AIB one, even a cheap one is better cooled and 265 watts, hence the 11% difference in my score and this, mine is like a 3080, which i'm guessing now or rather a couple of years ago became a bad RT card? I've beaten 3090's in some RT titles.

1je0vM5.jpeg
 
Last edited:
I run C2077 on a 16GB 5060Ti with PT on and most settings at high (Hardware Unbounds Quality settings) I run about 100 mods as well and use 12-14GB of VRAM depending on where I am in game. With the options I use I get a very playable experience for me. In C2077 I think the PT looks so good its worth compromising other settings to get a playable FPS. YMMV.

I will agree that RT often doesn't produce enough of a better visual experience to be worth the FPS loss. Especially if it brings the performance down enough enough to be very noticeable.

I have found in C2077 when the RT/PT goes askew in an area it can be very obvious, dragging the player out of any immersion. Also when using RT and especially PT some areas become very dark bringing with it the need for mods to allow flashlights and/or night vision in game
 
@Calin Banc DLSS is a compromise, i'm not buying a crappier GPU because at the time it had a better compromise solution, we should not have to compromise at all, especially at these prices and the fact that a lot of people have already normalised this in their reasoning to the point of reaching for it as an excuse is ***** terrifying, stop it, please...
 
Last edited:
Cyberpunk 2077 is heavily Nvidia sponsored,so OFC the RT modes will generally run better on Nvidia and the AMD cards will have to brute force it. In pure rasterised settings the base game runs better on many AMD cards,which sort of fits into the work they did for consoles. I doubt they really have bothered doing any work on getting the RT settings to run OK on AMD cards. AMD will need to stump up the sponsorship money.

For example they finally introduced FSR3 FG in the new update,but hadn't incorporated FSR3.1,etc. If you an AMD user,just drop some settings. It's no different than Crysis 2 which whacked up tessellation and if you had an ATI card,you could fiddle with the settings and driver settings to get most of the performance for little visual loss.

But then at the same point,you have games like COD which is one of the most popular games,which is AMD sponsored an an RX7700XT was matching an RTX4070. Or Oblivion:Remaster which works well on AMD cards too.
 
Last edited:
Cyberpunk 2077 is heavily Nvidia sponsored,so OFC the RT modes will generally run better on Nvidia and the AMD cards will have to brute force it. But then at the same point,you have games like COD which is one of the most popular games,which is AMD sponsored an an RX7700XT was matching an RTX4070.
I know, its as i said "worst case scenario" and an equally priced Nvidia card still can't beat it, the whole idea that AMD's RT is 'bad' is based off Cyberpunk, outside of it, even with it and even for this generation it actually isn't, its just the result of brainwashing.
 
Last edited:
I know, its as i said "worst case scenario" and an equally priced Nvidia card still can't beat it, the whole idea that AMD's RT is bad is based off Cyberpunk, out side of it, even with it and even for this generation it actually isn't, its just the result of brainwashing.

Because that is the reason companies do it - even back to the 3DFX days with games supporting their Glide API. AMD really needs to understand this and do more COD level sponsorships and get more bigger titles to run better on its hardware.

Unforunately money talks,so if Nvidia is flashing the leather jackets,etc the devs will quite happily just make stuff work better on their hardware.
 
Last edited:
I had an RTX3060TI and it was fine with some RT effects on at qHD with DLSS quality. Compared that to my mate with an RX6700XT who had to play it with less effects on(or at a lower quality).

Once they made the big update for Phantom Liberty performance cratered and yet I didn't really see much in the way of improvement. Tried using FSR FG,but it only helped a bit,because the base FPS was too low anyway. You really need as close to 60FPS as possible. This was on a system with a Ryzen 7 7800X3D,PCI-E 5.0,DDR5,etc.

Another example of poor optimisation from a developer breaking performance for older cards,just to cater for newer ones.

I think they did add stuff from patch to patch, but I haven't got into testing it too much. When I had the rtx2080 it ran 1080p@performance DLSS path tracing at around 30fps, which was pretty impressive tbh.

With that said, my problem with the game is the fixed nature of lights. You can't really destroy / turn off the vast majority of the light sources. That's a big no no for a game that wants to be next gen and all that.

@Calin Banc DLSS is a compromise, i'm not buying a crappier GPU because at the time it had a better compromise solution, we should not have to compromise at all, especially at these prices and the fact that a lot of people have already normalised this in their reasoning to the point of reaching for it as an excuse is ***** terrifying, stop it, please...

I'm happy with the compromise (with a decent implementation), considering the drawbacks of the classic raster and AA.
 
Because that is the reason companies do it - even back to the 3DFX days with games supporting their Glide API. AMD really needs to understand this and do more COD level sponsorships and get more bigger titles to run better on its hardware.

Yeah, i would go one step further than that, Game developers are using AMD's IP for consoles when they wish to and need to optimise for those, i would clause that you don't get to do that unless you do the same for the PC version, AMD have always been very adverse to behaving like this, Nvidia behave like this as a matter of course, pretending to be the nice guy doesn't win you any respect or favour, its simply appreciated in that it makes it easier to comply if your competitor makes them screw you over.

Step up AMD...
 
Last edited:
I think they did add stuff from patch to patch, but I haven't got into testing it too much. When I had the rtx2080 it ran 1080p@performance DLSS path tracing at around 30fps, which was pretty impressive tbh.

With that said, my problem with the game is the fixed nature of lights. You can't really destroy / turn off the vast majority of the light sources. That's a big no no for a game that wants to be next gen and all that.

That update cratered performance on my RTX3060TI after testing the same areas under normal RT in the base game. I already had manually optimised my settings to run well on the card. Then played a bit of the Phantom Liberty,which was closer to 30FPS with RT. Modded in FSR3 FG,which helped a bit in smoothing out the frame-drops but it wasn't a pleasant experience overall. This was also on a Ryzen 7 7800X3D with PCI-E 5.0 too.

I just switched off RT and it was much better. The RX9070 does run it much better overall,but I had lost interest in the game after that. There is apparently an FSR4 update,so might give it another try.

Yeah, i would go one step further than that, Game developers are using AMD's IP for consoles when they wish to and need to optimise for those, i would clause that you don't get to do that unless you do the same for the PC version, AMD have always been very adverse to behaving like this, Nvidia behave like this as a matter of course, pretending to be the nice guy doesn't win you any respect or favour, its simply appreciated in that it makes it easier to comply if your competitor makes them screw you over.

Step up AMD...

Look what happened with Starfield? They had it running better on AMD hardware,Nvidia PR didn't like that and within a brief period BGS fixed performance and put in DLSS. Yet,it didn't even launch with FSR3 FG either. Even with FSR4,Optiscaler showed how they can have a driver level toggle for it but AMD are still asleep at the wheel.
 
Last edited:
I think they did add stuff from patch to patch, but I haven't got into testing it too much. When I had the rtx2080 it ran 1080p@performance DLSS path tracing at around 30fps, which was pretty impressive tbh.

With that said, my problem with the game is the fixed nature of lights. You can't really destroy / turn off the vast majority of the light sources. That's a big no no for a game that wants to be next gen and all that.



I'm happy with the compromise (with a decent implementation), considering the drawbacks of the classic raster and AA.
Important bit you accept it is a compromise unlike many here.
 
That update cratered performance on my RTX3060TI after testing the same areas under normal RT in the base game. I already had manually optimised my settings to run well on the card. Then played a bit of the Phantom Liberty,which was closer to 30FPS with RT. Modded in FSR3 FG,which helped a bit in smoothing out the frame-drops but it wasn't a pleasant experience overall. This was also on a Ryzen 7 7800X3D with PCI-E 5.0 too.

I just switched off RT and it was much better. The RX9070 does run it much better overall,but I had lost interest in the game after that. There is apparently an FSR4 update,so might give it another try.



Look what happened with Starfield? They had it running better on AMD hardware,Nvidia PR didn't like that and within a brief period BGS fixed performance and put in DLSS. Yet,it didn't even launch with FSR3 FG either. Even with FSR4,Optiscaler showed how they can have a driver level toggle for it but AMD are still asleep at the wheel.

Looking into it, I saw people with 4080 and 4090 reporting issues as well. What they have in common is underutilized GPU, so perhaps an issue with drivers, cache, etc. On my end (Ryzen 5800x3D, rtx4080, 32GB 3200MHz), it seems about the same as before, perhaps a little better. Although, locking the fps at 30fps it seems a bit more erratic than before, with no issues at high FPS. I do play with mods, updated to their latest versions available through Vortex...
So going back to my original post, that rtx 4060 16GB should go closer to 60fps or 50fps (assuming is not hit with the same bug), which is fine with FG.

Starfiled was a mess. My card was pulling only around 200w out of 315W or whatever it can do max. The lack of DLSS it was also no excuse. Overall, a very bad execution from Bethesda (which is not a surprise).

Important bit you accept it is a compromise unlike many here.

Well, using going with more aggressive settings can't offer the same image quality. I don't have a 4k display, perhaps natively there with DLSS Quality is better than what the game offers by default, can't say no or yes.
In my experience, DLAA is ideal, but I do game at lower res/display than 4k.
 
@CAT-THE-FIFTH do you know why games often run better on Linux despite running through a middleware translation layer? Because Proton (The Middleware translation layer, developed by Valve) converts the code to Vulkan (Previously Mantle) using specific AMD hardware IP.

DX12 is also another version or fork of Mantle, only with AMD's IP stripped out, not because they are anti AMD, because they are lazy, they don't want to work with vendors to maintain their API properly, so its completely generic unless all vendors have the same functionality.

Linux has enjoyed a surge in uptake, from about 2% market share a couple of years ago to 5% now and growing, there are now Linux dev's specifically developing gaming and desktop Distroes to replace Windows, for example Bazzite

The best thing we can do is just move away from Windows.

 
Last edited:
There's a new benchmark on Steam out called NextVisuals Forest, uses UE5.6 and has an explore mode you can walk around in forest and temple areas. Looks quite nice really but it's fairly basic as of now.

This is using DLSS for AA, but render res is 100% It uses DLSS4 preset K as well which is cool

62n2nP0.jpeg




^^ lower render res options tried out after 10 mins mark.


LGsqEkm.jpeg


zCdUIrw.jpeg


1OGga2D.jpeg


In explore mode there is no mouse input latency which is often seen with UE5 demos, there was with matrix demo, for example, this feels like raw input with no latency.
 
Last edited:
the whole idea that AMD's RT is 'bad' is based off Cyberpunk, outside of it, even with it and even for this generation it actually isn't, its just the result of brainwashing.

No, it's because when the RT settings are cranked up to to PT levels, AMD cards cannot compete with Nvidia's. Even cheaper Nvidia cards.
It's been proven over & over again, not just in Cyberpunk, but games such as Doom, Wukong and Indiana Jones etc

AMD's RT has improved, no doubt. And in many titles it's fine because RT is only lightly implemented so the majority of the fps is still based on raster performance.
This isn't to bash AMD, they still have a great series of cards which compete in the most popular segment and their RT isn't bad. It's just not up there with Nvidia's and to say people who believe Nvidia's is better because of Cyberpunk or being brainwashed is simply not true.
 
No, it's because when the RT settings are cranked up to to PT levels, AMD cards cannot compete with Nvidia's. Even cheaper Nvidia cards.
It's been proven over & over again, not just in Cyberpunk, but games such as Doom, Wukong and Indiana Jones etc

AMD's RT has improved, no doubt. And in many titles it's fine because RT is only lightly implemented so the majority of the fps is still based on raster performance.
This isn't to bash AMD, they still have a great series of cards which compete in the most popular segment and their RT isn't bad. It's just not up there with Nvidia's and to say people who believe Nvidia's is better because of Cyberpunk or being brainwashed is simply not true.
AMD GPU's are not compatible with PT, they emulate it, they aren't going to support it in its current form because its not standardised, its Nvidia's tech.

Microsoft are rolling out a DX12 update that standardises Path Tracing and Ray Reconstruction. But no AMD GPU will even support PT in Cyberpunk or Wokong because they aren't going to licence Nvidia IP, its as simple as that.
 
Last edited:
No, it's because when the RT settings are cranked up to to PT levels, AMD cards cannot compete with Nvidia's. Even cheaper Nvidia cards.
It's been proven over & over again, not just in Cyberpunk, but games such as Doom, Wukong and Indiana Jones etc

AMD's RT has improved, no doubt. And in many titles it's fine because RT is only lightly implemented so the majority of the fps is still based on raster performance.
This isn't to bash AMD, they still have a great series of cards which compete in the most popular segment and their RT isn't bad. It's just not up there with Nvidia's and to say people who believe Nvidia's is better because of Cyberpunk or being brainwashed is simply not true.

As things stand AMD also have no direct equivalent to DLSS4 w/ transformer model + 2x framegen - even when you adjust settings to get the same 60-70 FPS based frame rate and ~100-130 FPS with FG it isn't as good an experience using FSR and other forms of framegen. Which makes quite a difference in the playability of some games when using higher levels of RT/PT features - even though personally I'm not a great fan of DLSS and don't like framegen.

I spent quite awhile testing the different incarnations and settings possible for this with Oblivion Remaster.
 
Back
Top Bottom