• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ratchet and Clank: Rift Apart RDNA 2 Ray Tracing

You have to only look at games like HZD,what the dev did with such weak hardware. With PC too many devs are happy not to bother,because they expect us to buy new hardware. However,with all these GPU shortages due to miners,etc its going to mean a lot of PC gamers are being stuck on subpar GPUs. So devs will need to now take this new reality into consideration and downgrade many of these titles for PC. Even looking at Steam,look at what most hardware gamers have?? Most of the GPUs listed are worse than the XBox Series X GPU,and its hard to say whether most can challenge the PS5 GPU either. Enthusiasts have to appreciate they are ahead of the curve.

The same goes with the CPUs. The consoles have something equivalent to a Ryzen 7 3700X/Ryzen 7 4700H. Nothing for many on here but lots of people are still stuck on slower CPUs too. Many still use SATA SSDs,etc.
 
Yeah but point is, no dev gives a **** about pc gaming.

Sony exclusives got insane budgets/talented teams. Pc has what? Indies, ocasional multiplat bad ports and ‘exclusive’ games that look like **** ( esports, mmos etc ).

Now because of gpu shortages it will be even worse. Watch how most next gen AAA multiplats will start to avoid pc releases :cry:
 
Yeah but point is, no dev gives a **** about pc gaming.

Sony exclusives got insane budgets/talented teams. Pc has what? Indies, ocasional multiplat bad ports and ‘exclusive’ games that look like **** ( esports, mmos etc ).

Now because of gpu shortages it will be even worse. Watch how most next gen AAA multiplats will start to avoid pc releases :cry:

Its what some of us saw might happen,with increasing GPU prices,as the mainstream average gaming PC would get relatively worse and worse. Mining has only made it more evident. Its kind of sad if you look at the Steam Hardware Survey:
https://store.steampowered.com/hwsurvey/videocard/

Look at the top 20 GPUs on there,and only the RTX2070 Super and RTX3070(at 20th) are faster than an RX5700XT:
https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-ti-founders-edition/28.html

Even the latest mainstream RTX3060 is slower than an RX5700XT. The issue is the PS5 has a GPU which is basically an overclocked RX5700XT with newer features,and the XBox Series X has a faster GPU overall.

So that means an RTX2070 Super at best matches a PS5,so only the RTX3070 at 20th is probably beating an XBox Series X. The RTX3080 barely makes a blip on the list,and neither does the RTX3060. The AMD RX6000 series is nowhere to be found.

Hence most gaming PCs on Steam are barely capable of running RT effects to any degree,and even then most average gaming PCs are going to be beaten in RT performance by a PS5 or XBox Series X as a result.It makes far more sense for game devs to use consoles as the starting point for pushing RT currently. That doesn't even take into consideration many still using quad core CPUs,SATA SSDs,etc.

That is why all the RTX3080,RX6800,etc owners going around saying the console can't do this and that,don't seem to realise a console will be a giant upgrade in hardware for most PC gamers. Yes,maybe in their circles it isn't but they are enthusiasts who throw money at PC hardware.
 
Last edited:
So that means an RTX2070 Super at best matches a PS5,so only the RTX3070 at 20th is probably beating an XBox Series X. The RTX3080 barely makes a blip on the list,and neither does the RTX3060. The AMD RX6000 series is nowhere to be found.

That will only last until NVIDIA releases their new GPUs

PS5 will probably push more fidelity/frames than a 3080 in 4-5 years later and I'm pretty sure SX will decimate the 3090 4-5 years later.

Of course, the elite "community" of revered Overclockers.co.uk will consider a 3090 low end 4 years later and will find it being equal/near to a Xbox a normal thing or blame Nvidia and bail out of the discussion (like they do with their gtx 770s and 780s, they blame on nvidia and bail out. they normalize a 600 dollar high end 780ti being destroyed by ps4 after all by calling it old. or they will blame "thats a port" "this an outlier" "nvidia blame hurr durr should've bought amd card instead [as if consumer could've known at that timeline])
 
Last edited:
It was true for a long time, the games will use less resources on consoles. You could notice this since the days of PS2 or Dreamcast/Gamecube. Sure a game like San Andreas will look better on PC but how about trying it on a 300Mhz CPU and 32 Mb RAM like PS2 had? That's a configuration for GTA2 not San Andreas. :D
Since then, the gap become even bigger with Sony and MS buying almost every good game dev. And now the price of PC hardware was already exagerated, before all the cards were going to scalpers. F... you buy a 6700xt which is much worse than the chipset from XboxX for the same money you pay for the whole console. If you are one of the luckiest few who can find a card at that price.
 
Like you can see above a lot of the time the optimizations consoles get are not the sort of thing you'd normally get in the settings on a PC. Some of them are exposed to PC users through graphical menu options, but there's hundreds and hundreds of variables that go into modern game engines that control level of detail and it's always been the case that consoles are optimized by having many of these tweaked to get performance up to acceptable levels. Watch the video above and Alex scrolls through the config file to show you an idea of everything you could in theory change, it's crazy long and why he only covers a fraction of it.

On PC many of these settings are just bundled into the preset "high", "ultra" modes or whatever, mostly just to make it accessible and easy to change for novices who don't know what they're doing. But most games do store these config files in plain text so you can just edit them manually if you wish to push settings hard (and harder than presets allow).
So your point is that Consoles are more optimized, I get that. But what I am suggesting is that we don't get any meaningful benefits in most cases for the loss of optimized settings on PC. Ultra and High settings are just there as part of basic PC options (sometimes not even that) and just exist to tank PC performance for no meaningful gain in Image Quality. So PC gamers are being fleeced both ways, we nither get the optimization nor great IQ. Also, if it is so easy to set console-like settings, developers should bundle them as part of new settings named Console Like or Equivalent for PCs but that's too much to ask for these days.
 
That will only last until NVIDIA releases their new GPUs

The problem is that even the RTX3060 was released in February,so that is at least the end of next year until we get a replacement. Even if the fully enabled GA106 is released,its still going to be RX5700XT level performance. FE models used to be the upper end of the RRP,but now define the base price. So even if prices move back to "normality" an RTX3060 is going to cost more than £300,and an RTX3060TI over £400 for the most part.

Then the issue is with the price escalation per generation,they might jack up the price of the RTX4060 to closer to £400,which will probably be a bit slower than an RTX3060TI(but have more VRAM,and better RT performance) and count that as progress and so on.

The same is also happening with CPUs now.

PS5 will probably push more fidelity/frames than a 3080 in 4-5 years later and I'm pretty sure SX will decimate the 3090 4-5 years later.

Of course, the elite "community" of revered Overclockers.co.uk will consider a 3090 low end 4 years later and will find it being equal/near to a Xbox a normal thing or blame Nvidia and bail out of the discussion (like they do with their gtx 770s and 780s, they blame on nvidia and bail out. they normalize a 600 dollar high end 780ti being destroyed by ps4 after all by calling it old. or they will blame "thats a port" "this an outlier" "nvidia blame hurr durr should've bought amd card instead [as if consumer could've known at that timeline])

The same happens with GPU VRAM,core count,etc. Extra VRAM and more cores don't matter. Many will upgrade quicker so don't care. However,then what happens mainstream gamers get screwed over even more,because they need to overprovision hardware due to longer lifespans.
 
let's see

4-5 cars on the horizon, 3 npcs visible, a fairly simple - easy to render scene can drop to 50-53 fps with a 5800x (would probably go below 40 fps with a 3700x)

actual proof that it drops near 45 with a 3700x;

https://youtu.be/xvXMZe6nVws?t=126)

nZTaKsx.png


lots of cars (5-6), 4-5 visible npcs, rock solid 60 fps in the entirety of the road


zFYZb5O.png
You can't really do a comparison like that because WD:L on PC has many ultra settings which are exceedingly CPU-intensive, so while you're counting some visible things there's a lot more going on in the background. In particular the shadow settings on the higher settings are much more advanced than what's on any of the consoles and absolutely murder the CPU (and when you add all the rest, the murder becomes even more gruesome). Luckily you can actually find the exact settings used on consoles for WD:L in its game files on PC (in DF's video on RT for console version vs PC you can see an example) so you can do a(n accurate) test like that if you wish.
 
The issue is if you need to start needing examining a 4K console screenshot and a PC screenshot under a microscope for differences,you have already lost the battle. Most gamers certainly won't be able to notice it when running around the world,especially on their TV or 1080p PC screen.

At least with Crysis you could just see the big difference!
 
The issue is if you need to start needing examining a 4K console screenshot and a PC screenshot under a microscope for differences,you have already lost the battle.
Lose what battle? We're talking about performance. The reason it runs as it does on consoles and not on PC has nothing to do with magic optimizations that can only be done on consoles, it has to do with the settings chosen. So for people who can't tell the difference between different settings they should then turn them down and not suffer those dips (and save on having to buy more powerful hardware).

Granted, I only now realise the guy I replied to was replying to someone else who I have on ignore, so couldn't see the full context. Oops.
 
Lose what battle? We're talking about performance. The reason it runs as it does on consoles and not on PC has nothing to do with magic optimizations that can only be done on consoles, it has to do with the settings chosen. So for people who can't tell the difference between different settings they should then turn them down and not suffer those dips (and save on having to buy more powerful hardware).

Granted, I only now realise the guy I replied to was replying to someone else who I have on ignore, so couldn't see the full context. Oops.

Earlier in the thread people were trying to analyse the IQ of different images,etc and talking about console settings being worse than PC settings,etc. Maybe that is true. The issue is that unless its like Crysis and its obvious in gameplay,if you need to take screenshots and meticulously analysis the image to see differences,then 99% of gamers won't notice in play as they are more concerned with the gameplay. Its why the whole high vs ultra debate happens.

As someone else mentioned,its really on PC devs then to make those "console" settings a thing in the menu. Trying to bury it such,that you need to figure with config files,is probably beyond what most gamers will want to do IMHO.But even with performance,you have to appreciate for most gamers,they only care if its smooth. Most won't have FPS counters on to see the FPS values. Sometimes I do think PC gamers worry too much about FPS. In the past most games didn't really run at high FPS. It only became a thing with stuff like Quake IIRC,because the engines glitched out at certain FPS values,and competitive players could exploit it.

WRT to the whole optimisations business,consoles do have better "optimisations" because they are not exactly the same as PCs in features. There are aspects of the hardware which are non-standard. Just look at the PS4:

Sony has also taken steps to make it easier for developers to use the graphics component for general-purpose computing tasks. Cerny identifies three custom features dedicated to that mission:
  • An additional bus has been grafted to the GPU, providing a direct link to system memory that bypasses the GPU’s caches. This dedicated bus offers "almost 20GB/s" of bandwidth, according to Cerny.
  • The GPU’s L2 cache has been enhanced to better support simultaneous use by graphics and compute workloads. Compute-related cache lines are marked as "volatile" and can be written or invalidated selectively.
  • The number of "sources" for GPU compute commands has been increased dramatically. The GCN architecture supports one graphics source and two compute sources, according to Cerny, but the PS4 boosts the number of compute command sources to 64.

The designs themselves are semi-custom and take on feedback from devs,on what base level features they want to use. So trying to optimise code for the PC is not entirely going to be the same as PC,as the consoles don't 100% work in the same way as a normal PC. I suspect there are console specific tools.

PCs use general purpose components,but its the general purpose aspect which also leads to overheads in certain aspects. Its the trade-off for being able to mix and match so many components. With AMD even DF were amused when a PS5 beat the RX6800 in RT performance,even when the latter is faster in GPU performance. That hints at a software issue there(IIRC it was in Control).

Even for our hardware,there have been many instances,where Linux has had far better CPU performance than Windows. Look at how early 32 core CPUs caused Windows problems?? This is exactly the same reason why Apple can get impressive looking performance out of its SOCs. Its also why military systems and space systems can use radiation hardened 486 CPUs,old MIPs based hardware and do impressive things with them. In pure TFLOPs they are weak,but talking to people who have had experience with embedded systems,software overhead is actually a thing. Its why faster does not always equate better if tools/OSes are less well developed.

old but relevant as too why console gets get more out of the hardware than anything windows...

https://www.youtube.com/watch?v=nIoZB-cnjc0

Its a good listen!
 
Last edited:
Ratchet & Clank: Rift Apart offers three visual options – Fidelity Mode, Performance Mode, and Ray Tracing Performance Mode. All three modes offer essentially the same experience in terms of world detail and effects, with resolution, framerate, and ray tracing being the only things differentiating them. Fidelity Mode offers ray tracing and a locked 4K (achieved using various upscaling techniques) at 30fps, Performance Mode drops ray tracing and features dynamic 4K resolution that can drop as low as 1440p (1800p average) at 60fps, and RT Performance Mode offers ray tracing and dynamic resolution that can drop as low as 1080p (1440p average) at 60fps.

Testing shows all visual modes maintain their target FPS nearly all the time, although there are small dips of up to 5 frames in some scenarios, specifically a) during some cutscenes, and b) when going through one of the larger rifts that transport Ratchet to a new world. Since you’re not really playing the game during these moments, these small dips shouldn’t have much, if any, effect on your experience.

https://wccftech.com/ratchet-and-clank-rift-apart-performance-report-rifts-cause-dips/
 
So your point is that Consoles are more optimized, I get that. But what I am suggesting is that we don't get any meaningful benefits in most cases for the loss of optimized settings on PC. Ultra and High settings are just there as part of basic PC options (sometimes not even that) and just exist to tank PC performance for no meaningful gain in Image Quality. So PC gamers are being fleeced both ways, we nither get the optimization nor great IQ. Also, if it is so easy to set console-like settings, developers should bundle them as part of new settings named Console Like or Equivalent for PCs but that's too much to ask for these days.

I use the term optimized in quotes because it's not really optimization in the classic computing sense. Optimization means a very specific thing, which is to take some function f(x) and make it compute faster or with less resources but the output remain the same for any specific input. So with rendering what you want is more FPS with the same visual quality. The problem is that this kind of "optimization" we see on the consoles is really just turning down visual settings in a selective way so you get the most visual bang for your performance buck. It's a smart thing to do because you're spending your resource of GPU cycles on things that give you the biggest win. BUT it's not the same as truly optimizing the game, you're just tweaking settings and compared to the PC with everything turned up to Ultra you're simply looking at an inferior output on the console. So you naturally expect that to perform better on any given hardware, this is why I said if you want to compare performance on PC/Console you need to make sure you at least have settings parity first, so at least it's a fair test.

All of that said there are definitely more chances for real genuine optimizations on consoles. First of all they have fixed hardware across all users so if you're hunting for ways to be more efficient then if you find something it's a win for everyone. Where as on the PC you have a whole range of hardware and if you find an optimization for a specific video card then it has limited users, meaning there's less incentive to hunt for them on PC. As @Gerard also pointed out consoles allow for real optimizations that come from allowing engineers to program close to the metal. This is because consoles strive to remove as many layers of abstraction between the game engine and what instructions are being executed on the GPU. Something that's impossible on the PC, they need to have all those layers of abstraction to have an open and competing market of different video cards and components.

As for visual trade offs, game developers are going to aim for the low hanging fruit first. In a marketplace of competing games that want to use nice visuals to attract gamers they will go after whatever visual effects give the most bang for buck. But a corollary of this is that as time goes on you're forced to move towards effects which are more computationally expensive and relatively more subtle impact on visuals. if those tradeoffs are worth it or not is kinda just personal preference on image quality vs cost vs performance. It's why consoles being more or less budget gaming devices still sell extremely well. Many gamers will be told something is 4k when in fact there's dynamic scaling going on and most wont be able to tell the difference. High end settings for PC games is kind of a niche thing, it's enthusiasts who understand how a lot of this tech works and want to experience it in action, even if the visual impacts are subtle.

Alex @ DF argues for the very same thing regarding console settings on the PC, it'd be really handy for the hardware and gaming community to compare games. Hopefully the console DRM will be cracked and people will get access to the file system on them and modders can go hunting for this info.
 
I use the term optimized in quotes because it's not really optimization in the classic computing sense. Optimization means a very specific thing, which is to take some function f(x) and make it compute faster or with less resources but the output remain the same for any specific input. So with rendering what you want is more FPS with the same visual quality. The problem is that this kind of "optimization" we see on the consoles is really just turning down visual settings in a selective way so you get the most visual bang for your performance buck. It's a smart thing to do because you're spending your resource of GPU cycles on things that give you the biggest win. BUT it's not the same as truly optimizing the game, you're just tweaking settings and compared to the PC with everything turned up to Ultra you're simply looking at an inferior output on the console. So you naturally expect that to perform better on any given hardware, this is why I said if you want to compare performance on PC/Console you need to make sure you at least have settings parity first, so at least it's a fair test.

All of that said there are definitely more chances for real genuine optimizations on consoles. First of all they have fixed hardware across all users so if you're hunting for ways to be more efficient then if you find something it's a win for everyone. Where as on the PC you have a whole range of hardware and if you find an optimization for a specific video card then it has limited users, meaning there's less incentive to hunt for them on PC. As @Gerard also pointed out consoles allow for real optimizations that come from allowing engineers to program close to the metal. This is because consoles strive to remove as many layers of abstraction between the game engine and what instructions are being executed on the GPU. Something that's impossible on the PC, they need to have all those layers of abstraction to have an open and competing market of different video cards and components.

As for visual trade offs, game developers are going to aim for the low hanging fruit first. In a marketplace of competing games that want to use nice visuals to attract gamers they will go after whatever visual effects give the most bang for buck. But a corollary of this is that as time goes on you're forced to move towards effects which are more computationally expensive and relatively more subtle impact on visuals. if those tradeoffs are worth it or not is kinda just personal preference on image quality vs cost vs performance. It's why consoles being more or less budget gaming devices still sell extremely well. Many gamers will be told something is 4k when in fact there's dynamic scaling going on and most wont be able to tell the difference. High end settings for PC games is kind of a niche thing, it's enthusiasts who understand how a lot of this tech works and want to experience it in action, even if the visual impacts are subtle.

Alex @ DF argues for the very same thing regarding console settings on the PC, it'd be really handy for the hardware and gaming community to compare games. Hopefully the console DRM will be cracked and people will get access to the file system on them and modders can go hunting for this info.
I think I got your point a while ago. To summarize:

1. Consoles have better optimization because of specific hardware and low-level access. We already know this.
2. What we might think of Optimization for Consoles might just be various specific hidden settings that are lower than on PC.

About the last para, I think we all know that games are being made (for a long while now) with Console first mentality. There are various market reasons for this. However, I still think that PC hardware is not being utilized to its limit. I am not talking about small imperceivable differences for the large performance hits. There are things that can be PC-specific which we are not implemented by devs. Few examples; 128 or 256 player matches, huge texture packs (not just 1 step above consoles), ability to run Single-player games at 144 FPS plus (This is not possible on manay engines even on lowest settings).
 
I'm not sure R&C has RT based GI, it certainly doesn't look like it.

I personally found the lighting in ME Enhanced to be good, although it was another RT light title. I even commented with the inclusion of DF's video explaining why some areas were lighter. Technically it was correct, artistically is up for debate. If you look back, the original ME was called out for being too dark with RT on, which was due to single bounce lighting.
I have already explained why I don't think it was technically correct but the summary was light fall off and materials absorbing light and reducing its power before reflecting it.

It just seems like whenever anyone comments on RT they go for the low hanging fruit a.k.a reflections and sometimes they notice light bounce.

It's almost as if RT affects are really hard to notice outside of the areas I just mentioned. You don't see any comments on materials and whether they look right or judgement on the whole scene.
 
The problem is that even the RTX3060 was released in February,so that is at least the end of next year until we get a replacement. Even if the fully enabled GA106 is released,its still going to be RX5700XT level performance. FE models used to be the upper end of the RRP,but now define the base price. So even if prices move back to "normality" an RTX3060 is going to cost more than £300,and an RTX3060TI over £400 for the most part.

Then the issue is with the price escalation per generation,they might jack up the price of the RTX4060 to closer to £400,which will probably be a bit slower than an RTX3060TI(but have more VRAM,and better RT performance) and count that as progress and so on.

The same is also happening with CPUs now.



The same happens with GPU VRAM,core count,etc. Extra VRAM and more cores don't matter. Many will upgrade quicker so don't care. However,then what happens mainstream gamers get screwed over even more,because they need to overprovision hardware due to longer lifespans.

if Intel can bring out something in the £150-200 bracket that does rx5700xt performance by the end of the year they will clean house

but its looking first 1/4 next year by all reports
 
if Intel can bring out something in the £150-200 bracket that does rx5700xt performance by the end of the year they will clean house

but its looking first 1/4 next year by all reports

How much DDR/PCB/GPU Si do you think you're going to get in 2022 for a parts cost of £70-100? Not much - consoles seem cheap because they're subsidised by the game prices. If Intel has a competitive GPU, it will be priced within a few % either way of the rest of the market IMO.
 
I'm playing the game now in Fidelity mode. It's the best for me, I tried performance RT mode but the visuals are a bit rubbish, sure it runs at 60fps but it's dropping down to 1080p which is just too soft on my screen.
 
I think I got your point a while ago. To summarize:

1. Consoles have better optimization because of specific hardware and low-level access. We already know this.
2. What we might think of Optimization for Consoles might just be various specific hidden settings that are lower than on PC.

About the last para, I think we all know that games are being made (for a long while now) with Console first mentality. There are various market reasons for this. However, I still think that PC hardware is not being utilized to its limit. I am not talking about small imperceivable differences for the large performance hits. There are things that can be PC-specific which we are not implemented by devs. Few examples; 128 or 256 player matches, huge texture packs (not just 1 step above consoles), ability to run Single-player games at 144 FPS plus (This is not possible on manay engines even on lowest settings).

Well...it's not "might", it very definitely is. And has been the case for just about as long as we've had multi-platform games. The only real issue there is that it's not that well known among console gamers, they're sold in the marketing that their consoles are super high end, this generation they're being sold 4k, 120hz, ray tracing and a bunch of stuff. But the reality is that there's massive amounts of compromise just to get any one of those things working in a AAA title. It's a shame because it leads to the angry rage quit stuff you saw with oguzsoso where even if you literally do a technical breakdown of the differences in large amounts of really specific detail, it just leads to defensiveness and denial.

There are examples of what you're talking about. If you want extremely large player battles you can play a game like Planetside 2 which has very high player numbers on the same server, I'm not sure exactly how many, wikipedia claims 2000 which seems way too high, but I do know the original Planetside capped at about 333 per continent and PS2 is at least that or higher. Huge texture packs exceeding what we already have generally wont get made because they wont get used. They make fairly little difference to visual outcome on most common screen resolutions because the texel count often exceeds the pixel count it's being displayed in. For example if you look at the 4k UHD texture pack for R6 Siege and read comments from people, most people find it a waste of time at 1080p and only those at 4k really notice the difference. But WDL specifically does have a high res texture pack for the PC, it's large enough to have an impact but not be a waste. Running games at 144fps is typically just a hardware limitation, it's hard for CPUs to get modern engines up to that kind of tickrate, but it certainly is something you can trade off visuals for if you like, and some people do, they prefer 1080p or 1440p over 4k, and lower video settings in order to maintain higher frame rates.

It doesn't matter if it's high end visual settings, very high res textures, very high frame rate...they all share one thing in common which is they all have diminishing returns. Going from 40fps to 60fps is more noticeable than 120 to 140, going from 20 players to 40 is more noticeable than 300 to 320 etc, despite the delta being the same. Probably something to do with Weber's law, if you want to massively geek out watch this https://www.youtube.com/watch?v=hHG8io5qIU8
 
Back
Top Bottom