• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
The consoles will probably have to scale to 960p anyway just like they have to with Phantom Liberty when RT is enabled :p

UE5.2 uses RT either in HW or SW mode, and since none of the consoles have RT via HW anywhere near the power of PC...

Starfield = scale from 800p on Xbox, that's with no RT even
 
The consoles will probably have to scale to 960p anyway just like they have to with Phantom Liberty when RT is enabled :p

UE5.2 uses RT either in HW or SW mode, and since none of the consoles have RT via HW anywhere near the power of PC...
It will be interesting to see how newer games will perform on consoles. The magic of 4k@60fps can go only so much before falling flat and that fast storage won't make miracles when it comes to actually render the scene. And we had examples of this so far.
 
It will be interesting to see how newer games will perform on consoles. The magic of 4k@60fps can go only so much before falling flat and that fast storage won't make miracles when it comes to actually render the scene. And we had examples of this so far.

Looking at the average hardware spec on Steam,if an XBox Series X has problems,then so will most gaming PCs.

GPU performance

Remember,the PS5 dGPU is apparently around RX6650XT/RX6700 level,but the XBox Series X has more TFLOPs,more "RT cores" and far more memory bandwidth. The XBox Series X has more RT cores than an RX6700XT. So at the very least overall, it probably is close to one if we account for the lower peak clockspeeds.

It is also a true RDNA2 dGPU,whereas the PS5 is a hybrid RDNA1/RDNA2 design.


The RX6700XT and RTX3060 trade blows with each other in RT.

Even if we just look at rasterised performance,most of the top 10 dGPUs on Steam have similar or lower rasterised performance:

Adding up the numbers,around 12~15% of dGPUs listed on Steam have more RT performance than an RTX3060!

CPU performance

It's most likely the XBox Series X has more CPU performance too. The PS5 CPU has a weaker FPU compared to normal Zen2:

Most gamers don't use overclocked CPUs either:


So the average gaming CPU on Steam is a six core CPU with HT,running at under 4GHZ!
 
Last edited:
The consoles will probably have to scale to 960p anyway just like they have to with Phantom Liberty when RT is enabled :p

UE5.2 uses RT either in HW or SW mode, and since none of the consoles have RT via HW anywhere near the power of PC...

They've been out for 3 years and cost half the money. You should compare a RT capable pc with 90% of the gpus on a steam hardware survey :)
 
Looking at the average hardware spec on Steam,if an XBox Series X has problems,then so will most gaming PCs.

GPU performance

Remember,the PS5 dGPU is apparently around RX6650XT/RX6700 level,but the XBox Series X has more TFLOPs,more "RT cores" and far more memory bandwidth. The XBox Series X has more RT cores than an RX6700XT. So at the very least overall, it probably is close to one if we account for the lower peak clockspeeds.

It is also a true RDNA2 dGPU,whereas the PS5 is a hybrid RDNA1/RDNA2 design.


The RX6700XT and RTX3060 trade blows with each other in RT.

Even if we just look at rasterised performance,most of the top 10 dGPUs on Steam have similar or lower rasterised performance:

Adding up the numbers,around 12~15% of dGPUs listed on Steam have more RT performance than an RTX3060!

CPU performance

It's most likely the XBox Series X has more CPU performance too. The PS5 CPU has a weaker FPU compared to normal Zen2:

Most gamers don't use overclocked CPUs either:


So the average gaming CPU on Steam is a six core CPU with HT,running at under 4GHZ!
I don't think those run 4k displays anyway.
 
I don't think those run 4k displays anyway.

But the issue was the comment that consoles are nothing compared to PCs due to lack of RT power,but by that extension the majority of PCs on Steam are no better or worse. This is why Nvidia/AMD are trying to spin upscaling as the best thing since sliced bread,after PCMR used to mock consoles being "weak" and need "upscaling".

I suppose when greedy companies such as Nvidia and AMD want to sell you last generations performance for last generations pricing,it's sad that 2020 consoles are comparable(or even faster) compared to an average gaming PC with a six core CPU and GTX1650/GTX1060/RTX3060.

There is also a reason why the most common gaming resolution is 1080p because mainstream dGPUs are getting relatively worse and worse! :(

When the XBox 360 came out,which had a cut down 7900 series dGPU,within two years we had the 8800GT(which in todays money would be still under £400) and was probably twice as fast or more. Now we have trash like the RTX4060/RTX4060TI/RX7600 under £400 three years later.

That meme is now not true - most common gaming PCs are just as weak now.
 
Last edited:
But the issue was the comment that consoles are nothing compared to PCs due to lack of RT power,but by that extension the majority of PCs on Steam are no better or worse. This is why Nvidia/AMD are trying to spin upscaling as the best thing since sliced bread,after PCMR used to mock consoles being "weak" and need "upscaling".

I suppose when greedy companies such as Nvidia and AMD want to sell you last generations performance for last generations pricing,it's sad that 2020 consoles are comparable(or even faster) compared to an average gaming PC with a six core CPU and GTX1650/GTX1060/RTX3060.

There is also a reason why the most common gaming resolution is 1080p because mainstream dGPUs are getting relatively worse and worse! :(

When the XBox 360 came out,which had a cut down 7900 series dGPU,within two years we had the 8800GT(which in todays money would be still under £400) and was probably twice as fast or more. Now we have trash like the RTX4060/RTX4060TI/RX7600 under £400 three years later.

That meme is now not true - most common gaming PCs are just as weak now.
They are also because mining was a set back and everything got *****d up in the process. I'd say that consoles being cheaper is normal. I don't care about that.

Anyway, at least those PCs know their place, not sold as 4k or 8k machines :)) Leave outside the RT, those consoles won't do 4k@60 native with high amount of details. Not even 30 in most cases.
 
They are also because mining was a set back and everything got *****d up in the process. I'd say that consoles being cheaper is normal. I don't care about that.

Anyway, at least those PCs know their place, not sold as 4k or 8k machines :)) Leave outside the RT, those consoles won't do 4k@60 native with high amount of details. Not even 30 in most cases.

They tried last time with Turing and things stagnated until Turing V2,and the mainstream has been getting worse and worse since Pascal and Polaris. The stagnation in VRAM is an example of this. I should know because I buy mainstream parts. All this mocking of consoles is hilarious with all the DLSS/FSR marketing trying to boast about 8K gaming FFS:

Linus tried it with the RTX3090 and RTX4090 and you can tell by the titles how he felt.

geforce-rtx-3090-worlds-first-8k-gaming-gpu.jpg



AMD and high refresh rate 8K gaming:
4NHW57W.png



Nvidia also talks about 4K gaming....on an RTX3060 and ALL RTX3000 laptops(including the RTX3050 4GB) and GTX1650 cards:

J8E4B3a.png

UFh2CHB.png

s6OQeBr.png


Have a jibe at consoles and release trash like the RTX4060 8GB,etc.

"The consoles will probably have to scale to 960p anyway just like they have to with Phantom Liberty when RT is enabled :p

UE5.2 uses RT either in HW or SW mode, and since none of the consoles have RT via HW anywhere near the power of PC..."

Yes,so all these digs at consoles when PCMR on tech forums,seem to not realise this is the reality on Steam:
FMfQobB.png


Many of those dGPUs are probably not faster in any appreciable way than an XBox Series X dGPU THREE YEARS after it released. THREE YEARS. The RTX4060/RX7600 are slower than my old RTX3060TI FFS. How does the RTX4060/RTX4060TI/RX7600 push things forward in any useful way in that list? Push more intensive RT effects,talk about 4K/8K gaming and release more trash.

We can have pathtracing at 320p upscaled to 1080p then? :p

So if PCMR wants to laugh at consoles,they should also laugh at most PCs too because of the trash releases we get now. Nvidia/AMD having jibes at consoles talking about 4K/8K gaming whilst restricting VRAM on mainstream releases. It seems Nvidia/AMD want mainstream PC gaming to be console level at a much higher cost.

The issue on tech forums is like car forums,where all the high spenders will congregate together and think everyone has an RTX4090/RTX4080/RX7900XTX and so on.

Stuff like the RTX4060/RTX4060TI and RX7600 are unmitigated disasters for PC gaming. Mining is over and the Pandemic is now technically over.

There was a time I also used to join in and laugh at consoles. But not anymore. The whole market is going backwards. No wonder the consumer dGPU market is contracting!
 
Last edited:
But the issue was the comment that consoles are nothing compared to PCs due to lack of RT power,but by that extension the majority of PCs on Steam are no better or worse. This is why Nvidia/AMD are trying to spin upscaling as the best thing since sliced bread,after PCMR used to mock consoles being "weak" and need "upscaling".

I suppose when greedy companies such as Nvidia and AMD want to sell you last generations performance for last generations pricing,it's sad that 2020 consoles are comparable(or even faster) compared to an average gaming PC with a six core CPU and GTX1650/GTX1060/RTX3060.

There is also a reason why the most common gaming resolution is 1080p because mainstream dGPUs are getting relatively worse and worse! :(

When the XBox 360 came out,which had a cut down 7900 series dGPU,within two years we had the 8800GT(which in todays money would be still under £400) and was probably twice as fast or more. Now we have trash like the RTX4060/RTX4060TI/RX7600 under £400 three years later.

That meme is now not true - most common gaming PCs are just as weak now.

On PCs with not much power for RT acceleration, you can simply turn settings down to either get good RT performance, or turn RT off altogether and leverage gained fps and resolution. You don't have the same level of control on a console, you only have the presets the developer decides to implement. Likewise texture quality will be superior on PC too. This isn't a debate worth having, the PCs, even when at compartable spec to either console (2070 Super+), will always be better on PC than the console version if, and only if, the games are optimised for both platforms properly.

And then there are mods that allow even more performance gains as well as in many cases, picture quality gains too. This is not something that is possible on console, you are stuck with lower quality 60fps, or 30fps higher quality but at lower dynamic resolutions.

And since UE5 is now the go-to engine everyone seems to be banging on about, and since it has RT as part of the engine core itself, with upscaler support out of the box, the gap widens even further.
 
Last edited:
On PCs with not much power for RT acceleration, you can simply turn settings down to either get good RT performance, or turn RT off altogether and leverage gained fps and resolution. You don't have the same level of control on a console, you only have the presets the developer decides to implement. Likewise texture quality will be superior on PC too. This isn't a debate worth having, the PCs, even when at compartable spec to either console (2070 Super+), will always be better on PC than the console version if, and only if, the games are optimised for both platforms properly.

And then there are mods that allow even more performance gains as well as in many cases, picture quality gains too. This is not something that is possible on console, you are stuck with lower quality 60fps, or 30fps higher quality but at lower dynamic resolutions.

And since UE5 is now the go-to engine everyone seems to be banging on about, and since it has RT as part of the engine core itself, with upscaler support out of the box, the gap widens even further.

Yet two years after the XBox 360,the 8800GT came out at less than the cost of an XBox 360,was over twice as fast,etc. How,the mighty have fallen.

PC enthusiasts on tech forums really don't appreciate the level of hardware most gamers have:

FMfQobB.png



I looked on the Steam Hardware Survey. Barely 15% of all dGPUs on there have something faster than an RTX3060.The XBox Series X dGPU is actually quite wide. It has more shaders and RT cores than an RX6700XT with significantly more memory bandwidth. As shown by people testing the RTX3060 and RX6700XT they trade blows in RT titles,and the RX6700XT is much faster without RT.


One of those titles is UE5 based,and the RX6700XT isn't beaten by an RTX3060/RTX4060 with RT on. It is the same with previous tests in Fornite with RT on. So how is the XBox going to be slower than an RTX3060/RTX4060 with UE5?

What do you think is going to be taking up more of that top chart in the next two years....the RTX4060 which can't even handily beat an RX6700XT in UE5 titles,especially with only 8GB of VRAM. It is also slower than the RTX3060TI. The RTX4060TI is not faster than an RTX3070.

The laptop RTX3060 cards also are only 6GB. Despite your jibes at consoles,the XBox is as strong or still stronger than probably 14 to 15 cards on the above list,THREE YEARS after it launched or 85% of the Steam hardware survey top 30.


@KompuKare should chime in too. I am a mainstream gamer and I can see this myself so don't try to tell me what I am starting to see. Most PC gamers are not buying £600+ cards. Most of my friends have mainstream hardware. People have looked at the market the last few years and have just gone meh.

So how is that going to push forward RT adoption then,if MOST of the cards are much of the same performance for years? Why don't you try an RTX4060 as your main gaming dGPU for a year then? You dumped your RTX3080 because you thought it was too slow! :p

Most PCs also don't even have TLC NVME SSDs either - so many gaming systems have DRAMless QLC drives(even pre-built ones and laptops). The consoles use TLC drives with DRAM,attached to dedicated I/O controllers.

Most PCs don't even use Directstorage too. Even if you use it,there is a penalty on dGPU performance - so how is that mainstream RTX4060/RTX4060TI going to do then? Most gamers I know have quad cores or six core CPUs too.

Even DF looked at it,using a console equivalent desktop and as time progressed the desktops fell behind as the consoles got better and better optimisations.

The same goes with all the stuff about texture quality - you mean like all those recent titles which have poor texture quality on mainstream 8GB dGPUs. The consoles are not nearly affected by this because they can use more VRAM than 8GB.


---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------


UE5 is developed also with consoles in mind:

It runs quite decently on AMD dGPUs,and Epic demoed it plenty of times on the PS5:


Tim Sweeney said:
The Unreal Engine 5 demo on PlayStation 5 was the culmination of years of discussions between Sony and Epic on future graphics and storage architectures.


The Nanite and Lumen tech powering it will be fully supported on both PS5 and Xbox Series X and will be awesome on both.

Lots of console games will use use it fine:

Epic is betting big on consoles.

Yet in 2023,we have Nvidia releasing its entire range of dGPUs upto £490 as 8GB VRAM cards. The most common CPU on Steam is six cores UNDER 4GHZ. That means most gaming PCs have stock clocked CPUs,which are running at console like clockspeeds and performance.

The PS5 will be replaced by a faster PS5 PRO within 12 months. Microsoft is looking like they might push the XBox Series X replacement forward than do a midlife refresh.

So how is trash like the RTX4060,RTX4060TI,etc going to last then? Its one thing preferring PC gaming. I prefer PC gaming too - but it's another thing to not see how the market is basically going down the drain.

I am not going to trash consoles when Nvidia and AMD are releasing subpar crap. No amount of upscaling is going to change that. Every £200+ dGPU should trash a console by now after three years. Maybe I need to just agree to disagree with you people.
 
Last edited:
They tried last time with Turing and things stagnated until Turing V2,and the mainstream has been getting worse and worse since Pascal and Polaris. The stagnation in VRAM is an example of this. I should know because I buy mainstream parts. All this mocking of consoles is hilarious with all the DLSS/FSR marketing trying to boast about 8K gaming FFS:

Linus tried it with the RTX3090 and RTX4090 and you can tell by the titles how he felt.

geforce-rtx-3090-worlds-first-8k-gaming-gpu.jpg



AMD and high refresh rate 8K gaming:
4NHW57W.png



Nvidia also talks about 4K gaming....on an RTX3060 and ALL RTX3000 laptops(including the RTX3050 4GB) and GTX1650 cards:

J8E4B3a.png

UFh2CHB.png

s6OQeBr.png


Have a jibe at consoles and release trash like the RTX4060 8GB,etc.

"The consoles will probably have to scale to 960p anyway just like they have to with Phantom Liberty when RT is enabled :p

UE5.2 uses RT either in HW or SW mode, and since none of the consoles have RT via HW anywhere near the power of PC..."

Yes,so all these digs at consoles when PCMR on tech forums,seem to not realise this is the reality on Steam:
FMfQobB.png


Many of those dGPUs are probably not faster in any appreciable way than an XBox Series X dGPU THREE YEARS after it released. THREE YEARS. The RTX4060/RX7600 are slower than my old RTX3060TI FFS. How does the RTX4060/RTX4060TI/RX7600 push things forward in any useful way in that list? Push more intensive RT effects,talk about 4K/8K gaming and release more trash.

We can have pathtracing at 320p upscaled to 1080p then? :p

So if PCMR wants to laugh at consoles,they should also laugh at most PCs too because of the trash releases we get now. Nvidia/AMD having jibes at consoles talking about 4K/8K gaming whilst restricting VRAM on mainstream releases. It seems Nvidia/AMD want mainstream PC gaming to be console level at a much higher cost.

The issue on tech forums is like car forums,where all the high spenders will congregate together and think everyone has an RTX4090/RTX4080/RX7900XTX and so on.

Stuff like the RTX4060/RTX4060TI and RX7600 are unmitigated disasters for PC gaming. Mining is over and the Pandemic is now technically over.

There was a time I also used to join in and laugh at consoles. But not anymore. The whole market is going backwards. No wonder the consumer dGPU market is contracting!
A lot of those accounts are for simpler games, won't even play some current/old gen properly. And, most likely, won't pay for new AAA games full price either way.

Back in 8800gt days there were no crypto bubbles or AI, with some pandemic generated shortage on top.

Now there isn't really a "killer app" to push gaming that much. Yes, there is CB77, but that's not a "must have" and can be played fine in raster. There isn't any Crysis going on... And that's the thing, UE5 should allow to be made easier while running on relativ modest hardware.

But... we're still playing (basically) the same games from like 10 years ago - gameplay wise. At least make them beautiful :)
 
A lot of those accounts are for simpler games, won't even play some current/old gen properly. And, most likely, won't pay for new AAA games full price either way.

Back in 8800gt days there were no crypto bubbles or AI, with some pandemic generated shortage on top.

Now there isn't really a "killer app" to push gaming that much. Yes, there is CB77, but that's not a "must have" and can be played fine in raster. There isn't any Crysis going on... And that's the thing, UE5 should allow to be made easier while running on relativ modest hardware.

But... we're still playing (basically) the same games from like 10 years ago - gameplay wise. At least make them beautiful :)

The difference is back then these companies were more enthusiast driven and valued gamers. Now they are more worried by what the accountants say and think gamers are whales. So they believe they can release any junk and gamers will buy. Now surprised Panda faces as mainstream gamers are not so enthused at 10% to 20% extra performance after 2~3 years,which can't even show consoles with their economy hardware a clean set of heels.

But at the same time feel proud when they push ray tracing,higher resolution,higher FPS,etc and more and more intensive stuff which craters performance.

With Crytek they could push an intensive game such as Crysis. Why? They probably knew we would get a card such as 8800GT,etc which would make the game viable to run for most plebs. The same as HL2 when ATI/Nvidia had the great X800/6000 series out. Lots of these games only were viable because decent hardware(and decently priced hardware) was available to run it for the full range of gamers. Its about numbers in the end. Devs could look forward - but how can they do it now? Look at how many say that 8GB is not enough for next generation games....then Nvidia/AMD ignore them and launch more 8GB cards upto £490.

Now fast forward to the present day. Outside some tech demo games which are sponsored,how many devs will want to take a risk and push things too far? Cyberpunk 2077 is a 2020 game,and Nvidia has been deeply involved in using it as a tech demo to push it's cards. How many of the 100s of games released each year will go to that level? If most PC gamers don't have the hardware and the consoles don't why bother pushing too far?

People keep having a go at consoles,but I don't see them as the main issue nowadays in pushing better effects - it's the poor state of mainstream dGPUs. Remember,we got Crysis barely two years after the XBox 360 and it was a PC only title. How many big titles are PC only now? Not even Star Citizen which is PC only seems to really push RT that much - they don't have a lack of money for sure. Its quite clear the average dGPU isn't good enough to meet the marketing Nvidia/AMD wan't to push.

Trash like the RTX4060TI/RTX4060/RX7600 are holding back PC gaming. Yesterdays performance for yesterdays pricing.
 
Last edited:
The difference is back then these companies were more enthusiast driven and valued gamers. Now they are more worried by what the accountants say and think gamers are whales. So they believe they can release any junk and gamers will buy. Now surprised Panda faces as mainstream gamers are not so enthused at 10% to 20% extra performance after 2~3 years,which can't even show consoles with their economy hardware a clean set of heels.

But at the same time feel proud when they push ray tracing,higher resolution,higher FPS,etc and more and more intensive stuff which craters performance.

With Crytek they could push an intensive game such as Crysis. Why? They probably knew we would get a card such as 8800GT,etc which would make the game viable to run for most plebs. The same as HL2 when ATI/Nvidia had the great X800/6000 series out. Lots of these games only were viable because decent hardware(and decently priced hardware) was available to run it for the full range of gamers. Its about numbers in the end. Devs could look forward - but how can they do it now? Look at how many say that 8GB is not enough for next generation games....then Nvidia/AMD ignore them and launch more 8GB cards upto £490.

Now fast forward to the present day. Outside some tech demo games which are sponsored,how many devs will want to take a risk and push things too far? Cyberpunk 2077 is a 2020 game,and Nvidia has been deeply involved in using it as a tech demo to push it's cards. How many of the 100s of games released each year will go to that level? If most PC gamers don't have the hardware and the consoles don't why bother pushing too far?

People keep having a go at consoles,but I don't see them as the main issue nowadays in pushing better effects - it's the poor state of mainstream dGPUs. Remember,we got Crysis barely two years after the XBox 360 and it was a PC only title. How many big titles are PC only now? Not even Star Citizen which is PC only seems to really push RT that much - they don't have a lack of money for sure. Its quite clear the average dGPU isn't good enough to meet the marketing Nvidia/AMD wan't to push.

Trash like the RTX4060TI/RTX4060/RX7600 are holding back PC gaming. Yesterdays performance for yesterdays pricing.

Of course I don't disagree that the current offer on graphics card is pants, but that's just one side of the story.

Demo from UE5 was running on PS5, yes? With movie quality assets, pixel accurate shadows and other goodies. Where are those games? Those should run, no problem, on something like rtx2080 equivalent around 1080p, give or take. That's also the normal resolution for consoles anyway, not 4k.

Hogwarts, Forespoken and TLOU are no Crysis and yet they run (ran?) like ****. There isn't a valid reason to have such high requirements.

SC is not using RT (probably will at some point), but they actually bother to push tech. I think is the best engine out there when it comes to parallel work/threading. They actually do a sort of Red Faction thing on ships and not just have precompute destruction like all other games do. There is no loading screen for an entire solar system and doesn't ask for stupid fast storage and compression/decompression algorithms like consoles do (or direct storage).

However, coming back to UE5, since you have high quality models and such, don't do a "similar experience across systems" BS and limit what a PC can do. Allow for more stuff on screen, better physics, more NPCs, higher draw distance, include whatever each vendor tech that he has, do all, do it right. Don't so NMS BS with Vulkan which was basically just a move to allow it to run on more OSes, but still be single thread limited. There are plenty gamers out there with fast enough systems.

And one more thing, when AMD had Mantle, which allowed even weaker systems to play your game (better utilization of the resources), how many devs jumped on that? It was on their best interest after all...
 
Of course I don't disagree that the current offer on graphics card is pants, but that's just one side of the story.

Demo from UE5 was running on PS5, yes? With movie quality assets, pixel accurate shadows and other goodies. Where are those games? Those should run, no problem, on something like rtx2080 equivalent around 1080p, give or take. That's also the normal resolution for consoles anyway, not 4k.

Hogwarts, Forespoken and TLOU are no Crysis and yet they run (ran?) like ****. There isn't a valid reason to have such high requirements.

SC is not using RT (probably will at some point), but they actually bother to push tech. I think is the best engine out there when it comes to parallel work/threading. They actually do a sort of Red Faction thing on ships and not just have precompute destruction like all other games do. There is no loading screen for an entire solar system and doesn't ask for stupid fast storage and compression/decompression algorithms like consoles do (or direct storage).

However, coming back to UE5, since you have high quality models and such, don't do a "similar experience across systems" BS and limit what a PC can do. Allow for more stuff on screen, better physics, more NPCs, higher draw distance, include whatever each vendor tech that he has, do all, do it right. Don't so NMS BS with Vulkan which was basically just a move to allow it to run on more OSes, but still be single thread limited. There are plenty gamers out there with fast enough systems.

And one more thing, when AMD had Mantle, which allowed even weaker systems to play your game (better utilization of the resources), how many devs jumped on that? It was on their best interest after all...

People always talk about the graphics of Crysis, but what made it great were the little touches. For example the destructible environments, the nanosuit which you could change your play style, multiple ways to get to an objective, etc. Even the use of flying enemies at the end which meant two different AI models for enemies. Most modern games use bipedal enemies because you can use the NPC AI models.

But even with CPUs, we seem to be stuck at 6 and 8 cores for years now on most normal systems. Also Directstorage is basically because MS has for years not bothered. SSDs have been viable for years now on PC - one has to question why Windows is so poor at I/O and why are Intel/AMD not making platforms which are more efficient at I/O? Motherboards are getting more and more expensive and we even still have £200 motherboards stuck at PCI-E 4.0!
To need a dGPU to handle I/O sounds a poor use of resources.
 
Last edited:
Being a little too optimistic aren't you ? .... :D

Build a £450 pc and see what settings it will run at.

People should be bashing the current gpu landscape and not trying to make silly digs at consoles 3yrs after their release.

Fortunately Sony and Capcom send some quality games our way because because if they stuck to consoles exclusively there would be hardly anything decent to play and we would still be circle jerking over crap old games like Control.
 
Build a £450 pc and see what settings it will run at.

People should be bashing the current gpu landscape and not trying to make silly digs at consoles 3yrs after their release.

Fortunately Sony and Capcom send some quality games our way because because if they stuck to consoles exclusively there would be hardly anything decent to play and we would still be circle jerking over crap old games like Control.

It's not that deep, I was just messing around hence the smiley emote at the end.
 
Last edited:
Not specifically RT, but upscaling is required for decent RT performance, so it fits here, DF comment on the whole missing DLSS on AMD sponsored games deal:

 
Status
Not open for further replies.
Back
Top Bottom