• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Nexus, can I ask a question. When designing a game, do you use models, textures and polygons? I'm talking 3D games. Because as far as i'm aware, also being a developer these basic building blocks for a game are all raster. We've had years and years trying to look for replacements, non have been forthcoming. There are faster ways to display certain types (tesselation, voxels etc) but basically it's all triangles or balls in voxels case. RT/PT is a render path, it affects lighting using said geometry or model or textures. Without them you would not have RT/PT unless i'm doing something wrong. The levels in games are exactly the same whether raster or with a RT/PT render path. It's just the lighting and how that light reacts that is different.

Now am I dismissing RT/PT. Not at all, you just can't have one without the other at the present moment in time. RT/PT does not magically conjure up said models/textures/polygons.

A simpler way of putting it is RT/PT is a renderer of those models and lights etc. But it must have those to do it's work. It's one of, but not the only way to get an output. Lumen is another way that uses screen space rendering, not ray traced. Lumen will use ray tracing in the future as the technology catches up

You're not wrong on your points there, blurbuster guy summed it up well, this is referencing "fake frames" but it somewhat touches upon the same note with the whole raster, ray tracing etc. too with how games are rendered:


Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures

Regarding lumen, isn't it ray tracing through and through, choice either being software or hardware ray tracing?


Lumen uses multiple ray-tracing methods to solve Global Illumination and Reflections. Screen Traces are done first, followed by a more reliable method.

Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.

Of course when I am referring to RT becoming mainstream and raster becoming a thing of the past, I am referrencing entirely to lighting, shadows, reflections, GI etc.

But the point at the minute is that even in a game that was hyped for it's RT visuals, even as an enthusiast I didn't really notice a difference - trying to sell that to "normal" non-enthusiasts is a non-starter.

At the minute the main push for RT seems to be from developers because it makes their life easier .

People rate spiderman 2 and avatar as being 2 of the best graphical games we have to date. Obviously it's not all down to the RT but it does play a part in it to achieve the graphical fidelity witnessed in those 2 games, which is what I mean, there will come a point where you won't even know if it is RT or not, you will simply just say "game looks fantastic".

I think this is actually shown very well with aw 2, cp 2077 and spiderman 2 + avatar, the visuals for the first 2 games are always rated and RT/PT talk is at the forefront of the games visuals and why it looks so good where as with spiderman 2 and avatar, you don't really hear the great visuals being specifically down to ray tracing, obviously nvidia are heavily invested in the marketing for the first 2 titles which again, is genius move by nvidia to make it out like RT is very much a nvidia thing.
 
You're not wrong on your points there, blurbuster guy summed it up well, this is referencing "fake frames" but it somewhat touches upon the same note with the whole raster, ray tracing etc. too with how games are rendered:




Regarding lumen, isn't it ray tracing through and through, choice either being software or hardware ray tracing?




Of course when I am referring to RT becoming mainstream and raster becoming a thing of the past, I am referrencing entirely to lighting, shadows, reflections, GI etc.



People rate spiderman 2 and avatar as being 2 of the best graphical games we have to date. Obviously it's not all down to the RT but it does play a part in it to achieve the graphical fidelity witnessed in those 2 games, which is what I mean, there will come a point where you won't even know if it is RT or not, you will simply just say "game looks fantastic".

I think this is actually shown very well with aw 2, cp 2077 and spiderman 2 + avatar, the visuals for the first 2 games are always rated and RT/PT talk is at the forefront of the games visuals and why it looks so good where as with spiderman 2 and avatar, you don't really hear the great visuals being specifically down to ray tracing, obviously nvidia are heavily invested in the marketing for the first 2 titles which again, is genius move by nvidia to make it out like RT is very much a nvidia thing.
Well, Lumen is a bit of a kludge. You can use a Nvidia fork of Unreal Engine 5 to use their hardware but it's a bit buggy and not supported by Epic. The software Lumen is a screen space renderer or Signed Distance Fields. Basically it ray traces for whats on the screen, not real life. Seeing as how we play games on a screen it has it's pros and cons.
 
Don't disagree but we don't just have nvidia to thank for this but also amd for not making it a priority to push their RT to compete with nvidia, as usual, if we had competition, things would be in a better state. Nvidia are so far ahead of amd in the RT and feature set department, they can price substantially higher and amd unfortunately think they can do the same but come in maybe 10-15% cheaper for a lesser package overall. Thankfully intel are doing pretty well with their RT performance and solutions, which I hope will mean amd taking note so as to avoid being over taken but alas, amds focus is the console market (which is probably safer/better from their POV)

Again, this is where nvidia have done incredibly well by making people think RT is a nvidia thing....

RT is not invented by Nvidia,but they are the ones which have pushed a lot for it the last few years as a selling point. The issue is this is at odds with the poor midrange dGPUs they have released. I have an RTX3060TI which is much faster than the RTX3060. Is it better than my mates RX6700XT at RT? Sure. Is it a great experience in many instances? Nope,unless I make compromises elsewhere in the game. Also the Nvidia of old didn't care what ATI/AMD did,ie,the 8800GT was an example of that,with the best card from ATI being an HD3870! ATI couldn't even beat the 4th card in the Nvidia line-up!

Daniel Owen summarises it well(and even HUB does if you watched the video in the OP). RT performance broadly sucks on cheaper dGPUs because they are too weak. Lots of people are using cheaper and weaker dGPUs(just look on Steam) and why over half the 54000 people in the poll didn't care about RT. Of the remaining 48% only 15% of those people would turn it up to high,etc. The rest either will switch it off or run it at low. It also replicates other polls too.

It becomes more of a selling point when you are buying £1000 dGPUs. Imagine if the RTX4070 was a sub £400 RTX4060TI? The RTX4070TI was a sub £500 RTX4070? That would have been a huge jump in rasterised and RT performance.

For developers on PC to really push RT,they need the mainstream dGPUs to be good at it. In between Nvidia not caring unless you want to spend over £600 on a card,and AMD just plodding along with RT in their own desktop cards, it really only means it will be down to consoles. So if Sony/MS really push heavier RT with the midlife console updates and the next consoles,that will force Nvidia and also AMD by extension to stop plodding along with mainstream cards.

It's is no coincidence the RTX3000/RX6000 initial pricing was decent - new consoles were launched.

I think Nvidia is deliberately holding back performance on the 60 class and has been for a few generations now as people who buy these generally want to keep their cards for longer and that doesn’t match Nvidia’s ideals, this gen though we’ve now seen this also creep into the 70 class.

They are copying Apple and didn't someone on here say JHH wanted a normal dGPU to sell for the price of a whole console?
 
Last edited:
RT is not invented by Nvidia,but they are the ones which have pushed a lot for it the last few years as a selling point. The issue is this is at odds with the poor midrange dGPUs they have released. I have an RTX3060TI. Is it better than my mates RX6700XT at RT? Sure. Is it a great experience in many instances? Nope,unless I make compromises elsewhere in the game. Also the Nvidia of old didn't care what ATI/AMD did,ie,the 8800GT was an example of that,with the best card from ATI being an HD3870! ATI couldn't even beat the 4th card in the Nvidia line-up!

Daniel Owen summarises it well(and even HUB does if you watched the video in the OP). RT performance broadly sucks on cheaper dGPUs because they are too weak. Lots of people are using cheaper and weaker dGPUs(just look on Steam) and why over half the 54000 people in the poll didn't care about RT. Of the remaining 48% only 15% of those people would turn it up to high,etc. The rest either will switch it off or run it at low. It also replicates other polls too.

It becomes more of a selling point when you are buying £1000 dGPUs. Imagine if the RTX4070 was a sub £400 RTX4060TI? The RTX4070TI was a sub £500 RTX4070? That would have been a huge jump in rasterised and RT performance.

For developers on PC to really push RT,they need the mainstream dGPUs to be good at it. In between Nvidia not caring unless you want to spend over £600 on a card,and AMD just plodding along with RT in their own desktop cards, it really only means it will be down to consoles. So if Sony/MS really push heavier RT with the midlife console updates and the next consoles,that will force Nvidia and also AMD by extension to stop plodding along with mainstream cards.

It's is no coincidence the RTX3000/RX6000 initial pricing was decent - new consoles were launched.
Please don't take this as gospel and it may be not be released but i've heard through developer grapevines of AMD cards being sent out with a Xilinx FPGA on it that aids is certain scenarios. What that is I don't know but ???
 
Last edited:
RT is not invented by Nvidia,but they are the ones which have pushed a lot for it the last few years as a selling point. The issue is this is at odds with the poor midrange dGPUs they have released. I have an RTX3060TI. Is it better than my mates RX6700XT at RT? Sure. Is it a great experience in many instances? Nope,unless I make compromises elsewhere in the game. Also the Nvidia of old didn't care what ATI/AMD did,ie,the 8800GT was an example of that,with the best card from ATI being an HD3870! ATI couldn't even beat the 4th card in the Nvidia line-up!

Daniel Owen summarises it well(and even HUB does if you watched the video in the OP). RT performance broadly sucks on cheaper dGPUs because they are too weak. Lots of people are using cheaper and weaker dGPUs(just look on Steam) and why over half the 54000 people in the poll didn't care about RT. Of the remaining 48% only 15% of those people would turn it up to high,etc. The rest either will switch it off or run it at low. It also replicates other polls too.

It becomes more of a selling point when you are buying £1000 dGPUs. Imagine if the RTX4070 was a sub £400 RTX4060TI? The RTX4070TI was a sub £500 RTX4070? That would have been a huge jump in rasterised and RT performance.

For developers on PC to really push RT,they need the mainstream dGPUs to be good at it. In between Nvidia not caring unless you want to spend over £600 on a card,and AMD just plodding along with RT in their own desktop cards, it really only means it will be down to consoles. So if Sony/MS really push heavier RT with the midlife console updates and the next consoles,that will force Nvidia and also AMD by extension to stop plodding along with mainstream cards.

It's is no coincidence the RTX3000/RX6000 initial pricing was decent - new consoles were launched.

I don't disagree on the first point, as I did state this in my OP:

Personally my take is, there is nothing wrong with the RT tech itself but as they somewhat touched upon the issue is the price point to get a decent experience, you basically need a 7900xt or 3080+ to really enjoy it and appreciate it and going forward when as shown, games start to use heavier RT, you ideally need a 4070ti super/4080+ so sadly, lots just won't use it, which is a valid view point of course, for me, as long as I can maintain ideally 70/80 fps, I'm good, although ideally do want to get 100+ fps, which is somewhat possible tbf with upscaling and frame gen, even on the 3+ year old 3080 with the exception of AW 2 and cp 2077 PT.

As shown time and time again though, the problem is the developers just as much, if not more for not implementing RT well. Even without RT, lots of games run **** and in some cases, even worse than games with RT turned on.....

If 4a enhanced, massive and insomniac can get great results from RT then why can't other devs.... (I know why but I'm just phrasing this so highlight who the real issue resides with)
 
Ray tracing will be good, provided that developers use it correctly and the hardware required to run it well isn't prohibitively expensive. At this stage, neither of these things are true. If it's an option to turn on, I will turn it on, but if it is impacting performance as much as it currently does, it's the first thing I turn off.

I said a couple of years ago that for me, it's a few generations away, and I still think that it is the case. Diminishing returns on GPUs are also extending that time frame.

This is basically it for me.
Pity I can't find it, but there was someone, somewhere (yes, not useful hence why I can't find it now) who posted shots of a conversion. And it clearly showed the problem with the old adage "RT is easier than baked in raster lighting": added afterwards the light was totally wrong. Far too dark (as usual) and what it really takes is a game design from the ground up to be RT only.
Ignore the not finding it ibt, it was this HL2 conversion:
qXIFGnR.jpg

And it shows exactly what I mean. The RT version totally wrong. The lighting there is more realistic but there simply isn't enough as the game was never design for it.
In a poor job like this, even if some global gamma-like setting could increase the lighting, that still wouldn't show what the first one does.
That and maybe the RT scene really needs full PT as the light is far too unrealistically "straight" with far too strong shadows. Real light doesn't behave like that and mostly looks far more than the RT off version.

So for this survey, well with this game in mind I would answer a mix of the last three options:
yvfxk5X.png

as all apply to some extend.
 
Last edited:
Please don't take this as gospel and it may be not be released but i've heard through developer grapevines of AMD cards being sent out with a Xilinx FPGA on it that aids is certain scenarios. What that is I don't know but ???

Stuff like the above but it's essentially just for machine learning or ai accelerated tasks like automotive applications.
 
Please don't take this as gospel and it may be not be released but i've heard through developer grapevines of AMD cards being sent out with a Xilinx FPGA on it that aids is certain scenarios. What that is I don't know but ???

Well they already are using some of their IP AFAIK for the NPUs in their APUs. Maybe AMD is going to target RT a different way,but whatever way it happens,it will be because Sony/MS have pushed for it because they essentially fund a lot of the AMD GPU development.

I don't disagree on the first point, as I did state this in my OP:



As shown time and time again though, the problem is the developers just as much, if not more for not implementing RT well. Even without RT, lots of games run **** and in some cases, even worse than games with RT turned on.....

If 4a enhanced, massive and insomniac can get great results from RT then why can't other devs.... (I know why but I'm just phrasing this so highlight who the real issue resides with)

Even then,those same decent devs could do more with more performance on the table. Also the issue is that many PC games are just ported over from console without any real care. But it's why I don't like early access games. It make sense for smaller Indie developers,not these massive companies who want to palm off half finished games.The best thing is for gamers not to buy such games until they get updated,or make sure if you can't wait buy when it is way below the RRP.

But the issue,is something like an RTX4060/RTX4060TI are more like overclocked RTX3060/RTX3060TI with FG thrown in. Imagine if they had been sold as a £200 RTX4050 and £250 RTX4050TI? Instead we only get a slight bump in rasterised and RT performance. So that means for 4 years,maybe even a bit longer the sub £400 area has essentially the same performance.
My only upgrade path involves spending £150~£200 more and having almost the same price/performance after three years. It's the same issue facing a lot of mainstream gamers I know - we just feel ripped off. At least with Turing Nvidia/AMD actually changed course a bit and it was salvaged a bit. This generation is worse than Turing for mainstream gamers.
 
Pity I can't find it, but there was someone, somewhere (yes, not useful hence why I can't find it now) who posted shots of a conversion. And it clearly showed the problem with the old adage "RT is easier than baked in raster lighting": added afterwards the light was totally wrong. Far too dark (as usual) and what it really takes is a game design from the ground up to be RT only.
Ignore the not finding it ibt, it was this HL2 conversion:
qXIFGnR.jpg

And it shows exactly what I mean. The RT version totally wrong. The lighting there is more realistic but there simply isn't enough as the game was never design for it.
In a poor job like this, even if some global gamma-like setting could increase the lighting, that still wouldn't show what the first one does.
That and maybe the RT scene really needs full PT as the light is far too unrealistically "straight" with far too strong shadows. Real light doesn't behave like that and mostly looks far more than the RT off version.

So for this survey, well with this game in mind I would answer a mix of the last three options:
yvfxk5X.png

as all apply to some extend.
See this is where Lumen I think would do better. It looks at it from a viewer and screen perspective and not real life. Nothing that can't be fixed though with maybe a soft light at the top and taking a bit of the shininess off the wood.
 
Pity I can't find it, but there was someone, somewhere (yes, not useful hence why I can't find it now) who posted shots of a conversion. And it clearly showed the problem with the old adage "RT is easier than baked in raster lighting": added afterwards the light was totally wrong. Far too dark (as usual) and what it really takes is a game design from the ground up to be RT only.
Ignore the not finding it ibt, it was this HL2 conversion:
qXIFGnR.jpg

And it shows exactly what I mean. The RT version totally wrong. The lighting there is more realistic but there simply isn't enough as the game was never design for it.
In a poor job like this, even if some global gamma-like setting could increase the lighting, that still wouldn't show what the first one does.
That and maybe the RT scene really needs full PT as the light is far too unrealistically "straight" with far too strong shadows. Real light doesn't behave like that and mostly looks far more than the RT off version.

So for this survey, well with this game in mind I would answer a mix of the last three options:
yvfxk5X.png

as all apply to some extend.

Not sure I would say it is "wrong", it's at night time in a dark room with very little light.... also, you do have a flashlight to use. BTW, you can create invisble light sources which would lighten up areas but then it would just look weird/out of place like all those rooms which have a weird illuminating glow e.g. metro 4a enhanced, they have an invsible light bulb/light source in the centre to create the lighting:

S1qViYF.png


What do you mean here: "full PT as the light is far too unrealistically "straight" with far too strong shadows. Real light doesn't behave like that and mostly looks far more than the RT off version", that's how shadows work when you have light sources pointing at objects, the shadow strength and softness varies depending on the distance from the light and the strength of the light

I do agree though, it requires some more thought to level design, which again, ultimately leads to better game world designs and visuals with more realistic environments.

I have found the opposite for most games with regards to darkness/lighting, in most scenes, RT has actually made scenes brighter/ligher due to lighting correctly bouncing about e.g. metro ee

3p3NAMa.png


dS86mBo.png


Also, an example of the finals where it's actually beneficial to have it turned on for player visibility:

JXBX87Kh.png


6pg7fGLh.png


Y5GcCpNh.png


5w8ELKdh.png


Someone elses example of just raster methods, imagine how much lighter/easier it would be to see this room if RT GI was used:

m6CcVxrh.png


Of course, if in a street corner at night with no light sources, then it will be pitch black as you would expect but there are ways around this even with RT.
 
Last edited:
What do you mean here: "full PT as the light is far too unrealistically "straight" with far too strong shadows. Real light doesn't behave like that and mostly looks far more than the RT off version", that's how shadows work when you have light sources pointing at objects, the shadow strength and softness varies depending on the distance from the light and the strength of the light
I guess mainly that the light effect looks to clinical and that having at least some extra path-tracing bouncing would defuse it somewhat.

I know realism vs enjoyment... and I've complained about anything which feels like watching Aliens or X Files (i.e. almost all black and the camera people really get to show of their cinematography skills) as they can be far too dark.

But if I actually was somewhere dark. Properly in RL. And had let my eyes adjust then generally any light is not straight and does bounce around and gets reflected even at night (disclaimer: I do live in a town so light pollution is always there).

And I feel that HL2 scene (and other RT I have seen) is like my eyes haven't adjusted to the dark yet. And being a computer generated scene they never will as there is no additional detail to see.

Unsure what the answer is. A webcam tracking how open your pupils are isn't realistic outside VR!
 
Last edited:
I guess mainly that the light effect looks to clinical and that having at least some extra path-tracing bouncing would defuse it somewhat.

I know realism vs enjoyment... and I've complained about anything which feels like watching Aliens or X Files (i.e. almost all black and the camera people really get to show of their cinematography skills) as they can be far too dark.

But if I actually was somewhere dark. Properly in RL. And had let my eyes adjust then generally any light is not straight and does bounce around and gets reflected even at night (disclaimer: I do live in a town so light pollution is always there).

And I feel that HL2 scene (and other RT I have seen) is like my eyes haven't adjusted to the dark yet. And being a computer generated scene they never will as there is no additional detail to see.

Unsure what the answer is. A webcam tracking how open your pupils are isn't realistic outside VR!

Hard to tell from the screenshot tbh but it does look like a spotlight in the left corner which are pretty strong light sources and the outside light I imagine is coming from a full moon or/and lighting outside the building, kind of need the game out to really get a true sense of it and see the env for yourself though.

I do get what you mean in terms of enjoyment and so on e.g. I love survival games and generally most of the time, at night time, its next to useless going out to explore and do stuff due to it literally being pitch black but this is where in the game, you can use mechanics like crafting torches, flashlights, flares and so on and in the case of palworld, use a pal to light your surrounding area, which still isn't quite as enjoyable to explore during the day but it adds a different experience/way to game e.g. palworld here, if it weren't for that fire there, it would literally just be a black screen with the night sky being visible.

mRSB5qR.png


It's also a bit like in DL 2 which I'm replaying now due to the RTX HDR mod, at the start when you go in the hospital, it's pitch black and impossible to navigate without using your torch, needless to say, it's maybe not enjoyable but utterly terrifying and forces you to turn on your flashlight, which kind of makes it even more terrifying.... I kind of get the impression this is what the devs intended for areas like this.
 
To me RT at the moment is pretty pointless to use, graphics cards really don't have the grunt to run it, it's the future but the key there is future, it's still miles away from being worth using in my opinion.

The biggest advantage of pc gaming for me is the fluidity of higher fps, RT is counter productive and makes the experience much worse if you're a fan of actual gameplay/feel over just staring at pretty images.

I have flipped a few games between RT on and off and to me it doesn't make anywhere near enough of a visual difference to a game to give up that extra fps. If you stare at it yes you can tell a difference but if you are actually just playing the game its barely noticable, it's just not something within a game that you are focussing on. Unless of course the game is crap and its the only thing worth paying attention too.
 
Well you should know what I mean when I refer to all industries when we're talking about RT.... as in anything where you will have simulated effects on screen or do you think I'm implying that RT is applicable to the gardening, school etc. industries too?

I dare not to assume, hence I asked.

Again, you're making this out to be a nvidia thing RT with comments like "NVIDIA's list of RT enabled games".... amd have a pretty high amount of sponsored rt games and there are also plenty of titles with no involvement from any brand.

You linked the NVIDIA's list, so you made it a NVIDIA thing... I am not sure why you're deflecting towards me things you write (including the one higher)?

(...)
Do you want more dynamic environments with better destruction, do you want better visuals on the whole? If so, well then it matters to you and anyone else who fits in such categories. I personally want better games but at the same time, I also want to see better graphics. Isn't the reason we all by new gpus etc. partly to allow us to dial up settings? And obviously get better performance?

This topic is about survey'e results. Which is what I am commenting about and you're getting side-tracked. My point (again) is that currently RT is largely irrelevant for most gamers, as they survey in question (and few others) revealed - for few reasons, but mainly because it's not the right time for it yet, as GPUs that can run it well are too expensive. That's it. Hence the results we see. What I want is cheap GPUs for the masses that can run these things, so we can all enjoy the future - but that, I believe, is in the far future, still.

If you really don't care about graphics as per this post:

Then why buy the best gaming display, gpu and so on? Surely you can just enjoy the games on lesser hardware?

I care about graphics. I care much more about gameplay. :) Also, if not for work I'd not push for 4090 - it's a silly purchase for games alone (unless one really has loads of spare cash and nothing else to spend it on). I likely would be fine with 4070Ti (4070S would be fine speed-wise, but vRAM is meh), or AMD equivalent, though I started to like recent HDR magic from NVIDIA and wouldn't want to lose that now. ;) They're all still too expensive, but it is what it is.

Yes I know, I created a thread for this:


Windows auto hdr is pretty crap since it raises blacks and doesn't use the correct gamma, windows one is also done on a whitelist basis and based on my experience works in very few titles.

After proper calibration using Windows HDR Calibration software by Microsoft, I have not noticed any issues with AutoHDR on Win 11 and my OLED monitor. NVIDIA one is still a bit better, plus it works in the handful of games in which AutoHDR refused to work.

It's fine to say not mainstream and all, I don't dispute this, only dispute claims like "gimmick", "raster isn't going anywhere" (of course not going to happen overnight for the listed reasons) but it is only a matter of time and I think people may be in for a surprise come next gen consoles/especially amds next gen gpus on the RT scene, time will tell.

As few people also wrote a few times in this thread, raster truly isn't going anywhere anytime soon. RT/PT/Lumen (and the likes) will be on top of it, as it is in UE5, but underneath it raster will still push whole geometry etc. One doesn't exclude the other, as is now. Also, I have already defined what I consider gimmick - again, bolted on top of raster badly implemented RT effects just to tick the box by devs, which is sadly still majority of games IMHO.
 
Back
Top Bottom