• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Raytracing - Would you buy in to it now?

Associate
Joined
2 Oct 2020
Posts
120
Nope. Going to take years for this to mature and become mainstream/worthwhile. Right now it's more like a gimmick, and the performance cost is huge.

Maturity doesn't just mean hardware it means effective use of RT in software too. Not just "maximum shiny" :p
Have you seen the Cyberpunk RT video by Digital Foundry? if not might be worth a watch https://www.youtube.com/watch?v=6bqA8F6B6NQ

I guess it depends on what you're playing - if it's competitive shooters then yes RT is not worth the hit, if it's an RPG that's all about building a believable world then it's worth it IMO.
 
Caporegime
Joined
17 Feb 2006
Posts
28,680
Location
Cornwall
My LG BX isn'y very bright in the grand scheme of things but seems to pull off HDR quite well.

The problem I find is any game I play on the OLED I don't want to then play it on the PC monitor because everything looks garbage on monitor compared to the OLED, even when comparing it sitting up close to the OLED.

It was the same even with my old LED TV (Samsung NU7400)

My Dell S2721DGF seems better with HDR on than off in games that seem to properly support it like Cyberpunk and FS2020.
Samsung just released a new micro-LED TV which hopes to rival OLED in future.

The only problem is it costs $150,000 :p :p
 
Soldato
Joined
3 Jan 2006
Posts
24,219
Location
Chadderton, Oldham
Samsung just released a new micro-LED TV which hopes to rival OLED in future.

The only problem is it costs $150,000 :p :p

I wouldn't say rival it outright beats OLED, no risk of screen burn, much brighter than any OLED, but the price won't be in reach of many of us for maybe 4 years or so at a guess?
 
Caporegime
Joined
17 Feb 2006
Posts
28,680
Location
Cornwall
I wouldn't say rival it outright beats OLED, no risk of screen burn, much brighter than any OLED, but the price won't be in reach of many of us for maybe 4 years or so at a guess?
Probably decades. Apparently the "micro" LEDs Samsung use in their concept uLED TVs are still several mm in size. Hence the products have to be massively over-sized compared to regular TVs and monitors. Explains why they've gone for stupendously large "display wall" concepts first.

They'd need to be shrunk a lot smaller to work as a TV or monitor.

It doesn't sound like we're very close to that at all. Boo.
 
Associate
Joined
23 Aug 2005
Posts
1,273
Soldato
Joined
5 Nov 2010
Posts
22,380
Why are people comparing RT and HDR..?
They're completely different things and you can have both...

All these "I would rather HDR than RT" posts just look stupid :confused:

I suppose it comes down to the cost associated.

For instance, I don't have a monitor capable of HDR nor a graphics card capable of RT. If I were looking to upgrade my GPU, would I spend a big chunk of money on a graphics card capable of running RT well or consider spending less and focus on getting a half decent HDR capable monitor (i.e. not your VESA 400 rubbish)?
 
Last edited:
Soldato
Joined
13 Aug 2012
Posts
4,277
Why are people comparing RT and HDR..?
They're completely different things and you can have both...

All these "I would rather HDR than RT" posts just look stupid :confused:

I agree and would also like to add that HDR is much better with RAy tracing maxxed out,

I don't think people have seen propper HDR in pc games until you have experienced it with ray tracing.

The life it brings to the game makes all other games look decades old and flat
 
Soldato
Joined
13 Aug 2012
Posts
4,277
Samsung just released a new micro-LED TV which hopes to rival OLED in future.

The only problem is it costs $150,000 :p :p

The full array samsungs all ready rival oleds and are generally cheaper when going for the big ones and qleds dont age over time so will still look just as good years later.

Both have there benefits and i know most on here prefer the OLEDS but HDTV test sumed it up by saying QLED better for bright room OLEDS better for dark rooms.

I would buy an lg oled next time but nothing to do with the tv, or tech, just samsung being bad company for updates and not fixing stuff where as LG have been great these last year with comunication and updates.
 
Soldato
Joined
26 May 2014
Posts
2,692
The full array samsungs all ready rival oleds
They don't even come close for anything except extremely bright content in a room situated on the surface of the sun. The haloing and blooming artifacts in darker scenes are far, far worse than any of OLED's downsides. Vincent is a huge OLED fanboy too, so I'm not sure I'd quote him as suggesting it's 50/50. The OLEDs wiped the floor with the Q90T in his awards this year.
 
Soldato
Joined
18 Oct 2002
Posts
3,996
RT in cyberpunk is something that has blown me away. Look beyond the bugs of the game and anyone who has run it with Raytracing on has to be impressed. Static screens don't do it justice in truth but playing in UW (3440x1440) had my jaw on my keyboard (normally one of my many chins) and turning RT off and back on brings a smile again to my face.

Now that the gentlemen's relish is wiped off the KB, prior to CP2077, Control and Metro Exodus were the showcases for me and I could see back then what a difference it could make. Soooooooo, Would you buy a GPU for Raytracing now? I understand AMD don't have it in CP2077 as of yet but they will, so would that sway your decision?

Once the GPU's can do 100fps with RT on at Native at 1440p, from any manufacturer I will jump in. Probably 2 years away.

I don't personally rate any of those three games you have quoted and so its not an immediate problem for me. We need something like like Grand Theft 6, Skyrim 2 and dare i say it 'Division 3' or something like that with RT to appear to make me want it.

The Cyberpunk launch looks like a total disaster zone, it reminds me of Battlefield 4. Its gonna be 6 months before that pile of rust starts to look like the game its supposed to be.
 
Soldato
Joined
13 Aug 2012
Posts
4,277
Agreed. Inherently different techs and funds permitting, you can have both.
They do compliment each other though. When using ray tracing the HDR highlights are getting brighter and darker in real time as u move about depending what light are about.

As shown in the latest df video without ray tracing an extra street light does not increase brightness at all so the HDR peak all at same level giving a flat experience. But with ray tracing the peak brightness is going up an down all the time.

I had first seen it in COD campaign, and realized how well they make each other better (rt,HDR).
 
Soldato
Joined
13 Aug 2012
Posts
4,277
They don't even come close for anything except extremely bright content in a room situated on the surface of the sun. The haloing and blooming artifacts in darker scenes are far, far worse than any of OLED's downsides. Vincent is a huge OLED fanboy too, so I'm not sure I'd quote him as suggesting it's 50/50. The OLEDs wiped the floor with the Q90T in his awards this year.

Yeah they messed up on a lot of the new TVs, the q90 is broken in game mode and there is no excuse for that when game mode and features is so good on the 2018 ones. Also Samsung now only putting the good panels in the 8k sets now to make then stand out more.

For me the halo thing is not something I see so doesn't bother me,. And yeah my living room is super bright 3 windows and double patio doors plus I like to somtimes play same game 500 hours straight and leave my desktop on 24 hours a day and play only HDR games where the qled is about 2000 nits brighter on the highlights.

So for me qled suits my needs better.

I was surprised he said that as well I thought he was going to side with OLED win on the OLED Vs qled debate.
 
Soldato
Joined
10 Oct 2012
Posts
4,088
I've tried Raytracing in Cyberpunk and it made no difference to me outside of tanking my fps. So no I don't care about RT. I've tried RT in World of warcraft and Mechwarrior 5 and again it makes no sense to me. I'm not saying RT is useless, I just feel that other areas are much more important like fluidity and perhaps stuff like physics and AI which haven't seen much/any improvement in recent years.
 
Associate
Joined
2 Oct 2020
Posts
120
I've tried Raytracing in Cyberpunk and it made no difference to me outside of tanking my fps. So no I don't care about RT. I've tried RT in World of warcraft and Mechwarrior 5 and again it makes no sense to me. I'm not saying RT is useless, I just feel that other areas are much more important like fluidity and perhaps stuff like physics and AI which haven't seen much/any improvement in recent years.
you could continue that train of thought and include AA, higher resolutions, textures, wire frames, graphics... and we end up at MUDS. All those innovations add layer upon layer of the overall believability of the game world.
 
Caporegime
Joined
17 Feb 2006
Posts
28,680
Location
Cornwall
you could continue that train of thought and include AA, higher resolutions, textures, wire frames, graphics... and we end up at MUDS. All those innovations add layer upon layer of the overall believability of the game world.
We've also had a fair share of dead-ends. I.e. failed and abandoned tech. See 3D (in both games and TV..)

RT is unlikely to be abandoned, but I just don't see it being a huge priority for many gamers right now.

Honestly, who was talking about RT before nVidia needed to find a use for their Tensor cores in a gaming product? Did anyone see a general clamouring for RT from gamers? Nope. It wasn't even discussed.

Then nV decided that their Tensor cores (AI cores) could be repurposed for RT and *bam* pushes RT to the gamer population.

And now people are worrying about RT and the hardware is still barely good enough for the minimum amount of RT without completely killing framerates.

This isn't pushed by gamers. This is pushed by nV trying to a) find a use for their Tensor cores and b) find a way to inhibit AMD's performance.

I consider it to be a fairly cynical play by nV. I don't for one second think we're close to RT being essential in modern games.
 
Soldato
Joined
4 Feb 2006
Posts
3,066
We've also had a fair share of dead-ends. I.e. failed and abandoned tech. See 3D (in both games and TV..)

RT is unlikely to be abandoned, but I just don't see it being a huge priority for many gamers right now.

Honestly, who was talking about RT before nVidia needed to find a use for their Tensor cores in a gaming product? Did anyone see a general clamouring for RT from gamers? Nope. It wasn't even discussed.

Then nV decided that their Tensor cores (AI cores) could be repurposed for RT and *bam* pushes RT to the gamer population.

And now people are worrying about RT and the hardware is still barely good enough for the minimum amount of RT without completely killing framerates.

This isn't pushed by gamers. This is pushed by nV trying to a) find a use for their Tensor cores and b) find a way to inhibit AMD's performance.

I consider it to be a fairly cynical play by nV. I don't for one second think we're close to RT being essential in modern games.

Performance takes a huge hit when using RT so until they can figure out ways to reduce the impact then it will not be a feature most will be able to use let alone afford. Those with 144Hz screens are certainly not going to be sacrificing fps for some visual effects which tank performance but give similar results to older techniques. If more consoles games use the effects then perhaps it will become popular but for now it's not a priority. I certainly wouldn't be handing over £800 for just RT in a few games.
 
Soldato
Joined
10 Oct 2012
Posts
4,088
you could continue that train of thought and include AA, higher resolutions, textures, wire frames, graphics... and we end up at MUDS. All those innovations add layer upon layer of the overall believability of the game world.
I think that's oversimplifying it and perhaps turning what I wrote into something it's not meant to be. I think graphical fidelity has diminishing returns the higher you go and there are other areas we should focus on instead, examples of this would be AI or physics. Then when that stuff is has been advanced more we can focus on moving the goalpost again for graphical settings. Right now alot of games are pretty much a Kardashian, all graphics no depth.
 
Top