• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ray Tracing - Do we care?

Ray Tracing - Do you care?


  • Total voters
    183
  • Poll closed .
By the time ray-tracing is widely used in games then these 'new' cards will be doooooooooooog slow at it and barely worth using (outside of a tech demo or two). How it's always gone. Better to skip if 7nm cards aren't too far off if you want the ray-tracing tech.

This and what people on forums need to understand there are a few issues here:
1.)Many games are still made for consoles first so if these don't support ray tracing in anyway,it will be a tacked on feature to sell high end cards.
2.)There needs to be proper support in mass market cards,ie,the 60 series ones and so on,and if they cannot run the effects to any degree reasonably well,loads of the market won't bother.
3.)Loads of legacy cards which might have to run the effects in software,which will lead to worse performance reductions,and again limitations in what can be run by larger sections of the market.

Also what annoys me with both Nvidia and AMD,is they tend to overuse the effects on specific objects to the extent it can look weird so they can sell it,instead of using the same level in a more even way over more objects. Its what annoyed me about PhysX,hair physics,etc and could break immersion. You ended up with Geralt or Lara having massively animated hair,but most of the NPCs looked like they had permed hair.
 
Last edited:
AMD threw cash at developers to get DX12 implemented. I actually see a lot of developers adding RTX support freely because ti can help differentiate their game. There is also the fact that game engines like Unreal and Unity have already added RTX support or are in the process.

Real time race tracing is what game developers have wanted for the last 30 years. A low level graphics API was completely undesirable for a vast majority and see DX12 as a step backwards.

Rubbish. Developers have been calling directory X rubbish for over a decade. DX11+Gamesworks is backward tech.
 
Don't care. Yes it looks nice, but there are far too many problems in the gaming industry at the moment (like a lack of good games, problems at EA and etc) for me to care. If it were a leap to fully ray traced scenes and games? I might care. But it's not and it is a proprietary thing again forcing you into a choice (which tbh isn't difficult with AMD being so poo) but you will still be in the minority when compared to console gamers who the games will be primarily made for.

This means a ton of investment from Nvidia who have shied away from doing so over the years.

There have been many techs come and go and fall by the way side for the same reasons over the years so I am not paying out a massive chunk of money for a few tech demos and a couple of lamps rendered with RT. I will wait for a few years yet and see what happens.

Until then I will be sticking to my TXP and XB1X.
 
Only way I will care is if:

- it doesn't kill performance like previous nvidia gameworks
- used in more than just a handful of games
- actually makes games look better, I thought a lot of nvidia's effects looked crap and massively over done i.e. their smoke
 
AMD threw cash at developers to get DX12 implemented. I actually see a lot of developers adding RTX support freely because ti can help differentiate their game. There is also the fact that game engines like Unreal and Unity have already added RTX support or are in the process.

Doesn't seem like the devs have a problem when it's consoles with low level APIs.

Real time race tracing is what game developers have wanted for the last 30 years. A low level graphics API was completely undesirable for a vast majority and see DX12 as a step backwards.

Lol! Yes devs have loved having everything go into a DirectX black box and not know why things fail. Devs aren't getting ray-tracing, they are getting hybrid ray tracing, where they get to use a bit of ray tracing for spot effects that tank framerates on all but the highest hardware, but still have to use all the shortcuts and tricks they've been developing for the last 30 years for everything else.
 
Last edited:
Doesn't seem like the devs have a problem when it's consoles with low level APIs.



Lol! Yes devs have loved having everything go into a DirectX black box and not know why things fail. Devs aren't getting ray-tracing, they are getting hybrid ray tracing, where they get to use a bit of ray tracing for spot effects that tank framarates on all but the highest hardware, but still have to use all the shortcuts and tricks they've been developing for the last 30 years for everything else.

Most developer I've ever had anything to do with aren't fans of working at a low level all the time - some exceptions aside - mostly people who are hardcore engine developers.

Most wanted more ability to see beneath the hood when necessary and the ability to drop in selectively when specific areas could do with hand optimisation not forced to work with the kind of approach taken by DX12 which is a complete misunderstanding of what developers were making noises about (not unusual with MS these days unfortunately).
 
Most developer I've ever had anything to do with aren't fans of working at a low level all the time - some exceptions aside - mostly people who are hardcore engine developers.

Most wanted more ability to see beneath the hood when necessary and the ability to drop in selectively when specific areas could do with hand optimisation not forced to work with the kind of approach taken by DX12 which is a complete misunderstanding of what developers were making noises about (not unusual with MS these days unfortunately).
Well that was the point, the big engine developers were supposed to integrate these lower level APIs properly and then the developers would use those engines for their games with most of the hard work already done. Instead, most development studios used DX12 as a marketing point and it was just a wrapper around DX11 which is always going to be less efficient.

I do like the excellent work done by Id Software with Vulkan in Doom and Wolfenstein 2 though.
 
then the developers would use those engines for their games with most of the hard work already done.

That is part of the problem it doesn't really work like that with DX12 - which is a big part of why where it is put into use it is more of a wrapper nature.

If you look up many standard features in the documentation for DX12 instead of a documented function call or some example code you can customise as was usual for DX11 and earlier you often find a run down on the concept and theory of implementing said feature along with flowcharts and then basically "along you go then" - what developers were actually wanting is something more inbetween.
 
That is part of the problem it doesn't really work like that with DX12 - which is a big part of why where it is put into use it is more of a wrapper nature.

If you look up many standard features in the documentation for DX12 instead of a documented function call or some example code you can customise as was usual for DX11 and earlier you often find a run down on the concept and theory of implementing said feature along with flowcharts and then basically "along you go then" - what developers were actually wanting is something more inbetween.
Well that is how it was sold to the public, that it was something in between, which is why they were careful to use the term 'lower level' rather than 'low level' and my point about game engine developers was made to say why it's workable now, in terms of long term and comprehensive adoption, whereas it wasn't 20 years ago as everyone made their own engine.

It's strange that Dice were one of the most vocal in wanting this and yet have one of the worst DX12 implementations. Their Mantle implementation was alright though. So is it a problem that's inherent in DX12?
 
It's strange that Dice were one of the most vocal in wanting this and yet have one of the worst DX12 implementations. Their Mantle implementation was alright though. So is it a problem that's inherent in DX12?

A lot of the Mantle work was done with hands on from 1-2 people from AMD I believe - doesn't surprise me their DX12 renderer is poor - the same passion and pride that goes into the visual work on their games does not extend to the nuts and bolts programming which is mostly slapped in shoddily and a half-arsed effort at best - get it mostly sort of working and move onto the next bit and never revisit.
 
How is this going to help PC gaming differentiate itself from consoles?
It's embarrassing as it is that consoles can produce similar fidelity and IQ in gaming which is similarly on par with high end PC setups with roughly 1/3 the powah. And yet no one complains of the huge amounts of input lag/latency that inherent in consoles as a result. Adding RT into console games will only enhance and attract more towards consoles because the startup cost to get the same on PC will cost them more then double the price for a console...including the games.

The only way RT is going to take off is if it's refined for console use as well.
 
No. It's just marketing nonsense at this point. Another PhysX if you will. It'll probably tank your frame rate too when enabled in-game.
It is not just marketing nonsense as its great for people who use RT casually like myself or developers who use RT to develop games, which a lot do. While it can tank FPS, if its used correctly with a hybrid method it can boost FPS while making things look better.
 
The only way RT is going to take off is if it's refined for console use as well.

This. So much this. Thing is, even if it does happen to consoles (and it probably will eventually) then it will likely be using another method. Meaning game devs will still have to sit down and spend a long time on the PC version. Now any one doubting this? look at what you have had since Crysis, almost ten years ago. I am talking about exclusive games, that are so hardware dependent that they can not run on consoles. And that would be... Yup, none.

Since buying an Xbox 1X I have discovered why PC gaming is so stale. I also discovered just how nice a game on it looks from 17ft away (55" 4k TV). Now sure, if I take a screenshot and then sit and scrutinise it on the PC it looks nowhere near as good, but at 17ft away you would need to be able to detect a needle in a haystack to tell the difference.

Remember kids, if it can't run on a console you ain't getting it at all.
 
Seems there will be 4 games with raytracing in the nearer future:
Battlefield 5, Metro, Shadow of the Tomb Raider and Control from Remedy.

Should give us a good picture, how it can look in games and whether it's worth it.

There will be more than 4 games with ray tracing:

Assetto Corsa Competizione release date 12 Sept 2018
Shadow of the Tomb Raider release date 14 Sept 2018
Battlefield 5 release in Oct 2018
Metro: Exodus release in Feb 2019
Control release in 2019 and many more games

I hope Nvidia will bundle 3 free games Assetto Corsa Competizione, Shadow of the Tomb Raider and Battlefield 5 with RTX 2070, RTX 2080 and RTX 2080 Ti in response to AMD Raise The Game Promotion with 3 free games Assassin’s Creed: Odyssey, Strange Brigade and Star Control: Origins bundled with RX 570, RX 580, Vega 56 and Vega 64 cards.

Possible old, current and future games will use ray tracing through patch, hack or mods:

Final Fantasy XV
Elder Scroll V: Skyrim SE
Elder Scroll VI
GTA IV
GTA V
GTA VI

Ray tracing on Metro: Exodus looked really amazing and it is much better than rasterization.


Ray Tracing - Do we care? Yes I do!
 
@AthlonXP1800
Do you have any links showing these games as having rt in them?
And did you read the comments in that video LOL

This. So much this. Thing is, even if it does happen to consoles (and it probably will eventually) then it will likely be using another method.
You mean that other method used in Physx on console. I think it was SSE (might have been something else though) instead of x87 code.

The same hypocrisy found in the use of Physx between PC (CPU) and xbox360/PS3/Wii will no doubt repeat itself with ray tracing on consoles at some future point.
 
Last edited:
Doesn't seem like the devs have a problem when it's consoles with low level APIs.
Because console have a fixed hardware making the reuired optimization much eaiser.Moreover, most devs end up wrapping the low-levle PAI in their own high-level interface, or using even higher level game engines.

Lol! Yes devs have loved having everything go into a DirectX black box and not know why things fail.
Having a black box is fine, this is desirable. Finding out why thigns fails it the job of the debugging tools that are very extensive for DX11.
Devs aren't getting ray-tracing, they are getting hybrid ray tracing, where they get to use a bit of ray tracing for spot effects that tank framerates on all but the highest hardware, but still have to use all the shortcuts and tricks they've been developing for the last 30 years for everything else.


There is nothing wrong with hybrid rasterization and ray-tracing. IT means they can skip huge amounts of the tricks used in the last 30 years and use RT where it makes sense ion illumination, reflection, refraction, caustics. This is exactly what developers want.
 
Not particularly interest in it currently. Since performance is a (convenient) unknown it could very well be a checkbox feature that will need a few years and obviously newer generations of cards to get decent performance out of the odd game that has it.
 
Back
Top Bottom