• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ray Tracing - Do we care?

Ray Tracing - Do you care?


  • Total voters
    183
  • Poll closed .
Properly done ray tracing - not just overly shiny stuff for the sake of it - is immense - I've long wanted proper real time capabilities relevant to game use - the little details added by caustics and bounced light, etc. really increase the overall fidelity when done right - also allows for very intricate and accurate lighting without massive additional performance penalty relative to the base performance cost of ray tracing a scene.

Also it nicely meshes with things like physics if done properly - currently it is very expensive and complicated to add full dynamic lighting, shadows and reflections, etc. to every physics object if you have a lot on screen - once you are at viable real time ray tracing adding those things into the full lighting model of the scene has little to no additional penalty.
 
I do like eye candy, and am looking forward to the visual fidelity ray tracing will bring. But I'm also incredibly interested in how it could be leveraged to improve gaming audio. It's an area that really needs to be advanced after years of stagnation. I know AMD tried with TrueAudio, but it never really took off.
 
Ray tracing is slow and cimputationally expensive, they will have to use tricks to even get simple scenes to render at 30fp
That's why they will use hybrid ray tracing. They will only ray trace the bits that look better and run faster with ray tracing and do the rest without RT. Full Ray Tracing for games is years away unless its really basic. Hybrid Ray tracing can cut down on tricks and make it easier for game developers which in turn cuts down on costs to make a game. Devs will not need to spend ages faking various things that RT does automatically. Its all about using RT where its smart to use it, not to use it for everything.
 
It might not be as big of a gimmick as i first thought. Looks like this will help film studios a lot and as someone already mentioned in 5-10 years it might trickle down to gaming. I would be tempted to pick this up if it speeds up cycles significantly. However in these early days it won't make much of a difference in gaming. I think we will have to do some serious pixel counting with still rather than video to see difference in gaming.
Does seem like it still needs some work, the threadripper 2 comparison at 5:50 mark does show differences in the final rendered scene.
 
The main thing it's gonna do is scupper AMD's performance in games Nvidia use to demo it. I don't think Ray Tracing is ready for mainstream gaming. They'll probably work on a few games so it uses it a little totally demolishing both AMD, Maxwell & Pascal owners performance, At the end of the day they want everyone to feel the need to upgrade and when they finally have everyone they can get on RTX cards the next big thing will be PTX or is that PTSD?.
 
I hope it is amazing - I don't see it being a massive deal in this generation - but we have been saying exactly the same about the high core counts in CPU's. If the hardware is there the applications will come. Someone sometime has to build something of doing the next thing first - or how will it ever get out there and improve things?

The idea that you are going to be able to catch a glimpse of an enemy in a reflection in a bus shelter window or puddle is immense and will potentially make a massive difference in the future, especially in high end VR.

An engine automatically and in real time adjusting light sources is super cool.
 
I'm more concerned about the pricing of the new cards than about the features actually. If it's close to the rumoured pricing I'll be giving it a miss regardless of how powerful it is and what features it has. Console gaming is looking more and more like the way to go.
 
Last edited:
I have no idea what sort of performance jump the RTX cards will bring, but it will need to be massive and I think we are a very long way off having graphics cards powerful enough.

Nvidia are just hamming up a feature I think. The need for ray tracing as standard in games is questionable too.

By the time raytracing is common in games and not one off or of limited use like Metro Exodus, then we would probably be into 2020 with more powerful generation of cards.
Trying to jump to the wagon now, thinking that is future proof going to be the big mistake.

Will this work for AMD aswell? It will take ages to be adopted if not in console games aswell. Not many pc exclusives nowadays.

That is the funny bit. Given that consumer Vega 64 is exactly the same chip found in the Fire Pro WX 9100, and the latter is advertised for GPU rendered ray tracing, who knows.

What we do know is that the V64 is great on computing tasks and has the grunt power on that department, even if on gaming graphics is lacking compared to a similar (compute) GPU.

Given past history of AMD over engineering their GPUs before their time, I wouldn't be surprised if it came out with a driver update allowing hardware ray tracing.

Also we know that AMD has announed that RadeonRays 2.0 is backwards supported all way to Hawaii based Fire Pro, which used the 290X, and already supporting ray tracing on DX12, Vulcan, Embree and OpenCL.

Yet that might not be possible with GCN architecture, as it is not also with plain Cuda cores found in eg Pascal. However we do know Vega is the last GCN card, as Navi is already announced that has completely new core architecture, and it's successor another new core architecture.

Time will show.
 
However we do know Vega is the last GCN card, as Navi is already announced that has completely new core architecture, and it's successor another new core architecture.
That's incredibly unlikely. Because it doesn't make any sense.

e: Also no idea where you got that from. Various (inc wccf :p) are reporting that Navi is the final iteration of GCN.
 
There is literally no way that AMD will introduce two complete new architectures in the next two generations of cards.

No way.

On your initial post, you are correct. I remembered wrongly after two months so had to read some articles, and confused it with the "next gen memory" advertised for Navi. (probably GDDR6). Navi is GCN 6.0 and last GCN card indeed.
 
At 7nm there will be plenty of opportunity to allocate die space for RT features.
At 12nm the die size would have to be large to extend general gaming performance plus add the RT hardware, as was seen in the Quadro announcement.
So expect the first gen RTX cards to be expensive and the real fun to start next year @7nm.

If NV go large with RT @7nm and AMD build a big chip that performs well generally plus doesn't have to allocate die space to RT, AMD could gain the overall performance crown outside of the still fledgling RT sphere.
Any change to the status quo creates opportunities that could go either way.
 
I agree nvidia has probably paid a few games company's to add raytracing like metro but there wont much adoption until the middle to bottom cards have it at good performance.
 
I think the real issue is, by the time enough top-tier games use ray tracing in any significant way, your spangly new RTX2080 will be back on eBay for the low low price of 150 British pounds :D:D:D


This is always the case, e.g. ATI's TrueForm was baked in to a few demos/games but true tessilation came much better.Early adoption didn;t help ATI in the slightest.


However, RTX has many uses beyond real-time game use. In the short term it will be used to add better lighting and shadows. This is already done by GPU win games with Global Illumination. RTX will make this faster and allow more lights. No one is speaking aout real-time fully ray-traced games, but a hybrid approach where ray-tracing is sued for specific lighting and shadowing effects in limited areas. This provides a good improvement in image quality and RTX takes this to the next level and run existing Global Illumination models much faster (but to be clear, the existent GI models will eed ot be re-coded for RTX, so old game wont get the boost).


There are also big benefits to content creation. Even if baked lightmaps are used, the content creators can use RTX to allow level design with real-time feedback of lighting instead of waiting minutes to an hour. Then there are the uses in actual CGI, even if the ray-traced images are not real-time they get a 6X boost, so instead of waiting a few hours o render a short clip it might be done in 20 minutes.

All of this allows GPUs with RTX to offer a lot of advantages well before real-time
 
Based on demos like this:


All very clever and no doubt Nvidia will sell loads of Turing cards to the gullible, thinking that their games will now look like this, and I'm sure that one day they will, but its going to take numerous generations of Nvidia RTX cards before they come anywhere near that.
 
Back
Top Bottom