• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Launches The World's First Interactive Ray Tracing Engine

Yeah but you can't really expect the average consumer to see the need or point, especially when they are gonna be led by their wallet and see it as an extra cost.
 
I don't understand :p What is a ray tracer.

Like Rroff said, it basically calculates every single ray of light, which surfaces it hits, how much is reflected, refracted and absorbed, etc, which is essentially the key to creating graphics as realistic as possible, specifically with mirrors, glass objects, and curved surfaces. However, of course, due to the massive amount of processing power required, it's just not feasible yet. Hopefully GPU rendering should be able to sort that out.

No doubt we'll have it one day, it's just a matter of the hardware catching up. Perhaps one of the only instances in which the software is ahead of the hardware.
 
Last edited:
For performance reasons you generally depth test every screen space pixel and then reverse test the ray end point against each light source, etc. obviously its a bit more complicated for proper ray tracing when your doing specular, refracted, etc effects and so on.

The great thing is most interactions are massively threadable and very little actually needs to be processed synchronusly at all.
 
intel might be the "daddy" of ray tracing - but they will never get anywhere in the gaming world - they don't produce drivers that have either the level of feature richness or the update frequency to support new titles...

I agree wush once this becomes more common we'll see it plastered on every possible corner til people are sick of it :S

Sorry, despite Intel not having a discrete 3d card out you're claiming, with BILLIONS in the pocket, with enough spare cash lying around to hire Nvidia's and Ati's driver teams, and increase their size by 1000% without batting an eyelid, despite that and with no proof you're saying they'll get no where because their basic and crap intergrated stuff lack features and constant updated drivers.

THe basic thing is, with that level of hardware updated drivers are rare because they aren't aimed at gaming and rarely have issues with new games, they simply either can't run them or run them on such low/pathetic settings the fancy features aren't even usable, which frequently means things run just fine.

In all likelyhood they've been buying up coding staff for years working on the new discrete sector. What they've done till now has no bearing whatsoever on anything they might be bringing to the table, and claiming you know different is just completely ridiculous. As per usual any way to say why Nvidia are better though :rolleyes:

(and thats the first time i've used the rolleyes, i think, ever on these forums btw)

The real future aim of raytracing is that with every particle being rendered essentially through a realistic physics engine you can use it to map game physics at the same time through the same calculations, IE the scene renders every path of every piece of light and how it reacts, it can simply be told to assign a particle to a box, and calculate and render where that box will go at the same time(ish).

We're not there yet, we're years away from that to be honest(except probably some tech demo's, from any/all companies showing off ray tracing). We'll see more basic ray tracing engines to start with just the rendering being done differently without anything else moved to the gpu's. As with all things change takes time, and steps.

The problem will be for anyone competing with Intel is, their manufacturing capability, smaller and faster, cheaper higher yield, lower power than anything AMD/Nvidia can provide, deep, deep, deeeeeeeeeeeeeeeeeep pockets behind them also make it a problem. We're 3 years away from TSMC having real competion and probably 3 years away from seeing ATI/Nvida being able to use the very lastest process for gfx, they'll always be a half/full node behind.

As with all things though, mistakes happen, things don't work, Intel's first cards will probably be "ok" but nothing special, a card that will have to be very capable at non ray tracing "normal" games anyway but something that can introduce the industry/dev's to ray tracing. It just doesn't matter to Intel if its a total failure, they've got the money to keep going till they dominate. Quite why its taken them this long to get in the game I have no idea.
 
History says different... for all their millions even billions the few exercisions intel have made into the gaming market have flopped for lack of proper feature support and any update cycle... and unless they pull things around dramatically on that front I don't see them as a competitor to ATI let alone nVidia.

Sure they have deep pockets, a lot of clout behind them and so on... but so far they've been unable to effectively utilise it when it comes to gaming products... infact they don't even appear to understand the gaming market which is very strange...

They've been in a position to pull the rug out from under nvidia/ati for the last 2 or so years yet every time they've taken tentative steps in that direction the final product has ended up much less than the hype and swift redesignated to another market area...

Take a look at larrabee they were claiming the current spec (which is 60% less than the hyped up spec) was equivalent to an nvidia 280 series GPU for rendering performance - yet in a demonstration theoretically running on the source engine it was giving up performance numbers only just faster than a 7900GTX.
 
iirc there was a company that manufactured renderdrives and raytracing cards a few years ago, it was very expensive kit and really only of any use to specific industries, ie design, special fx, etc

hopefully this will be a huge leap forward and add more realism to game graphics
 
Raytracing is the future of game graphics. Theres simply things you cannot do with the rasterization method, which is currently what all games use.

If you want stuff looking as good as the 3D in films you have to use raytracing (which is what they use). It's not just something thats good at doing shiny reflective objects as posted in this thread, it's a far more advanced and complex way of rendering that can reach a far higher level of realism and accuracy.

BTW theres an article here on raytracing for games. It's not too detailed so maybe some of the people on here can understand it? ;)
 
Last edited:
It might take a while, but ray tracing will replace rasterising (drawing lots of triangles). The thing is rasterising is basically a hack to work around the costs of raytracing, but the computation for both approaches scales differently.

Rasterising gets more and more expensive as you add scene complexity, there's effectively a linear cost to adding polygons. Raytracing isn't impacted anywhere near as much by scene complexity, it has a fixed cost per pixel. So really what companies like Intel and NVidia are looking forward to is the point where scene complexity gets to the point where the per pixel fixed cost approach of ray tracing becomes less than drawing millions of triangles.

This is what larrabee is all about, and to be fair both NVidia and to a lesser extent ATI are building their next generation GPU's with an awful lot of parallel "general purpose" compute power.
 
did anyone watch babylon 5? Iirc the station ships etc were raytraced using Amiga workstations.

I'm sure i read that ILM, dreamworks etc used massive banks of CPU'S (amd currently) to render their movies. By massive i think they were talking literally 100's if not thousands and they were taking hours on end doing it.

Could a gpu really replace that? In real time.
 
did anyone watch babylon 5? Iirc the station ships etc were raytraced using Amiga workstations.

I'm sure i read that ILM, dreamworks etc used massive banks of CPU'S (amd currently) to render their movies. By massive i think they were talking literally 100's if not thousands and they were taking hours on end doing it.

Could a gpu really replace that? In real time.

ENIAC

Babylon 5 now? Sure. The pixar stuff? No.

You can always throw more computers at these things, add more and more effects. A lot of stuff they use that computation for is modelling stuff like individual strands of hair and fur. You won't see stuff like that in a game. Yet.
 
that's a shame. I was looking forward to choosing what color hair my sim had in it'r butt crack.

O well. A new space combat game that looked like bab 5? Soon? I'd buy that for a dollar.
 
There was a good discussion of ray-tracing vs/ rasterisation on Beyond3d.com or something around 18 month ago...

Essentially, the conclusion was that future games will probably feature a hybrid approach because of the limitations of both techniques. Ray-tracing performance gets hammered by some things rasterisation does very easily, and vice versa.

Is this a big development? Until it's usable to any extent whatsoever, it may as well be classed as vapourware, like CUDA/Stream/etc - both cool in principle, yet essentially only practical in a very, very limited set of applications.
 
Bare in mind that movies use resolution and complexity we won't see in games for a very long time.

As above we probably won't see a complete ray tracing system for a long time but a hybrid of methods... idtech6 shifts focus to voxel/ray casting again so we could well see a shift towards this in around 2 years time.
 
So it doesnt matter whether people go ATI or Nvidia for the new cards.. good enough for me.

I suspect it will be like an protracted DirectX implementation: graphics cards are capable of supporting/using the new standard, but no games actually use it for a good 18/24 months afterwards. In this case, I suspsect it will be a much longer time.
 
Back
Top Bottom