• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Launches The World's First Interactive Ray Tracing Engine

It won't happen until the Consoles can do it, PC gaming is mostly just ports these days - or massmarket stuff that can run on any old GPU
 
It might take a while, but ray tracing will replace rasterising (drawing lots of triangles). The thing is rasterising is basically a hack to work around the costs of raytracing, but the computation for both approaches scales differently.

Rasterising gets more and more expensive as you add scene complexity, there's effectively a linear cost to adding polygons. Raytracing isn't impacted anywhere near as much by scene complexity, it has a fixed cost per pixel. So really what companies like Intel and NVidia are looking forward to is the point where scene complexity gets to the point where the per pixel fixed cost approach of ray tracing becomes less than drawing millions of triangles.

This is exactly what I was going to post, and well put. Raytracing has dropped off the radar in the last few years, but it was the original method for generating images of 3d environments on computers. Rasterising was really implemented as a workaround because raytracing was so complicated. Back on the Amiga it wasn't impossible for a single high resolution image to take 24 hours to render.

Raytracing can be massively parallelised, and perhaps we're getting to the point where the hardware has finally caught up. Can't wait to see the demos.
 
Bare in mind that movies use resolution and complexity we won't see in games for a very long time.

As above we probably won't see a complete ray tracing system for a long time but a hybrid of methods... idtech6 shifts focus to voxel/ray casting again so we could well see a shift towards this in around 2 years time.

LOL You are the ultimate optimist.
 
All this talk of voxels had me looking on youtube for commance videos - I'd forgotton how good commanche games looked compared to polygon based terrain engines at the time.

comanche 4's island levels look like an early FarCry demo FFS, and this was back in the days of Win2k!

Back to ratracing, and specifically that Q3RayTrace demo posted above [which looks ratehr good for such an old game], it's inferred that the RT routines were running solely on the CPU, whcih was a C2Q Extreme. now, if you can offload that to the GPUs shaders with OpenCL [or more likely, DX11 compute shaders], wouldn't that make that thing utterly fly?
 
I'm no expert but even I know raytracing also has its processing pitfalls - the sheer computational power needed to render the same scene at increasing resolutions for starters. We are 5+ years away from 1920x1200 @ 60fps. I'm also pretty certain you can't have HSR with ray tracing engines either - light will still be interacting with objects even if they can't be seen, meaning you can't cull them.
 
We need to stop pushing resolutions if we want to adopt raytracing.

This i have to agree with, which is why i am afraid to upgrade from a 22" and most i would ever go is 24" due to a lot of demanding games running poorly any higher.

So adding raytracing at that resolution would be practically unplayable.
 
We need to stop pushing resolutions if we want to adopt raytracing.
2560x1600 seems to be the standard desktop limit so far and has been for quite a while, even then, those panels cost a fair fortune. It's low end 24" panels and under that have steadily got cheaper.

If ray-tracing wants to go mainstream we need a usable, efficient and multi vendor platform that can operate at current resolutions. Of course we need the hardware capable of doing it too.

Not asking much. :p
 
I'm also pretty certain you can't have HSR with ray tracing engines either - light will still be interacting with objects even if they can't be seen, meaning you can't cull them.

You don't need or want to bother with Hidden Surface Removal with a raytracer, it's one of the benefits - the raytracer implicitly does this. You get free reflections and shadows of stuff off screen because of this.

HSR is important to a rasterizer, because you wan't to limit the number of tri's you are displaying. It's a pain in the arse as well, generally consuming level designers time - either modelling things without non visible poly's or having to run a whole level through some BSP encoder everytime you tweak a wall (or both).
 
HSR is important to a rasterizer, because you wan't to limit the number of tri's you are displaying. It's a pain in the arse as well, generally consuming level designers time - either modelling things without non visible poly's or having to run a whole level through some BSP encoder everytime you tweak a wall (or both).

With todays hardware tho this isn't as much a problem as it used to be - you can just batch your main level geometry and use much more rudimentary visibility routines on large blocks of level tris without a major performance drop.
 
PC gaming must push technical development as it the only advantage we have over consoles. Resolution is king.

I agree pc gaming needs to push the envelope but it'r not all about resolution. Look at the wii / mmorpg etc. I'm not going to spew up the old chestnuts about games not being as good as they were etc etc but 'playability' is king. Thats why we do it.

Never mind the majority of mainstream consumers (ie where the monies at) struggled to tell sd from hd in the tv world. There is a point where the human eye struggles to tell the difference, not just quantifiably either. 1080p is irrelevant on smaller screens. Personally i'd say you DON'T need more than 1080p until you game on a screen bigger than 30". Notice i said game, not dtp, gfx design, cad etc.
 
With todays hardware tho this isn't as much a problem as it used to be - you can just batch your main level geometry and use much more rudimentary visibility routines on large blocks of level tris without a major performance drop.

Of course it is, rasterisation is a bunch of hacks upon optimisations upon hacks. To get any kind of performance with it, the first optimisation you do is cull all of the triangles that you don't need to render, and this act needs some runtime data structure optimisation like BSP or Octree or Portal - which generally needs either a load of designer time and/or a load of computing time. You can quite easily watch a modern GPU performance go to **** if you render millions of non visible tris using the Z-Buffer.

Raytracing implicitly deals with this problem, only the pixels you see are rendered. You could have 5,000,000 polygons behind the point you see and they are intrinsically culled because they are not visible.
 
* edit, and if you are talking about tri-strips then thats just a semi programmer friendly optimisation to squirt a load of geometry across the limited PCI Express bus without all the slow read write cycles. You still bottleneck on the cards ability to render triangles.
 
I agree pc gaming needs to push the envelope but it'r not all about resolution. Look at the wii / mmorpg etc. I'm not going to spew up the old chestnuts about games not being as good as they were etc etc but 'playability' is king. Thats why we do it.

Never mind the majority of mainstream consumers (ie where the monies at) struggled to tell sd from hd in the tv world. There is a point where the human eye struggles to tell the difference, not just quantifiably either. 1080p is irrelevant on smaller screens. Personally i'd say you DON'T need more than 1080p until you game on a screen bigger than 30". Notice i said game, not dtp, gfx design, cad etc.

Yeah maybe I took play-ability/ FPS for granted but the higher the resolution the higher the detail.

The average human eye can resolve around 5000x3000 pixels and Im very sure gaming at that resolution would blow peoples socks off.
 
I disagree with the resolution thing. One of the main advantages with PC has always been resolution. And it's always been the case where some games will come along that even the current top end hardware cannot run on the highest res. It's just how things are with PC, and you can always adjust graphics settings anyway. Plus it helps stops games looking dated so quickly, and you can later go through them again with newer hardware. If someone dont like it - get a console, because i doubt this will ever change, and i wouldnt want it to.

Resolution needs to keep being pushed. It also brings the screen image quality closer to how you see things in reality, and closer to high print quality in good magazines and art posters which have far higher DPI than any monitor.

I doubt many of you have even gamed at 2560x1600 so dont know what your missing.
 
Last edited:
did anyone watch babylon 5? Iirc the station ships etc were raytraced using Amiga workstations.

I'm sure i read that ILM, dreamworks etc used massive banks of CPU'S (amd currently) to render their movies. By massive i think they were talking literally 100's if not thousands and they were taking hours on end doing it.

Could a gpu really replace that? In real time.

It was done on Commodore Amiga 4000's backed up with a boat load of video toasters. The Motorola 68040, happy days.
 
Maybe in a few years this'll just be another tool used by all games, maybe not. Still, it's interesting.
Thing is it's unclear as to how complex the raytracing they're talking about is. There's a bit difference between a reflective sphere and a nurbs surface with sub surface scattering next to a water filled glass showing caustics.
 
Back
Top Bottom