In this case, I suspsect it will be a much longer time.
If at all, above and beyond the odd "converted" game or demo.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
In this case, I suspsect it will be a much longer time.
It might take a while, but ray tracing will replace rasterising (drawing lots of triangles). The thing is rasterising is basically a hack to work around the costs of raytracing, but the computation for both approaches scales differently.
Rasterising gets more and more expensive as you add scene complexity, there's effectively a linear cost to adding polygons. Raytracing isn't impacted anywhere near as much by scene complexity, it has a fixed cost per pixel. So really what companies like Intel and NVidia are looking forward to is the point where scene complexity gets to the point where the per pixel fixed cost approach of ray tracing becomes less than drawing millions of triangles.
Bare in mind that movies use resolution and complexity we won't see in games for a very long time.
As above we probably won't see a complete ray tracing system for a long time but a hybrid of methods... idtech6 shifts focus to voxel/ray casting again so we could well see a shift towards this in around 2 years time.
comanche 4's island levels look like an early FarCry demo FFS, and this was back in the days of Win2k!
We need to stop pushing resolutions if we want to adopt raytracing.
2560x1600 seems to be the standard desktop limit so far and has been for quite a while, even then, those panels cost a fair fortune. It's low end 24" panels and under that have steadily got cheaper.We need to stop pushing resolutions if we want to adopt raytracing.
I'm also pretty certain you can't have HSR with ray tracing engines either - light will still be interacting with objects even if they can't be seen, meaning you can't cull them.
HSR is important to a rasterizer, because you wan't to limit the number of tri's you are displaying. It's a pain in the arse as well, generally consuming level designers time - either modelling things without non visible poly's or having to run a whole level through some BSP encoder everytime you tweak a wall (or both).
PC gaming must push technical development as it the only advantage we have over consoles. Resolution is king.
With todays hardware tho this isn't as much a problem as it used to be - you can just batch your main level geometry and use much more rudimentary visibility routines on large blocks of level tris without a major performance drop.
I agree pc gaming needs to push the envelope but it'r not all about resolution. Look at the wii / mmorpg etc. I'm not going to spew up the old chestnuts about games not being as good as they were etc etc but 'playability' is king. Thats why we do it.
Never mind the majority of mainstream consumers (ie where the monies at) struggled to tell sd from hd in the tv world. There is a point where the human eye struggles to tell the difference, not just quantifiably either. 1080p is irrelevant on smaller screens. Personally i'd say you DON'T need more than 1080p until you game on a screen bigger than 30". Notice i said game, not dtp, gfx design, cad etc.
did anyone watch babylon 5? Iirc the station ships etc were raytraced using Amiga workstations.
I'm sure i read that ILM, dreamworks etc used massive banks of CPU'S (amd currently) to render their movies. By massive i think they were talking literally 100's if not thousands and they were taking hours on end doing it.
Could a gpu really replace that? In real time.