• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

unlimited detail graphics

Looks like interesting tech. And I didn't mind the guy so much, but yes, he needs to calm down and maybe take a public speaking course.

What got to me the most was when ever there was a glitch and he'd say "There's a glitch! Please forgive us! We're young!; there's going to be glitches!"

I mean, instead of saying all that, say "There's a glitch in our shadow mapper. Our shadow mapper is unique because it can compute from 0658400965230945 ;k325 light sources at once, and we're working very hard on getting it perfect".

But yeah, I'm sure the tech will filter down eventually, if it's solid. I mean, he didn't even say if we can mix that tech with current tech... What I was thinking while watching the whole thing is: can we have a standard polygon game, and use this tech to display static backgrounds or something like that? Stuff that just needs to sit there and look pretty. Or can it not be mixed and matched?

Interesting nevertheless... ;)
 
uuuuuuuuUUUUUUUUUUUUUUUUUUUnnnnLIIIII MIIII TEEEEDDDDDDDDDDDD

DARTARRRRRR :D

Hey, quit mocking his mastery of the English Language :p

Someone raised a good point above about character animations. I recently watched the Making of Oblivion documentory again, and you can see the lady who designed the horses showing off her work with 3D animated biped skeletons, and studying real horses and going riding to get the animation correct.

Now how do you do all that with point cloud data?
 
I would've thought animations would work similarly, but because of the much, much larger dataset involved, depending on how you made the animation it'd either be very computationally expensive or have an enormous file size.
 
Still, this is in its early stages, if they can get this right, we might be seeing a massive improvement in graphics in the near future. If it was to become part of a bigger company, it won't suprise me if Nvidia gets there first...
 
I think to make a game with this "technological advanced software" you would need to spend months one 1 model.

I bet its all CPU Dependant. So people still running dual cores might lag a "little"
 
Back
Top Bottom