Hand tracking in VR

Soldato
Joined
23 Jul 2009
Posts
14,116
Location
Bath
Full disclosure: I work at Ultraleap (we acquired Leap Motion last year) so I have strong opinions on this, but I'm keen to hear whether you guys think that well implemented hand tracking and spatial interaction methods are something you'd be excited by.

I would love for games like star trek bridge crew to just have used my actual hands, as half the fun of that game was the social expression afforded to you and all the interfaces were just touchscreens. Similarly I'd love to be able to take my hands off the controls in elite dangerous and navigate the menus with my hands instead of having to click through them.

What do you guys think? Great way to enhance some/all games or pointless gimmick? Let's assume that it's done better than oculus have /sassy
 
Board games, most notably would be Table Top Simulator. There's a control binding for the Index controllers that goes a way to replicating touch by using the natural grip and grab to pickup and move pieces, but with no finger support your still reliant on pressing buttons to action detailed movement.

Hand tracking totally makes sense for touch interfaces, your controls in Elite, or working the menu's in Big Screen for example.
TTS is a good shout actually. If anyone has any other ideas it might be quite useful.

One of the things we've been working on is how you deal with interfaces when you don't have buttons. There are quite a few analogues with the work already being done in VR interfaces, of course, but I was just kinda brainstorming for concepts where it's actually already a good fit without loads of new concepts being introduced.

For anyone interested, this is our idea on dealing with the tricky issue of browser navigation in AR, but I think it would fit reasonably neatly in VR too for complex navigation.

https://www.ultraleap.com/company/news/blog/vr-ar-content-browser/

I live in this stuff so it's always useful to hear opinions from end users to make sure we aren't drinking the cool aid too much!
 
Some DCS modules offer up a fully clickable cockpit already. As great as every switch is in the Black Shark working is, it's cumbersome having a controller in hand when you want to pick up the flight stick.

That's precisely it for me. Or having to fumble for a controller that has idled out somewhere on your desk with your headset on. I like the idea of having my hands available, just in real life, without having to pick them up or put them down.
 
I would like it. Nothing would be more natural that just picking stuff up rather than fumbling around with vr controllers. Question is how do you provide feedback to the hands so you don't just keep jabbing your hands through things in the VR world without some big old gloves?
Absolutely, feedback is really important. With gestures and other spatial interaction interfaces you try to design them so they are inherently tactile (eg a pinch has its own haptics because your fingers touch eachother), but the solution to creating virtual objects in reality is a tricky one.

Our solution is to project sensations onto the hands using ultrasound. You can't see it or hear it, but you feel it. Essentially we vibrate your skin so you can feel it on your fingers or palm and we can modulate the vibration to simulate different textures or draw shapes on your hand to better represent what you're touching or doing (imagine a tap tap for a button actuation and release). It's never going to produce force though. That's just physics. I think you're stuck with gloves /suits for that kind of experience. What's cool to me is how much your brain does to match up a vibration with what you're seeing and expecting to feel. Eg a random pattern of vibrations on the palm and running up the fingers + the animation of shooting lightning out your hands and your brain is like "yep that feels like lightning". I guess what I'm saying is that sometimes it doesn't have to be a perfect simulation in the same way that the haptics in your phone don't have to feel like you are pressing a physical keyboard to tell you that your key press worked without you having to think about it. It just needs to be enough to tell your brain the thing you were trying to do worked without the mental load of confirming it visually.
 
Thanks everyone, and especially @RoyMi6 those are some really valuable insights which I'll definitely be passing on. I started this thread mainly out of curiosity to gauge opinion, but it's actually been really helpful as well.
 
This is what Mike (the VR podcast guy) was talking about on his show a few months ago?

Everything he described at the time sounded extremely interesting, the way your describing it sounds fairly similar. Either way it's awesome that people are working toward these types of goals, VR has such a bright future.
I can't say I'm familiar with Mike the VR podcast guy, but if you have a link I'd love to check it out. We had our lead UX researcher on Radio 4 the other week so maybe some other people picked it up as well?

I got to try this demo before they packed it up for CES (apologies for the elevator music), and I have to say I was pretty stoked. I want that to be the future. I also found it pretty funny to watch visitors doing it because you know when they get to the "relaxation" bit because they're all swirling the stars around for ages lol. This particular one had haptics as well which makes a surprising difference to the accuracy rate of gestures. Similar to have cherry brown to cherry red switches in a keyboard, it just kinda helps you know it worked and not wondering if your half-keypress registered or not.



For AR, this concept has me super excited as well. I think this interface could totally work.

watch
 
Back
Top Bottom