viralamo

Menu
  • Technology
  • Science
  • Money
  • Culturs
  • Trending
  • Video

Subscribe To Our Website To Receive The Last Stories

Join Us Now For Free
Home
Technology
Hands-on: Oculus Quest hand tracking looks cool, and sometimes even works
Technology

Hands-on: Oculus Quest hand tracking looks cool, and sometimes even works

18/12/2019

When a company unexpectedly adds a major new feature to a device after you’ve purchased it, or downplays the feature as a “bonus” that isn’t core to the product’s functionality, it’s tempting to accept the feature — flaws and all — as better than not having it at all. That thought has been on my mind as I’ve tested Oculus Quest’s new hand tracking feature over the past few days: I’m thrilled that the feature is here, and I fully understand its potential, but I’m not actually finding it useful for anything. Yet.

Brief recap: Facebook rolled out the feature last week using its terrible and crazy software update mechanism, which requires people to leave their Quests on and keep checking back for days after firmware first hits Oculus servers. Once it’s installed, you need to dig into a menu to activate it as an Experimental Feature, then manually select “use hands” in settings every time you want an alternative to the included Oculus Touch controllers.

Perhaps the single best part of the hand tracking feature is how it looks. Instead of representing your hands as skeletal lines and dots, Oculus uses shadowy gloves that so precisely mimic your actual movements that they might as well be real. Since Quest already nailed nearly photorealistic 3D controller representations, it’s not totally surprising that it did such justice to (currently genderless) human hands, but it’s amazing to see them move fluidly in space and respond with such low latency to pinches, pointing, and other gestures.

The key problem — and one I suspect will be solved at some point in the future — is that virtual finger gestures aren’t yet a reliable alternative to using physical controllers. You might expect that if your hands were inserted into a virtual space, such as Oculus’ Home interface, you could just touch or tap on anything and have it respond to your fingertips. As of now, that’s not how it works. Your fingers instead vaguely control a floating pointer that can be used to select things, like a Quest controller’s line-shaped selection tool, minus the line. You’re supposed to pinch your thumb and index finger to confirm a selection.

Above: A demo of the pinching gesture from Oculus Connect 6.

Image Credit: Facebook

On average, I’ve found myself “pinching” to select something three or more times before it actually gets selected. I’ll also occasionally hear a series of pinch confirmation sounds even when I’m not trying to select anything. If I look carefully at what the hand recognition system is showing when I’m pinching, it seems like it’s sometimes confusing another finger with my index finger, and sometimes just not acknowledging the pinch gesture. This seems to happen regardless of where I’m using Quest, but it’s possible that if I were to be against a more neutral background, the cameras might perform better.

Facebook rolled out the feature in two stages. Users gained access through the version 12 software update last week, and third-party developers are officially getting SDK support this week. While hand tracking is presently limited to the Quest’s own interface, developers are already itching to release updated apps with preliminary hand support, which if properly implemented could be very impressive. For now, trying to load any app without hand tracking support will force you to switch back to controller input.

Being pushed to go back and forth from hands to controllers led me to an unexpected conclusion: The haptics in Quest’s controllers may give them a long-term advantage over direct hand control. Holding a controller lets you feel clicks, vibrations, and other sensory cues in a way that quickly begins to feel absent when your hands are floating around in actually empty 3D space. Moreover, while Quest’s hand tracking system is remarkably capable of recognizing multi-finger positional data at this stage, all the sensors in the Quest’s controllers have been finely tuned for precision input. That makes the controllers easier to like, for now.

Given how Quest’s setup process currently works, and the direction Facebook has been going with its Oculus Quest Safety Video and recent updates to the Guardian system, I can very easily imagine a day when controllers become an afterthought rather than an integral part of Oculus onboarding. Open the box, put the headset on, turn on the power, and use your hands to move through the setup menus for everything; that clearly seems like the future of Quest (and VR headsets in general).

If and when that happens, the usage paradigm for VR will be even simpler than it is today: Turn it on, and you’re ready to interact in VR without the need to fumble for controllers. This will be ideal for social applications, where you’ll be able to wave to or high-five friends, and retail, where you’ll be able to point at a button to change the way a virtual car or sweater looks as you’re inspecting it.

But we’re not quite there yet, nor is the impact on games — Quest’s big selling point — totally clear. Anything requiring twitch-level precision isn’t going to work as reliably with hands as with a controller, and whether you’re swinging a sword or pointing a gun, you’re going to find that Oculus Touch is a better solution than using your forearm or a thumb-triggered index finger to fight off enemies.

Facebook and its developers may well change that over the next year. Regardless of how it performs today, I’m really looking forward to seeing where the hand tracking feature goes in 2020, and glad to have a chance to start playing with it now.

Source link

Share
Tweet
Pinterest
Linkedin
Stumble
Google+
Email
Prev Article
Next Article

Related Articles

PlayStation 5 gets Godfall looter-slasher from Gearbox Publishing
TechCrunch ist Teil von Verizon Media. Klicken Sie auf ‘Ich …

Modulr raises £18.9M for its ‘Payments as a Service’

Microsoft trains world’s largest Transformer language model
The AI research labs at Facebook, Nvidia, and startups like …

Microsoft’s AI generates 3D objects from 2D images

Leave a Reply Cancel reply

Find us on Facebook

Related Posts

  • Wonder raises $11 million to make large virtual events more sociable
    Wonder raises $11 million to make large …
    07/12/2020
  • As attacks begin, Citrix ships patch for VPN vulnerability
    As attacks begin, Citrix ships patch for …
    20/01/2020
  • Apple will goose 5G sales in late 2020, but does anyone still care?
    Apple will goose 5G sales in late …
    13/07/2020
  • African crowdsolving startup Zindi scales 10,000 data scientists
    African crowdsolving startup Zindi scales 10,000 data …
    18/02/2020
  • Twitter will soon let you choose who can reply to your tweets
    Twitter removes more than 170,000 accounts linked …
    12/06/2020

Popular Posts

  • High fossil fuel prices are good for the planet—here’s how to keep it that way
    High fossil fuel prices are good for …
    20/06/2022 0
  • 10 Real Historical Events That Inspired ‘Game …
    22/05/2022 0
  • Top 10 ’90s Songs You Didn’t Realize …
    23/05/2022 0
  • Top 10 Mysteries, Cold Cases & Puzzles …
    23/05/2022 0
  • Ransomware attack on Planned Parenthood steals data of 400,000 patients
    Why it’s hard to sanction ransomware groups
    23/05/2022 0

viralamo

Pages

  • Contact Us
  • Privacy Policy
Copyright © 2022 viralamo
Theme by MyThemeShop.com

Ad Blocker Detected

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker.

Refresh