Virtual wearables: The future of human-computer interaction

Virtual wearables: The future of human-computer interaction

By Gopi Kuppuraj

Leap Motion, a startup based in San Francisco, has been working on hand-tracking software to develop virtual and augmented reality devices. Other companies are developing similar products. In the near future, such devices could work without the use of a controller or other hand-controlled devices.

Virtual reality (VR) and augmented reality (AR):

VR is a three-dimensional computer-generated environment which a person can explore and interact with. Put simply, VR allows you to experience situations through computers that don’t really exist, such as living in an alternate world, like Mars, or reenacting a historical battle. To be considered VR, the following criteria must be met:

  • Believable: Feels like living in the fantasy (virtual) world.
  • Interactive: As you move around, the VR environment needs to move with you.
  • Explorable: A VR world needs to be large and detailed enough to explore.
  • Immersive: To be both believable and interactive, VR needs to engage both body and mind.

AR, on the other hand, is an overlay of virtual content on the real world through computer programs/interface. Through AR, one can enhance/augment the real world environment with a virtual environment (such as information from the web). For example, pointing your smartphone at the Eiffel Tower might automatically “pop up” information about the landmark and surrounding areas.

Limitations:

Most VR and AR platforms are limited by technology. The products currently on the market use one of two methods:

For example, Oculus Rift, a VR device, comes with a dashboard and buttons that you push to pick an object. This feels less natural than actually grabbing or holding an object with our own hands. In addition, some products are visually “restrictive”. Microsoft’s Hololens 2.0, the most advanced AR product currently on shelves, only has a 70-degree field of vision. The human eye has a 114-degree field of view with depth perception.

Hands off:

For billions of years, humans have evolved to interact with objects in the real world using our hands. Other interactive methods, like sound/voice, are used daily to communicate, but our hands are fundamental to any physical interactive experience. Leap Motion’s AR platform, dubbed the North Star, enables similar natural interactions by following our hand movements.

Tap, another AR startup, designed a strap that sits at the base of users fingers and senses gestures and finger taps to provide text input and control (as seen in the video below). The optical mouse attached to the users’ thumb guides your cursor and game controls. This lets your fingers be the controls by communicating through Bluetooth-connected devices. Users can tap in corrections, messages, and more with just one hand and on any surface. Physical connection to the device is no longer required and neither is being able to see or feel a keyboard.

These immersive experiences allow users to manipulate objects like we would in real life. User testing showed that people from all ages could interact with the virtual representations of familiar objects with no training involved. In addition, if given something recognizable, such as a book or a ball, people achieved 99% accuracy with related tasks..

Virtual wearables:

The North Star platform also adds a virtual gadget around your hand or wrist that looks like a real display or interface. It is similar to a smartwatch that exists in virtual space and therefore can change based on context. Instead of memorizing a lot of gestures to invoke different menus or buttons, this virtual wearable looks and acts like familiar interfaces. Users can click, open, drag, turn, or swipe them, just as they would in real life.

Another similar platform called Magic Leap One can pull out an internet browser in the virtual space. It allows users to optimize the platform for content extraction and spatial browsing, enabling new ways to shop and explore with 3D objects as we would do on a laptop or smartphone. It also enables you to open multiple screens and deliver companion content for work or entertainment into any physical space.

These platforms are customizable for various applications. That is, different wearables can have different functions. You can use a tailored wearable for painting, another for writing, another for taking pictures, and another for editing video. Each wearable user-interface is logically customized to the task at hand.

What’s next?

The North Star interface could be the next step for smartphones and VR devices, as it allows for the use of a device using just a pair of hands in empty space. Other companies are not far behind in this space. Similar to Tap, Myo, developed by Thalmic Labs, is a gesture control and motion control device that lets users take control of phones, computers, and appliances. It works by sensing changes in the muscles in your forearm. GestureTek is developing interactive visual projection display systems with gesture-controlled effects for advertising and games on interactive floors, walls, tables, windows, counters, bar tops, phones, and computers.

Integrated with smartphones, computers, laptops and tablets, these 3D tracking and control platforms power users to control all the functions of their devices in the physical world without having to look at the screen. While the technology has shown so much promise, these platforms are still in the early-product life cycle stages. All startups developing these virtual platforms plan to work with VR/AR hardware manufacturers to launch these AR devices by 2030.


If you have any questions or would like to know if we can help you with your innovation challenge, please contact our High-Tech lead Lee Warren, at lwarren@prescouter.com.

Never miss an insight

Get insights delivered right to your inbox

More of Our Insights & Work

Never miss an insight

Get insights delivered right to your inbox

You have successfully subscribed to our newsletter.

Too many subscribe attempts for this email address.

*