Q&A with Thad Starner

Array

Thad Starner is a wearable-computing pioneer. He founded the MIT Wearable Computing Project; he is a professor in the School of Interactive Computing at the Georgia Institute of Technology, and he is a technical lead on the Google Glass self-contained wearable computer. Thad has authored more than 150 peer-reviewed scientific publications, with over 100 co-authors on mobile Human Computer Interaction (HCI), machine learning, energy harvesting for mobile devices and gesture recognition. He is listed as an inventor on over 80 U.S. patents awarded or in process.

Where do you see people using mixed reality today?

Thad Starner: Hundreds of thousands of people already interact with mixed reality on a daily basis. The head-up displays on many cars, such as Camaros and BMWs, comprise a great example. Your speedometer is overlaid on the world in front of you, bouncing the image of the speedometer off the windshield. Some cars also have this for alerts of possible obstacles in the road or telemetry on the car.

Many people use mixed reality on their cellphones or handheld augmented reality. Google Maps overlays graphics on the world with the “you-are-here” spot. It also can have other modes for traffic and terrain. There’s a street-view version where you can actually see what’s in front of you and align the street-view map with, say, where the shop is in a 20-story skyscraper in front of you. The shop you want is on Level 4, and you can use your phone to manually align the street-view image with what you’re seeing in front of you.

Mixed reality also can be exclusively audio. I have a bone-conducting headset on Glass that sends the audio directly to my cochlea, bypassing my eardrum, using bone conduction. This mixed-reality system is putting audio into my world, allowing me to hear messages with a simple nod of the head.

What are some of the enterprise or industry applications of mixed reality that most interest you?

Thad Starner: The military use mixed reality to provide a remote expert for local needs. They might use it in a combat zone if they have a patient who needs medical attention. They can remotely conference a doctor or expert. With a wearable camera, the remote expert can see the patient or situation just as the local caregivers does and give step-by-step instructions to the boots on the ground.

Task guidance is one of the big applications of mixed reality. Ubimax in Germany is using it to help with order picking in clients’ warehouses. Through the wearable device, employees are guided through the warehouse to find the right product. In most warehouses, someone is actually wandering through the stacks and picking up different items. This process can be problematic because the products often have 10-digit SKU (stock-keeping unit) numbers for product inventorying; employees must match these to a piece of paper. With a mixed-reality system, the employee sees a cartoon overview of aisles and is directed to the right shelf and a bin. Using a head-worn display, you can eliminate errors and increase speed of fulfillments.

Firefighters also are using mixed-reality devices to help them fight fires. Using thermal scopes, they can see where the floor is hot and where there might be a dangerous situation underneath them or above them.

Mixed-reality devices are extremely useful in situations such as border security. Time and efficiency are crucial at border checkpoints. Being able to pull up people’s documentation on a head-worn display as they come through the line makes things faster and more efficient and, frankly, more friendly.

Of course, the medical field also is another important area. If you’ve been to the doctor recently, you’ll notice that a lot of doctors spend more time typing on their laptops than they do talking to the patient. The focus should be on the patient, but the physician is focused on documenting. In our research, we surveyed potential patients with doctors using just pen and paper, using a tablet or computer and using a head-worn display. We found that patients rated the doctors as giving better quality care when using the head-worn display. They saw the tablets and computers as a more interruptive technology. A head-worn device allowed the physicians to pull up the information they needed while maintaining a conversational rapport.

How do you see wearable devices evolving as usage continues to grow?

Thad Starner: When a lot of people talk about mixed reality, they are thinking about large, field-of-view displays. We’ve found that people really don’t want these displays right in their line of sight. If you look at all the head-up displays that are currently being used for cars and airplanes, they’re always slightly off to the side. If you’re using mixed media as guidance for a task, having the graphics right in your line of vision just gets in the way once you’ve seen what you need to do. So, one way we’re looking at making mixed-reality devices more reasonable for everyday use is by adjusting the display. The display might be above your normal line of sight or to the side, and when you need it you just tilt your head to move the graphics into line of site. Once you get the instructions or the information you need, tilt your head back to see the world normally. This method allows you to get in and out of the overlaid graphics quickly.

If you’ve ever watched somebody with bifocals as they are trying to read, you’ll see them tilt their head up a little bit so they can get the text within the near-reading area of their bifocals. That’s how mixed reality is going to be used in everyday life—a virtual world that is accessible with a small tilt of the head.

Can you talk about the role of IEEE in this space?

Thad Starner: Mixed reality will transform the human experience. It represents the biggest shift in human communication and collaboration of our modern era. IEEE members have been organizing the field with a conference on the subject since 1998.

IEEE started the International Symposium on Wearable Computers, which later spun off to the International Workshop on Augmented Reality, which then became the International Symposium on Augmented Reality. It joined with a Mixed Reality Conference in Japan, and they’ve now merged into something called ISMAR—the International Symposium on Mixed and Augmented Reality.

For more information on these emerging technology areas, please watch “Mixed Reality—The Future of Our World,” see The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems’ “Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems” and get involved with IEEE Future Directions and the IEEE Standards Association (IEEE SA). Also, please visit the IEEE SA at Augmented World Expo (AWE) 2018, 30 May and 1 June 2018 at the Santa Clara (California) Convention Center.

Share this Article