Hands-free tactical drone controlled in AR

The Problem

Drones have become an integral part of a typical military tactical team’s equipment – helping with surveillance, image capture, and mission success. However, the lack of situation awareness for the pilot is alarming, and can even be deadly in some situations. This is because drones require pilots to be heads-down while navigating hands-on with a clunky controller and monitor.

The Solution

We proposed a system that would allow the pilot to both fly the drone and have situational awareness. Using AR, the application can allow users to set navigation waypoints for the drone, monitor drone status, and view drone footage all while being able to be hands-free and head-up.

My Process

User Flows and Wireframes

I started by creating a userflow and then wireframes for the system. The wireframes helped to uncover AR-specific interactions. For example, one of the main actions was to confirm the start of a new drone route. I identified that this needed to be a very intentional action, therefore instead of providing a “tap-able” button, the interaction required the user to drag and slide the button.

User Observation

For user observations, we were able to observe and participate in typical drone training that our end-user would receive. Before the visit, I worked with the PM to create a checklist of goals and questions. From this visit we accomplished the following:

  • To determine if we have the critical information needs provided to the user for decision making and monitoring of the system.
  • Learn how they currently train and what the common pitfalls are.
  • Determine if they are expected to complete the kinds of actions we’ve conceptualized.
  • Get initial feedback on UI wireframes.

At the training site, we were split into groups and I was able to learn how to deploy a drone alongside several end users. From this, we were able to determine potential setbacks and concerns that the users had.  For example, during night training, users commented that it would be more difficult because of NVG (night vision goggles) headgear. We had to recognize that the limitation in hardware (using a Hololens) could potentially deter our end-user from using our product at night.

Night Vision Goggle Drone Catching

He's used it, and he's used it a lot. But you throw someone else in an iron man suit and it's like - what is this? and I can't even function correctly.

*Sgt RobertsDrone Instructor, thoughts on using the Hololens

High-Fidelity Design

With the new information and feedback from our users and SMEs, I updated the designs and interactions.

One example included the interaction to open the menu. Given the limitations of interaction with the Hololens 1, I was not able to do much outside of the “tap” gesture and “bloom” gesture. The bloom gesture, however, is tied to the Hololens design to bring up the Hololens system menu. These limitations actually helped me to find a creative solution! After sketching and ideating, I came up with the idea of the “long tap” to bring up the SenseAR menu. Because SenseAR did not have a consistent world-space UI, this was the best way to bring up a menu in that space.

After testing this idea, we realized that while the interaction worked, we lacked visual feedback. In my second iteration, I added a circular progress bar that provided visual feedback as soon as the user made the tap gesture. This let the user know that the system recognized the gesture and provided the amount of time left that the user would have to hold the gesture until the menu would appear. This feedback really helped to solidify the interaction.

What We Delivered

In the end, we delivered a hands-free Hololens system that allowed military drone pilots to set routes within the Hololens and then monitor the status of their drone and payload from the headset.

This has the potential to allow drone pilots to have a safer environment where they can keep their vision focused on the surrounding environment (instead of a head-down looking at a drone controller/monitor) and allowed them to carry a weapon in their hands. Not only that, but it fostered a system to create pre-determined routes utilizing existing military symbology – unlike the current system that requires the pilot to fly in real-time.