The idea was to create digital guiding structures for controlling drones or other devices; a kind of virtual jig that is anchored in the real world, which is felt physically, remotely, through a control device. This, which is what we call “haptic augmented reality”, can enable you to control a drone and perform maneuvers that would be virtually impossible if flying freehand.
For example, it is extremely difficult to manually keep a drone in an exact fixed position relative to another moving object. To simultaneously control the drone and another tool that is attached to the drone, such as a camera gimbal or a robot arm, is essentially impossible for one single person to do.
The research project demonstrated how a drone could be used to inspect a wind turbine blade, while it is rotating, by placing a virtual magnet on the blade and simply fly the drone towards it and snap on to it. While attached to the virtual magnet the pilot can let go of the drone and focus on controlling a camera in stead while the drone is following the moving magnet. What makes this better than using the current techniques of combining altitude meter, downwards cameras and GPS, is that the virtual magnet can be attached to a moving object. The magnet could also be shaped, for example as a rail along which the drone can travel. The virtual object could also mimic many other things, shapes or materials.
Haptic is a very interesting interaction design paradigm for augmented reality as it taps into our fundamental experience of the physical world. The interaction is therefore very easily understood without much cognitive effort at all. Its also fun!

Technology
The prototype system was designed and developed in-house at Ericsson Research, leveraging on a number of available open source components. The remote control and haptic feedback algorithms were developed using the Robot Operation System (ROS) and CHAI3D.
The drones we used were Crazyflie 2.0 from Bitcraze with induction modules for charging. For positioning and tracking we used an Oqus 5+ motion capture system from Qualisys, who also was an outstanding technology partner for this project. Without them the project would never have happened. Our haptic devices, aka joysticks, were developed by the open source project Woodenhaptics, developed by researchers at KTH and Stanford University.

A case for 5G
The whole system and prototype is a lab only showcase. It doesn’t work in the real world since the technology simply doesn’t exist. Yet. For us this becomes a good challenge for developing future technology, since the combination of simultaneous haptic remote control, augmented reality including positioning and tracking as well as streaming of ultra high definition multimedia adds up to some pretty hefty requirements on the network.
In addition to this the drones are battery driven and need their batteries for their motors, so it’s not optimal if they also have to run heavy on-board computing. Hence it will make sense to run the entire operating software in the cloud, which we did in the prototype. This makes crazy-quick and reliable connectivity even more crucial.
The text is an adapted version of Telehaptic Drone Control (Feel the Force) previously published at the Ericsson Strategic Design blog. Used with permission. Photos by Ericsson Strategic Design on flickr, available under a Creative Commons BY-SA 2.0 license.
For positioning and tracking we used an Oqus 5+ motion capture system from Qualisys, who also was an outstanding technology partner for this project. Without them the project would never have happened.
Joakim Formo