Human-Robot Interaction at Chalmers University
At Chalmers University in Gothenburg, Sweden, within the Computer Science department, interaction design researchers use Qualisys motion capture as they practice the methods and philosophy of design within computer science and run research through design projects on social drones. Social drones, a term coined by the researchers themselves, describes the application of autonomous drones flying in environments with humans.
To break it down a bit more, if you look at the space of all the things that you can do with the drone, the use cases can be divided into two distinct categories: piloted UAV flight in which the drone is controlled by a pilot with their hands on their joysticks, or there is autonomous flight with sensors and algorithms.
What the Chalmers researchers are specifically curious about is the environment and people that are interacting with the drone in some way, referred to as human-drone interaction, in which scholars are working within fields, such as sports, using drones as training devices or enhancing the spectator’s experience. Others are using drones to bring physical touch into virtual reality or projecting digital information. These concepts all boil down to human-robot interaction within the context of human-computer interaction and computer science.
Their approach is to take philosophies and practices that are known to be useful in cultivating positive, healthy, benevolent qualities in ourselves. The Qualisys motion capture system is used to capture the motion of both humans and drones, either together or separately, as the group designs experiences with digital and robotic materials embodying the essence of these practices.
The second example is the prototype Wisp that the Chalmers computer science researchers are working on for a breathing exercise, in which wearable sensors can detect breathing, as the drone moves with the breath in real time. Led by PhD student, Mafalda Gamboa, the project investigates how a drone-based interactive experience might motivate people of all ages towards take up meditation and breathwork, and how a drone’s movement can be used in such exercises to guide and enhance the experience. In a truly interdisciplinary project, the Chalmers team has been performing engineering and design experiments to build a stable, enjoyable drone product; as well as participating in trainings and experiences to better understand the practices and philosophies of meditation and breathwork.
The group uses motion capture to both control and analyze motions of the drone, as well as to capture human movement. Using real-time and recording functions in tandem, the motion capture data becomes, at the same time, a material that is used to build an interactive experience, as well as a source of information they analyze for their research on how these interactions happen, and how we might design better ones in the future. They also experiment with using different positioning systems and AI-based algorithms to eventually turn these prototypes into products which all of us can enjoy at our homes.
"The Qualisys system gives us unmatched precision and versatility for developing indoor drone applications that we may find soon enough in our homes. We get excellent real-time performance for control algorithms, and extremely rich datasets for our human-centered research, at the same time."Mehmet Aydın Baytaş
We want to thank the team at Chalmers University Computer Science Department and the T2i Lab, for sharing their work, and for using their Qualisys system to its full potential!
Mehmet Aydın Baytaş and Mafalda Gamboa‘s research is supported by the The Wallenberg AI, Autonomous Systems and Software Program – Humanities and Society (WASP-HS), as part of the project “The Rise of Social Drones: A Constructive Design Research Agenda” led by principal investigators Morten Fjeld and Sara Ljungblad.
If you are interested in having your lab featured as a Qualisys customer story, please submit a write up using the form below to share your story with us.