With augmented reality (AR) and virtual reality (VR) technology being used more in parallel with robotics it is not surprising that the relationship between robot workers and their human counterparts is improving. In a paper published entitled “Improving Collocated Robot Teleportation With Augmented Reality“, roboticists from the University of Colorado Boulder are exploring how AR technology will help to keep the working relationship between robots and humans grow through communication.
The paper, which went one to win one of the Best Paper Awards at HRI 2018, explores the challenging task of robot teleportation which requires a great deal of training. With a focus on AR technology, it looks into how advances in the technologies are creating new design spaces for mediating robot teleportation by enable novel forms of intuitive, visual feedback. The paper explores several aerial robot teleportation interfaces using AR and even evaluates the performance and benefit of each one.
“We don’t think the lack of past AR-HRI work is due to lack of interest (in fact there have been a few papers), but rather due to past barriers imposed by technology and hardware limitations.” Said Dan Szafir from the University of Colorado speaking with IEEE Spectrum: “Even only a few years ago, a lot of AR and VR work relied on custom hardware made in the lab, meaning you had to have expertise in many different specialties (optics, mapping and localization, ergonomics, design, graphics, etc.) just to get started. The refocus on AR and VR from industry (Microsoft, Google, Facebook, and all the startups) and subsequent release of new headsets and related technologies (trackers, controllers, etc.) is making it much easier for researchers in fields not traditionally linked with AR and VR to get a foot in the door and start exploring how these technologies might be of use.”
Szafir goes on to explain that though other means of communication between robotics and users are by no means ineffective. The teams research explored many different methods to find which would be the most beneficial and it was seen that AR provided a unique opportunity to maintain the users attention all while providing the needed information. This means that the users never needs to take their gaze off the robot or drone to look at a second screen as all the information is displayed within their line of sight.
“AR lets users see exactly where a robot will move just by looking at their surrounding environment; there is no need to translate a location from a map overlay to the real-world environment, which might incur an additional cognitive burden, as that work has already been done for them”
The team are continuing to carry out their research into the application of AR communicate and explore means to apply it within the working environment. Though there is still a long way to go the early research seems promising and there is a short video demonstration below that showcases what the team have been up to so far. For more on this story in the further, keep reading VRFocus.