Remote working has becomes routine in our society. An increasing number of people do the majority of their work from their own home, tele-commuting in with Skype or other software to speak to colleagues. Work done by MIT is expanding the range of jobs available to remote workers by using virtual reality (VR) to control robots.
There have been previous experiments involving VR and robots, with two main methods emerging. One uses a ‘direct’ model where the user’s vision is connected to the robot’s state, and a ‘cyber-physical’ model where the user interacts with virtual versions of the usual physical controls. Both approaches have disadvantages, with the direct model, signal interruptions can cause the dreaded simulation sickness, while cyber-physical systems need specialised set-ups and large amounts of space to use.
MIT are pioneering a third approach, where the user effectively acts like a pilot inside the robot’s head and uses the Oculus Rift Touch controllers to control the robot’s actions. Tests of the system so far have been positive, and showed that completing tasks such as picking up screws or stacking blocks could be completed with a higher success rate that other VR remote operating models.
The MIT system, which has been titled CSAIL, is currently being assessed for compatibility with other robot and automation types as well as testing how it can be made scalable for multiple users. The CSAIL system has potential for allowing factory workers to work from home, and also for providing employment and educational opportunities for wheelchair users and people of limited mobility.
You can watch a video of the CSAIL system being tested below.
VRFocus will continue to report on new and innovative users of VR technology.