Emotion Sensing In VR: Getting Social in VR

Emteq's Charles Nduka returns to discuss social and emotional interaction within virtual reality.

Evidence for the human need to share experiences stretches back to the earliest cave paintings. Scenes from real life or the artist’s imagination were recreated and displayed for others to share.   Social interactions have accompanied almost all communication platforms. Reading and writing facilitated theatre and the formal play, early movies facilitated the cinema industry, radio and television broadcasts resulted in families and friends huddled around a single device to consume sports and entertainment. This article explores how innovators are making Virtual Reality (VR) a social experience.

Social interaction within VR can be distilled into three core elements – speech, movement, and emotional expression. Speech is easily captured and communicated using a microphone and VOIP. All VR devices support capture of head movement, and many capture arm/ hand movement. A surprising amount of non-verbal communication can be inferred from these movements, particularly gestures and gesticulations. Whilst elements of body language can be communicated in VR, the communication of emotional expression is lacking. As a work-around some VR social apps are relying on user-triggered emoticons and arm movement tracking to imply feelings and reactions.  We’ve seen some interesting demos from Oculus and others using cameras to capture mouth movements. Eye tracking will provide improved face to face interaction but eye tracking by itself is not sufficient.

Whilst we await the release of emotionally expressive VR, there are still a number of companies creating social platforms. One of the largest is AltspaceVR, founded by former SpaceX engineer Eric Romo. AltspaceVR is freemium software that supports high- and low-end VR headsets, as well as a 2D experience on computers and mobile. It allows users to chat, watch videos, and join a range of special events, from NBC News Q&A sessions to live music. Like many early social VR spaces, it’s similar to a VR-based Second Life –built less around sophisticated communication, and more around sharing experiences.

AltspaceVR focuses on simplicity and shared experiences.

For emotional interaction, Altspace focuses largely on voice and physical movement. As platform-agnostic software, it features many ways to communicate physical movement for social interaction – everything from simple controller-based movement, through to full-body motion tracking with Microsoft’s Kinect. However, this approach limits the sophistication of social interaction between platforms – users won’t often have equally elaborate set-ups, and so some modes of interaction might not be reciprocated. In terms of emotional expression Altspace supports a range of emoticons, largely selected by the user through a menu. It also supports eye tracking but, again, this is dependent on the VR platform being used supporting it. The main focus appears to be on connecting with friends and sharing experiences like live events or streamed video in a VR setting, which it does very effectively.

In real life, gaming is naturally a social experience and so it’s inevitable that social spaces are being built and enable playing together.  The gaming community has always been quick to embrace new technology that allows them to share play time in new ways. As such, many social applications for VR are heavily game-based, offering up a variety of minigames and tools for users. For example, Sports Bar VR offers competitive pool, darts, and skeeball, Anyland invites users to add and tinker with anything (really, pretty much anything) to their avatars or environment, and Rec Room has online multiplayer paintball, dodgeball, disc golf, charades, and more. These games have simple avatars, often cartoony and without arms, but all players can communicate through voice, movement, emoticons, and hand-gestures. In Rec Room, a fist bump results in an explosion of light – physical interaction is used to perform actions, and now you’ve formed a private party to go play paintball.

Rec Room’s use of the game charades is great for showing the capacity for fun brought with physical interaction in the digital world; getting someone halfway across the world to correctly guess that you’re acting out the movie Jaws in your office is a strange but compelling pastime. VR gaming social spaces focus on the fun of physically interacting and exploring the world and other users around you, and anything they miss in the subtleties of communication are often compensated for with absurdity and silliness from fellow players.

Gaming spaces like Rec Room revel in communicating through exaggerated avatars and situations.

In April Facebook finally launched its own foray into social VR with Facebook Spaces. In Spaces, users are represented by a cartoon avatar, with customised hair, face, and clothing. Spaces integrates Facebook services heavily – users can share photos and videos, take their own inside the space (to share on Facebook, of course), play simple games, or call non-spaces users through Messenger.

Interaction in Facebook Spaces is simple, but effective.

Facebook Spaces is part of a third subset of social VR applications – one step beyond sandboxes like AltspaceVR that focus on sharing content, Spaces is a polished experience built around all aspects of communication. Spaces is sophisticated and modern, and seems to pay quite a lot of attention to conveying authentic interaction. The Oculus Rift headset’s tracking communicate head, arm, and hand gestures to others in the social space reliably and universally. Facebook also invested time in making human-like avatars. Development lead Mike Book stated, “Facebook is about authentic identity, which is fundamentally about humans”, and this ethos is carried through to Space’s characters, who are stylised, but also authentically human and full of emotional range.

Facebook Spaces’ avatars, though stylised, look and feel human in their actions.

What makes Facebook Spaces interesting is the focus on communicating the emotional aspects of conversation. Like many similar applications, avatars’ mouths move in time with microphone output. In addition, the eye positioning of all users is interpreted, creating “eye contact” with others. Given that eye contact is a key form of nonverbal communication, this is a very important development. Spaces also integrates a wide range of emoticons, triggered by movement and by buttons on the Oculus touch controllers. Movement-based emoticons enable some spontaneity in the conversation, but, as Book says, “You have to invoke them. They’re not supposed to be accidental.” The need to deliberately remember to respond in a certain way abstracts emotional communication. Nevertheless, interaction-focused social spaces in VR are making big steps forward to providing authentic human communication in the space.

BigScreen VR has an interesting approach. Here, the social element largely revolves around sharing 2D content within VR. Users can share their work, games or entertainment content by allowing others to view their PC screen. Lip sync and inferred gaze tracking adds to the interactivity or the cartoony avatars. According to CEO Darshan Shankar, engagement  levels have been impressive, and to show their commitment to this new way of collaborating the company holds its meeting in VR.

Most VR platforms can be divided into these three subsets – sharing experiences, gaming, or authentically communicating. In the fledgling VR industry, developers largely haven’t yet looked to tackle more than two of these at a time. While sharing experiences and gaming in VR are natural fits that have seen massive growth, authentic communication in VR is still difficult to implement successfully. While almost all platforms support good interactions in speech and movement, emotional expression is still largely based on emoticons that have to be purposefully triggered by users.

At Emteq, we are working to deliver a virtual reality experience that can interpret and respond to a user’s emotional state. Our Faceteq™ technology allows user avatars to react in conjunction with the user’s own facial expressions – essential to truly authentic communication. Our expression recognition solution will integrate to common headsets and capture the wearer’s expressions accurately. We believe this affective computing is the key to authentic VR and AR social interaction, and will open up new avenues in digital social spaces. . If you’re interested in learning more, do get in touch.


You might also like More from author