Virtually violated: How Facebook is trying to fix abuse on social VR before it goes mainstream
Social VR has yet to go mainstream, but people have already had their personal space invaded in virtual reality, and Facebook is working on security tools to prevent it. The company described the first of those tools at its F8 developers conference on Wednesday.
For example, when a user avatar’s “safety bubble” is exceeded by that of another’s in a Facebook social AR app (like Facebook Spaces and Oculus Venues), both avatars become invisible to each other, explained Lindsay Young during the F8 Wednesday keynote. “Pause” lets a user stop the action in a VR setting if they feel uncomfortable. A user can “mute” another user’s avatar to make it disappear completely. The apps also have live moderators to help police bad behavior.
Young showcased an even more advanced security measure from one of Facebook’s developer partners, Harmonix. In the multi-player lounge in its Dance Central VR title, avatars gather in a virtual club just like humans do in real clubs. And sometimes jerks invade the personal digital space of others just like they do in real clubs. In the game, a threatened user can make a double-thumbs-down gesture at the offending avatar, which is then muted and frozen, then relocate to another part of the dance floor.
Social VR games and other apps tend to sell better than single-player experiences. And as VR moves toward the mainstream, user security will become more important.
Already happening
Actually, invasion of personal space is already well documented in VR. “There are already problems with VR multiplayer games where players can get very close to each other–close enough to virtually touch each other–and you had cases where one player would touch the genitalia of the other,” UX strategy consultant Celia Hodent told me. “As soon as you have an open social space, there is always a very small minority that try to pull that kind of crap, thereby poisoning the experience of the majority.”
“We are witnessing this on online multiplayer games and on social media already,” she said. “VR would only make these issues worse because of its increased feeling of presence.”
This may be hard to imagine if you’ve never experienced really good VR. It’s immersive. You begin to identify with your avatar, and you begin to see personhood in the avatars of others.
Getting out in front
Facebook’s vision for social VR has only slowly emerged since it bought the VR company Oculus for $2 billion in 2014. But judging by CEO Mark Zuckerberg’s acquisition track record, social VR could be a big deal. Instead of just chatting or trading links or pictures on Facebook or Messenger, we might all throw on our VR headsets and sit in a virtual living room with our Facebook pals.
That’s a whole new kind of social networking, of course, and a whole new set of social do’s and don’ts will have to evolve for it. If Facebook wants to be the host for that experience, it must provide tools to help users uphold those rules of the road. The same thing goes for the developers of VR experiences that work on Oculus.
That’s why Facebook is focusing on this now, instead of later.
“When the product gets to anywhere near mainstream, you have most of the basics of this figured out, rather than, ‘Oh, you know, we never thought about physical comfort and safety in VR–now I have to go fix all the problems as this thing is live,’” Facebook CTO Mike Schroepfer told my colleague Harry McCracken in a recent interview.
Facebook’s proactive approach to security in the social VR space is notable. The company has been known for its “move fast and break things” ethic–for its willingness to release products or updates quickly and deal with the real-world implications later. At the conference, Zuckerberg said his company would retire that approach in favor of a more thoughtful one, where the opinions of experts would be considered in advance of major product changes.
In the meantime, Hodent thinks Facebook still has a long way to go to ensure safety and user confidence.
“The safety bubble is a good start, but I remain skeptical, especially regarding moderation,” Hodent said in an email. “Algorithms are not very good at this yet, and it’s a dreadful job to do for humans, who also are too slow.”
“We already aren’t good at tackling these [security] issues on ‘regular’ social platforms, so there is no reason to believe Facebook is going to solve this considerable issue on VR social platforms anytime soon,” Hodent said. “I would recommend them to solve this on their main social platform before moving forward.”
(30)