Creating inclusive, and safe social experiences in VR, AR, and MR

Written by jeffreylin | Published 2017/07/04
Tech Story Tags: virtual-reality | augmented-reality | design | diversity | mixed-reality

TLDRvia the TL;DR App

My opinions are my own, and do not reflect my company.

Most developers exploring virtual reality (VR), augmented reality (AR), and mixed reality (MR) are focused on finding the next medium-defining product; one of the first things developers discover is that when a VR, AR, or MR experience is social, it increases the immersion and engagement by leaps and bounds. Years of research from the Video Games industry have explored why social connections and experiences are critical components to extending the life of a product, and why social gameplay is a reason some of the most popular games are still played 10 years after their release (like DOTA, League of Legends, World of Warcraft, or a personal favorite_, Ultima Online_). So, it is not a surprise that many developers are trying to make the next great multiplayer game or social experience for VR, AR, and MR.

However, most developers trying to create a social experience do so within the confines of their internal playtests, or Friends and Family “Alphas” and “Betas.” Through the entire development cycle of many VR, AR, and MR products, most of the user testers are either friends, friends of friends, or coworkers who have an expected code of conduct and respect for each other.

When these products are released to consumers outside of the developers’ personal social networks, the developers are instantly exposed to the extremes of online behaviors — both positive, and negative. It only takes a few searches to see that online harassment in VR (Example 1, Example 2, Example 3), AR, or MR can be rampant — and because of the immersion — more damaging than any prior medium. Developers are then faced with the challenge of retroactively trying to re-design their social experience to be inclusive and safe before their product reaches a point of no return and fails due to a reputation as an experience prone to toxic or harassing behaviors.

So what can developers do to proactively design a more inclusive, and safe social experience?

  1. Verbal Commitment, and a Community Code. In Social Psychology, there has been research on a phenomenon called verbal commitment. The basic idea is that if you explicitly ask and get affirmation from someone about some issue or task, they are more likely to commit to it and do it. So, if you want your kids to clean up their room, you should ask them, “Can I have your word that you will have your chores done by 5 PM tonight?” If the child verbally and publicly says, “Yes, I can,” they are more likely to actually do the task. One of the easiest ways developers can leverage the science behind this phenomenon is to present a “Community Code of Conduct” prior to users entering the online, social experience. This is not a typical Terms of Service agreement that most products have! Users rarely read those, and they have no impact on online behaviors in a social experience. The Community Code of Conduct has to be simple, easy to read (no legalese!) and should use “I” language. For example, “I will treat every user with respect, and failing to do so, will give up my privilege of being a part of this community.” Depending on how much you want to trade-off between the user experience and the strength of the verbal commitment, you can then utilize 3 options of affirmation: (1) the weakest option is simply having the user click “I Agree” after reading the Code. (2) The medium option is having the user type out the letters for “I Agree.” (3) The strongest option, but most unfriendly to the user, is to have the user type out the entire sentence. Depending on the product, a simple feature like this can reduce online harassment by over 20%.
  2. Record Better Data. With engines like Unity streamlining the implementation of data analytics, there is no longer an excuse to not have data analytics for projects; however, developers should not just focus on the standard key performance indicators (KPIs) like CCU (concurrent users), or MAU (monthly active users). It is critical that developers specifically record data related to the social experience. Many developers will only record the basics, such as how often an user is reported for offensive behaviors — this is not good enough anymore. When it comes to new mediums like VR, AR, and MR, users are more likely to quit the experience and never try a product again instead of filing an accurate report against the offending user. After all, there are 1000 other products to try, why should they have to deal with offensive behaviors in one? Instead of just implementing a “Report” feature and recording basic reporting metrics, developers should also record secondary metrics. For example, for a given user, how often does another user “Block” or “Mute” them after an interaction? If a user in your social experience is getting Blocked or Muted often, that could be a red flag. For those who want to dig a little deeper into this space, consider learning about social network analysis. Through social network analysis, it is possible to analyze interactions between users, and see if some users are negatively impacting others. For example, think how powerful it would be if your developers could identify users that, after interacting with another user, increased the odds that the other user will log off and never return to the product? Nearly every developer records some form of data these days, but to optimize for a more inclusive and safe social experience, developers need to start recording the right data.
  3. Feedback. When it comes to designing an inclusive and safe community in VR, AR, or MR, one of the most critical aspects is how you design the feedback loops. You cannot just implement features that allow a user to stop harassment after the fact such as the “Report” or “Block” features — these are the basics. If a user has to utilize these features, the damage is already done, and the user might disengage from your product and never return. In addition, if you never give feedback to the offending users, then your product or platform is effectively telling the offending users that their behaviors are OK, and the problem is the victim. As a general rule, the faster and more specific the feedback is, the better an user typically responds. In video games or social media platforms, companies used to “ban” users for offensive behaviors but the execution was generally poor. For example, an user might be extremely racist one day, but weeks later, the company would send the player an e-mail that looked something like this, “You have been banned for violating the Terms of Service. Please contact Customer Support for more details.” This is terrible feedback because it was delivered weeks after the offensive incident, and there are no specifics. Humans can barely remember what they had for lunch last week; we cannot expect users to remember what part of the Terms of Service was violated. Instead, these users will most likely blame the company for falsely banning them. This is a lose-lose situation for most products because the user ends up becoming angry, has no idea what behaviors to change, and simply makes a new account and re-joins the online community because the barriers of entry is pretty low for most products. When designing your Customer Service loops or reporting systems, make sure you have the fastest and most specific feedback loops possible — for both the offending user, and the victim. However, when creating fast feedback loops, please be aware of the below risk.
  4. Feedback, Part 2. Another aspect of feedback that is critical to the product design is how you incorporate feedback into the user experience (UX) design. For example, in many products today, when a user “Blocks” or “Mutes” another user, the offending user gets a big flashing sign. “You have been muted by [User].” “You have been blocked by [User].” This is poor product design because this type of feedback can trigger and instigate even more negative behaviors. When a user is harassing someone, and they get a message that the other user has blocked them, it is like a boost of dopamine that lets the harasser know “they have won,” and “their behaviors had an impact.” When designing feedback systems, always make sure that you are not rewarding the offending users with any information about their victim that could trigger more negative behaviors. One solution is to not deliver the feedback immediately and have the feedback come from “the community,” so the offensive user cannot directly pinpoint which of their targets reported/blocked/muted them and triggered the feedback.
  5. The Space Bubble. Products like AltspaceVR have begun exploring features specific to VR that can create a more positive online experience such as a Personal Space Bubble that can be toggled On/Off and either allow or block other users from entering your personal space. This is a great design that can improve online experiences; however, developers can push this further. For example, instead of a toggle, tying your Personal Space Bubble to your Friends List, which makes it a little easier to be choosy about who can enter your personal space. Or, adjusting the Personal Space Bubble based on the cultural context (such as having a more restrictive Personal Space Bubble for users from a culture that are more respectful/strict about personal space). You could also tie the Personal Space Bubble to other features we noted earlier in the article, such as automatically turning on the Personal Space Bubble against all users that have had a high number of users block or mute them.
  6. Take Risks. On PC or mobile platforms, banned users can typically make a new account and rejoin the online experience in a few minutes; however, VR, AR, and MR are not like your regular mobile or PC experience — identities are tied to expensive hardware in most cases and the workarounds are less obvious. Platform companies can take risks, and completely remove users from their ecosystems for egregious offenses. It may sound ridiculous to remove potentially paying consumers from an ecosystem; but, some research has shown that for each egregious offender you remove from a platform, many more users stay engaged.

Designing social experiences to be inclusive and safe is not easy, and designers are still figuring out the best approaches to these types of issues. Hopefully, these tips help your product team to create more inclusive and safe products for VR, AR and MR by considering these issues in their product designs from Day 1. It is almost always more difficult to re-design products to solve for inclusiveness and user safety retroactively. If you are interested in more about this topic, check out my previous talk below:

  1. More Science Behind Shaping Player Behavior in Online Games.

ABOUT THE AUTHOR:

Jeffrey Lin, Ph.D.

Dr. Jeffrey Lin was a Lead Product Owner and Lead Designer of the award-winning PC game League of Legends at Riot Games, one of Fortune’s Best Companies to Work For. He was also a Research Scientist at Valve Software, makers of the award-winning PC game Portal 2, and creators of the Steam platform. He obtained his PhD in Cognitive Neuroscience from the University of Washington where he was funded by the Howard Hughes Medical Institute. His design work has been featured in Wired Magazine, MIT Tech Review, The Verge, Scientific American, Times Health & Science, and Re/code. His research has been featured in numerous peer reviewed journals, including Nature.


Published by HackerNoon on 2017/07/04