Discover more from Other Feminisms
Rules in Lieu of Virtues
How online communities disclaim responsibility for character
I’ve never tried a VR game and I’m skeptical of plans for the metaverse. (Incidentally, VR headsets are much more likely to cause motion sickness and nausea for women than for men). But for those who make it online, there are new problems, including new forms of sexual harassment.
The MIT Technology review had a recent story on efforts to curtail virtual groping in Facebook/Meta’s VR environments. The pitch for VR is the successful illusion of presence—when it works, that can make intrusive virtual contact feel more “real” and more threatening than slurs in a chatbox. Facebook/Meta had a plan, which didn’t work out so well:
Meta’s internal review of the incident found that the beta tester [the woman who was groped] should have used a tool called “Safe Zone” that’s part of a suite of safety features built into Horizon Worlds. Safe Zone is a protective bubble users can activate when feeling threatened. Within it, no one can touch them, talk to them, or interact in any way until they signal that they would like the Safe Zone lifted.
The Facebook employee quoted in the article emphasized the way Facebook is fulfilling its responsibility—users are introduced to Safe Zone in onboarding, and reminders about the feature turn up on posters within the virtual world.
It sounded a little like the instructions women receive to navigate the world: don’t leave your drink alone; walk with a friend; have your keys out before you reach your car. “You’re in control,” the Facebook posters say, implying that, if something bad happens, you may be the one responsible for messing up your precautions.
At the time of the MIT article’s publication, Safe Zone just gave users the chance to exit an encounter and block the person bothering them. The instigator could push the person they were harassing out of a space. A different virtual game, Quivr (in which you fight zombies with a bow and arrow), had a virtual gesture users could make to push another avatar away from them. In mid-March, Facebook added a new feature, Personal Boundary, which users could toggle on and off to keep other avatars at arm’s length.
It’s not that I want companies to not build these safety features. Each one can be helpful in limiting harm. But I do think they show a real disregard for responsibility on the part of the companies.
Each tool is about creating a rule, not inculcating a virtue.
The companies are coming up with ways to make certain behaviors impossible or to let users escape bad behavior. But they aren’t thinking about what in their platform encourages people to behave badly. The sites act as though they are neutral—if people behave badly online, they are revealing their self. They might have behaved badly anywhere; it just happened to be on Facebook.
However, it’s clearer and clearer that the choices sites make can encourage or discourage virtuous behavior, and the sites know it. Facebook knows that Instagram is bad for young women. Twitter knows that their site makes sending a death threat feel casual and normal, when many of those users would probably never have posted a letter expressing the same thoughts.
Their job isn’t just to stop doing harm, but to think a little about how to actively do good. Putting large numbers of strangers together without guardrails isn’t neutral and isn’t responsible.
The search for the perfect rule or set of safety settings does remind me of Christine Emba’s Rethinking Sex. As she told me during our conversation, the modern culture around sex is marked by a broken promise. Many of her interviewees had a sense that, if you find the right rules, sex can only be good, and you and a stranger will never have to know each other or reveal yourselves to each other in order to feel good about what you do with each other. The rules (“two enthusiastically consenting adults”) will keep you safe.
But there’s no end run around character formation, and no checklist of consent items that lets us get around the fact that we are interacting with another human being, not a preference menu.
Rules are the minimum, and a good rule can be a teacher, when we inquire into it. But safety can’t come from rules alone but from active work to build a culture that forms character rightly. Every site and culture is already shaping character; the question is just how deliberately and in what direction.