Discover more from Other Feminisms
Swimming Against the Algorithms
Your stories about trying to cultivate virtue online
This week, I’m sharing your stories about going beyond merely banning bad behavior online and trying to actively cultivate character. Next week, I’ll be sharing your thoughts on the push to help women by eliminating periods.
The allegedly burgeoning metaverse is already having trouble with new forms of (immersive!) harassment. And the tech companies behind the VR tech are more focused on some rules that will keep avatars spaced apart then on whether their virtual worlds make people feel morally weightless.
I asked you all about whether you’ve found online communities that actively cultivate virtue, rather than just setting a few community rules, and most of you were pessimistic.
Claire sets her own rules, which frequently mean logging off.
I have found that I can’t really look for strong community online. I can use Twitter for news and chatting with friends and friends-of-friends. I can use email and Facebook for coordinating local stuff like giveaways and park meetups. I can use Reddit to crowdsource car repair ideas. But I can’t join non-local Facebook groups geared at moms or Catholic women, for example, because I find that it quickly becomes unhealthy for me. Social media groups, in particular, often seem to encourage me and others to stake out really aggressive positions on decisions that we ourselves are insecure about.
Several people talked about working to avoid the algorithmic biases of social media, which, by rewarding posts for “engagement” frequently wind up amplifying controversialists or snide jokes.
People used chrome extensions to hide the algorithmically weighted feeds so they could make deliberate choices about what to seek out. It would be interesting to see what weights we all might assign to these feeds if we could get under the hood!
I follow several prayer-focused twitter accounts, and when they turn up in my twitter feed, they usually prompt me to close the site, pray, and move on to something else. Definitely bad from the algorithm’s view, but good for my purposes.
Analisa has had a better experience in a community that commits to heavy moderation:
A community I belong to that inculcates virtue echoes a couple of other commenters as a heavily moderated Facebook group. I'm one of the moderators. It's for Catholic homeschooling mothers. The thing is, keeping the 4300-member group positive and encouraging and on-topic and not braggy requires five of us! It's a lot of work, and it's discouraging to me that so much is required when members have already agreed to our list of rules, and we have screened them to ensure they are indeed Catholic and homeschooling and mothers.
And Lawrie had one of the only examples of an algorithm helping to shape behavior:
My husband played Overwatch online for a while. They had a system called Endorsements (https://overwatch.fandom.com/wiki/Endorsements) where you could reward other players for good/prosocial behavior. The matchmaker system used these ratings as part of the process to match people for games, and he said that the games were generally more relaxed/easygoing than he was used to in other online games. I think it worked more by separating the toxic players from the rest of the community (which perhaps is good by not allowing them to teach new players toxic behavior), but I'm would be curious if it helped to reform the toxic players by teaching them what behavior was deemed acceptable by the community.
GeekLady was skeptical that externally-imposed rules could foster virtue:
So, I actually think lots of rules are probably diametrically opposed to fostering an environment where people can grow in virtue.
Growing in virtue is interior, it’s something the individual has to pursue. You can’t chose virtue for another.
Rules are exterior, aimed at enforcing a behavioral set as a norm.
I’d say that rules do the most good when they’re set by a community or a person that you admire and you want to grow to imitate. Then, the rules are a tutorial in assimilating to the community, something you want to internalize.
A maelstrom like Twitter isn’t a coherent enough community for there to be a shared archetype to aspire to. The tutelary norms aren’t set by the community standards team, but by the figures people choose to follow.
And Joyous Thirst came back to this question of trust, at the scale of the country, not just the internet:
We keep coming around to the notion that good rules make safe citizens and good politicians: basically we can’t trust our politicians to do the right thing, but we can trust rules to ensure they don’t do too much damage… As a former American Government teacher, I’m well aware that this principle was at least a part of the reason for the system of government chosen by the Continental Congress. But more and more I see that rules can’t make people good—something those same sage Founders also frequently warned!
Rules do have a place, but it’s a far smaller place than we collectively hope it will be. Rules can’t make up for the loss of ability to trust in each other’s character.