Social media safety tools suck
Have you ever thought about the safety tools in social platforms? More specifically, have you thought about how insufficient & unreliable they are? I’ve thought about these tools a lot. I think about them as we’re building Diem. I think about what building a “safe space” online really means in practice — who it entrusts with power and who it’s designed to keep out. If I’m being honest, I’ve been thinking about this for almost a decade, since I was 19, which is when I first started amassing a significant following on social media.
When we talked with Laurie Segall last week in Diem (listen here), we discussed the importance of building ethical and inclusive practices into the next generation of immersive platforms. A veteran tech journalist, Laurie was one of the first reporters to interview folks like Mark Zuckerberg and Jack Dorsey, so she’s also been thinking about social platform safety for…a long time. For her, one of the biggest red flags amongst the Facebooks and Twitters of the world was their approach to protecting their user base. Historically, it’s been somewhat careless. And it’s not really changing, as we’re seeing in real-time with the inadequate safety features in Meta Facebook’s new metaverse.
I’ve often likened the current “safety” on big social media platforms to the advice commonly given to women walking alone at night. You know what I mean…
“Wear shoes you can run in.”
“Don’t wear a short skirt.”
“Put your keys between your fingers.”
In other words, much like this largely unhelpful advice, modern safety features in social spaces typically place the onus on victims to solve the problem inflicted onto them. Victims of harassment have to self-report, remove themselves from online communities, and limit their overall experience on a platform when they do nothing wrong. Like the advice given to women walking home alone, this approach fails to address the root cause: there is persistent violence against women baked into modern digital societies and it keeps us on high alert at all times. The safety features in Facebook’s metaverse allow harassment victims to put a protective bubble around themselves so other avatars can’t come close to them. Why can’t you put a bubble around a person harassing you? Why are you the one who has to experience a pared-down social world because of a harasser’s actions? Surely, placing a warning sign on a bad actor and limiting their experience should be the approach?
Sadly, the problem doesn’t stop at safety features. “Catcalling” is a thing on social media and it’s happening all the time in your DMs. The language we commonly use on social channels is also questionable — isn’t it creepy that we “follow” people given the lived experience of so many women in the real world? Being “followed” home is not something we strive for.
There is a dire need for online spaces that are designed with safety, inclusivity, and ethics at the forefront. This doesn’t mean designing reactionary safety features when things go very very wrong. It means designing digital worlds where safety is baked into every nook and cranny of the platform itself. Where does this start? I think we could all benefit from a nod to real-world community management. For example, there’s a nightclub in Brooklyn called Nowadays where you get a speech on community and principles when you enter the venue. On the dance floor they also have samaritans identifiable via a green band who will help you deal with incidents. Starting from the moment you enter, everyone in the club is equally incentivized to look out for each other. Wouldn’t it be amazing if digital spaces did the same thing?
One of the biggest problems with moderation is that it’s currently centralized, which lends itself well to bias and simply doesn’t scale, as we have seen time again. Putting the power into the hands of engaged and diverse community members works to solve that and creates more incentives for the humans in your community to engage.
What do you think? If you could design a safe space online, how would you design it? No bad ideas.
This article was originally published in Diem’s weekly newsletter on March 15th 2022, subscribe here.