Algospeak: The Simplest Way To Bypass Censorship on Social Media
Algorithms end up suppressing the wrong conversations, especially since they overlook the context.
A combination of 'algorithm' and 'speak', the term "algospeak" refers to the replacement of phrases that are disfavoured by social media algorithms, with seemingly innocuous ones.
Platforms like YouTube and Instagram, which rely on advertisements from revenue, have strict and ever improving algorithms which watch out for content that's taboo or not brand-friendly.
Some examples of algospeak:
'Unalive' replaces the word 'dead' or 'kill'.
The corn emoji or 'pron' replaces the word 'porn'.
'Ouid' replaces the word 'weed'.
'Leg booty' replaces the term 'LGBTQ'
'The Vid' or 'Backstreet Boys reunion tour' replaces 'COVID-19'
Using phrases like these can help content creators bring up controversial topics without getting their posts removed or suppressed by content moderation systems.
Voldemorting and Leetspeak
Though the term 'algospeak' seems to have been first appeared in this context in a tweet by user LokiJulianus in December 2021, its usage overlaps with that of 'Voldemorting' – a term that has been around for several years.
The term refers to Lord Voldemort, a character in the Harry Potter series, who is referred to by other characters with euphemisms like "he who must not be named" or "you know who".
While Voldemorting was primarily used to refer to avoiding search engine algorithms by using alternate phrases, algospeak applies to social media platforms in particular.
Leetspeak, in which letters are often replaced by numerals or special characters – referring to hacker as H4X0R, for instance – emerged in the 1980s and is a predecessor to both Voldemorting and algospeak.
It was likely developed to bypass text filters on internet bulletin boards which forbade certain terms and topics of discussion.
A Tool For Free Speech
In a bid to be brand friendly or combat misinformation and hate speech, social media content moderation algorithms end up suppressing the wrong conversations, especially since they overlook the context in which words are used.
LGTBQ creators, for instance, have alleged that their videos have been demonetised for saying the word “gay" on YouTube. TikTok users now use “cornucopia” to refer to “homophobia," according to a report by The Washington Post.
"It disproportionately affects the LGBTQIA community and the BIPOC community because we’re the people creating that verbiage and coming up with the colloquiums.”Sean Szolek-VanValkenburgh to The Washington Post
Women’s health, pregnancy and menstrual cycles are also topics that are consistently down-ranked on TikTok, 23-year-old content creator Kathryn Cross told the publication.
She added that she replaces the words for 'sex,' 'period' and 'vagina' with euphemisms or spells them using symbols.
People from marginalised communities are also affected by aggressive algorithms which can restrict them from openly discussing oppression. They often use algospeak to circumvent this.
Black people often swap out words for “white” or “racist” or hold their palm toward the camera to refer to White people, according to the report.
"Content’s getting flagged because they are someone from a marginalised group who is talking about their experiences with racism. Hate speech and talking about hate speech can look very similar to an algorithm.”Casey Fiesler, content creator, to MIT Technology Review
Prevalent in Harmful Communities
Algospeak is also often used in harmful or radical communities.
Some anti-vaccination groups on Facebook changed their names to euphemisms like “Dance Party” or “Dinner Party,” and began using code words to avoid getting banned Facebook, according to NBC News.
Phrases like "dancing” or “drinking beer” mean getting the vaccine, while 'pizza' and 'Moana' are used to refer to vaccine-makers Pfizer and Moderna.
Anti-vaccine influencers on Instagram also reportedly subverted the algorithm's misinformation-tracking abilities, by referring to vaccinated people as “swimmers” and vaccination as joining a “swim club.”
Pro-eating disorder and pro-anorexia communities, which encourage members to adopt unhealthy eating habits, have also used algospeak to continue operating undetected on social media platforms.
A study from the School of Interactive Computing, Georgia Institute of Technology, found that the "pro-ED community has adopted non-standard lexical variations of moderated tags to circumvent these restrictions.
"In fact, increasingly complex lexical variants have emerged over time," it said, adding, "Despite Instagram’s moderation strategies, pro-ED communities are active and thriving."
(With inputs from The Washington Post, NBC News, and MIT Technology Review)
Subscribe To Our Daily Newsletter And Get News Delivered Straight To Your Inbox.