Digital Delights and Disturbances: Poledance and Censorship


Digital Delights and Disturbances: Poledance and Censorship in Aula Magna

The JCU Department of Communications welcomed London-based Italian researcher, activist and blogger Carolina Are for a talk on online censorship called “Pole Dancing Against the Algorithm,” on October 20, 2022. Are is a professional pole dancer with a Ph.D. in Content Moderation, and she is also an Innovation Fellow at Northumbia University’s Centre for Digital Citizens in the U.K., where she conducts leading research on Instagram and TikTok’s algorithmic censoring of pole dance. The discussion was part of the Digital Delights and Disturbances lecture series.

Are said she experienced firsthand algorithmic censorship for her social media content. Are’s Instagram account features her performing and teaching pole dance. She described how Instagram and TikTok use ‘shadow banning’ to prevent similar content from gaining exposure, despite complying with the platform’s community guidelines. Shadow banning refers to a social media platform’s ability to hide content without notifying the creator.

“People realized that shadow banning was a thing, that it was happening to people near them and to people like them, not just pole dancers. That kickstarted a campaign called ‘Everybody Visible’, which is a group dedicated to demanding META for more transparency between user and platform,” Are revealed.   

Are described how her Instagram account was deleted without any warning, resulting in her losing her main source of income. She noted how tech companies like META, who own Instagram, lack transparency when deciding which accounts to delete. While she admitted that she was fortunate enough to speak with individuals within META to restore her account, she also revealed how most users never recover them.   

“I’m currently being an intermediary between META and a few users who had their accounts deleted. Not all these accounts have been recovered and these people do not know why or what happened,” Are stated.   

She outlined how platforms use vague language in their guidelines to mitigate company liability for content disputation, which inherently ostracizes some creators. She said that while safety is at the forefront of media litigation, platforms lack coherent rules to enforce their guidelines. Her pole dance TikTok account was deleted four times and she was prohibited from posting on the platform. TikTok explained that her account had “implied nudity”, which violates the app’s terms and conditions. 

“If we are running content moderation on implications, that can become a very worrying scenario,” Are said.

When a user’s content or account is removed from a platform, one must undergo a series of appeals to restore their data. Are pointed out that META’s Instagram appeals process is virtually nonexistent for average users. She said that “without friends in high places,” most users have no ability to “redeem themselves” after removal.   

“An infrastructure that allows de-platforming but no rehabilitation or no recovery is inadequate. It’s not how a democratic society works; it makes you feel powerless as a user,” she said.   

Also, she mentioned social media’s tendency to favor certain users’ content violations over others, due to societal status. Using examples posted on Instagram by Kim Kardashian, Are explained how celebrities are allowed to post images alluding to nudity without repercussions, while smaller accounts that post similar content are shadow banned or removed. She equated this to a larger discussion about “whorephobia” within social media, where sex workers are often stigmatized, while celebrities posting more explicit content are accepted. 

This double standard within social media raises the question of how to operate “platform governance” in an ethical manner for all users. Students were invited to propose their own ideas on how companies like META can improve their moderation techniques to promote equity for all users. Several ideas including diversification of moderators, algorithmic adjustments, and greater transparency were proposed.  

“We need to ask safety for whom? Who do you want to be safe on your platform? If you really want everybody to be safe, you need to be more transparent about your decisions,” Are concluded.