This paper contributes to the social media moderation research space by examining the still under-researched “shadowban”, a form of light and secret censorship targeting what Instagram defines as borderline content, particularly affecting posts depicting women’s bodies, nudity and sexuality. “Shadowban” is a user-generated term given to the platform’s “vaguely inappropriate content” policy, which hides users’ posts from its Explore page, dramatically reducing their visibility. While research has already focused on algorithmic bias and on social media moderation, there are not, at present, studies on how Instagram’s shadowban works. This autoethnographic exploration of the shadowban provides insights into how it manifests from a user’s perspective, applying a risk society framework to Instagram’s moderation of pole dancing content to show how the platform’s preventive measures are affecting user rights.
|Number of pages||18|
|Journal||Feminist Media Studies|
|Early online date||19 May 2021|
|Publication status||Published - 17 Nov 2022|