‘Dysfunctional’ appeals and failures of algorithmic justice in Instagram and TikTok content moderation

Carolina Are*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This article explores users’ perceptions of justice when using appeals on Instagram and TikTok, focusing on the barriers de-platformed users across fields like activism, sex work, sex education and LGBTQIA + self-expression face when using these platforms’ automated appeals to recover their de-platformed content and/or accounts. Examining appeals from a platform governance standpoint and drawing from fairness and due process literature, this study finds concerning loopholes within these platforms’ appeals, leaving room for discrimination, fraud and scams and leading to user disempowerment. Through interviews with de-platformed users, this paper reveals significant barriers faced by particularly transgender and sex working users when recovering their de-platformed accounts through in-platform appeals. With metaphors of an ‘algorithmic cop, jury and judge,’ this paper concludes that the needs of marginalised users have been designed out of content moderation and of platforms’ processes, leading them to experience the appeals system as opaque, unfair and unjust.
Original languageEnglish
Pages (from-to)1-18
Number of pages18
JournalInformation Communication and Society
Early online date30 Aug 2024
DOIs
Publication statusE-pub ahead of print - 30 Aug 2024

Keywords

  • Algorithms
  • Appeals
  • Content creators‌
  • Content moderation
  • De-platforming
  • Fairness

Cite this