Gillett, Gray and Kaye (2023)
Contents
Source Details
Gillett, Gray and Kaye (2023) | |
Title: | ‘Just a little hack’: Investigating cultures of content moderation circumvention by Facebook users |
Author(s): | Gillett, R., Gray J.E., Kaye D.B.V. |
Year: | 2023 |
Citation: | Gillett, R., Gray, J. E., & Valdovinos Kaye, D. B. (2023). ‘Just a little hack’: Investigating cultures of content moderation circumvention by Facebook users. New Media & Society, 0(0). |
Link(s): | Definitive |
Key Related Studies: | |
Discipline: | |
Linked by: |
About the Data | |
Data Description: | With the support of Tor Browser, the authors collected Youtube videos and Reddit threads, according to specific moderation-related keywords (113 phrases) in order to detect discussions around content moderation on both platforms. Afterwards, they analysed the dataset through an iterative grounded approach with a combination of both manual and computational coding. |
Data Type: | Primary data |
Secondary Data Sources: | |
Data Collection Methods: | |
Data Analysis Methods: | |
Industry(ies): | |
Country(ies): | |
Cross Country Study?: | No |
Comparative Study?: | No |
Literature review?: | No |
Government or policy study?: | No |
Time Period(s) of Collection: |
|
Funder(s): |
|
Abstract
“As social media platforms adapt their rules to limit the presence, spread, and amplification of harmful content on their services, users develop strategies to circumvent content moderation policies. To better understand cultures of content moderation circumvention, including the types of rules that Facebook users seek to circumvent, we analysed a sample of YouTube videos and Reddit threads in which users discuss content moderation circumvention. We show how Facebook users turn to others across platforms to obtain information about circumvention methods. We observe that these users often discuss overcoming Facebook’s content moderation policies in terms that downplay the significance of their intended actions. We suggest that where Facebook’s policies and enforcement measures fail to deter rule violations that may facilitate harm, Facebook should consider new culture driven approaches to platform governance that foster prosocial environments and engender compliance with platform rules”.
Main Results of the Study
Although content moderation is an important aspect that governs platforms, some users seem to not recognise and accept Facebook as a governing authority for their content. In addition, despite the general idea that social norms lead the compliance of individuals with rules, users mostly downplay such platform rules and practices, by for instance justifying hate speech as humor, or by defining as “just talking” the interactions (previously blocked) with women. Therefore, deterrence models that result in platform content moderation are often ineffective if the “culture of compliance” in that context is not particularly strong, and perhaps platforms themselves should look for alternative interventions that do not focus exclusively on punishments.
Policy Implications as Stated By Author
As mentioned in the findings, the authors support alternative approaches to platform governance that do not entail the moderation of singular pieces of content and do not contemplate threats of punishment for the user. On the contrary, this new system should revolve around “responsibility, risk management and care at all levels of platform governance”.
Coverage of Study
Datasets
{{{Dataset}}}