Foster, Bradley Sims (2023) ‘Fake News’: legal and psychological challenges in the regulation of Social Media. Doctoral thesis, University of East Anglia.
Preview |
PDF
Download (190MB) | Preview |
Abstract
The role of social media news use in recent elections in the United States and the United Kingdom has prompted the introduction of various policy proposals and interventions by legislatures and social media companies. However, the scale of global social media enterprises, ranging from hundreds of millions to billions of users worldwide, has produced voluntary and co-regulatory approaches that rely on the technical capabilities of the social media platforms to implement policymakers' objectives. YouTube, Twitter, and Facebook use a combination of rules or guidelines and algorithmic automation to make decisions about user-generated content at scale. A substantial part of the data social media platforms rely on to moderate content comes from the platforms' users, especially through content flagging tools that allow users to report instances of fake news.
Therefore, a key consideration in the evaluation of policy proposals to regulate social media should be the quality of those user-data inputs upon which regulation may rely for its implementation. I designed my research to examine the relationship of political group identification to making judgments about the political news stories users encounter on social media. My study assessed American participants' level of identification with political parties ahead of the 2020 U.S. election and then tasked participants with evaluating genuine and fictitious political news and rating their likelihood to report the stories as false. I found that the more highly identified participants in each group tended to report genuine stories that denigrated their own side as false.
I further contend that proposals to regulate social media content should only proceed with a robust understanding of the trade-offs inherent in technical solutions to human circumstances. In addressing news content on social media, we should be wary about institutionalizing algorithms as the arbiters of truth-especially where they mimic and reinforce our group psychological tendencies.
Item Type: | Thesis (Doctoral) |
---|---|
Faculty \ School: | Faculty of Social Sciences > School of Law |
Depositing User: | Nicola Veasy |
Date Deposited: | 11 Jul 2024 09:44 |
Last Modified: | 11 Jul 2024 09:44 |
URI: | https://ueaeprints.uea.ac.uk/id/eprint/95864 |
DOI: |
Downloads
Downloads per month over past year
Actions (login required)
View Item |