Social media has become a key source of online news consumption. However, at the same time, social media users are not passive news consumers. They can further distribute online information to their networks and beyond. It is easy then to understand how information that is not always factual or information that promotes hate and violence can be amplified on social media.
This use of social media has an unimaginable impact on the real world as many recent events have shown. One such example is that of the Capitol deadly attack in January 2021, in which social media had a major role to play. Misinformation regarding the transparency of the election process had been spread through social media generating distrust and anger against the newly elected president. Riots are also said to be organised through groups on social media.
It is evident, that social media and misinformation can be harmful and from a political perspective it can threaten democracy. Through our work in the EUNOMIA project, we adopted an interdisciplinary approach to examine political bias in the engagement with false information.
EUNOMIA, a 3-year EU-funded Innovation project, aims to shift the culture in which we use social media focusing on trust, nudging social media users to prioritise critical engagement with online information before they react to it. To this end, it provides a toolkit that supports social media users to assess information trustworthiness. In developing effective solutions, it is necessary to understand the human and societal factors of misinformation.
Our interdisciplinary approach
Within the project, our interdisciplinary team at Trilateral Research leads the work of understanding the social and political considerations in the verification of social media misinformation, and the findings directly feed into further development of the tools. Our approach involved three key stages:
- Stage 1 – TRI’s social scientists undertook desk-based research to understand the political challenges associated with verifying social media information. This provided insights on how political affinity can influence engagement with misinformation
- Stage 2 – 19 interviews were conducted with citizens, traditional media journalists, and social media journalists by social scientists. The interviews highlighted how the language used on social media can indicate political bias. Furthermore, information and sources which are politically biased or radicalised are not perceived to be trustworthy.
- Stage 3 – Building on the findings from stages 1 and 2, Trilateral’s technical team undertook a social network analysis to gain insights on the role of political bias in the engagement with misinformation on social media.
Stage 3 involved the team examining a network of 579 influential Twitter accounts of UK Members of Parliament and a sample of 49 false information accounts. Using UK politics as a case study, enabled the technical team to contribute to the existing heavily US-focused research.
The analysis was conducted using a step-by-step approach.
Within the UK context, the findings suggest that most of the accounts engaging with false information have a Conservative leaning. This can be explained in two ways:
- False information can be generated and spread mainly by Conservative-leaning accounts, or
- There is bias in the way fact-checkers label the false information accounts.
The insights emerging from Trilateral’s interdisciplinary approach can be used in the design and development process of relevant tools for tackling misinformation. It also invites fact-checkers and data scientists to explore potential bias when they label accounts as sources of false information. Furthermore, it contributes to media literacy, raising awareness of social media users regarding trustworthiness assessment and further engagement with online information.
The findings encourage social media users to examine the characteristics of accounts that generate and promote content especially with regard to political bias.
For more information, please contact our team: