Misinformation, communication, and challenges: how technology can be used to support misinformation correction within UK family networks

  • Lauren Scott

Abstract

As society moves into a more digital world, individuals are exposed to false information on a regular basis, which can ultimately cause changes to behaviour and identity, having detrimental effects on health and well-being. Research has focused on platform-based interventions to correct misinformed belief. This thesis explores how technology can support trusted individuals (such as family members) in challenging misinformation, as they are in a unique position to do so.
This thesis comprises of five studies, with a total of 163 participants, exploring the role that technology can have in supporting discussions about misinformation. The first study utilises interviews to explore how misinformation is discussed within a family setting; outlining specific motivations, barriers, resources, coping mechanisms and outcomes related to these discussions. The second study uses a survey to explore how relationship tie-strength and cultural background impact these interactions from a fictional standpoint; both of which are shown to have an impact on how individuals may behave. The third study explores culture and misinformation challenging through a survey; finding that existing cultural metrics may not be applicable to this setting, and that in seeking support from others, misinformation spread grows. The fourth study explores how technology can support these discussions showing that, although technology can assist, many additional social constructs have a role to play. The final study explores descriptions of false information types, identifying steps to be taken in future misinformation research.
This thesis makes contributions to knowledge, design, and research methods. Firstly, it contributes empirical understanding into how misinformation discussions are practically undertaken, how relationship tie-strength and cultural background impact discussions, and how technology can support social misinformation correction. Secondly, it contributes design recommendations for creating support tools that cue social correction. Finally, it recommends change to research approaches, and a need to understand how participants define misinformation.
Date of Award27 Feb 2025
Original languageEnglish
Awarding Institution
  • Northumbria University
SupervisorMarta Cecchinato (Supervisor), Mark Warner (Supervisor), Sheep Dalton (Supervisor) & Lynne Coventry (Supervisor)

Keywords

  • False information
  • Propaganda
  • Disinformation
  • Fake News
  • Close-ties

Cite this

'