About

Tools and online services are continuously being developed to support social media users, journalists, and policymakers to limit the harm caused by misinformation.

As we have experienced, the expected impact of online solutions is quite often hampered by limitations associated with lack of explainability, complex user interface, limited datasets, restricted accessibility, biased algorithms, among others issues that can confuse, overwhelm, or mislead users in their own ways. These limitations lead to ethical implications that are typically neglected when new digital solutions to tackle misinformation are conceived. As a wicked problem, limiting the harm caused by misinformation requires merging multiple perspectives to the design of digital interventions, including an understanding of human behaviour and motivations in judging and promoting false information, as well as strategies to detect and stop its propagation without unduly infringing on rights or freedoms of expression.

This hands-on workshop proposes to unpack the state-of-the-art on social, societal and political studies, as well as socio-technical solutions to stop misinformation, challenging the participants to first critically reflect upon limitations of existing approaches, to then co-create a future with integrating perspectives focusing on ethical aspects and societal impact.

A socio-technical issue with multiple perspectives

Among other social transformations, responses to the COVID-19 pandemic have modified the workflow adopted by major social platforms to assess misinformation, shifting from a human judgement to an approach heavily relying on automated detection through a machine learning-based approach. If on the one hand, this shift copes better with the volume and speed of information propagation online, on the other hand it exposes the fragility and limitations of existing technical solutions, even as they include humans at multiple places in the loop.

Conversely, some social platforms have also started challenging power structures, for example by promptly and intrepidly deleting posts by world leaders violating their policies on disinformation regarding to COVID-19. These ad-hoc solutions to regulating speech may have unintended consequences on the overall trust to the governance of any kind of media and authority.

This recent global scenario clearly evidences that purely technical solutions that disregard the social structures that lead to the spread of misinformation, or the different ways societal groups are affected, for instance, according to digital skills, and provide information assessment without promoting digital literacy are not only limited, they can also be harmful by obfuscating the problem.

More comprehensive answers to the problem can only emerge through an articulation of diverse perspectives and ideas, requiring an interdisciplinary approach that includes social scientists, computer scientists and technology designers in the co-creation of features and delivery methods considering both local and global contexts.

A socio-technical issue with multiple perspectives

Who we are

The team of organisers merges three EU-funded projects on misinformation: Co-inform, EUNOMIA and HERoS. With similar socio-technical approach but distinct perspectives, the four organisers share the common challenge of building technology to support credibility assessment, foster critical thinking and media information literacy.

Lara Schibelsky Godoy Piccolo - Photo

Lara Schibelsky Godoy Piccolo is a Research Fellow that investigates interaction design with a socio-technical and inclusive perspective, considering how technology can trigger a positive impact on people's lives. For Co-inform she is investigating how to communicate credibility signals.

Tracie Farrell

Tracie Farrell is a social scientist studying the proliferation of malicious content online, including misinformation and hate speech, at The Open University (as part of HERoS projects). With a background in critical education, her research interests focus on how technology can be used to promote critical awareness and reflection, at individual and societal levels.

Pinelopi Troullinou - Photo

Pinelopi Troullinou is a Research Analyst with an interdisciplinary background in social sciences. Her work focuses on the intersection of technology and society aiming to explore and foster awareness on socio-technical issues such as privacy. For EUNOMIA, she leads the privacy, social and ethical impact assessment, user engagement and liaison with stakeholders.

Diotima Bertel

Diotima Bertel is a Researcher and Project Coordinator with a background in social sciences and philosophy. In her work, she focuses on the individual, social and societal impacts of technology. In EUNOMIA, she is responsible for analysing user behaviour, as well as pilot evaluations.

Goals

The goal of the workshop is to propose an agenda for interdisciplinary research that incorporates robust knowledge of societal, political and psychological models, which can help not only explain reactions to the tools that we build, but to design them to be useful and ethical.

Participants will be invited to:

  • Discuss challenges and obstacles related to misinformation from the human and socio-technical perspectives;
  • Challenge existing approaches to tackle misinformation and identifying their limitations in socio-technical terms, including underlying assumptions, goals (e.g., preventing vs. correcting misinformation) and targeted users;
  • Co-creating innovative future scenarios with socio-technical solutions.

Aiming for a small and diverse group of participants from different disciplines to foster interaction and exchange of ideas, the agenda proposes engaging activities that challenge the status-quo and promote creative-thinking towards effectively advancing the state-of-the-art. Participants will be encouraged to experience the workshop as an opportunity to initiate synergies.

Examples of challenges and obstacles related to source data

Misinformation Challenges - Things to start striving for now!

Call for Papers and Participation

We invite researchers and practitioners aiming at actively engaging with social, societal and ethical problems of current socio-technical solutions to tackle misinformation to join us at the CHI 2021 Workshop - Opinions, Intentions, Freedom of Expression, ... , and Other Human Aspects of Misinformation Online.

Information disorder online is a multilayered problem; there is not a single and comprehensive solution capable of stopping misinformation, and existing approaches are all limited for different reasons such as end-user engagement, inadequate interaction design, lack of explainability, etc. Purely technical solutions that disregard the social structures that lead to the spread of misinformation can be harmful by obfuscating the problem.

More comprehensive, ethical and impacting answers to the problem can only emerge through an interdisciplinary approach that includes computer scientists, social scientists and technology designers in the co-creation of features and delivery methods. Topics of interest include, but are not limited to:

  • Censorship and freedom of speech
  • Social and political polarisation, partisanship
  • Disinformation campaigns and propaganda
  • Conspiracy theories and rumours
  • (Limitations of ) Automated tools for misinformation detection and notifications
  • Nudging strategies and persuasion
  • Social network analysis
  • Impact on communities or social groups
  • Fact-checking
  • Explainable AI
  • Credibility of online content
  • Behavioural studies
  • Human values
  • Legal and ethical aspects of socio-technical solutions

Participants will actively engage in activities for:

  • Identifying challenges and obstacles related to misinformation from human and socio-technical perspectives;
  • Challenging existing approaches and identifying their limitations in socio-technical terms, including underlying assumptions, goals, and targeted users;
  • Co-creating innovative future scenarios with socio-technical solutions addressing impact and ethical aspects.

How to submit

Prepare a motivation statement describing your approach towards fighting misinformation, acknowledged limits, or an envisioned future in which the societies are resilient to mis/disinformation supported by socio-technical solutions. The 2-4 pages paper should follow the ACM Master Article Submission Template. Send your submission to the email: Misinformation 2021 Group.

Submissions will be reviewed by the organisers according to their relevance to the problem and motivations to address ethical aspects and societal impact. Accepted papers will be published on the workshop website.

Important Dates

First Submission Date: 7th February 2021 (for early bird registration)

Notification: 21st February 2021

Second Submission Date: 15th March 2021

Notification: 1st April 2021

Workshop date: 7th May 2021 CET 15:00-19:00 / EST 09:00-13:00 / JST 21:00-02:00 (next day).

For more information, email the Misinformation 2021 Group

Workshop Programme (15:00 - 19:00 CET)

Workshop introduction

Community and Values Cluster (15:20 - 16:15)

The Gossip Economy of Online Social Media

Immigrant families' health-related information behavior on instant messaging platforms

Surfacing Trust, Assessment, and Provenance for Better News Sharing

Fighting Beliefs: Cognitive and logical problems with how we believe and what we can do about that

Discussions

15 Minute Break

Policies and Regulations Cluster (16:30 - 17:00)

Disinformation Hub

The Unanticipated Use of Platforms in Disseminating Misinformation

Discussions

Technical Cluster (17:00 - 17:45)

Helping People Deal With Disinformation - A Socio-Technical Perspective

Harmonization Challenges in Data Collection of COVID-19 Misinformation

Challenges in Automated Detection of COVID-19 Misinformation

Discussions

15 Minute Break

Future Making (18:00 - 19:00)

How to achieve more ethical and impactful socio-technical answers? Hands-on. Looking at tools and defining priorities.

Building an outlook and workshop outcome

Wrap-up: round of thoughts


Accepted Papers

The Gossip Economy of Online Social Media

Brett Bourbon - Constantin College of Liberal Arts, University of Dallas
Renita Murimi - Gupta College of Business, University of Dallas

Immigrant Families' Health-Related Information Behaviour on Instant Messaging Platforms

Lev Poretski - University of Haifa
Taamannae Taabassum - University of Toronto
Anthony Tang - University of Toronto

Helping People Deal With Disinformation - A Socio-Technical Perspective

Hendrik Heuer - University of Bremen

Fighting Beliefs: Cognitive and logical problems with how we belief and what we can do about that

André Martins - University of São Paulo

The Digital Disinformation Hub

Clara Iglesias Keller - Leibniz Institute for Media Research - Hans-Bredow-Institute, Hamburg

Harmonization Challenges in Data Collection of COVID-19 Misinformation

Joshua N. Grant - Oak Ridge National Laboratory
Varisara Tansakul - Oak Ridge National Laboratory
Bryan M. Eaton - Oak Ridge National Laboratory
Gautam Thakur - Oak Ridge National Laboratory
Martin Smyth - Stony Brook University
Monica Smith - National Geospatial Intelligence Agency

Challenges in Automated Detection of COVID-19 Misinformation

Drahomira Herrmannova - Oak Ridge National Laboratory
Olivera Kotevska - Oak Ridge National Laboratory
Jordan Burdette - Oak Ridge National Laboratory
Gautam Thakur - Oak Ridge National Laboratory
Joshua N. Grant - Oak Ridge National Laboratory
Varisara Tansakul - Oak Ridge National Laboratory
Bryan M. Eaton - Oak Ridge National Laboratory
Martin Smyth - Stony Brook University
Monica Smith - National Geospatial Intelligence Agency

The Unanticipated Use of Platforms in Disseminating Misinformation

Ava Lew - University of Toronto

Surfacing Trust, Assessment, and Provenance for Better News Sharing

Farnaz Jahanbakhsh - Massachusetts Institute of Technology Cambridge
Amy X. Zhang - University of Washington
David R. Karger - Massachusetts Institute of Technology Cambridge