About

The manipulation of information and the dissemination of "fake news" is a practice that traces back to the early records of human history, which has for long influenced societies and their power structures.

Yet, the spread and impact of misinformation in the increasingly connected world are calling for immediate action. The acknowledged influence of social media on the results of the UK’s Brexit referendum and Donald Trump’s election in the US, for example, has evidenced the magnitude of the power granted to the online world to transform the reality.

Both misleading information, the so-called misinformation, or disinformation, which is created to confuse or manipulate people are daily harming individuals and societies by threatening democratic political processes and distorting values that shape public opinion in a variety of sectors, such as health and science, and now in a global scale.

The fact that this wicked problem is now deeply intertwined with digital and social media assigns to Human-Computer Interaction (HCI) researchers and practitioners a responsibility in joining this fight. However, neither computer-based solutions, nor journalists and fact-checkers, policymakers, educators or few empowered citizens will succeed on their own.

A key aim of this workshop is to promote innovative, concrete research, potentially of an interdisciplinary nature, that focuses on technologies, critiques, techniques, and contexts for computing within fundamental limits with regard to misinformation.

A longer-term goal is to build a community around relevant topics and research on how misinformation resilient societies can be designed. From an HCI perspective, we hope to impact society through the design and development of socio-technical systems that respond to the social media context and its current struggles between what is considered fact and fiction.

A social issue with multiple perspectives
A social issue with multiple perspectives

Note: Although the term 'fake news' has been extensively used, it has multiple and controversial meanings. For this reason, official stakeholders have suggested to avoid it in research and policies referring to information disorder. As a simplification, we are using misinformation to represent the complexity of the problem of information disorder which includes

Who we are

Lara Schibelsky Godoy Piccolo - Photo Lara Schibelsky Godoy Piccolo is a Research Fellow at the Knowledge Media Institute at the Open University. Lara investigates interaction design with a socio-technical and inclusive perspective, considering how technology can trigger a positive impact on people’s lives. Community engagement, motivations and values are important drivers of her research. She is currently investigating how voice-based systems can be used to raise awareness of misinformation. Lara is also an Associated Lecturer on Interaction Design and User Experience.
Somya Joshi - Photo Somya Joshi is an Associate Professor at Stockholm University with expertise in the field of Sustainable Human-Computer Interaction (SHCI). Her specialization falls within the applied context of technological innovation, particularly in how it translates into transparency in governance, environmental conservation and citizen engagement. She has experience working with a range of partners from academia, industry, NGOs, as well as international development organizations towards the common goal of facilitating inclusive development. Currently, Somya is Head of Research at eGovernance-Lab at the Department of Computer Systems Science (DSV) at Stockholm University.
Evangelos Karapanos - Photo Evangelos Karapanos is an Assistant Professor at the Cyprus University of Technology where he directs the Persuasive Technologies Lab. Evangelos’ expertise is in experience-centered design of interaction with technology. His ongoing work explores technology-mediated nudging interventions for misinformation-resilient societies.
Tracie Farrell Tracie Farrell is a Research Associate at the Knowledge Media Institute at the Open University in the UK. Tracie’s professional background and research interests focus on awareness and reflection in learning. In particular, she examines how technology can trigger metacognitive activity.

Goals

The workshop aims to promote innovative and concrete research, of an interdisciplinary nature, focusing on technologies, critiques, techniques, and contexts for digital solutions that establish fundamental limits to misinformation. More specifically, the workshop will engage the participants in:

  • Discussing challenges, obstacles or problems related to misinformation
  • Challenging current perceptions and their limitations
  • Mapping stakeholders, questioning the relationships between them
  • Co-creating future scenarios where digital platforms would support misinformation resilience
  • Identifying criteria for assessing the potential of the innovation for impact

As a longer-term goal, the workshop aims at building a multidisciplinary research community focusing on the design of misinformation resilient societies.

A social issue with multiple perspectives
A social issue with multiple perspectives
A social issue with multiple perspectives
A social issue with multiple perspectives

Call for Papers

We invite position papers for the CHI 2019 Workshop on Exploring the Limits of Misinformation. This one-day workshop will offer an interdisciplinary forum for researchers and practitioners aiming at actively challenging the limits of current approaches to tackle misinformation and fostering the design of misinformation resilient societies.

Interaction designers, journalists, educators, policymakers or other related stakeholders are invited to submit a position paper describing their approach towards fighting misinformation, acknowledged limits of the approach, and how they envision a future in which the societies are resilient to mis/disinformation.

Topics of interest include, but are not limited to: socio-technical empirical studies, motivational studies, human values, persuasive technology, games, gamification, information and media literacy, fact-checking, social media policies and regulation, automated tools for misinformation detection and notifications, legal and ethical aspects.

Submissions will be reviewed by the workshop organizers according to their relevance to the problem and motivations to advance towards a more systemic socio-technical solution.

The paper should be from 2 to 4 pages in the CHI Extended Abstracts format and submitted via the Easychair system.

Accepted papers will be available at the workshop website. Workshop outcomes will be published as a poster presentation at the main conference, and potentially as an article at ACM Interactions. The possibility of a special issue journal will be discussed with the participants.

Participants will be invited actively engage in a series of co-creation activities for:

  • Discussing challenges, obstacles or problems related to misinformation
  • Challenging current perceptions and their limitations
  • Mapping stakeholders, questioning the relationships between them
  • Co-creating future scenarios where digital platforms would support misinformation resilience
  • Identifying criteria for assessing the potential of the innovation for impact

Submission date: 12-Jan-2019

Notification date: 01-Mar-2019.

How to submit

A link to Easychair will be provided soon.

Programme

This one-day workshop will offer an interdisciplinary forum for researchers and practitioners aiming at actively challenging the limits of current approaches to tackle misinformation and fostering the design of misinformation resilient societies.

Workshop Agenda

Ice-breaker and introductions

Setting the Stage: Inspirational talk

Mapping the Terrain: Mapping stakeholders

Role Play: Brief exercise to change perspectives, lenses and orientations

Lunch break

Lightning Talks: Short presentation by the participants on their area of research

Future Making: Exercise to identify criteria for assessing an innovation’s potential for impact, such as a fact-checking tool or technical platforms that would allow for misinformation resilience.

Wrap up and Next Steps

The inspitational speaker

Allan Leonard is co-founder and editor-in-chief at FactCheckNI. He is a peace journalist and editor-in-chief of Shared Future News, which reports on peacebuilding in Northern Ireland. He served as managing director at the Northern Ireland Foundation, an independent charity established to promote peace and reconciliation, where he also gained international experience supporting programmes in other divided societies. His professional background includes project and staff management, fundraising, marketing and communication.