Workshop website

Draft skeleton structure for manifesto

Manifesto materials

Polis 1 | Polis 2

Contents of Notes (Notes from day 1 panel are under Keynote, not sure why they won't show on the TOC ☹️)

Day 1 (18 Nov)

We will introduce the workshop and why we are focusing on science communication and collective intelligence. We will hear from Prof Kai Spiekermann on the need for science communication and how it supports the pivotal role of knowledge in a functioning democracy. We will discuss the limitations of traditional models of science communication for reaching the public and policy-makers, and what collective intelligence has to offer. We aim to explore some examples of collective intelligence in science communication during the pandemic and learn from their lessons.

https://youtu.be/OyRtcHhxw58

Notes from Keynote

  1. the role of science in a democracy:

    1. Science and democracy
      • 2 systems of knowledge creation, each with distinct set of epistemic standards and norms, procedures and dynamics
        • a scientific system
        • public sphere
      • information exchange between these systems
        • controlled by gatekeepers (mainstream media)
        • selective info flow (those who are allowed to speak for scientists, picked for reasons that can be problematic)
        • characterised by skepticism about representativeness (public sceptical that those who speak are representative of science)
    2. Pathways to distortion of science:
      • Representation—the public sphere often draws on science by identifying and consulting “the expert”, who is often a single individual, specifically selected according to (opaque?) criteria.
        • Problem of undervaluing reports from single scientitsts
          • Does the audience believe they are representative of the whole field?
          • Single expert mental model: is the expert selected representative of a large majority? or a random draw from a divided group? If the working hypothesis is that science is divided, the representative scientists may not get recognised as such (and vice versa)
          • Problem for the public is that they don't know what unselected experts really think.
        • Problem of perceived cancelling
          • Media selects for "balance", but this can distort the representation of a large majority, such that the public false concludes that scientific opinion is divided.
          • Balance is taken as different opinions are presented, with no weighting of how much support there are for the opinions.
        • Problem of aggressive debunking arguments
          • We have rational reasons to discard the views of others when we have evidence that (i) their views are biased; (ii) not based on evidence; (iii) based on a fallacy or systematic error (Ballantyne, 2015)
          • Especially in political debates. People are now quite good at debunking plausible views based upon false views or systematic error.
          • Aggressive debunking is a concern if these strategies are used such that they (i) pertain to many or all beliefs the opponent holds; (ii) in settings in which such a wide debunking judgement is unjustified.
          • The aggressive debunker might have a mental model that the lone expert has true evidence but doesn't make it into the media, and the scientist communicator is drawn from a pool of experts with "false evidence".
            • Scientists may be discredited by stories about bias
            • Or about alleged ulterior motives, funding, etc.
      • Incentives
        • Problem of differentiation incentives
          • Overemphasis on differentiating themselves, this is an environment where it is part of the business.
          • But this hampers science communication by failing to explain areas of consent. If scientists go out and engage in science communication as the same in public - disagree with bits but don’t explain how they explain they agree with most other bits
        • The incentive to predict rather than to explain
          • media and politicians request predictions from science
          • this can lead to too many predictions and not enough explanation in the discourse between the public and science
            • physics envy (predictive science)
            • biology/epidimiology are dangerous with preditions
              • better to enageg with scenarios
            • Application to COVID pandemic
              • people easily drawn on offering predictions without really explaining much
            • Better to talk about scenarios rather than talking about predictions (but this is hard when put on the spot!)
      • Strategic interactions
        • Strategising in the policymaker-scientist interaction
          • when scientists advice policymakers on problems that are characterized by
            • large risks
            • urgency to mitigate risks
          • then scientists may be tempted to give very specific policy recommendations
            • problem with this: we blur the boundary between giving advice (cause and effect consideration) and enter field of normative judgments ( prioritization of urgency)
              • Important to keep in mind who is qualified to make these judgements about priorities?
                • E.g. SAGE – very careful not to make too many normative comments, felt an urgency, felt needed to be clearer and make their opinion clear – mention certain options and specific recommendations
        • Journalistic selection problem
          • media actors do not choose scientists because they are the most competent or representative
            • rather - to increase content that increases audiences, circulation, likelihood to click on link
    3. conclusion:

    Notes from panel session

    Inherent tension between open science and need for gatekeeping. Science is better when it's diverse, the more you disagree the better it is. How do we make use of collective intelligence in diversity without inviting too much white noise to the conversation?

    Fundamentally the principles of what science is supposed to be vs the reality are entirely different

    o Special interests co-opt very nature and fundamental principles of science and its terminology to undermine that science in public domain

How can publishers handle this?