Automated tackling of disinformation

Cover

Cover of the book 'Automated tackling of disinformation'
Cover of the book 'Automated tackling of disinformation' read in the book now

Read

Metadata and description

Title
Automated tackling of disinformation
Editor
European Parliamentary Research Service
Publisher
Scientific Foresight Unit (STOA)
Date
2019
Language
English
ISBN
978-92-846-3945-8
Size
21.0 x 29.7 cm
Pages
116
Categories
Tagungsbände Scientific Foresight Unit (STOA)

Read

Table of contents

  1. 1. Problem Definition and Scope 1
    1. 1.1. Public Perception of the Problem 2
    2. 1.2. Definitions and Conceptual Frameworks 5
      1. 1.2.1. Terminology 5
      2. 1.2.2. Propaganda techniques 7
      3. 1.2.3. Conceptual Framework 8
  2. 2. Social Platforms and Other Technological Factors Helping the Spread of Online Disinformation 10
    1. 2.1. Social Platforms and Web Search Engines: Algorithms, Privacy, and Monetisation Models 10
      1. 2.1.1. Fake Profiles and Groups 14
      2. 2.1.2. Online Advertising and Clickbait 14
      3. 2.1.3. Micro-Targeting and Third-Party Data Analysis of User Data 18
    2. 2.2. Genuine Amplifiers: Online News Consumption Habits, Confirmation Bias, and Polarisation 21
    3. 2.3. Fake Amplifiers: Social Bots, Cyborgs, and Trolls 22
    4. 2.4. Artificial Intelligence, Synthetic Media, and “Deepfakes” 23
    5. 2.5. Outlook 25
  3. 3. Technological approaches to fighting disinformation 27
    1. 3.1. Fact checking and Content Verification 28
    2. 3.2. Detecting Computational Amplification and Fake Accounts 33
    3. 3.3. Detecting Mis- and Disinformation Campaigns 35
      1. 3.3.1. Agents: Source Trustworthiness and Information Laundering 35
      2. 3.3.2. Message Credibility: Beyond Fact Checking and Content Verification 36
      3. 3.3.3. Interpreters 37
    4. 3.4. Malinformation: Hate Speech, Online Abuse, Trolling 38
    5. 3.5. Accuracy and Effectiveness 39
  4. 4. Legal responses 43
    1. 4.1. Self regulation 43
      1. 4.1.1. Risks and opportunities of self regulation 44
    2. 4.2. Co-regulation 44
      1. 4.2.1. European Commission approach 44
      2. 4.2.2. Belgian platform 45
      3. 4.2.3. Denmark 45
      4. 4.2.4. Opportunities and risks of co-regulation 46
    3. 4.3. Classic regulation 46
      1. 4.3.1. German regulation 46
      2. 4.3.2. French regulation 46
      3. 4.3.3. UK regulation 47
      4. 4.3.4. Risks and opportunities of regulation 47
    4. 4.4. Compared approach on regulation 48
    5. 4.5. National and International Collaborations 48
  5. 5. Social and Collaborative Approaches 50
    1. 5.1. Media Literacy 50
    2. 5.2. From Amplifiers to Filters: Citizens’ Role in Disinformation Containment 52
    3. 5.3. Journalist-Oriented Initiatives 54
  6. 6. Initiatives Mapping 56
    1. 6.1. Survey of initiatives 56
      1. 6.1.1. Initiative obstacles to achieve their objectives 56
      2. 6.1.2. Collaboration amongst the initiatives 57
      3. 6.1.3. Legislation as a measure to fight disinformation 57
      4. 6.1.4. Policy actions as a measure to fight disinformation 58
      5. 6.2. Roadmap of initiatives 59
  7. 7. Case studies 60
    1. 7.1. Case Study 1: The InVID verification plugin 60
      1. 7.1.1. What is the InVID plugin? 60
      2. 7.1.2. Who is using the InVID plugin? 60
      3. 7.1.3. How is the InVID being used? 61
      4. 7.1.4. Using the InVID plugin to verify video footage 61
      5. 7.1.5. Technical dependencies and limitations 62
    2. 7.2. Case Study 2: Disinformation during the 2016 UK EU membership referendum 62
      1. 7.2.1. Introduction 62
      2. 7.2.2. Description of the dataset 63
      3. 7.2.3. Russian involvement in social media during the referendum 63
      4. 7.2.4. Russia-sponsored media activity in social media 64
      5. 7.2.5. Impact of Russia-linked misinformation vs impact of false claims made by politicians during thereferendum campaign 65
    3. 7.3. Case Study 3: Mis- and Disinformation during the French elections #MacronLeaks 65
      1. 7.3.1. How #MacronLeaks started? 65
      2. 7.3.2. Sourcing #MacronLeaks 66
      3. 7.3.3. How is sourcing different from fact checking? 68
      4. 7.3.4. How sourcing identifies content of potential disinformation? 68
      5. 7.3.5. Why and how is sourcing useful? 69
  8. 8. Policy options 71
    1. 8.1. Option 1: Enable research and innovation on technological responses 71
      1. 8.1.1. Preserving Important Social Media Content for Future Studies 71
      2. 8.1.2. Fund open-source and multidisciplinary research on automated methods for disinformationdetection 72
      3. 8.1.3. Measure the effectiveness of technological solutions implemented by social media platforms andnews media organisations 72
      4. 8.1.4. Outcomes for this option: Ethical implications of tech solutions 73
    2. 8.2. Option 2: Improve the legal framework for transparency and accountability of platforms and political actors for content shared online 73
      1. 8.2.1. Build a transnational legal framework and support strong privacy protection 73
      2. 8.2.2. User-centric moderation and fiduciary responsibilities of social platforms 74
      3. 8.2.3. Strengthening trust in public institutions and political discourse online 75
      4. 8.2.4. Outcomes: a human rights approach to tech solutions 75
    3. 8.3. Option 3: Strengthening Media and Improving Journalism and Political Campaigning Standards 76
      1. 8.3.1. Support and promote high quality journalism and political campaign standards 76
      2. 8.3.2. Promote Fact Checking Efforts 76
      3. 8.3.3. Outcome: fact-checking on its own is not enough to combat disinformation 77
    4. 8.4. Option 4: Interdisciplinary approaches and localised involvement from civil society 77
      1. 8.4.1. Support interdisciplinary approaches and invest in platforms for independent evidence-basedresearch 77
      2. 8.4.2. Empower civil society to multiply efforts 78
      3. 8.4.3. Promoting Media Literacy and Critical Thinking for Citizens 78
      4. 8.4.4. Outcomes for this option: the challenge of scaling up the action and overcoming cognitive bias 78
  9. 9. ANNEX I: Survey Questions 93
    1. 9.1. Use of the information you provide 93
    2. 9.2. Personal Details 93
    3. 9.3. Participation in initiatives related to fake news, misinformation or disinformation 93
    4. 9.4. Problem addressed 94
    5. 9.5. Technical Solutions used 94
    6. 9.6. Legislation related to fake news, misinformation or disinformation 95
  10. 10. ANNEX II: EU initiatives roadmap 96
  11. 11. ANNEX III: Initiatives in Member States roadmap 98