Thematic report

Artificial Intelligence, Information Integrity and Democracy (chapter 3)

Artificial Intelligence, Information Integrity and Democracy

This report examines research on the properties of AI systems (specifically machine learning algorithms) and their embeddedness in online content governance systems.
How is ‘artificial intelligence’ (AI) defined, and what are the relationships between AI systems development and internationally protected human rights?
What are the relationships between AI systems development and internationally protected human rights?
What are the interdependencies between AI systems development, the use of automated tools and democratic processes?

The analysis covers the relationships between AI systems and human rights, AI systems use and content governance (generation and moderation), and how these developments are related to changes in democracy, societal resilience and cohesion.

It shows how:

  • States need to ensure that human rights protection frameworks are adapted to the challenges posed by new technologies, their actors and power dynamics.
  • AI cannot be free of bias, hence the need for transparency and specificity in evaluating algorithms and models, and regulating algorithmic personalization systems.
  • Solutions to rights violation by AI driven content governance require multifaceted, ethical approaches that address the root causes of polarization, and legal frameworks that define illegal content and enforce transparency.
  • Research must prioritize understanding human rights law regional applications, improving data diversity, conducting independent algorithmic audits, and addressing AI divides.

The report also highlights the contribution of AI systems to changes in information ecosystems with a section on AI in the news media industry. The governance of legacy and online news media is examined in this report (chapter 6) and this report (chapter 7), and the role of non-mainstream news media is examined in this report (chapter 8).

Thematic reports

Other chapters

Information ecosystems and democracy (chapter 1)

This chapter begins with an introduction that frames the central themes of the report, covers the key concepts and definitions, delves into the challenges facing democracies focusing on mis- and disinformation, acknowledges the limitations of the report and provides an outline of the report.

News Media, Information Integrity and the Public Sphere (chapter 2)

What does research tell us about changes in legacy and online news media and what can be done to promote information integrity and a democratic public sphere?

  • The analysis shows how platform dominance and advertizing market concentration impact news media finances and public trust, with variations across countries. 
  • It highlights inconsistent findings on mis- and disinformation and urges to strengthen news organizations’ bargaining power. 
  • The study emphasizes government roles in information manipulation and the protective role of filter bubbles for marginalized groups. 
  • It calls for global studies on media trust, polarization, and news sustainability.

Big Tech and Governing Uses of Data (chapter 4)

What does research tell us on the power of big tech companies and approaches to governing data extraction and use and influences on political deliberation?

  • This report reveals injustices associated with the interplay of data extraction and data brokering.
  • It underscores the role of monopolistic actors and digital platforms in shaping data production and governance, replicating injustices and exacerbating inequalities.
  • It explores how permissive legislation and platforms business models fuel mis- and disinformation, global data dependencies, and exploitative online labor markets.
  • It highlights resistance strategies from Global Majority World countries while addressing their vulnerability to misinformation campaigns and datafication harmful consequences, and calls for more research into these phenomena.

 

Awareness of Mis- and Disinformation and the Literacy Challenge (chapter 5)

What does research tell us about changes in legacy and online news media and what can be done to promote information integrity and a democratic public sphere?

  • The analysis shows how platform dominance and advertizing market concentration impact news media finances and public trust, with variations across countries. 
  • It highlights inconsistent findings on mis- and disinformation and urges to strengthen news organizations’ bargaining power. 
  • The study emphasizes government roles in information manipulation and the protective role of filter bubbles for marginalized groups. 
  • It calls for global studies on media trust, polarization, and news sustainability.

Governing Information Ecosystems: Legislation and Regulation (chapter 6)

What does research tell us about changes in legacy and online news media and what can be done to promote information integrity and a democratic public sphere?

  • The analysis shows how platform dominance and advertizing market concentration impact news media finances and public trust, with variations across countries. 
  • It highlights inconsistent findings on mis- and disinformation and urges to strengthen news organizations’ bargaining power. 
  • The study emphasizes government roles in information manipulation and the protective role of filter bubbles for marginalized groups. 
  • It calls for global studies on media trust, polarization, and news sustainability.

Combating Mis- and Disinformation in Practice (chapter 7)

What does research tell us on specific measures to combat mis- and disinformation by civil society organizations and governments?

  • This report highlights the risks to human rights posed by some measures against mis- and disinformation, emphasizing the need for diverse, context-sensitive approaches rather than a single solution. 
  • It calls for balancing economic growth, innovation, and human rights protections while avoiding regulatory overreach, particularly by authoritarian regimes. 
  • It underscores the limitations of overrelying on technical tools and stresses the need for adaptable practices like fact-checking. 
  • It also highlights global differences in protecting press freedom and countering disinformation, urging research with real-world data beyond the Global North and mixed methods to capture diverse experiences, and monitor platform practices that suppress dissenting voices.

Artificial Intelligence, Information Integrity and Democracy (chapter 3) – Interactive Map

Developed using GarganText by the OID in partnership with CNRS Institute for Complex Systems.

This map represents a statistical summary of the thematic content of the report. The network graph represents relations between the words in the report, placing them closer to each other the more they are related. The bigger the node, the more present the word is, signalling its role in defining what the report is about. The colors represent words that are closely related to each other and can be interpreted as a topic.

The map is generated by the OID using GarganText – developed by the CNRS Institute of Complex Systems –on the basis of the repot’s text. Starting from a co-occurrence matrix generated from report’s text, GarganText forms a network where words are connected if they are likely to occur together. Clustering is conducted based on the Louvain community detection method, and the visualisation is generated using the Force Atlas 2 algorithm.