Summary of the public round table on Public Interest Research

More than four billion people around the world use social media and other digital infrastructures every day, and yet knowledge on how algorithms control information and communication on such platforms is limited.

It remains mostly opaque which data is shared with the users based on which decisions. The importance of getting scientific and legal access to data to produce knowledge of public interest has become clear in the light of diverse scandals, such as the case of Cambridge Analytica.

On 24 February 2022, the University of Vienna’s Research Platform Governance of Digital Practices held a roundtable to discuss challenges for public interest research and future research priorities. The public round table was part of the Virtual Winter School "Taming the iMonster: Regulating digital platforms''. Bringing together experts from social sciences, law and operations research, themes and topics relevant for science, policy and civil society were discussed. The roundtable convened experts from different backgrounds: John Albert, policy and advocacy manager at AlgorithmWatch, Wolfie Christl, digital rights activist and investigative researcher at Cracked Labs, Johnathan Gray, associate professor at King’s College London and co-founder of the Public Data Lab, Jürgen Pfeffer, professor of computational social science and big data at the Technical University of Munich, and Theresa Züger, head of the Public Interest AI research group at the Alexander  von  Humboldt  Institute  for  Internet  and  Society discussed with Katja Mayer, STS at the University of Vienna and Meropi Tzanetakis, Criminology at the University of Innsbruck their experiences and visions for research in the public interest.

While there is no universal definition of “public interest research,” it can be broadly described as research of public (and not private) interest in which beneficiaries are often not able to carry out research on their own behalf (e.g., what counts as evidence and finding evidence for a lawsuit or corporate misconduct) and data as well as information and communication technologies resulting from the research may be made freely available under certain circumstances.

In the German-speaking world in particular, the term public interest research has yet to generate attention and find its definition. Public interest could be delineated from private, profit maximizing interests, says Theresa Züger, thus it could be positioned both in opposition or in conjunction with private interests right at the interface between research, industry, and civil society to build services and technologies for social good. On Artificial Intelligence, for example, public interest research is bringing new knowledge for the creation and the implementation of ethical AI. It could help to establish monitoring and auditing processes. Of course, this kind of research is not limited to these topics, or even the digital sphere alone. On the contrary, by using innovative methods and forms of collaboration, such as Citizen Science, problem areas can be understood in a completely new way and approached differently. 

The preconditions and resources needed for public interest research were debated by the speakers at the roundtable: from tapping open-source intelligence by meticulous online and offline research in publicly available information to re-engineering algorithms for public audits, these efforts should not remain in niches, nor being solely based on volunteering and precarious working conditions. Instead, public interest research offers many opportunities to systematically engage with pressing issues for the benefit of science, policy and industry and society, if sustainably organised.

 

Themes and topics covered by the round table include:

 

  • Problematizing the notion of publics beyond the law is needed: What counts as public? Instead of taking the public for granted, looking at the emerging of publics in relation to issues, in relation to how people are affected by algorithms, artificial intelligence, platforms etc.
  • Democratic participation of people in the way technology is produced and implemented is needed, as are standards in the processes of developing technologies needed to be designed that are for the public interest.
  • A need to encourage a culture of collective learning and critical empirical research including producing better factual representations as well as documenting how knowledge is produced and how expertise is configured and the political implications of the practices of power and intervention (e.g., introducing a measure, taxation).
  • Currently, Explainable Artificial Intelligence (XAI) is intended only for technology developers and expert users. However, there is a need to extend the concept to include citizens in general and non-profit research and advocacy organizations in particular in order to communicate to the people what technologies are doing and what implications they have.
  • Basic communication infrastructure such as social media platforms is in private hands, so technology companies need to be held accountable and to be assisted in enforcing regulations.
  • Access to data for research purposes is needed, while at the same time researchers should meet high regulative standards including procedural safeguards protecting sensitive data. This includes addressing who gets to be a public interest researcher, is it only academic researchers or should it also include civil society organisations. Another aspect includes the operationalisation of data access. In addition, a legal distinction should be made between commercial interests and public interest research.
  • Institutions and laws are needed that project public interest research (e.g., into how digital platforms work) of non-profit research and advocacy organizations like AlgorithmWatch against the threats or legal action from big internet companies.
  • Therefore, there might be the need for a government institution like the Robert Koch Institute (responsible for disease control and prevention) that has legal access to data and algorithms of technology companies. This would allow for a semi-safe space for companies to meet with researchers and government officials in the light of new accountability requirements, also to increase credibility and social corporate responsibility.

 

Public Interest Research has the potential to engage technology companies, governments, researchers, and civil society in dialogue about urgent challenges arising from ongoing developments in information and communication technologies. Further discussion is needed on how to build infrastructures and institutions to better integrate top-down (e.g., legislation) and bottom-up (e.g., empower users) approaches to advance knowledge production for public interest. Finally, it will also be necessary - especially in Europe at this time - to study the new legislations for digital activities closely and to analyse them continuously from now on regarding their impact. Who will benefit from these new legal frameworks? How are behaviours and markets changing in response to the new regulation? Especially in connection with the pandemic, this could open up new fields of research.

Photo by Daria Nepriakhina on Unsplash.com