Institute for Information Law (IViR)
A new study published by AlgorithmWatch in cooperation with the European Policy Centre and the University of Amsterdam’s Institute for Information Law shows that the GDPR needn’t stand in the way of meaningful research access to platform data; looks to health and environmental sectors for best practices in privacy-respecting data sharing frameworks.
In February 2020, after heated discussions between Facebook and the research community, the platform announced that it would release nearly a billion gigabytes of raw data through its partnership with Social Science One. The partnership, which allows select researchers to investigate the effects of social media on elections and democracy, was delayed by nearly two years after Facebook raised concerns about compliance with the EU’s General Data Protection Regulation (GDPR). However, a new AlgorithmWatch study examining best practices in research access frameworks shows that this claim, used by Facebook and other platforms in apparent attempts to thwart external scrutiny, doesn’t hold water. Drawing on examples from the environmental and health sectors, the study shows how data protection concerns can be mitigated through the introduction of mandatory data sharing frameworks, with an independent EU-institution at the center.
“The Social Science One drama is emblematic of a much larger dilemma—it’s really just the tip of the iceberg,” says Jef Ausloos, a data protection expert and lead author of the study Operationalizing Research Access in Platform Governance: What to Learn from Other Industries? As the COVID19 crisis has shown, platform intermediaries play a central and ever-expanding role in modern society. They are inextricably linked to how we coordinate remotely at school or work, how we find and consume information, and how we organize our social movements or exercise key democratic rights.”Due to their influence, scale and complexity, a wide range of research is necessary to understand their impact and hold them accountable” write the authors, “but the concentration and privatization of data has a deep impact on independent investigations.”
The study’s findings come at a critical time. The European Commission is moving forward with its plans to increase and harmonize the responsibilities of online platforms and reinforce oversight over platforms’ content moderation policies through the announced Digital Services Act (DSA) package. The Commission is also considering special rules for large platforms with significant network effects, like Facebook or Google.
Noting that self-regulatory efforts have failed to bring about true accountability, the study argues that especially such large platforms should be subject to binding disclosure requirements in the same way that large polluters are required to report emissions data to member state and EU level authorities. Wary of the sensitivity of certain kinds of data as well as legitimate concerns related to user privacy, the authors recommend that an independent EU body act as an intermediary between the disclosing corporations and recipients. Such an institution would maintain relevant access infrastructures including virtual secure operating environments, public databases, websites and forums. It would also play an important role in verifying and pre-processing corporate data in order to ensure it is suitable for disclosure.
“When we think about what transparency measures should look like for the DSA we don’t need to reinvent the wheel,” says Mackenzie Nelson, project lead for AlgorithmWatch’s Governing Platforms Project, “The report provides concrete recommendations for how the Commission can design frameworks that safeguard user privacy while still enabling critical research access to dominant platforms’ data.”
Operationalizing Research Access in Platform Governance: What to Learn from Other Industries?Jef Ausloos, Paddy Leerssen, Pim ten Thije
Mackenzie Nelson (AlgorithmWatch) Project manager Governing Platforms firstname.lastname@example.org
Jef Ausloos, email@example.com