Skip to ContentSkip to Navigation
About us Latest news News News articles

How to deal with the datafication of our society?

Autonomy and data protection in research and teaching
15 April 2025
Oskar Gstrein
Oskar Gstrein, associate professor of Human Dignity in the Digital Age.(Photo: Piet Douma

Our data usage is at an all-time high. But concerns regarding our online presence are on the rise. Who governs the spaces we access online and how do they do this? How can government regulations help manage the consequences of their decisions for the privacy and safety of internet users? These are central questions to the research topics of Oskar Gstrein, associate professor of Human Dignity in the Digital Age.

Text: Britt Corporaal 

In the past few decades, there is one message that is often passed down to children: anything posted on the internet stays there forever. The idea behind this is that if someone deletes a post they have made, a back-up could have been made without the original user’s knowledge or consent. In response to this issue, the GDPR (General Data Protection Regulation) grants individuals the ‘right to be forgotten’. While this is more complicated in reality, the basis for this is the idea that there should be a mechanism granting internet users control over their personal information in order to be able to fully erase something from the digital world.

Early in his career, it was this ‘right to be forgotten’ that inspired Dr Oskar Gstrein. During his PhD, he explored whether this should become a more universal human right. This question led Gstrein to more complex issues surrounding digital identities, and as a result he gained the relevant knowledge to support the Special Rapporteur at the United Nations on the topic of privacy as a senior researcher. Gstrein currently works at the University of Groningen’s Campus Fryslân as an associate professor of Human Dignity in the Digital Age,  as the Programme Director for the  Data Science and Society Bachelor’s programme, and as Theme Coordinator for Data Autonomy at the Jantina Tammes School.

GDPR
‘There seems to be a hunger to use more data, but nobody knows how to do that in a way that allows us to respect individual rights.'

Reconstructing the discourse

During his work with the Special Rapporteur, Gstrein co-organized workshops around the world to ask people questions about privacy and data protection. During these conversations, he noticed a trend: ‘There seems to be a hunger to use more data, but nobody knows how to do that in a way that allows us to respect individual rights or take into account what it does to people who live in groups that are easily stigmatized.’ Therefore, we need new ways of thinking about data protection and our general use of data.

Gstrein identifies two main problems with the current discourse. Due to its current organization, data protection requires data to be labelled statically. This means data has to be given a specific type of purpose and people have a responsibility to adhere to this purpose. For instance, when audio is recorded for the purpose of an interview, the interviewer and the interviewed have to agree on the audio’s purpose and both have the responsibility to not use the audio for any other purpose, such as the production of a podcast. As a result of this static labelling, reusing data becomes irresponsible, because data would then be used for purposes other than those originally intended. However, a lot of players in the political and economic context are pushing for more flexibility when it comes to data reuse, creating tension with the legislative context. Reused data can train AI systems such as large language models, or enable other forms of innovation.

Another problem with the current discourse lies in the focus of data protection on the individual. Gstrein explains: ‘People who are very attuned to technology tend to forget its actual effect on society.’ Technological application can potentially cause a lot of damage to certain groups of people, as recently became apparent in the case of discrimination by automated decision systems in the childcare benefits scandal in the Netherlands (‘toeslagenaffaire’). Because these threats do not directly relate to an individual, it is more difficult to make a case for an individual’s right to defend themselves against these. Hence, the discourse around data protection needs to be reconstructed to take on these new threats.

Data Science and Society: a mission-driven programme

In his different roles at the University, Gstrein explores this reconstruction. For example in the Bachelor’s programme Data Science and Society, where he can convey the urgency of these matters to students who will eventually enter the work field. ‘All of our programmes at Campus Fryslân are mission-driven instead of discipline based. These have in common that they look at global challenges and work on resolving them in local contexts. They provide local solutions to global challenges.’

In the Data Science and Society Bachelor, this works both ways. On the one hand, the programme puts the North of the Netherlands, and Leeuwarden in particular, on the map as a place to research these topics and to create a network between knowledge institutions and stakeholders: ‘We call this educating interdisciplinary bridge builders.’ On the other hand, the programme takes an approach to data science rooted in European and Dutch values and in which data science is informed by current societal affairs, such as ethical, governmental, and regulatory issues.

Cloud safety
' I want the world I live in to be looking outwards, but also for it to be a place where engagement is based on values.’

Cloud dependence & digital identities          

While Gstrein thinks of his students as these bridge builders, the term also applies to his own work: ‘The field I am working in is rapidly evolving. We do this very well at Campus Fryslân: we are always working on developing issues. Nobody has the answers on how to solve climate change, how to make the economy sustainable or how to solve the issues surrounding data. These issues require insights from different fields.’ One of these developing issues is that of changing digital identities. Whereas recorded identities – think of your passport, driver’s licence or birth certificate – are usually assigned by governmental institutions, in more recent years the power to assign identities has also been granted to companies such as Meta and Microsoft, for example by data collected through social media accounts or email inboxes. We also increasingly rely on their cloud infrastructures to manage our digital identities – think of your photos that are stored in the cloud of a big tech corporation, which is almost impossible to migrate away from if you wanted to. Who assigns your identities, and who defines how they are constructed in the digital space? When one person simultaneously has multiple identities, can these be connected, or should they be separated? These are the questions Gstrein hopes to answer: ‘How do we develop a value-based approach to dealing with this datafication of society, promoting human dignity on both individual and societal levels? I want the world I live in to be looking outwards, but also for it to be a place where engagement is based on values.’

More information

If you want to know more about current research on data autonomy, the Jantina Tammes School currently hosts the Data Autonomy project, which offers more information on this. They have created a Data Autonomy Index in order to open up the discussion about data autonomy within institutions.

More information on the research of Oskar Gstrein can be found on his staff page.

Last modified:14 April 2025 09.47 a.m.
Share this Facebook LinkedIn
View this page in: Nederlands

More news