Skip to ContentSkip to Navigation
Jantina Tammes School of Digital Society, Technology and AI
Digital prosperity for all
Jantina Tammes School of Digital Society, Technology and AI Community JTS Themes A Roadmap to Data Autonomy
Header image Data Autonomy

Data Autonomy and Data Privacy Workshop at Noorderzon Festival

Date:10 August 2023
Author:Dali Fekete

Introduction

I don’t care about my data, I’ve got nothing to hide!
We have probably all heard someone around us say this, or even said it ourselves before learning more about data privacy and data autonomy. Yet, many are still turning a blind eye to privacy issues. In an era where information becomes a commodity, we ought to protect ourselves (and those around us!) from the misuse of our private information. After all, there are certain things we would much rather hide. If for nothing else, then because exposing these attributes - these ‘secrets’ if you will - would affect our lives to a great extent.

AI, as a tool for discrimination

Imagine a scenario, where being from Groningen would have an effect on all aspects of your life: you pay more taxes and insurance than anyone else in the Netherlands, your studies and your advancement in your career are significantly harder, you pay more for medical care and insurance, and you are more likely to get incarcerated - only because you put Groningen as your residence on official forms. From next year, however, geolocation and social media activity will also matter: if you spend more than 35% of the year in the province, you will face the same consequences as those who officially indicate they live here.
Certainly, this is a bizarre example. We might believe that discrimination to this degree does not happen in the Netherlands. Yet, we would not be fully right.

Real life examples

On the one hand, we know that Target’s infamous case of predicting teenage pregnancies in 2012 was not true - but it still shed light on how a change in our consumption habits can signal greater patterns of change in our lives. In this case, Target sent out coupons for baby items to a girl high school. The angry father demanded an apology from Target, only to realize later that his daughter was indeed pregnant.
It is almost like it could be Black Mirror episode - not too far from our current life, but it is upsetting and anxiety-inducing to think about how much information we give away every time we scan our loyalty cards, and what that information could be used for. Stores are likely to know much more about us than we think - however, this data may also be shared with other parties for advertising purposes, for example. Yet, ordinary customers have hardly any say in this - one either scans and gets a discount, or keeps their shopping data and pays full-price.

On the other hand, the recent fiasco with the UWV was very much real, and it revealed that welfare recipients were tracked by internet cookies for much longer than thought. By placing tracking cookies in beneficiaries’ browsers - some of which were connected to their BSN numbers - UWV could gather information about its clients without needing them to log in via DigiD. A welfare applicant reported to NOS that he received a letter from UWV, asking why he was traveling abroad to Germany, when he should be looking for a job here in the Netherlands. UWV has stopped the deployment of this system earlier this year after receiving significant pushback. Still, its existence is proof that sometimes we, Europeans, are watched, too.

Ethical implications

One interesting thing about ‘big data’, however, is that it allows to identify patterns humans can not find. Sometimes it ‘sees’ patterns that we humans do not even notice. And while acquiring good quality datasets that power these systems is hard, creating AI that works with the data is very easy - especially for a big tech company.
How would you react if I told you there was an AI that could tell gay people apart from their heterosexual peers, better than humans can? What are your opinions on creating and deploying algorithms like this? What are the ethical implications? Is this what we want AI to do?

Continue this conversation at Noorderzon?

If you are interested in having a conversation about AI ethics, privacy, data autonomy and data ownership, join our workshop, What Do We Want AI to Do? at Noorderzon on 20 August at the Nova stage. See you there?

decorative image
Illustration: Dali Fekete

About the author

Dali Fekete
Dali Fekete

Comments

Loading comments...
Volg ons optwitter linkedin youtube