Skip to ContentSkip to Navigation
Lustrum: making connections
Lustrum: making connections Coaster campaign

Are drugs legal in the Netherlands?

As read on a coaster
Answer by Michelle Bruijn, associate professor of the Faculty of Law at the University of Groningen

Contrary to what many people think, drugs are not legal in the Netherlands. However, we are tolerant of soft drugs, such as cannabis. The government tolerates the possession of five cannabis plants and five grams of soft drugs, and cannabis cafés (coffee shops) are allowed to sell small quantities under strict conditions.

The tolerance in the Netherlands towards soft drugs is called the policy of tolerance. Under this policy, certain offences under the Opium Act are not actively prosecuted, such as possession of a user quantity of drugs. This includes five grams of soft drugs, five cannabis plants, half a gram of hard drugs, or one pill. One of the best-known parts of the policy of tolerance is the sale of small amounts of cannabis in coffee shops. Coffee shops are allowed to sell a maximum of five grams of cannabis per person per day to people over the age of 18. They are also allowed to stock a maximum of 500 grams of cannabis. This policy of tolerance was introduced in the 1970s to separate the markets of hard and soft drugs and thus reduce the health risks of drug use, reduce crime, and prevent public nuisance.

But there is a strange side to this policy of tolerance. Sales in coffee shops are tolerated, but growing cannabis is prohibited. This means that coffee shop owners are currently forced to supply their shop illegally. For years, a debate has been going on in the Netherlands about whether cannabis cultivation should be legalized. Proponents say that legalization protects health because it makes it easier to monitor quality. In addition, crime is reduced because coffee shop owners no longer need to deal with criminals. Opponents, on the other hand, say legalization poses health risks and leads to an increase in drug use.

In 2020, the government launched an experiment called ‘the closed coffee shop chain’. This is an experiment in 11 municipalities with the aim to investigate whether it is possible to control cannabis growth and supply to coffee shops. Legislation has been amended for this purpose, so that production, distribution, and sale within the 11 municipalities are no longer subject to prosecution. The experiment had a long preparation phase, but the first regulated cannabis has been on sale in coffee shops in Breda and Tilburg since the end of 2023. The other participating municipalities will follow during 2024.

Although the Netherlands was long known for its liberal soft drug policy, other countries have since overtaken us. For example, cannabis cultivation has been legalized in some states in America, as well as in Canada and Uruguay. Our neighbours to the east have also taken an important step towards a more liberal cannabis policy, and from 1 April 2024, adults in Germany are allowed to possess 50 grams of cannabis and grow three plants for their own use.

decorative image

Who is Michelle Bruijn?

Michelle Bruijn is an associate professor of the Faculty of Law at the University of Groningen. Her main research areas are public order, security, and health, with a particular focus on the regulation of cannabis, tackling drug-related (organized) crime, drug-related home evictions, and protecting the right to housing.

Read more science quotes?

Am I being unwittingly politically influenced?

Political parties use personalized ads on social media to get your vote: so-called microtargeting. This means that you might get to see a different message than your neighbour. While this is not done as surreptitiously here as in the United States (think Cambridge Analytica), regulation may still be wise to ensure a transparent debate.

Political parties will stop at almost nothing to win votes. These days, personalized ads on social media are one way in which they do this. The underlying technique is known as microtargeting: using data analytics to present a tailored campaign message to a specific target audience. For example, a senior citizens’ party might specifically target the over-50s with a message about the desirability of a new pension system. In this way, there is something for everyone.

Since the scandal surrounding the British data analytics firm Cambridge Analytica, concerns about microtargeting have increased. The company allegedly leveraged the personal data of millions of unwitting Facebook users to take advantage of the psychological characteristics of voters, providing services to both Trump and the Brexiteers. In the Netherlands, no such covert influence operations have yet taken place, for several reasons. For instance, political parties in the Netherlands have modest campaign coffers compared to those in the United States, there are strict European data protection rules, and one vote in our multi-party system does not make the difference that it does in the duel for the White House. So smaller budgets, limited opportunities, and reduced incentive. No need for panic then.

Nevertheless, targeted political ads on social media are also a popular tool in the Netherlands. This does carry risks: there is a danger that voters will only hear one side of the story, and that political opponents will not have the opportunity to contradict it. Part of the debate is therefore conducted behind the scenes. In the worst case, conflicting promises may even be made to different voters that cannot be met at the same time. Microtargeting can thus encourage voter fraud.

The question is, therefore, whether the government should introduce rules to regulate this. These should not go so far as to impact the freedom of expression of political parties, and so restraint is called for, but at the same time, the code of conduct jointly adopted by political parties and platforms seems a bit too informal. It is up to the legislator to find an effective solution that also respects fundamental rights. In any case, transparency requirements could be considered: microtargeting is allowed, as long as parties are honest about it. It should then be clear to voters who is reaching out to them, why, and at what cost. However, the announced Dutch law may come a little late – the European Union has moved a lot faster in this area.

Should the government do more to protect my data?

In a world increasingly shaped by technology, governments must increase efforts to secure and enforce protection of your data. As digital divides emerge, it is vital to ensure fairness in resource and power distribution. Laws enable the reflection of societal values in technology, foster responsible innovation, and enhance broad prosperity.

Technology and data play an increasingly important role in our lives. The speed with which cutting-edge applications based on artificial intelligence and similar breakthroughs become part of our everyday lives is hard to keep up with. This also has an impact on the make-up of our society – there are those among us who enjoy the benefits and opportunities of new devices and services, but there are also those who prefer to stick to the old ways of doing things.

In the public sector, this division has implications for how we interact with the government - for example, by using digital identity services such as DigiD, or by filing our income tax online. In the private sector, the last three decades have seen the meteoric rise of a handful of companies that dominate the digital society, cloud infrastructures, or the machines needed to make computer chips. All of this has implications for the distribution of power and our autonomy, for who has influence and who does not, for how wealth is generated and distributed, and for the prospects we have as individuals and as a society.

Laws and regulation are one way of responding to these developments. But they can also be seen to shape a better future going forward. In democratic countries - those that respect human rights and the rule of law - they allow to have a legitimate, multi-stakeholder debate that leads to binding rules. These rules define clear obligations to behave responsibly with respect to the technologies, as well as rights for those who are potentially harmed. In this sense, the question is not whether regulations and laws promote or stifle innovation. We should use them to negotiate the principles we want to see embedded in technology, the process by which innovations become part of our everyday lives in a responsible and enabling way, and how the opportunities that technology and data bring can create prosperity for all.

If interpreted in this way, the government should do more to protect (y)our data.

Will AI steal my job?

The short answer is “maybe”. Artificial intelligence (AI) is at the core of a transformative revolution affecting all aspects of society. AI-based tools are already integrated in our daily lives (look at your phone!). AI will have disruptive effects but it will also contribute to creating new jobs. The impact of this transformation is ultimately a matter of governance and politics, where us (the users and the society) are put first rather than the interests of a few private corporations.

The most honest answer is “maybe”. Alarmist claims and exaggerations on the potential and the impact of AI have appeared regularly in the press since the ‘70s - yet, most of the  jobs are still here, and many new types of jobs have been created.

The most recent wave of AI tools, such as ChatGPT or DALL-E, is based on Large (Language) Models, also called Foundation Models. These tools have reached impressive results and are already part of our daily lives: for instance, Google has integrated a language model (BERT) in its search engine  to improve its results as of 2019. The anthropomorphization of these machines, described as “intelligent”, “sovrahuman”, able to “understand”, is more marketing than reality. Their strength is also their weakness: as soon as they are applied to data or tasks that are a bit off with respect to the original purpose they were trained for, their performance drastically drops.

We are in the midst of a transition driven by AI technologies, and transitions have disruptive effects. When automation was introduced in the car industry, jobs were lost but new ones were created. The integration of AI in our lives must be planned as a support to do better by understanding what skills we want to preserve and which ones we might concede to AI rather than replace us. A recent paper appeared in Science on the use of ChatGPT in the context of professional writing tasks has shown that those who used ChatGPT (in this experimental context) increased their productivity, reducing the time spent on “boring” tasks and shifting it to more creative ones. However, the study did not show a leveling effect: mediocre writers did not improve, and neither did the good writers.

Rather than “stealing”, AI will definitely change our jobs in ways that are unpredictable now. Like any other innovation, AI may make some jobs obsolete, but we must work for a positive outcome: even coding will change, meaning that nowadays software engineers may be jobless in the future. We need to remember that every transition has a cost, and it is our duty as a society to decide how this transition happens and protect the most vulnerable. In the words of prof. Luciano Floridi, we must make AI “work against wrong doing, support human responsibility, and make us more humans”.

UG Lustrum:

Celebrate 410 years of science with us!

This campaign is part of the UG Lustrum. The University of Groningen is celebrating its 410th anniversary this year. We are celebrating this with a Lustrum.

decorative image
Last modified:07 May 2024 5.35 p.m.
View this page in: Nederlands