Education Research Current About VU Amsterdam NL
Login as
Prospective student Student Employee
Bachelor Master VU for Professionals
Exchange programme VU Amsterdam Summer School Honours programme VU-NT2 Semester in Amsterdam
PhD at VU Amsterdam Research highlights Prizes and distinctions
Research institutes Our scientists Research Impact Support Portal Creating impact
News Events calendar Woman at the top
Israël and Palestinian regions Culture on campus
Practical matters Mission and core values Entrepreneurship on VU Campus
Organisation Partnerships Alumni University Library Working at VU Amsterdam
Sorry! De informatie die je zoekt, is enkel beschikbaar in het Engels.
This programme is saved in My Study Choice.
Something went wrong with processing the request.
Something went wrong with processing the request.

The Digital Services Act: how the EU is tackling Big Tech

Online platforms like Instagram, TikTok and Amazon have become integral to our daily lives. But their growing influence brings significant challenges: the spread of misinformation, harmful content and the collection of vast amounts of user data. How can we ensure that these platforms act responsibly?

The Digital Services Act (DSA) aims to bring about change. Sarah Eskens, expert in digital legislation at VU Amsterdam, explains what this law entails and why it’s so important.

What does the DSA regulate?

The DSA is a law that imposes rules on online platforms (such as online marketplaces, social networks and video platforms) to create a safer and more transparent online environment. 'Its goal is not for the EU to dictate what people can or cannot say,' explains Eskens. 'But the law requires platforms to act fairly and consistently when moderating content.' This means, for example, that users who repeatedly share illegal content can be temporarily suspended, and that large platforms must follow clear procedures when removing posts.

For “very large online platforms” (like Facebook, Amazon and TikTok), additional rules apply. 'They must annually assess whether their services pose risks, such as to elections or the mental health of young people,' says Eskens. 'If such risks exist, they must take measures.'

The DSA does not require platforms to remove all harmful or misleading content. 'The law only includes a few rules regarding illegal content,' Eskens emphasises. 'And not everything that is harmful or misleading is illegal. Moreover, the DSA does not impose an immediate obligation to remove illegal content. If a platform receives notice of this and fails to remove the content, it may be held liable under national law – not under the DSA itself. The DSA functions more as an incentive than a strict obligation.'

Protection against harmful content and AI moderation

An important difference from previous legislation, such as the General Data Protection Regulation (GDPR), is that the DSA is not focused on privacy, but on content moderation. 'The law does not determine which content is banned, but forces platforms to be transparent about how they moderate it. This is particularly relevant now that AI systems are increasingly being used to automatically detect and remove content.'

'If platforms use AI for moderation, they must be transparent about how it works,' Eskens explains. 'Users and other third parties must be able to verify whether AI is being used fairly. Furthermore, the DSA requires platforms to consider freedom of expression in content moderation. AI should not lead to the disproportionate removal of content posted by certain groups. Platforms must therefore actively prevent their systems from being discriminatory.'

Personalised ads and fair online advertising

Another notable aspect of the DSA is the ban on personalised advertisements based on sensitive data, such as religion, political preferences or sexual orientation. 'The idea behind this rule is that using such sensitive data allows people to be targeted based on their vulnerabilities, which may improve the advert’s ‘hits’, but which potentially manipulate someone into buying or believing something that’s not in their best interest.'

Regulation of Big Tech: sufficient or not enough?

Although the DSA is a significant step forward, it’s not a comprehensive solution. 'That’s why there’s also the Digital Markets Act (DMA),' says Eskens. 'This law focuses more on the powerful market position of Big Tech, and includes rules to ensure that they don’t abuse their power. For instance, measures against favouring their own products in search results.'

But challenges still remain. 'Enforcement is crucial,' emphasises Eskens. 'Recent incidents, such as the manipulation of TikTok’s algorithm in the Romanian elections – where the election result was declared invalid – show that online platforms are far from ready to properly implement the DSA.'

The role of scientific research

Research at universities, such as Vrije Universiteit Amsterdam, helps identify potential pitfalls in the DSA. 'One important point, for example, is the approach to disinformation,' Eskens says. 'Disinformation can be harmful, but it’s not always illegal. If platforms are required to take action against it, this may pose risks to freedom of expression.'

Scientists are analysing how platforms maintain this balance, and whether the DSA works as intended in practice. The coming years will show whether the law is effective enough or whether additional regulation is needed. 'I suspect that more rules will be added in the future. But for now, it’s especially important that existing laws are properly implemented and enforced.'

Quick links

Homepage Culture on campus VU Sports Centre Dashboard

Study

Academic calendar Study guide Timetable Canvas

Featured

VUfonds VU Magazine Ad Valvas Digital accessibility

About VU

Contact us Working at VU Amsterdam Faculties Divisions
Privacy Disclaimer Veiligheid Webcolofon Cookies Webarchief

Copyright © 2025 - Vrije Universiteit Amsterdam