About Us

Events

Virtualis

Politics and Algorithms: How a recent Dutch Court Case Shines Light on Social media’s Threat to Democracy

Politics and Algorithms: How a recent Dutch Court Case Shines Light on Social media’s Threat to Democracy

Sophia Welch

December 8, 2025

Introduction 

In October of this year, an interesting case came before the Amsterdam District Court (1). The plaintiffs- a Dutch non-profit advocacy group named ‘Bits of Freedom,’ challenged the defendants, Meta Ireland, over Meta’s recommendation systems and default algorithms in light of the recent Dutch elections. They claimed that the unclear choice between an engagement-based algorithm and a reverse-chronological feed was in violation of Article 25 of the DSA (2) (EU Digital Service Act), which states that companies should not design their platform in a way that would “materially distort or impair the ability of” users “to make free and informed decisions.” But why is it important that Bits of Freedom chose to bring this case particularly in the light of the elections? And will the outcome of the case solve the concerns that the group and wider society have raised?

The Harms of Engagement Based Algorithms Particularly in Politics 

The main issue that emerged in the Meta Ireland case was the automatic and default use of an engagement-based algorithm. This algorithm is used commonly amongst social media platforms and consists of promoting content that, as the name suggests, has the most engagement, whether that be likes, comments, dislikes or reposts. However, the issue with this method of content in social media is that it amplifies “emotionally charged or polarizing content. (3)” This is argued to have further effects such as the spread of misinformation and the creation of “filter bubbles” which limits exposure to diverse and opposing views (4).  The harms of an engagement based algorithm within politics are not simply examined as such from an external perspective- Facebook’s own leaked research suggested that 64% of extremist groups recruits were as a result of Facebook’s own recommendations (5). This is a harrowing insight particularly in a society where new studies show that social media is an increasingly popular way of following current events and news, especially amongst young adults where roughly 84% of 18-24 year-olds say that social media is their main source for following the news (6)

It is with this perspective that it becomes clear why Bits of Freedom were particularly concerned about Meta’s default use of an engagement-based algorithm ahead of the recent Dutch elections. The algorithm has a particular connection to the way people consume political content and the mass consumption of political information. Within a recent conference of the European Parliament, the German federal elections were used as an example of the pressing issues of Meta’s algorithms surrounding politics and elections; they found that 49% of the first five political posts were from the AFD (7), which according to Sasha Havlicek suggests algorithmic amplification (8). 

In a stirring decision, the Amsterdam court seemed to reflect the concerns that Bits of Freedom felt, surrounding the harms of an engagement-based algorithm. The court found that by not allowing users to have a clear choice between engagement and reverse chronological algorithms, that would default back to an engagement algorithm when the app was closed, Meta Ireland had violated the DSA, particularly article 38 which demands that online platforms that use recommender systems should provide at least one option for algorithms. The Netherlands is the first country to deliver a verdict like this, but could this become a stipulation that is required across Europe and Internationally? 

An Algorithm that Authorizes Genocide? 

Internationally we have seen the dire consequences of the effects of an engagement-based algorithm. In the most heinous case we can look to the example of Myanmar (Burma) and a specific minority group of Muslims within the country, known as the Rohingya people. It is now common understanding, admitted by Facebook itself (9), that hate and misinformation spread online empowered a movement of hate so powerful that it led to what the UN have determined a genocide of the Rohingya people (10). This was perpetuated by symptoms of an engagement-based algorithm that pushed polarizing and reactionary content with posts using hate speech comparing the Rohingya people to “pigs” and “dogs (11).” A 2018 report funded by Facebook found that the platform had created an “enabling environment” (12) for the proliferation of human rights. This severe case exemplifies that the algorithm can cause diplomatic and political atrocities internationally. This therefore raises the question as to whether other countries will see the extensive examples of harms the engagement-based algorithm can result in and the precedent set by the Netherlands and implement similar measures to ensure the right to a choice of algorithm. 

Chronological Algorithm- the Solution?

The end goal that was achieved within the case against Meta Ireland in October was that, within the Netherlands, users are now given the right to be able to choose between a reverse chronological (a feed which simply shows content that users are subscribed to in order of latest post) and an engagement based feed, a choice which now must be made clear, easy and with no default. So, will this solve the issues that are evident with the connection between social media and democracy? Perhaps not. 

Josh Tucker of NYU was part of a group who conducted a study (albeit with the help of Meta, although crucially not funded by them), in the context of the 2020 US elections, on the difference between engagement and chronological algorithms. Surprisingly, when comparing the two, the study found that the chronological algorithm had no discernible impact on polarisation, on people’s belief on the legitimacy of the election, or people’s participation in the election (13). Indeed, Laura Edelson believes that focusing on the question of engagement or chronological algorithms creates a “false choice” for users (14). She instead believes that there is actually a much wider choice of algorithm options that we can choose from to consume media and hopefully reduce harm politically. She created a typography of algorithms with characteristics such as (in)direct connection to users, intensity of content, timeliness of content, novelty etc. So surely, the solution to the diplomatic issue must lie within one of these perfect combinations of algorithm characteristics. Unfortunately, as Edelson states, “we do not necessarily know how to make things better (15).” And perhaps this can partly be blamed on social media companies themselves. 

The so-called KGI recommendations published earlier this year, put forward that one of the major issues that needed to be resolved surrounding algorithm harms, was “detailed transparency”(16). This call has been echoed by Sasha Havlicek within the European Parliament; “we urgently need standardised metrics, benchmarks and transparency around reporting methodology (17).” How would this transparency help? Detailed and thorough reports of platform metrics and data could help academics, such as Tucker and Edelson in researching and conducting studies more expansively to come to a solution that helps solve the issues of social media’s harms to democracy. Edelson specifically stated that in building her typography, due to a lack of resources, she had to rely heavily on documents such as the leaked Facebook files, referred to earlier in this piece. This indicates that if it was mandatory for all platforms to have to disclose such information, research could be much more free-flowing and a solution found more easily.

Conclusion 

While the recent case in Amsterdam is a positive development in acknowledging the harms that arise within an engagement-based feed particularly surrounding politics, recent research may highlight that the outcome of the case may not necessarily solve the harms and issues put forward by Bits of Freedom. If we are to see impactful difference, it seems like it must come from the social media companies themselves and stricter transparency rules in legislation governing them, such as the DSA.


Bibliography

1. C/13/774725 / KG ZA 25-687 MK/JD

2. Digital Services Act 

3. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy? (EUDS committee meeting, 12th May 2025)

4. N. Alnemr, Deliberative Democracy in an Algorithmic Society: Harms, Contestations and Deliberative Capacity in the Digital Public Sphere, (Taylor and Francis Group, 2nd July 2025)

5. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy?, (EUDS committee meeting, 12th May 2025)

6. Members Research Service, Young People and the News, (European Parliament, 9th November 2024.)

7. Alternative für Deutschland, German Political Party

8. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy?, (EUDS committee meeting, 12th May 2025)

9. BBC, Facebook admits it was used to ‘incite offline violence’ in Myanmar, (6th November 2018)

10. A/HRC/39/CRP.2 at ¶1441

11. Neema Hakim, How Social Media Companies Could Be Complicit in Incitement to Genocide, 21 CHI. J. INT’l L. 83 (2020)

12. Alex Warofka, An Independent Assessment of the Human Rights Impact of Facebook in Myanmar, (November 5, 2018)

13. Joshua Tucker, How Algorithms and AI are Reshaping Democracy- Antitrust and Competition Conference, (11th April 2025)

14. Laura Edelson speaking at Ibid

15. Ibid

16. Moehring et al., Better Feeds: Algorithms That Put People First, (Knight Georgetown Institute, March 2025)

17. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy?, (EUDS committee meeting, 12th May 2025)



Introduction 

In October of this year, an interesting case came before the Amsterdam District Court (1). The plaintiffs- a Dutch non-profit advocacy group named ‘Bits of Freedom,’ challenged the defendants, Meta Ireland, over Meta’s recommendation systems and default algorithms in light of the recent Dutch elections. They claimed that the unclear choice between an engagement-based algorithm and a reverse-chronological feed was in violation of Article 25 of the DSA (2) (EU Digital Service Act), which states that companies should not design their platform in a way that would “materially distort or impair the ability of” users “to make free and informed decisions.” But why is it important that Bits of Freedom chose to bring this case particularly in the light of the elections? And will the outcome of the case solve the concerns that the group and wider society have raised?

The Harms of Engagement Based Algorithms Particularly in Politics 

The main issue that emerged in the Meta Ireland case was the automatic and default use of an engagement-based algorithm. This algorithm is used commonly amongst social media platforms and consists of promoting content that, as the name suggests, has the most engagement, whether that be likes, comments, dislikes or reposts. However, the issue with this method of content in social media is that it amplifies “emotionally charged or polarizing content. (3)” This is argued to have further effects such as the spread of misinformation and the creation of “filter bubbles” which limits exposure to diverse and opposing views (4).  The harms of an engagement based algorithm within politics are not simply examined as such from an external perspective- Facebook’s own leaked research suggested that 64% of extremist groups recruits were as a result of Facebook’s own recommendations (5). This is a harrowing insight particularly in a society where new studies show that social media is an increasingly popular way of following current events and news, especially amongst young adults where roughly 84% of 18-24 year-olds say that social media is their main source for following the news (6)

It is with this perspective that it becomes clear why Bits of Freedom were particularly concerned about Meta’s default use of an engagement-based algorithm ahead of the recent Dutch elections. The algorithm has a particular connection to the way people consume political content and the mass consumption of political information. Within a recent conference of the European Parliament, the German federal elections were used as an example of the pressing issues of Meta’s algorithms surrounding politics and elections; they found that 49% of the first five political posts were from the AFD (7), which according to Sasha Havlicek suggests algorithmic amplification (8). 

In a stirring decision, the Amsterdam court seemed to reflect the concerns that Bits of Freedom felt, surrounding the harms of an engagement-based algorithm. The court found that by not allowing users to have a clear choice between engagement and reverse chronological algorithms, that would default back to an engagement algorithm when the app was closed, Meta Ireland had violated the DSA, particularly article 38 which demands that online platforms that use recommender systems should provide at least one option for algorithms. The Netherlands is the first country to deliver a verdict like this, but could this become a stipulation that is required across Europe and Internationally? 

An Algorithm that Authorizes Genocide? 

Internationally we have seen the dire consequences of the effects of an engagement-based algorithm. In the most heinous case we can look to the example of Myanmar (Burma) and a specific minority group of Muslims within the country, known as the Rohingya people. It is now common understanding, admitted by Facebook itself (9), that hate and misinformation spread online empowered a movement of hate so powerful that it led to what the UN have determined a genocide of the Rohingya people (10). This was perpetuated by symptoms of an engagement-based algorithm that pushed polarizing and reactionary content with posts using hate speech comparing the Rohingya people to “pigs” and “dogs (11).” A 2018 report funded by Facebook found that the platform had created an “enabling environment” (12) for the proliferation of human rights. This severe case exemplifies that the algorithm can cause diplomatic and political atrocities internationally. This therefore raises the question as to whether other countries will see the extensive examples of harms the engagement-based algorithm can result in and the precedent set by the Netherlands and implement similar measures to ensure the right to a choice of algorithm. 

Chronological Algorithm- the Solution?

The end goal that was achieved within the case against Meta Ireland in October was that, within the Netherlands, users are now given the right to be able to choose between a reverse chronological (a feed which simply shows content that users are subscribed to in order of latest post) and an engagement based feed, a choice which now must be made clear, easy and with no default. So, will this solve the issues that are evident with the connection between social media and democracy? Perhaps not. 

Josh Tucker of NYU was part of a group who conducted a study (albeit with the help of Meta, although crucially not funded by them), in the context of the 2020 US elections, on the difference between engagement and chronological algorithms. Surprisingly, when comparing the two, the study found that the chronological algorithm had no discernible impact on polarisation, on people’s belief on the legitimacy of the election, or people’s participation in the election (13). Indeed, Laura Edelson believes that focusing on the question of engagement or chronological algorithms creates a “false choice” for users (14). She instead believes that there is actually a much wider choice of algorithm options that we can choose from to consume media and hopefully reduce harm politically. She created a typography of algorithms with characteristics such as (in)direct connection to users, intensity of content, timeliness of content, novelty etc. So surely, the solution to the diplomatic issue must lie within one of these perfect combinations of algorithm characteristics. Unfortunately, as Edelson states, “we do not necessarily know how to make things better (15).” And perhaps this can partly be blamed on social media companies themselves. 

The so-called KGI recommendations published earlier this year, put forward that one of the major issues that needed to be resolved surrounding algorithm harms, was “detailed transparency”(16). This call has been echoed by Sasha Havlicek within the European Parliament; “we urgently need standardised metrics, benchmarks and transparency around reporting methodology (17).” How would this transparency help? Detailed and thorough reports of platform metrics and data could help academics, such as Tucker and Edelson in researching and conducting studies more expansively to come to a solution that helps solve the issues of social media’s harms to democracy. Edelson specifically stated that in building her typography, due to a lack of resources, she had to rely heavily on documents such as the leaked Facebook files, referred to earlier in this piece. This indicates that if it was mandatory for all platforms to have to disclose such information, research could be much more free-flowing and a solution found more easily.

Conclusion 

While the recent case in Amsterdam is a positive development in acknowledging the harms that arise within an engagement-based feed particularly surrounding politics, recent research may highlight that the outcome of the case may not necessarily solve the harms and issues put forward by Bits of Freedom. If we are to see impactful difference, it seems like it must come from the social media companies themselves and stricter transparency rules in legislation governing them, such as the DSA.


Bibliography

1. C/13/774725 / KG ZA 25-687 MK/JD

2. Digital Services Act 

3. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy? (EUDS committee meeting, 12th May 2025)

4. N. Alnemr, Deliberative Democracy in an Algorithmic Society: Harms, Contestations and Deliberative Capacity in the Digital Public Sphere, (Taylor and Francis Group, 2nd July 2025)

5. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy?, (EUDS committee meeting, 12th May 2025)

6. Members Research Service, Young People and the News, (European Parliament, 9th November 2024.)

7. Alternative für Deutschland, German Political Party

8. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy?, (EUDS committee meeting, 12th May 2025)

9. BBC, Facebook admits it was used to ‘incite offline violence’ in Myanmar, (6th November 2018)

10. A/HRC/39/CRP.2 at ¶1441

11. Neema Hakim, How Social Media Companies Could Be Complicit in Incitement to Genocide, 21 CHI. J. INT’l L. 83 (2020)

12. Alex Warofka, An Independent Assessment of the Human Rights Impact of Facebook in Myanmar, (November 5, 2018)

13. Joshua Tucker, How Algorithms and AI are Reshaping Democracy- Antitrust and Competition Conference, (11th April 2025)

14. Laura Edelson speaking at Ibid

15. Ibid

16. Moehring et al., Better Feeds: Algorithms That Put People First, (Knight Georgetown Institute, March 2025)

17. S. Havlicek, European Parliament, How is the manipulation of algorithms and online platforms affecting democracy?, (EUDS committee meeting, 12th May 2025)



Become a Member

Partners

KVK number: 86554336

© 2025 DSLA All rights reserved.

Become a Member

Partners

KVK number: 86554336

© 2025 DSLA All rights reserved.

Become a Member

Partners

KVK number: 86554336

© 2025 DSLA All rights reserved.

Become a Member

Partners

KVK number: 86554336

© 2025 DSLA All rights reserved.