How social networks have become radicalization machines favoring the extreme right

By: Elora Bain

On June 10, 2025, President Emmanuel Macron declared that he wanted to ban social networks for those under 15, calling for European legislation on the subject. The association between violence and young people’s access to digital technologies is an inexhaustible commonplace, taken up by each generation at its own time, and which encourages all-out political exploitation, but the current global situation does not make concerns about online radicalization illegitimate.

Our digital life is indexed to algorithms, the rules of which often escape us. The platforms we operate on decide what we see, what we don’t see, try to guess what we would like to see, but above all value emotionally charged, polarizing, sensational and generally radical content. Internet users find themselves trapped in echo chambers tailored to their needs, which encourage confirmation bias.

This breeding ground is particularly fertile for populism, conspiracy and the far right. The Covid-19 pandemic and the explosion of anti-vax discourse that emerged in its wake is a textbook case, which marked a massive online radicalization of people whose initial fear and uncertainties were exploited, pushing them to evolve little by little in increasingly conspiratorial digital bubbles, who quickly aligned themselves with the far right.

In fact, algorithms are not designed to educate us, inform us or balance points of view, but to hold our attention and keep us online as long as possible, explains The Insider magazine. Very cynically, the very content offered on the platforms is none of their business as long as the click machine is running, and it turns out that what makes people click or scroll best is excess.

Create radicalization bubbles

Even before far-right entrepreneur Elon Musk bought Twitter, the company conducted an internal study showing that tweets from right-wing politicians and media outlets were more likely to be amplified than left-wing sources, sometimes by as much as a factor of four. The gap has widened considerably since Musk’s control of the platform, now a stronghold of the global extreme right, and correlatively deserted by many media.

During Germany’s last federal election, TikTok and “This can influence public opinion: repeated exposure to ‘radical content’ actually leads to adhering to similar radical ideas”notes The Insider.

However, this above all shows that the algorithms do not only feed on our personal preferences, but above all index themselves to the editorial lines drawn up by the platforms, in complete opacity.

Online extremism naturally impacts real-world violence, and many incidents, such as the 2022 Buffalo shootings, demonstrate this.

An investigation published by Bellingcat in 2021 already revealed that platforms like YouTube and Telegram serve as recruitment centers for extremist communities. Violent far-right groups, neo-Nazis, anti-immigration groups, incels, etc. each benefit from the gradual shift of an audience, initially coming for content that may seem innocent, and which ends up getting used to the most extreme ideas.

The European regulation on digital services (DSA), which entered into force in February 2024, aims to regulate the activities of platforms and increase the transparency of algorithms, but given the speed of evolution of AI systems, legal reforms are often two trains behind new current practices.

Elora Bain

Elora Bain

I'm the editor-in-chief here at News Maven, and a proud Charlotte native with a deep love for local stories that carry national weight. I believe great journalism starts with listening — to people, to communities, to nuance. Whether I’m editing a political deep dive or writing about food culture in the South, I’m always chasing clarity, not clicks.