top of page

What if Russia Wasn't the Biggest Threat to Our Democracies?

Updated: Apr 5

By Manoel Chavanne



A quick warning to begin, I could not find today's book in English, so I'm not sure it's been translated in Shakespeare's language yet, but it certainly should. The book is called Toxic Data – How The Networks Manipulate Our Opinions by David Chavalarias and it was originally published in French in 2022. Chavalarias is a mathematician director researcher at the CNRS (National Center for Scientific Research – original acronym in French) which is the largest fundamental science agency in Europe, employing tens of thousands of scientists. In the prestigious SCImago Institutions Rankings, the CNRS has consistently been ranked in the global top 3 every single year and was even number 1 for 8 consecutive years this previous decade, ahead of Harvard, Stanford, MIT, and other notable institutions.


Given its title, the main thesis of the book is that social media is manipulating our opinions in ways that are favoring illiberal democratic ideas, values and parties as well as alt-right populists. Having already led to some of these leaders winning elections, the author predicts that, unless strictly controlled, these networks will lead to the end of liberal democracies as we know them. To reach such a scary conclusion, the author goes through rigorous empirical analyses that he conducted between 2016 and 2022, reviewing millions of posts on social media in various countries on several continents. This publication is a walk-through of his findings after meticulous years of research.


Chavalarias also demonstrates that because the networks aren't geographically bound, it is possible for foreign actors to influence national elections. A whole chapter is dedicated to the US alt-right, bolstered by Trump's 2016 victory, making a big push to influence the French presidential election in 2017. In this specific case, these efforts failed, but this astroturfing (Merriam Webster definition coming up) is significantly increasing the probabilities of success and as mathematical laws dictate, a probable event becomes a certainty with time!

Astroturfing: “organized activity that is intended to create a false impression of a widespread, spontaneously arising, grassroots movement in support of or in opposition to something (such as a political policy) but that is in reality initiated and controlled by a concealed group or organization (such as a corporation)”.


The above graph comes from the book, I've added the red circle to highlight the

astroturfing by the US alt-right. At the top of the graph you see the percentage

of tweets coming from America, in the middle those originating in

France and at the bottom those stemming from the rest of Western Europe.


In further chapters, the author explains a few important scientific concepts such as homophily which dictates that humans have a tendency to bond with similar others. This wouldn't be a problem per se, but it becomes one under the influence of social media. Previously we found similar others in real life which meant people who play the same sport or enjoy similar art forms or have identical jobs. In other words, these interactions were bounded by geography and based on socio-demographic factors. With social media, people align on values and therefore stop having real friendships, or any real positive relationships, with people with different values. This leads to the creation and reinforcement of echo chambers and filter bubbles. If you want a good joke explaining why these chambers and bubbles are a problem, give Australian comedian Jim Jefferies 45 seconds of your time and click here starting the video at 5:00, language warning though.


Additionally, this brilliant essay covers several cognitive biases exploited by social media algorithms. For instance, the confirmation bias “is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.” The negativity bias which “even when positive or neutral things of equal intensity occur, things of a more negative nature (e.g. unpleasant thoughts, emotions, or social interactions; harmful/traumatic events) have a greater effect on one's psychological state and processes than neutral or positive things.” The author even gives an example to describe how these might play out when competing with each other. Imagine someone who believes vaccines are a good thing and sees the following two headlines:

  • The Covid-19 vaccine is 90% effective against severe forms of the disease.

  • The Covid-19 vaccine can make one sterile.

It has been scientifically proven that humans are way more likely to click on the second headline, often followed by other dangers linked to getting a vaccine but not do any research on the risks associated with not getting the vaccine. The algorithm feeds on our fears by over-representing them, leading us to be more scared, click on more worrisome content (whether true or false is irrelevant to the algorithm) and we have another positive feedback loop. Said feedback loops aren't called “positive” because they are good but rather because they are self-reinforcing. A leads to more B and in turn B leads to more A...


In the above graph, also taken from the book (and translated by me) you can clearly see that the number of fake news linked to the right-wing candidates (Fillon and Le Pen) are much higher. Candidates are listed from left to right on the graph and on the political spectrum.


Another chapter explains the impact of this negativity bias in politics using the revelations of Frances Haugen, a former Facebook employee who made the news and was invited to testify in front of the US Congress in 2021 after she disclosed tens of thousands of Facebook internal documents highlighting what the company was/is doing. In a CBS interview that year she said, “One of the most shocking pieces of information that I brought out of Facebook that I think is essential to this disclosure is political parties have been quoted, in Facebook's own research, saying, we know you changed how you pick out the content that goes in the home feed," said Haugen. "And now if we don't publish angry, hateful, polarizing, divisive content, crickets. We don't get anything. And we don't like this. We know our constituents don't like this. But if we don't do these stories, we don't get distributed. And so it used to be that we did very little of it, and now we have to do a lot of it, because we have jobs to do. And if we don't get traffic and engagement, we'll lose our jobs.”


Let's be fair to Facebook here, it is not really their fault, this is simply due to the fact that if your business model is based on people spending time on your platform your algorithm will eventually take this route. It's a combination of human nature and the business model of all social media. The implications are unfortunately disastrous, either governments force the networks to change their business model or we will all end up losing liberal democracy. Haugen pretty much says so in another great quote by her “My fear is that without action, divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar and are now seeing Ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it. Congress can change the rules that Facebook plays by and stop the many harms it is now causing.”


Targeted advertising is a significant part of the predicament we find ourselves into. Advertisers, through the data gathered by internet behemoths, can now select from more than 250,000 attributes they have about social media users. The usual ones like age, gender, location etc., but also more obscure ones like “is far from his/her family” or “knows a man who's birthday is coming in the next week”. If this was only used to sell the latest pair of sneakers or a new deodorant, it'd be one thing, but this data is used to manipulate political opinions and election outcomes. This also isn't limited to Twitter, Facebook, Tik Tok and the likes; search results as they appear on Google and the predictive text Google suggests also manipulate what you end up seeing and reading.


Unfortunately, a familiar enemy often featured in these columns had understood everything I've explained, and more, several years ago and has been using our biases and social media to manipulate a non-negligible, and likely growing, number of people, the Kremlin. It uses what it calls the “4D” strategy. Discredit, if you don't like what the West is stating, insult them. Disinformation, if you don't like the facts, make your own “alternative facts”. Distraction, if you're accused of something accuse someone else of the same thing. Dissuasion, if you don't like what someone is preparing, scare or threaten them. Chavalarias adds a “5th D”, adding that it might be the most important one, Doubt.


Using France as an example, the author shows that at the end of 2021, ten of the twelve most popular political Twitter accounts in France were far-right politicians or pundits, and almost all of them had publicly supported Putin's politics. There's also a long passage on China owning Tik Tok and the data it gathers from its users, but I'll spare you these details as this article is already quite long. They also want you to keep thinking that you are powerless individually even though the direction we're headed is only determined by the sum of our individual actions.


Two additional important concepts the author discusses are Brandolini's law, which states that “The amount of energy needed to refute bullsh@t is an order of magnitude bigger than that needed to produce it.” The second one, and maybe the most important, but also depressing one, is the Illusory Truth Effect. This one “is the tendency to believe false information to be correct after repeated exposure.” So the more we hear a lie, the more likely we are to believe it.


Researchers have also proved that familiarity overpowers rationality. Meaning that “repetitively hearing that a certain statement is wrong, can paradoxically cause it to feel right.” In other words, given that it's 10 times easier to produce lies than facts and the power of familiarity, the only real solution is to somehow not get exposed to the lies in the first place. Limit their reach means limiting their impact.


Now that I, and Chavalarias, must have completely discouraged you, let's try to end this article as he finishes the book, with potential ways to do something about the aforementioned issues. The author lists 18 of them at the end of the book, some are the obvious individual (but often not-so-effective) actions such as check the sources, keep your critical hat on at all times, identify users publishing falsehoods and stop following them, keep a watchful eye on your emotions (if a headline excites you too much it might be false, read more about it before believing, or even worse sharing, it), turn your notifications off.


Others are collective actions that could only be implemented by governments and that unfortunately are unlikely to happen any time soon. Such as, improve the educational system, finance more research, create a public space online where everything is currently privatized and don't allow a private corporation to own the data of so many users, control and verify regularly what the algorithms are doing.


I'll add one more that the author hasn't dared touch but that would definitely be the most effective one if we actually did it: Stop using these platforms and find other ways to communicate with people. Parents managed to take their daughter to soccer practice before there was a Facebook group dedicated to it, friends managed to meet each other for dinner before Instagram and Twitter existed. I understand these platforms can make things a bit more convenient, but if the price of living in a democracy is only to send your friends a text to ask a question, it's really not that much. This disastrously is the choice we are facing, as brilliantly demonstrated by Chavalarias in this book- keep using these social media platforms and lose liberal democracy or accept minor inconveniences in our daily lives and hopefully keep our democracies as a result.

bottom of page