• On Tuesday, Amnesty International released a report finding that TikTok's "For You" feed sometimes pushes young people towards content that is dangerous for their mental health.
  • Designed to keep users on their platform for as long as possible, social networks tend to offer more and more personalized content.
  • Users who suffer from mental fragility then sometimes find themselves locked in harmful algorithmic bubbles.

Tell me what's on your "For You" page, and I'll tell you who you are. The continuous feed of videos on this TikTok tab is developed according to your interests and the data collected by the Chinese social network on your personality but also your state of mind at the moment. To each his own bubble. Some have an endless parade of adorable canine acrobatics, others are passionate about renovating old furniture. But for users suffering psychologically, the walls of extreme personalization are quickly closing. And the "For you" thread turns into a real trap.

"Bubbles suffered

 »

"Social media is designed to generate as much brain time as possible for advertisers. They make a living from advertising and, by pushing you the most relevant content possible, they make you stay longer," explains Paul Midy, MP for the 5th constituency of Essonne and general rapporteur of the bill to secure the Internet. Because if you love Formula 1 racing, you're much more likely to stay glued to your phone screen if TikTok offers you videos of drifts rather than ballet images. Océane Herrero, journalist and author of The TikTok System, talks about "bubbles suffered". "On TikTok, the app chooses the content for you the first time you use it. The user loses control over what they are going to see and therefore their ability to decide," she notes.

Without a sound, the algorithm adapts to the mood swings or budding passions of its users. "Everything is tracked, right down to the level of well-being of users," warns Katia Roux, Amnesty International's Technology and Human Rights Advocacy Officer. The association published a report on Tuesday in which it accused the "For You" thread of pushing young people towards content that is harmful to their mental health. Michael Stora, psychologist and author of (a)Social Networks: Discover the Dark Side of Algorithms, talks about "cuddly algorithms", designed to show you "videos that correspond to you, that are supposed to make you feel good".

But while users are interested in content related to mental health, in the space of just one hour "many recommended videos idealize, trivialize or even encourage suicide," says Amnesty International, which conducted the test for its report. "TikTok will push users to stay, even if it means offering them harmful content, to push them to keep scrolling," Roux said, accusing the platform of "profiting on people's emotions."

From purring to scarification

"All you have to do is take an interest in a subject for it to be boosted and for it to invade the feed for you of Tik Tok," explains Katia Roux who denounces this "spiral" which, "when it is offered to people in a difficult situation or a fragile mental state, is devastating". Because the spiral obviously does not have the same consequences when it comes to kittens or content promoting scarification. "Teens are experiencing a mental health crisis" and "the data suggests that the rise of social media has played a role" in it, said Sydney Bryn Austin, a professor of social and behavioral sciences at Harvard University who has worked with Amnesty International.

"From 2009 to 2019, teen depression doubled and suicide became one of the leading causes of death among youth aged 10 to 14," she adds. Teenagers are particularly sensitive to this content. "They identify much more in a mirror image because they are fragile at this time," says Michael Stora, who adds that for "a young person in a very fragile state", watching this type of content "can make him sink in turn". "Young people find it more difficult to take a step back and TikTok is a social network they trust," says Océane Herrero. While TikTok's algorithm obviously doesn't "purposely push depressing content to a suicidal person, that's how it works" regardless, Midy points out.

Becoming thinner than an A4 sheet

"The rise of visual social media, such as TikTok or Instagram, has exacerbated the negative impacts on young people, particularly in terms of body image, self-esteem and the risk of developing eating disorders [ED]," says Dr. Austin, who adds that ADHDs have one of the "highest mortality rates of any mental illness." However, social networks regularly give rise to harmful trends, followed by thousands of young people in search of themselves. "Social media doesn't cause anorexia nervosa, but at the time of the pro-ana [pro-anorexia] sites, we realized that some teenage girls sometimes followed these sites to the point of ruining their physical health," recalls Michael Stora, who headed Skyrock's psychological unit for seven years. On TikTok, if a young girl shows interest in content related to CAD, her "For You" page will end up offering her a myriad of them.

The "thigh gap challenge" that encouraged young girls to show the gap between their thighs already advocated thinness. After invading social media, the keyword is now banned from TikTok and, if you type it into the search bar, the app offers you emergency help numbers. "On Instagram, there was also the challenge of placing an A4 sheet of paper in front of her waist and proving that it didn't stick," recalls Océane Herrero, who adds, however, that this quest for thinness, especially for women, "is the bottom of the air" of our society. Social networks act as a mirror and sometimes distort our societal obsessions. Whether they are unhealthy or not.

Hashtags, from prohibition to reinvention

But the echo chambers that social networks have become, on which many of us hang for hours every day, have their share of responsibility. "At the time of the Facebook files [in 2021], several internal studies showed the responsibility of platforms in reinforcing malaise, particularly on the image that young girls have of themselves because of Instagram," recalls Océane Herrero. "In 2019, TikTok said it was now bringing 'exploration' videos to the 'For You' feed," the journalist recalls. Many keywords are now also censored, such as "suicide" or "eating disorders".

"But content creators are hijacking 'ED' for 'Ed Sheeran' [the singer] in order to slip through the cracks of the algorithm," says the author, who adds that blocking hashtags doesn't stop communities from inventing new ones, on the contrary. "TikTok is committed to the safety and well-being of our teen community. We strive to nuanced the complexities of supporting the well-being of our community on our platform," the app told Amnesty International. Contacted by 20 Minutes as part of this investigation, the social network did not respond.

Anti-Bubble Law and Ball Return

Amnesty International is calling on social media companies to move away from over-personalisation and a ban on targeted advertising around the world, at least for young people. In the European Union, it is already available for minors since the entry into force of the new European Digital Services Regulation (DSA). In France, "we passed the digital law that obliges platforms to give us the opportunity to get out of our bubble and, therefore, to offer us a feed that is not based on our preferences", collected by the algorithm, explains Paul Midy who specifies that this feature will be effective from next year.

The MEP urges platforms to apply European law and regulations, in particular by quickly removing illegal content, but also to "better calibrate their algorithms". "If you chat with ChatGPT and ask it for a joke, the AI will do it, but if you ask for a racist joke, it will say no. It's the same for TikTok's algorithm, they can put safeguards in place," Midy said. He concluded: "We have made a lot of progress on these issues. Even though there is still a lot of work to be done, the ball is with the platforms today. »

  • Tech
  • TikTok
  • Mental health
  • Social media
  • ChatGPT
  • Instagram
  • Facebook