Instagram is the first platform used by pedophile networks to promote and sell content. This is the conclusion of a report published Wednesday by Stanford University and the Wall Street Journal (WSJ).
Networks of accounts posing as minors "openly promote the sale of content" child pornography, according to researchers at Stanford University's Cyber Policy Center. Instagram is favored by these networks because of "features such as content recommendation algorithms" and messaging that makes it easy to connect, they added.
Discreetly listed images for sale
The actors of these networks do not need to be very ingenious. According to the WSJ, a simple search with keywords like #pedowhore ("fucking pedo") or #preteensex ("pre-teen sex") leads to accounts that use these terms to advertise content showing sexual abuse of minors.
These profiles are often presented as being "driven by the children themselves" and use overtly sexual pseudonyms. The accounts don't directly say they're selling these images, but they do have menus with options, including asking for specific sex acts in certain cases. "At a certain price, children are available for face-to-face 'meetups,'" according to the researchers.
Regular accusations against Meta
The report highlights the role played by the social network's algorithms. A test account created by the WSJ was "flooded with content that sexualizes children" after clicking on a few such recommendations. Meta did not immediately respond to a request from AFP.
According to the WSJ, the social media giant acknowledged that there were problems within its security services and said it had created a "working group" on the matter. In March 2023, an organization filed a complaint against Meta for "turning a blind eye" to human trafficking and child crime on its platforms. Instagram is also regularly accused of not sufficiently protecting children from the risks of bullying, addiction and self-image problems.
- Child pornography
- Social Media