Instagram is apparently recommending sexual content to teens as young as 13 years

2 months ago 26

Instagram immediately began suggesting moderately suggestive videos upon the first time the account was logged into, featuring content like women dancing sensually. Things only got more bizarre from there read more

Instagram is apparently recommending sexual content to teens as young as 13 years

During the course of the investigation, the investigators also encountered videos featuring nudity and, in one case, a series of videos about some graphic and explicit sexual acts, within minutes of setting up the account. Image Credit: Pexels

Instagram has reportedly been suggesting explicit Reels to teenagers as young as 13, even when they are not actively searching for such content.

According to a investigative report by Northeastern University professor Laura Edelson and The Wall Street Journal the Meta-owned social media platform has been suggesting sexually explicit reels to teens. During tests conducted primarily between January and April this year, both parties created new accounts with ages set to 13 to examine Instagram’s behaviour.

The findings show that Instagram immediately began suggesting moderately suggestive videos upon the first time the account was logged into, featuring content like women dancing sensually or focusing on their bodies.
Accounts that engaged with these videos by watching them and skipping others soon started receiving recommendations for more explicit content.

Some of the recommended Reels included women mimicking sexual acts or offering to send nude photos in exchange for user comments. During the course of the investigation, the investigators also encountered videos featuring nudity and, in one case, a series of videos about some graphic and explicit sexual acts, within minutes of setting up the account.

Within just 20 minutes of initial engagement, the recommended Reels section became dominated by creators producing sexual content.

In contrast, similar tests conducted on TikTok and Snapchat did not yield recommendations for sexual content to the teenage accounts created on those platforms.

Even after actively searching for age-inappropriate content and following creators known for producing such videos, neither TikTok nor Snapchat suggested such content to the test accounts.
The Wall Street Journal notes that Meta, Instagram’s parent company, had previously identified similar issues through internal research.

Despite these findings, Meta spokesperson Andy Stone dismissed the report, labelling the tests as “artificial experiments” that do not accurately represent how teens use Instagram. He acknowledged Meta’s efforts to decrease sensitive content seen by teens, claiming significant reductions in recent months.

In January, Meta implemented substantial privacy updates aimed at safeguarding teen users, automatically placing them in the platform’s most restrictive control settings, which cannot be opted out of.
Despite these measures, The Wall Street Journal’s tests, conducted post-update, were able to replicate concerning results as recently as June. Meta had introduced these updates shortly after a previous experiment by The Jour

Read Entire Article