Tuesday, February 3, 2026
ADVT 
National

Instagram's 'deliberate design choices' make it unsafe for teens despite Meta promises, report says

Darpan News Desk The Canadian Press, 26 Sep, 2025 09:50 AM
  • Instagram's 'deliberate design choices' make it unsafe for teens despite Meta promises, report says

Despite years of congressional hearings, lawsuits, academic research, whistleblowers and testimony from parents and teenagers about the dangers of Instagram, Meta's wildly popular app has failed to protect children from harm, with “woefully ineffective" safety measures, according to a new report from former employee and whistleblower Arturo Bejar and four nonprofit groups. 


Meta’s efforts at addressing teen safety and mental health on its platforms have long been met with criticism that the changes don’t go far enough. Now, the report's authors claim Meta has chosen not to take “real steps” to address safety concerns, “opting instead for splashy headlines about new tools for parents and Instagram Teen Accounts for underage users.” 


The report Thursday came from Bejar and the Cybersecurity For Democracy at New York University and Northeastern University, as well as the Molly Rose Foundation, Fairplay and ParentsSOS.
Meta said the report misrepresents its efforts on teen safety. 


The report evaluated 47 of Meta's 53 safety features for teens on Instagram, and found that the majority of them are either no longer available or ineffective. Others reduced harm, but came with some "notable limitations,” while only eight tools worked as intended with no limitations. The report's focus was on Instagram's design, not content moderation. 


“This distinction is critical because social media platforms and their defenders often conflate efforts to improve platform design with censorship,” the report says. “However, assessing safety tools and calling out Meta when these tools do not work as promised, has nothing to do with free speech.

Holding Meta accountable for deceiving young people and parents about how safe Instagram really is, is not a free speech issue.” 


Meta called the report “misleading, dangerously speculative” and said it undermines "the important conversation about teen safety.


“This report repeatedly misrepresents our efforts to empower parents and protect teens, misstating how our safety tools work and how millions of parents and teens are using them today. Teen Accounts lead the industry because they provide automatic safety protections and straightforward parental controls,” Meta said. "The reality is teens who were placed into these protections saw less sensitive content, experienced less unwanted contact, and spent less time on Instagram at night. Parents also have robust tools at their fingertips, from limiting usage to monitoring interactions. We’ll continue improving our tools, and we welcome constructive feedback — but this report is not that.”


Meta has not disclosed what percentage of parents use its parental control tools. Such features can be useful for families in which parents are already involved in their child’s online life and activities, but experts say that’s not the reality for many people.


New Mexico Attorney General Raúl Torrez — who has filed a lawsuit against Meta claiming it fails to protect children from predators — said it is unfortunate that Meta is “doubling down on its efforts to persuade parents and children that Meta’s platforms are safe — rather than making sure that its platforms are actually safe.” 


The authors created teen test accounts as well as malicious adult and teen accounts that would attempt to interact with these accounts in order to evaluate Instagram's safeguards. 


For instance, while Meta has sought to limit adult strangers from contacting underage users on its app, adults can still communicate with minors “through many features that are inherent in Instagram’s design,” the report says. In many cases, adult strangers were recommended to the minor account by Instagram’s features such as reels and “people to follow." 


“Most significantly, when a minor experiences unwanted sexual advances or inappropriate contact, Meta’s own product design inexplicably does not include any effective way for the teen to let the company know of the unwanted advance,” the report says. 


Meta said the report is misleading because it rates many of the safety tools not on what they promised to do but what the authors would like them to accomplish. For instance, the report says a feature that lets users manually hide comments works as described but adds that it “places a clear onus on the recipient” to do so and doesn't allow the user to say why they hid the comment — which Meta had never promised it would do. 


Instagram also pushes its disappearing messages feature to teenagers with an animated reward as an incentive to use it. Disappearing messages can be dangerous for minors and are used for drug sales and grooming, “and leave the minor account with no recourse,” according to the report. 
Another safety feature, which is supposed to hide or filter out common offensive words and phrases in order to prevent harassment, was also found to be “largely ineffective.” 


“Grossly offensive and misogynistic phrases were among the terms that we were freely able to send from one Teen Account to another,” the report says. For example, a message that encouraged the recipient to kill themselves — and contained a vulgar term for women — was not filtered and had no warnings applied to it. 


Meta says the tool was never intended to filter all messages, only message requests. The company expanded its teen accounts on Thursday to users worldwide. 


As it sought to add safeguards for teens, Meta has also promised it wouldn't show inappropriate content to teens, such as posts about self-harm, eating disorders or suicide. The report found that its teen avatars were nonetheless recommended age-inappropriate sexual content, including “graphic sexual descriptions, the use of cartoons to describe demeaning sexual acts, and brief displays of nudity.” 


“We were also algorithmically recommended a range of violent and disturbing content, including Reels of people getting struck by road traffic, falling from heights to their death (with the last frame cut off so as not to see the impact), and people graphically breaking bones,” the report says. 
In addition, Instagram also recommended a “range of self-harm, self-injury, and body image content” on teen accounts that the report says “would be reasonably likely to result in adverse impacts for young people, including teenagers experiencing poor mental health, or self-harm and suicidal ideation and behaviors.”


The report also found that children under 13 — and appearing as young as six — were not only on the platform but were incentivized by Instagram’s algorithm to perform sexualized behavior such as suggestive dances. 


The authors made several recommendations for Meta to improve teen safety, including regular “red-team” testing of messaging and blocking controls, where a system is tested against one team pretending to be bad actors; providing an “easy, effective, and rewarding way” for teens to report inappropriate conduct or contacts in direct messages; and publishing data on teens' experiences on the app. They also suggest that the recommendations made to a 13-year-old's teen account should be “reasonably PG-rated," and Meta should ask kids about their experiences of sensitive content they have been recommended, including “frequency, intensity, and severity.”


“Until we see meaningful action, Teen Accounts will remain yet another missed opportunity to protect children from harm, and Instagram will continue to be an unsafe experience for far too many of our teens,” the report says. 

Picture Courtesy: AP Photo/Michael Dwyer, File

MORE National ARTICLES

Two bodies were missed at B.C. death scene. Was treatment of coroners to blame?

Two bodies were missed at B.C. death scene. Was treatment of coroners to blame?
The coroner had missed the bodies for a very simple reason: they did not attend in person.

Two bodies were missed at B.C. death scene. Was treatment of coroners to blame?

Uncertainty reigns in Kanesatake nearly one month after cancelled election

Uncertainty reigns in Kanesatake nearly one month after cancelled election
On Tuesday, the Mohawk Council of Kanesatake filed a statement of claim asking the Federal Court to declare that Simon and four other council chiefs remain in office until a new election can be held. 

Uncertainty reigns in Kanesatake nearly one month after cancelled election

Law advocates slam Ottawa for silence on Trump sanctioning Canadian ICC judge

Law advocates slam Ottawa for silence on Trump sanctioning Canadian ICC judge
The U.S. State Department also sanctioned citizens of France, Fiji and Senegal over their role in the ICC's investigation of Israel's actions in Gaza and the West Bank.

Law advocates slam Ottawa for silence on Trump sanctioning Canadian ICC judge

Smith's Alberta Next panel hears cheers for deportation, separation in Lloydminster

Smith's Alberta Next panel hears cheers for deportation, separation in Lloydminster
Smith's panel, which is touring the province to hear from the public on ways to shield the province from federal overreach, drew a friendly crowd of about 350 to a public recreation centre Wednesday night.

Smith's Alberta Next panel hears cheers for deportation, separation in Lloydminster

Poll suggests 85% of Canadians want governments to regulate AI

Poll suggests 85% of Canadians want governments to regulate AI
The Leger poll found 85 per cent of respondents believe governments should regulate AI tools to ensure ethical and safe use. More than half, 57 per cent, said they strongly agreed with that statement.

Poll suggests 85% of Canadians want governments to regulate AI

Active wildfires jump as heat warnings continue for parts of B.C.

Active wildfires jump as heat warnings continue for parts of B.C.
BC Wildfire Service figures Thursday morning show 78 active blazes, up from 68 on Wednesday, with 16 new starts and eight fires declared out over the past 24 hours.

Active wildfires jump as heat warnings continue for parts of B.C.