Instagram Still Promotes Eating Disorders And Unproven Diet Pills To Teens


[ad_1]

In this photo, a girl measures her stomach measurement on April 16, 2021 in Bonn, Germany.  (

In this photo, a girl measures her stomach measurement on April 16, 2021 in Bonn, Germany. (Photo by Ute Grabowsky / Photothek via Getty Images)

Unravel the viral misinformation and explain where it came from, the damage it is causing, and what we need to do about it.

Want the best of VICE News delivered straight to your inbox? Register here.

The day Facebook is once again confronted with questions from Congress about the harm its products are causing to children, new research shows that one such product, Instagram, is full of posts promoting eating disorders, unproven dietary supplements and skin whitening products.

The damning research, shared with VICE News by activist group SumOfUs ahead of publication, found that companies promoting unproven services, practices and products can easily target young women using hashtags related to these. trends.

While Instagram has banned some of the more obvious hashtags related to eating disorders, researchers have found that users have found that those looking to push their products can easily bypass these restrictions by using creative hashtags, including some. have accumulated more than 10 million messages.

Using some of the more popular hashtags, the researchers looked at 240 examples of posts related to eating disorders hashtags and found that more than half of the posts promoted eating disorders while nearly 90% of them encouraged unproven appetite suppressants.

“The posts show how Instagram enables and encourages users to engage in negative discussions, self-harm and extreme dieting,” the report’s authors write.

The researchers also found that plastic surgeons were targeting young people by positively framing the results of plastic surgery and “collaborating” with influencers to promote these procedures on Instagram, although it is not clear whether this collaboration includes a payment. .

SumOf

Related to this, the researchers found that over 80% of the skin whitening articles they analyzed promoted unproven products, with almost half of them directly supporting skin whitening. .

“This research confirms that, despite Facebook’s promises to limit such content, Instagram remains inundated with toxic content that poses an immediate danger to the lives of its users, and in particular to adolescents and young people,” the authors wrote. of the report.

“The platform tracks and targets some of society’s most vulnerable people – young teens and people of color – with content that strongly promotes unrealistic and unattainable Western beauty standards, and offers bogus solutions like surgery. plastic, skin whitening and extreme diets to trigger body image problems and low self-esteem.

Facebook disputed the findings of the report. “The conclusions drawn from this study are inaccurate and based on a limited sample size,” a company spokesperson told VICE News, although they did not specify exactly what they believed to be. inaccurate.

The company said it “bans content that promotes or glorifies eating disorders and blocks both hashtags and content that does,” although the report shows the company is not implementing its own rules.

The report was released hours before Facebook’s last congressional appearance. Antigone Davis, Facebook’s global head of security, will appear before the Senate Committee on Commerce, Science and Transportation in a hearing titled: “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms.” “

The hearing was sparked by a Wall Street Journal article earlier this month that revealed society is aware of Instagram’s toxicity to the mental health of young girls.

The article, based on an internal report commissioned by Instagram, found that the photo-sharing app can harm girls’ mental health. The report showed that one in three girls said it made them feel worse about their bodies, while teenagers told Facebook’s own research team, uninvited, the app was to blame. for increased anxiety and depression.

The company said the Wall Street Journal report was “simply inaccurate,” but Facebook has put development on its Instagram for Kids product, which targets children under 13, on hold.

SumOfUs hopes its own report will put even more pressure on the company.

“The results detailed here should prompt lawmakers around the world to sit down and take note,” Emma Ruby-Sachs, executive director of SumOfUs, wrote in the report. “They provide even more evidence that Facebook and other global technology companies cannot be trusted for the safety and well-being of their billions of users. It’s time for lawmakers to act.

Examples of posts using hashtags related to eating disorders on Instagram.  Credit: SumOfUs

Examples of posts using hashtags related to eating disorders on Instagram. Credit: SumOfUs

Body image issues were already a major problem among young people around the world, leading them to develop eating disorders and seek plastic surgery. But the pandemic has dramatically exacerbated these problems.

For example, one study found that the number of teens aged 10 to 23 with eating disorders has doubled since the start of the pandemic, while the National Eating Disorders Association has seen a 40% increase in eating disorders. calls to its helpline since March 2020.

As for plastic surgery, there has been a 70% increase in bookings and treatments during the pandemic.

The report highlights how individuals and businesses on Instagram are trying to leverage and monetize these real issues facing teens.

“Is it possible for a thin patient to undergo BBL and liposuction?” Wrote a plastic surgeon in a post on Instagram. “Can a Thin Patient Look Better with Liposuction?” YES! They can.”

This isn’t the first time Instagram has come under fire for its handling of eating disorder posts. Earlier this year he was forced to apologize for promoting diet pills to people with eating disorders, but little seems to have changed on the platform since then.

“Facebook’s recommendation systems and algorithms amplify the salaciousest content to maximize ‘engagement’ which it then monetizes,” Ruby-Sachs wrote. “Business leaders have little incentive to remove harmful content because it can reduce profits. And without any transparency, without audits of its technology and processes, and without any measure of accountability, the company has done little to truly tackle the serious threats its platforms pose to the world. company.

[ad_2]