social network

“Instagram helped kill my daughter”: teen suicide became a pretext for a large-scale social media reform in the UK

The authorities attributed the rise in juvenile suicides to the dominance of potentially dangerous content on the Internet.

The authorities attributed the rise in juvenile suicides to the dominance of potentially dangerous content on the Internet.
The authorities attributed the rise in juvenile suicides to the dominance of potentially dangerous content on the Internet.

Since the end of January, the 14-year-old Molly Russell is discussing suicide in Britain – her father is convinced that part of the blame for the tragedy lies on Instagram. According to him, the social network algorithms are designed in such a way that they offered the girl content related to suicide.

This version was actively supported in the UK government, where they had long discussed the connection between the growth of suicide among adolescents and the influence of social networks. Turning against Instagram, the authorities announced plans to introduce new requirements for the main social networks in the country. The purpose of the officials is to protect minors from dangerous content on the Internet.

The Case of Molly Russell

In November 2017, 14-year-old Molly Russell, as always, was getting ready for bed in her room. The youngest of three daughters in the family, she completed homework, tidied up the room and made it clear to her parents that tomorrow, as always, she plans to go to school. There she was appreciated – she studied well, quickly made friends with her classmates and maintained good relations with her parents.

Her behavior seemed normal to them – yes, she had recently ceased to engage in equestrian sports, but explained that it was too cold in the fall, and with the arrival of summer she would return to classes. In January, the premiere of the Hamilton musical based on the life of the statesman Alexander Hamilton, whose biography was fond of the girl, was scheduled. The family bought tickets to the event in advance and planned to go there in full force.

That did not happen. November morning, the parents found Molly dead, she committed suicide. In six days, she should have turned 15 years old, and in her diary there was a list of desired gifts. There, on one of the pages, the family found a drawing of a small ship trying to hold on to the waves in a powerful storm. And a farewell note where the girl called herself “a problem in the life of everyone.”

Photos from the Russell family archive
Photos from the Russell family archive

According to the father of the deceased, Yana, the girl has never been very keen on social networks. She deleted her Twitter account, didn’t create a Facebook account at all, and used mostly Snapchat and Instagram. She did not publish anything in the latter, says Yang, who followed her daughter’s activity on the social network.

The tragedy in the Russell family went unnoticed by the world. Only a year and a half later, in January 2019, the father of the deceased turned to the media. In an interview with the BBC, he said that Instagram and Pinterest service are partly to blame for the death of his daughter. The main argument was the content that the girl was interested in on these platforms, namely, images on self-harm and suicide.

Jan Russell, who, after an interview with the BBC, became the hero of several more publications in major British publications, never explained why he accused Instagram on a year and a half after his daughter died. He also did not tell how the family understood that Molly was browsing suicide-related content on social networks.

Molly’s father talks about the details of her death.

“This world is so cruel, and I don’t want to see it anymore” – one of the pictures that Molly’s parents showed to the media is signed. It depicts a girl lying in bed hugging a bear. Instead of her eyes, she has two black holes, closed with a bandage. “She could offer so much to this world, but now all this is no more. And this happened not without the help of the Internet and social networks, ”says Yang.

As the man believes, partial blame for the death of his daughter lies not on the Instagram itself, but on the social network algorithms. When the system understands that a person likes certain content, he begins to offer more often similar through tape and hashtags. In the case of Molly Instagram, her father believes, she regularly supplied her daughter with new images on the subject of self-harm and suicide.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]For us, it is obvious – despite all the claims of companies that own social networks that they are struggling with potentially dangerous content, such content is still easily accessible to minors, and having found it, the algorithms offer it more and more. […][/perfectpullquote] [perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]Although many may say that it is impossible to “control the Internet”, you need to find a better and safer way to control what young people on the Internet face and to provide wider support to those who are looking for answers [to their problems] on the Internet.[/perfectpullquote]
Ian russell
father of the deceased Molly Russell

Power reaction

According to the National Health Service of England, over the past 20 years, the number of 18-year-old girls who have been treated for the effects of self-harm has increased many times over the country. Only in 2018 it grew from 7300 to 13.463. The number of suicides among adolescents from 15 to 19 years has grown from 110 in 2010 to 177 in 2017.

In a conversation with The Telegraph, the head of the government suicide advisory group in the field of suicide, Louis Appleby , said that this statistic shows the popularity of self-harm and suicide among the current generation of teenagers. “[They] think in a suicidal way when they face difficulties,” the specialist said.

The possible association of the popularity of social networks among teenagers with suicides has not been discussed in the UK for the first year.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]In August 2018, the National Society for the Prevention of Cruelty to Children declared that Instagram and Tumblr feature “extremely dangerous” images on the topic of self-harm. According to the organization, having seen these pictures, teenagers can begin to copy this behavior.[/perfectpullquote]

The Government’s reaction to the story of Molly Russell is not forced to wait long. A few days after the BBC interview with Jan, the elected committee on science and technology in the UK called on social networks to better monitor what content is being distributed to them. The organization recommended the government to introduce a law requiring social networks to take responsibility for protecting minors from any harm on their platform.

According to The Times, in late February, the British authorities will publish a document establishing new requirements for social networks to protect minors. The exact list of companies that will be affected by the requirement is unknown. Most likely, it will affect Facebook (owns Instagram), Google, Twitter, Tumblr and Pinterest.

Estimated list of new conditions of the British government for companies-owners of social networks

  • Companies must “protect” their users from content related to suicide, self-harm and harassment;
  • Companies must agree to abide by a new set of government regulations that require the rapid removal of potentially hazardous content;
  • Companies should regularly report on how many complaints they receive about content related to harassment, trolling, self-harm or suicide;
  • In case of refusal of the above conditions or insufficient speed of their implementation, government regulators may interfere in the process of social network operation. 

Deputy State Secretary of the Parliament Jackie Doyle-Price believes that if the owners of social networks can not independently protect their users, especially minors, from dangerous content, the authorities will intervene. She added that if the companies that own social networks do not follow the new rules, the platform’s leadership faces criminal penalties and arrest in the UK.

Instagram Reforms

On February 5, the head of Instragram, Adam Mosseri, met with UK Secretary of Health Matthew Hancock. He was one of the first to pay attention to the case of Molly Russell and criticized social networks, including Instagram, for the insufficiently effective moderation of potentially dangerous content.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“I am inspired by the courage of Molly’s father, who spoke about the role of social networks in this tragedy, and is puzzled that we all need to make much more effort to prevent similar tragedies in the future,” Hancock said .[/perfectpullquote]

It is not known what the Minister of Health specifically mentioned at the meeting with the head of Instragram. As written by The Telegraph, Hancock could require the social network to introduce an age check so that teenagers under 13 years of age cannot register in it. In addition to the Ministry of Health, new requirements for social networks were supported by the Ministry of Cultural Affairs, Media and Sport.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]The tragic death of Molly Russell is the last known outcome of the world of social networks, which behaves as if it is above the law.[/perfectpullquote]
Margot james
Minister of Culture, Media and Sport

Against the background of the requirements of the British government, the head of Instragram Adam Mosseri ( took office after the company’s founders left in the fall of 2018) spoke in a large column for The Telegraph. In the material, he expressed his support for Molly Russell’s parents and assured that the company would do everything to protect its users from potentially dangerous content.

“We have not yet found enough of these images before other people see them,” wrote Mosseri, acknowledging the shortcomings in the work of Instagram filters.

Adam Mosseri Photo by Reuters
Adam Mosseri Photo by Reuters

In an interview with the BBC, Facebook vice president Nick Clegg (Nick Clegg) also admitted that earlier Instagram management tried not to remove content related to self-harm and suicide. This was due to the position of specialists, which the management of the social network consulted with. They felt that such publications could serve as a friend or family member of the user as a sign that he needed help.

What the Instagram Guide has promised to do to protect users from unsafe content

  • Complicate the process of searching in the social network content related to self-harm and suicide;
  • Disallow search for such content through hashtags;
  • Close images with content related to self-harm or suicide, plugs. To display a picture you will need to click;
  • Redesign algorithms so that they do not recommend content related to self-harm or suicide.

Some of these promises have already been implemented. For example, when switching to the #selfharm (self-harm) hashtag, the social network offers to call for help by calling the helpline and gives advice: “Go outside”, “Get creative” or “Calm down”. By default, images by tag are hidden, they can be opened by clicking on the “Show publications” button. After that, no warnings or tips from the social network appear – all images, including those that show cuts on the body or blood, are available.

Warn moderators about such content can be – among the types of complaints available option “self-injury”. The correspondent complained about four publications, the authors of which showed cuts on his hands and content related to suicide. All applications were reviewed, but the administration of the social network reported that it did not find signs of violation of the rules in the publications.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]On February 7, Instagram executives announced plans to ban content related to self-harm and suicide on the social network. “We will become better and determined to find and massively remove this content,” said Mosseri.[/perfectpullquote]

According to the parents of Molly Russell, they are still trying to gain access to the iPhone and ipod of the girl, who are locked with a password. Apple refused to help the spouses, explaining that due to the nature of the encryption device access to it without a password is “almost impossible.” The coroner on Molly’s case sent official requests to Instragram, YouTube and Pinterest, demanding that they provide access to the girl’s data, as parents believe they will help to learn more about the causes of her death.

“It seems to me that the data on Molly’s phone should become the property of her parents. She died without a will, she was 14, and everything else logically returned to us as parents, ” said Ian Russell.

Yang always comes to meetings with journalists alone, his wife and two daughters never gave comments on the case of Molly. In addition to correspondence condolences through the media, the Russell family never communicated with Instagram representatives, whose product considers Molly But the cause of death, despite its position on Instagram, the father of the deceased regularly emphasizes that the blame for the death of the girl on the social network cannot be shifted. Jan recognizes – as a parent he did not do enough to help his daughter.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]The most important thing is to create a relationship of trust with the child so that you can discuss everything openly. That is how we tried to do it, and it seems to me that it was here that I failed. I think it is very difficult, but parents and children really need to try to talk about the dangers and caution that need to be observed on the Internet.[/perfectpullquote]
Ian russell
father of the deceased Molly Russell
Tags
Show More
Back to top button
Close
Close