Ofcom finds UK children ‘inevitable’ from online violent content Cyber ​​safety

Research by the media watchdog has found that violent content online is now “inevitable” for children in the UK, with many being exposed to it for the first time when they are in primary school.

Every British child interviewed for the Ofcom study had viewed violent content on the internet, from videos of local school and street fights shared in group chats, to images of explicit and extreme violence, including gang-related content.

The report concluded that children knew that more extreme material could be found in the depths of the web, but they were not seeking it out themselves.

The findings prompted the NSPCC to accuse tech platforms of standing by and “ignoring their duty of care to young users”.

Rani Govender, senior policy officer for Child Safety Online, said: “It is deeply concerning that children are telling us that unintentional exposure to violent content has become a normal part of their lives online.

“It’s unacceptable that algorithms continue to push out harmful content, which we know can have devastating mental and emotional consequences for young people.”

The research, carried out by the Families, Children and Young People Agency, is part of Ofcom’s preparations for new responsibilities under the Online Safety Bill passed last year, which gives the regulator powers to clamp down on social networks that fail to protect the rights of their users. users, especially children.

Gill Whitehead, director of Ofcom’s online safety group, said: “Children should not assume that seriously harmful content, including material that depicts violence or promotes self-harm, is an inevitable or unavoidable part of their online lives.

“Today’s research sends a strong message to tech companies that now is the time to act so they are ready to meet their child protection duties under new online safety laws. Later this spring we will discuss how the industry can ensure Children can enjoy an age-appropriate, safer online experience for consultations.”

Children and young people interviewed by Ofcom named almost all the leading technology companies, but Snapchat and Meta’s apps Instagram and WhatsApp appeared most frequently.

“Children explained that private accounts, often anonymous, existed solely to share violent content – most commonly local school and street fights,” the report said. “Nearly all in the study had interacted with these accounts. of kids said they were discovered on Instagram or Snapchat.”

“There’s pressure from peers to pretend it’s fun,” said an 11-year-old girl. “You feel uncomfortable on the inside but pretend to be funny on the outside.” Another 12-year-old girl said she felt “slightly traumatized” after watching an animal cruelty video: “Everyone made fun of it.”

Many of the older children in the study “appeared to have become desensitized to the violent content they encountered.” Professionals also expressed particular concern about violent content that normalizes offline violence, reporting that children tend to laugh and joke about serious violence.

On some social networks, exposure of violent images comes from high-level sources. On Thursday, Twitter (renamed X after being acquired by Elon Musk) removed a video that went viral on the social network and allegedly showed sexual mutilation and cannibalism in Haiti. The clip was retweeted by Musk himself, who posted it on news channel NBC in response to a report on the channel that accused him and other right-wing influencers of spreading unsubstantiated claims about chaos in the country.

Other social platforms offer tools to help children avoid violent content, but they do little to help. Many children as young as eight told researchers it was okay to report content they didn’t want to see, but there was a lack of trust in whether the system was effective.

For private chats, they fear reporting would label them as a “snitch,” leading to embarrassment or punishment from their peers, and they don’t trust platforms to impose meaningful consequences for people who post violent content.

The rise of powerful algorithmic timelines, such as those of TikTok and Instagram, has added an additional twist: children have a common belief that if they spend time on violent content (for example, when reporting it), They’re more likely to recommend it.

Professionals included in the study expressed concern about the impact of violent content on children’s mental health. In a separate report published on Thursday, England’s Children’s Commissioner revealed that more than 250,000 children and young people are waiting for mental health support after being referred to NHS services, meaning one in 50 children in England On the waiting list. The average waiting time for children receiving support is 35 days, but last year nearly 40,000 children experienced waits of more than two years.

A spokesperson for Snapchat said: “Violent content or threatening behavior has absolutely no place on Snapchat. When we discover this type of content, we quickly remove it and take appropriate action against the offending account.

“We have easy-to-use, confidential in-app reporting tools and work with police to support their investigations. We support the aims of the Online Safety Act to help protect people from harm online and continue to engage with Ofcom on the implementation of the Act Constructive cooperation.”

Meta has been contacted for comment. X declined to comment.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *