Skip to content
Blog

Concerns raised about Facebook video

5 June 2015

Today the NSPCC raised concerns about a video on Facebook, which features a distressed child that some have purported to be an example of “baby yoga” while others say the baby is clearly being harmed.

Facebook has said that it does not break its terms, so they would not take the content down if it was reported, unless the behaviour in the video was being encouraged or supported. They would however prevent this content being viewed by users registered as under 18, and add a warning to the front of this content which would also prevent this video playing automatically in newsfeeds.

Technical steps like adding an interstitial warning page are welcome and it is important that we continue to work collaboratively to protect the best interests of children.

This issue is a complex one, and raises questions about how we as an online community, and how companies like Facebook, decide where the line should be drawn.

We need to continue working with all social media providers to:

  1. Provide effective reporting tools and ensure that when reports to social media providers are assessed, the best interests of children are put first – including those in the video (who may be re-victimised by the content being shared) and those who may potentially be exposed to the video and find it upsetting.
  2. Give users tools to help them manage how they share sensitive content. For example, at the moment on Facebook you could use Lists to only choose to share the content with certain contacts, and we would welcome a feature where users could add their own warning page to video content they post. Features on YouTube and Twitter similarly allow users to mark their content as sensitive to help protect users and prevent it being shown to minors.
  3. Protect minors from inappropriate content: it’s important that social media providers continue to restrict access for minors when it comes to content that has been flagged as age-inappropriate. We would welcome further innovations in how this can be done effectively.
  4. Educate users to understand their own responsibility for what they share online, and who they share it with, as well as the role we all need to play in reporting content that we are not happy with. In some situations we might be able to reach out to friends who have posted inappropriate or upsetting content and explain to them why we would like them to take it down.

With effective technical solutions and an empowered and considerate online community, we can help protect children both online and offline, and it’s important we continue working together to achieve that.

Latest articles

Childnet Film Competition 2022 - Arty Take 1

15 years of supporting Film Competition – looking at Childnet's partnership with the BBFC

The Childnet Film Competition is back for 2024 – and so is British Board of Film Classification (BBFC) Chief Executive David Austin as one of our expert judges.

1 May 2024

New interactive learning resource on reliability online

‘Reliability Online’ has been developed in partnership with the Childnet Youth Advisory Board and covers a variety of risks including AI generated imagery, scams, fake news, giveaways, body image, and social media bots.

29 April 2024

New updates to ground-breaking Smartie the Penguin resources

Take a closer look at the updated Smartie the Penguin resources.