- Outcomes
-
- to help parents and carers to gain an insight into young people’s exposure to harmful content online
- to provide help and advice on how to report such content and protect young people
- Service Description
-
Children’s ability to access inappropriate and harmful content online is becoming an increasing concern. It can often be accessed through popular social media platforms. Research by the NSPCC suggests that 56 % of 11 to 16-year-olds have seen explicit material online.
Examples of harmful content include:
- sexual images and pornographic material
- hate speech
- sexism
- pictures, videos or games which show images of violence or cruelty to other people or animals
- gambling sites
- sites that encourage:
- vandalism
- crime
- terrorism
- racism
- eating disorders
- suicide
Reporting harmful content:
All social media apps have a report function. It is important to first report such content through the app or website directly. Visit the Report Harmful Content website for further information and guidance on specific types of online harm and how to report them to social media platforms. You can make a reports via Submit a Report of Harmful Content 48 hours after reporting to the social media platform, providing it is regarding one of the following harms:
- threats
- impersonation
- bullying and harassment
- self-harm or suicide
- online abuse
- violent content
- unwanted sexual advances
- pornographic content
See further advice for harmful online content: how to report it and where to get help.
Protecting young people from harmful online content:
- make use of the parental controls available on games and apps to limit exposure. For further guidance, see parental controls and privacy settings guides and tips and tools to block inappropriate content online
- explain the importance of adhering to age restrictions applied to games and apps
- have regular conversations to help children to build coping strategies should they encounter such content online
To find out about different types of inappropriate content your child might see across the platforms they access, visit the Internet Matters website.
Statistics from Lincolnshire
According to data collected from Lincolnshire students aged 11 to 16 in April 2024, for those who had been exposed or viewed harmful content, they reported this was most likely to occur on:
- Snapchat (46.1%)
- TikTok (34.4%)
- Instagram (26.3%)
- WhatsApp (17.3%)
Other findings from this 2024 survery regarding harmful content found:
- students who viewed or received harmful content were most likely to ignore it, with males (66.6%) being even more inclined to do so compared to females (46.6%)
- only a minority of students would tell the police, CEOP (1.2%) or a parent or carer (5.8%) if they viewed or had been sent harmful content
- apart from ignoring it, the most common actions students took was to block and report the user that had sent harmful content (30.6%)
- students in year 7 are more likely to tell their parents (20.5%) compared to other year groups
- 75% of year 7 students reported never experiencing harmful content in the last year. Children in older year groups were more likely to experience 'harmful content', particularly those in year 10, with 51% reporting at least one such incident in the past year
- respondents in year 9 and 10 had the highest proportion of students reporting experiencing 'harmful content' more than five times in the last year
- students who encountered harmful content were more likely to have experienced it multiple times in the past year. This trend was consistent across all years except for year 7