
Researching people’s experience of harmful video content
We developed a behavioural user survey that helped us gain insight into the potential harmful video content young people are exposed to on social media, and if and how they would go about reporting this.
The Challenge
In recent years, concerns have grown about people - especially young people - encountering harmful content while using online video sharing platforms such as YouTube, Facebook, TikTok, Instagram and Twitch. The Department for Culture, Media and Sport wanted to get a better understanding of the potential harms encountered by young people through the development of more rigorous methods for understanding their experiences. We took on this challenge as part of a wider programme of work with Oliver Wyman, the global management consulting firm.
The Approach
In order to get a better understanding of the potential harmful content encountered by young people, we developed a ‘behavioural user survey’. The core insight at the heart of this method is not to restrict responses to a specific list of harmful content types, but rather give participants the opportunity to describe whether or not they had encountered content that they believed was ‘inappropriate, distressing or intentionally misleading’. The aim was to avoid the problem with more traditional surveying methods, which tend to constrain participants’ by asking them to select a response from a pre-defined list. This also had the benefit of ensuring that the context in which a respondent encountered a particular form of content could be taken into consideration. Footage of a medical procedure, for example, might not be seen as being harmful in the context of an online healthcare documentary; but might be perceived as being harmful if seen unexpectedly and without consent in a social media feed.
To run the behavioural user experience survey we recruited 2,252 UK users of video sharing platforms. We wanted to make sure we heard from young people too, so within our sample were 355 young people aged 18 or under. Respondents were asked to recall and describe (in an open ended question) one specific instance of encountering video content that they considered to be inappropriate, distressing or intentionally misleading. They were also asked how often they encountered content of this kind, what they did when they encountered it, and what happened if and when they reported it to the platform operators. In other words, we were interested in the extent to which action was taken (by them and the platform owners) if harmful content was encountered.
The Result
Almost half (45%) of the respondents to the behavioural user experience survey reported having encountered inappropriate, distressing or deliberately misleading content. The most commonly reported forms of content were violent or disturbing videos (19%); fake news and disinformation (18.7%); inappropriate sexual content (8.2%); and animal abuse (7.5%). 1% reported content containing sexual abuse and 0.7% reported content promoting self harm.
One of the key findings from the survey was how little action is taken upon encountering harmful content. Most people who encountered harmful content did not take any action in response (822, ~55% - see chart below). The most cited reasons were a belief that taking action would not make any difference (49%), or the opinion that the content they encountered was not harmful enough to cause them to take action (29%).
Figure 1: A diagram showing the distribution of users’ reactions upon encountering harmful content

Of the participants who had reported the harmful content to the hosting platform, many (42%) did not know what subsequently happened. This suggests that video sharing platforms are not always closing the loop by informing those who make the reports of the actions they take. We know that feedback about past actions can help encourage future engagement, so this lack of information from platforms might have the effect of deterring future reporting. We also found that only a small minority of people believed that other users actually report harmful content. This is important, as we know from other behavioural studies that people are more likely to do something when they perceive it as the ‘normal’ course of action.
This survey has helped to contribute to the wider work being undertaken by government and industry around online harms. And we believe that the methodological approach of running surveys in this more sophisticated way will help to plug gaps in knowledge in many other areas of research.
“It was great working with CogCo in this important area. The behavioural survey at the heart of this project has helped to contribute to the wider work being undertaken by government and industry around online harms.”
-Lisa Quest, Partner, Head of UK and Ireland, Co-Head of the Public Sector and Policy Practice Europe (Oliver Wyman)