Guest post by Luca Teres Loytved, Niamh Herlihy and Florian Bachmann

Influencer Law Clinic series

1/2/20219 min read

Could you estimate how much time of your day you spend on social media? The answer would most likely be something along the lines of ‘a lot’ or ‘I’m always connected, so I get notifications all the time anyways’ but an actual estimate, that’s a very tough guess to take. However, for most people, it is indeed a very high amount of time. Given that we spend an incredible amount of time on various social media platforms, it cannot be denied that the content we are exposed to for so many hours in a given day has an incredible impact on us. While, for a lot of people the good might outweigh the bad, others are prone to be severely damaged by the use and content on their social media accounts. Particularly, teenagers’ mental health can suffer due to the use of social media, most likely because of the stage of development of a young person’s brain and its vulnerability.[1] Because nowadays, 95% of teenagers have a smartphone, they are thereby almost continuously connected and kept up to date;[2] it is nearly impossible to differentiate between time spent on- and offline. Research suggests that the use of social media is correlated with depression and other mental health issues discerned in teenagers.[3] Unfortunately, just how severely social media can influence a young mind often only comes to light once it is already too late. One fatal example which sparked a considerable public debate in the United Kingdom was the suicide of 14-year-old Molly Russell. After she took her own life, her grieving family turned to the public, stating that they believe that Instagram played a role in the tragic turn her life took.[4] Her father advocated for a change of the content that can be viewed online and plead for the platforms’ responsibility to protect its users in situations similar to Molly’s.[5] His advocacy also resulted in the United Kingdom’s government applying pressure to various social media companies to remove harmful content. In this regard, health secretary Matt Hanock indicated that a failure on behalf of the platforms to do so could mean a stricter regulation of platforms in the United Kingdom. The response of Instagram to the public outcry concerning this topic was to remove a large variety of content that could be connected to self-harm and to broaden the scope of the type of material to hide from its users, in the hopes of offering better protection.[6] It is widely accepted that minors, in particular, have to be safeguarded from certain content that circulates online.

How this protection of children through community guidelines on various social media platforms is achieved will be outlined in the following sections. Generally, what most, if not all, social media outlets have in common is that they specifically focus on the protection of minors. However, here we will only discuss Instagram, YouTube and TikTok, as these outlets arguably represent the essential creative content providers by underage users.Sadly, although it comes as no surprise, that the various phenomena harming children in any way are not emerging only now. For these reasons, social media and video-sharing platforms have, throughout the years, developed policies to protect children. As a first step, it is fair to say that all these platforms follow a similar structure regarding the protection of minors through their community guidelines, albeit, of course, adapting the rules to particular platform-specific needs. Instagram follows the community guidelines set by its parent company Facebook and has scattered minor specific requirements alongside it to offer additional protection.[7] The platform also implemented a ‘Tips for Parents’ section within its Privacy and Safety Center.[8] Likewise, YouTube derives its community guidelines from parent company Google, which principally follows Instagram´s structure, while setting a few additional requirements relating to video content in particular.[9]Similar to Instagram, YouTube has implemented a ‘Child Safety Policy’. Those policies or tips aim at protecting children from any content that could potentially be harmful (both emotionally or physically) for viewers, especially minors.[10] As to protect harmful content from being posted within their platforms, both Instagram and YouTube will prohibit certains ‘kinds’ of content. YouTube, for example, will prohibit the following content from being posted on their platform: Sexualization of minors Containing harmful or dangerous acts involving minors May inflict emotional distress on minors Could be perceived as misleading family content Cyberbullying and harassment involving minorsAdditionally, YouTube can categorize its content to a certain age group, as such, excluding minors from viewing content that might be considered harmful for them – while still allowing older viewers to view it. To enact such viewing, an individual would have to sign in to their Youtube account as to prove they are 18 and older.

To help discern which content may be harmful, YouTube uses an automated system that aims to detect content that may violate the platform’s policies. Similarly, Instagram imposed prohibitions over several sections: Child and sexual exploitation and nudity; bullying and harassment and human exploitation.[1] Turning to TikTok, although one might argue that the platforms’ protection measures lead to the same practical results as those of the other outlets, TikTok retains a unique role in protecting minors.[11] Of course, this is due to the constant criticism that TikTok has to endure regarding this subject matter.[12] To provide yet another fatal example; at the moment, one can see that Italy is actively trying to regulate and restrict TikTok, due to the tragic death of a ten-year-old Italian girl, which was allegedly caused by accident in the wake of a viral TikTok challenge.[13] For these reasons, TikTok pays enormous respect to the protection of its underage users.But let us take a step back and look more generally at the content restrictions aimed at protecting minors on those websites. Naturally, one of the most pressing issues and the specific limitation is that no content is allowed that, in any way, sexually exploits or endangers children. If such content is detected on any of these websites and is deemed potentially criminal (e.g. child pornography), such content is immediately deleted. The case is then referred to the National Center for Missing and Exploited Children (NCMEC). What might be interesting to note here is that content in violation of these policies does not even necessarily have to be uploaded with evil intent. For example, Facebook/Instagram explicitly urges parents not to upload content showing their nude children, for instance, in the bathtub, as these pictures might be reused or misappropriated by third parties.[14] Also, understandably, the policies do not only explicitly encompass nudity or sexual content involving minors but any content in which a minor is shown within a sexual context. Such situations would, for example, include photos of children in the presence of aroused adults, children in sexualised costumes, or children in a sexual fetish context. Another recent and concerning trend on YouTube, for example, are seemingly harmless videos, which are edited in a certain way to ponder various fetishes (we will not refer to accounts/content to avoid giving such content a stage). It is difficult for content moderators to deal with such content, as some might argue, that the videos themselves would not depict anything in violation of minor protection policies.

Future developments in content moderation will show how such issues that will probably only increase will be addressed.A problem relating to the sexualization of minors that has proven to be a challenging issue for TikTok, in particular, is “grooming.” In its community guidelines, TikTok describes grooming as a situation in which “an adult builds an emotional relationship with a minor to gain the minor´s trust for the purposes of future or ongoing sexual contact, sexual abuse, trafficking, or other exploitation.”[15] To counteract this alarming problem, TikTok updated its guidelines this year and restricted underage users’ access to specific functions such as direct messaging or hosting a live stream.[16] In addition to these measures, TikTok bans every user who has already been convicted of crimes against children.Further, content that is generally restricted on all social media outlets relates, for example, to online bullying and harassment. Precisely for this purpose, Instagram has introduced special protection measures for users between the ages of 13 and 18. Besides, especially social media outlets revolving around video content, such as TikTok or YouTube, also ban every content that depicts dangerous acts of minors or aims at encouraging children to do something, such as challenges or dares.The importance of protecting minors has also generated attention at the EU level, where parental consent is required for actions taken by any child below the age of 13 years old. While the GDPR sets the general age of digital consent to 16 years of age, it also allows Member States to set their digital consent between the ages of 16 and 13 years old.[17] To that effect, children wishing to join a social media platform before the age of 13 years old shall require parental consent to do so. This, however, does not include the possibility for social media platforms to restrict the creation of an account for children below the age of 13. Platforms such as YouTube, for example, requires all users to be at least 13 years old.[18] This is due to the fact that no data collection can occur from a user who is younger. Therefore, one may wonder, whether there are any possibilities for a 10-year-old to have a YouTube account? If there is parental consent, then yes, a 10-year-old can create a YouTube account. However, users under the age of 13 should only use YouTube Kids which is not only an easy platform to navigate through but also a place where content is greatly moderated for the viewers’ safety. Beyond the legal protections, several initiatives and projects have been set up by the European Commission in the last decades in order to offer children with the necessary tools to safely and responsibly use the Internet.[19] The European Strategy for a Better Internet for Children which is coordinated by INHOPE and Insafe is one such example. The main tasks of this strategy can be categorised into four pillars:To increase the amount of existing child-friendly content on the InternetTo raise awareness of digital literacy and online safetyTo create a safe and protected environment online for childrenTo restrain sexual exploitation of children and other abusive content involving children.On the one hand, INHOPE was funded by the Commission and aims at supporting “the network of hotlines in combating online Child Sexual Abuse Material (CSAM).”[20]

Bringing an emphasis to the aforementioned fourth pillar. INHOPE has since its creation in 1999 developed a strong global network and has to this day, 47 hotlines. On the other, Insafe aims at promoting global awareness regarding safe internet use for all.[21] And it wishes to empower users such as children with the necessary knowledge and skills to “stay safe online.”As a response to the increased use of the Internet by minors, those associations have also launched additional initiatives and projects. Every year, they celebrate Safer Internet Day (SID), which aims at promoting “a safer and more responsible use of online technology by children and young people across the world.” To conclude from what has been discussed, while both social media and video-sharing platforms have restrictions and policies in order to protect children, the extent of that protection can be contested. Various content on platforms such as YouTube and Instagram involving minors can be perceived in certain settings as sexual. One controversial example which was previously mentioned manifests a mother with her legs spread, peeling a cucumber while being next to her toddler. One may believe that this video is cute as the mother is spending time with her child. However, one might also perceive the content as sexual or sensual. This content is only one of the countless existing content found on social media platforms, which begs the question, ‘have social media platforms done enough to effectively protect children from harmful content?’.


Child influencers have become an increasing trend in the past decades with YouTube channels such as “Ryan’s World” approaching 30 million subscribers and getting millions of views per video. What might start as a funny little video on YouTube can be transformed into an economic empire where the more followers, viewers they get, the more money they might perceive. But what about the child influencers themselves. Should we protect them from the potential harm they face on social media? Should we regulate this peculiar trend? And if so, then how can this issue be addressed? This roundtable touches upon the issue of children’s rights, children’s development, unfair enrichment and the need (or not) to regulate this activity.This Roundtable organised by the Influencer Law Clinic assembles Dr Valerie Verdoot (LSE) & Dr Mark Leiser(Leiden University) who will touch upon the subject of child influencers.

If you wish to learn more about child influencers and their legal framework, read Adrien Dubois & Nicole Binder’s earlier blogpost at: https://www.maastrichtuniversity.nl/blog/2020/11/regulation-child-influencers-profitable-playground

[1] Sherri Gordon, ‘5 Ways Social Media Affects Teen Mental Health’ https://www.verywellfamily.com/ways-social-media-affects-teen-mental-health-4144769

[2] Monica Anderson, Jingjing Jiang,’Teens, Social Media & Technology 2018’ https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/

[3] Ana Radovic, Theresa Gmelin, Bradley D. Stein, Elizabeth Miller; ‘Depressed adolescents’ positive and negative use of social media‘; Journal of Adolescence, 2017-02-01, Volume 55, Pages 5-15.

[4] Angus Crawford, ‘Instagram helped kill my daughter’ https://www.bbc.com/news/av/uk-46966009

[5] Zamira Rahim, ‘UK could ban social media companies over self-harm content, health secretary says’ https://www.independent.co.uk/news/uk/home-news/social-media-ban-uk-instagram-facebook-twitter-molly-russell-death-self-harm-samaritans-a8749121.html

[6] Angus Crawford, ‘Molly Russell: Instagram extends self-harm ban to drawings’ https://www.bbc.com/news/technology-50129402

[7] https://www.facebook.com/communitystandards/child_nudity_sexual_exploitation

[8] https://www.facebook.com/help/instagram/154475974694511/?helpref=hc_fnav&bc[0]=Instagram%20Help&bc[1]=Privacy%20and%20Safety%20Center

[9] https://www.YouTube.com/intl/ALL_uk/howYouTubeworks/policies/community-guidelines/

[10] https://support.google.com/youtube/answer/2801999?hl=en

[11] https://www.tiktok.com/community-guidelines?lang=en#31

[12] Stephanie Mlot, ‘TikTok Accused of Putting Children at Risk (Again)’ https://www.entrepreneur.com/article/350683

[13] Hannah Roberts, Giorgio Leali, ‘TikTok is the latest target in Italy´s crusade against Bich Tech’ https://www.politico.eu/article/tiktok-latest-target-italy-privacy-regulator-crusade-against-big-tech/

[14] https://about.fb.com/news/2018/10/fighting-child-exploitation/

[15] https://www.tiktok.com/community-guidelines?lang=en#31

[16] Alex Hern, ‘TikTok to tackle grooming with safeguards for young users’ https://www.theguardian.com/technology/2021/jan/13/toktok-to-tackle-grooming-with-curbs-for-young-users

[17] https://gdpr-info.eu/art-8-gdpr/

[18] https://blog.youtube/news-and-events/children-youtube#:~:text=While%20we%20permit%20users%20between,13%20to%20create%20an%20account.

[19] https://inhope.org/EN/articles/inhope-a-european-commission-funded-initiative-all-about-collaboration

[20] https://www.inhope.org/EN[21] https://www.betterinternetforkids.eu/policy/insafe-inhope