Loading News Article...
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
A new study reveals that a significant number of Kenyan children are exposed to disturbing content on social media platforms, leading to increased anxiety and distress. This highlights urgent concerns about algorithmic recommendations and inadequate online safeguards for minors.
More than half of children who access news through social media platforms in Kenya are left feeling worried and upset by content depicting war, violence, and death. This concerning trend is driven by algorithmic recommendations that push distressing news into children's feeds, even when they are not actively seeking it, according to recent research by Internet Matters, an online safety organisation.
The study, which included a survey and focus groups with over 1,000 children aged 11 to 17, found that 61% of children who get news from social media have encountered a worrying or upsetting story in the past month. Alarmingly, 39% of those exposed to such content reported feeling very or extremely upset and worried. This exposure includes graphic videos of real-world violence, stabbings, shootings, and war scenes.
Social media has become a primary source of news for more than two-thirds of children, with platforms like TikTok and Instagram being particularly prevalent. However, 40% of these children do not follow news-focused accounts, indicating that algorithms are largely responsible for their exposure to distressing content. This phenomenon reflects a broader shift where users spend less time viewing content from friends and more time engaging with algorithmically recommended posts.
Kenya has a notably high rate of social media usage, with citizens spending an average of 4 hours and 19 minutes daily on these platforms, significantly above the global average of 2 hours and 23 minutes. This extensive engagement, particularly among a youthful population where 75% are aged 18-30, makes Kenyan children highly susceptible to both the benefits and harms of online content. A study on Social Media Consumption in Kenya indicates that over 70% of teenagers spend more than three hours daily on social platforms, increasing their susceptibility to mental health disorders.
In April 2025, the Communications Authority of Kenya (CA) released comprehensive Industry Guidelines for Child Online Protection and Safety. These guidelines, rooted in the Kenya Information and Communications (Consumer Protection) Regulations of 2010, mandate ICT providers to implement safety tools, age-verification protocols, privacy-by-design, and robust complaint systems. The framework aims to safeguard children under 18 from online risks such as cyberbullying, child sexual exploitation, radicalisation, and data breaches.
Despite these guidelines, concerns persist regarding their effective implementation and enforcement. The CA has given licensees a six-month window to comply with the new regulations. Previously, Kenya had been criticised for lacking a clear policy on child online safety, with reports indicating a low performance in protecting minors online.
Children themselves have voiced their discomfort with the content they encounter. A 14-year-old girl shared, "On TikTok you can see stabbings and kidnappings, which are just not nice to see, especially when you are a bit younger, it makes you feel uncomfortable." Another 17-year-old girl recounted seeing "stabbing videos or gruesome videos" on Instagram and expressed a desire for trigger warnings.
Experts from organisations like Internet Matters emphasise that the shift away from traditional news channels is fundamentally altering how young people consume news, with worrying consequences for their well-being. Parents also share these concerns, with two in five believing that excessive online time negatively impacts their child's health.
The constant exposure to distressing content can have severe implications for children's mental health, contributing to anxiety, depression, and a general sense of unease. The addictive nature of social media, coupled with the pressure to maintain a 'perfect' online persona, further exacerbates these issues among Kenyan youth. Cyberbullying and online harassment also remain significant threats, with negative interactions causing emotional distress.
Furthermore, the rise of online child sexual exploitation and abuse (OCSEA) is a grave concern in Kenya, with digital platforms becoming hunting grounds for predators targeting vulnerable children. A survey by Child Fund International and Africa Child Forum indicated that up to 13% of minors online have been exploited or abused, predominantly those aged 12 to 17.
While guidelines are in place, the effectiveness of social media companies in self-regulating and implementing robust child protection measures remains a point of contention. Many children report that their efforts to flag abusive content or reset algorithms are futile. There is also an ongoing debate about the balance between protecting children and upholding their rights to access information and freedom of expression online.
The Communications Authority of Kenya has given ICT licensees until October 2025 to fully implement the new Industry Guidelines for Child Online Protection and Safety. Compliance will be monitored through routine reporting and regulatory oversight.
The coming months will be crucial in observing how effectively social media platforms and other ICT providers in Kenya adopt and enforce the new child online protection guidelines. Attention will also be on the Communications Authority of Kenya's oversight mechanisms and whether they lead to a tangible reduction in children's exposure to harmful content and an improvement in their online safety. The ongoing dialogue between parents, educators, policymakers, and tech companies will be vital in shaping a safer digital environment for Kenyan children.