Loading News Article...
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
A groundbreaking study reveals X's 'For You' feed can radically polarise users in just one week—a stark warning for Kenya's vibrant and often volatile digital political arena.

Your social media feed may be quietly making you angrier and more politically divided, and you likely wouldn't even notice. That is the sobering conclusion of a new study on the social media platform X, formerly known as Twitter.
A groundbreaking experiment published in the journal Science has provided the first direct causal evidence that small, algorithm-driven changes to a user's feed can dramatically increase political division. Researchers found that exposing users to more hostile political content for just one week increased negative feelings toward their political opponents by an amount that historically would have taken three years to develop.
The study, conducted with over 1,200 users during the heated 2024 U.S. presidential election, used a custom AI-powered tool to reorder participants' 'For You' feeds. One group was shown more posts with 'anti-democratic attitudes and partisan animosity' (AAPA), while another was shown fewer. The results were stark: those who saw more divisive content felt colder towards the opposing side, while those who saw less felt warmer. Crucially, most participants did not realise their feeds had been altered.
While the experiment focused on American politics, its findings are a critical wake-up call for Kenya. Social media platforms like X, Facebook, and TikTok have become central to the nation's political discourse, credited with mobilising youth-led movements like the 2024 protests against the Finance Bill. These platforms are powerful tools for civic engagement, but they can also be weaponised.
In Kenya, where political discourse is often intertwined with ethnic identity, the potential for algorithmic amplification of hate and division is particularly acute. Since the 2007 post-election violence, social media has been recognised as a double-edged sword—capable of both inciting hatred and promoting peace. This new research confirms that the very design of these platforms, aimed at maximising engagement under leaders like Elon Musk, can inadvertently deepen societal fractures.
Key findings from the study include:
The 'For You' feed was introduced after Elon Musk's acquisition of Twitter for $44 billion (approx. KES 5.7 trillion). Unlike the traditional 'Following' tab, it uses an algorithm to show users content it calculates will maximise their engagement, regardless of whether they follow the source accounts. This model has been criticised for creating echo chambers and promoting sensationalist content.
The study highlights the immense power platform owners wield. As Stanford University professor and co-author Michael Bernstein noted, the research opens pathways for interventions that could not only reduce partisan animosity but also build greater social trust. However, with platforms often prioritising engagement over social cohesion, the responsibility may fall on users and regulators to demand greater transparency and control over the digital spaces where political futures are increasingly shaped.
As Kenya looks toward its next election cycle, the invisible hand of the algorithm in shaping public opinion has never been more potent. The question is no longer whether social media influences our politics, but how profoundly—and how quickly—it can reshape our society.
Keep the conversation in one place—threads here stay linked to the story and in the forums.
Other hot threads
E-sports and Gaming Community in Kenya
Active 6 months ago
Popular Recreational Activities Across Counties
Active 6 months ago
The Role of Technology in Modern Agriculture (AgriTech)
Active 6 months ago
Investing in Youth Sports Development Programs
Active 6 months ago