Loading News Article...
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
A global protest is raising urgent questions about algorithmic fairness after women reported dramatic boosts in visibility on the professional network simply by identifying as male.

A growing number of women on the professional networking site LinkedIn are taking a provocative stand, some even donning fake moustaches in their profile pictures, to challenge what they claim is algorithmic gender bias. By changing their pronouns to he/him and adopting male aliases, they have ignited a crucial conversation about digital equality that resonates deeply within Kenya's professional landscape.
This isn't just a social media trend; it's a test of the systems that increasingly dictate career opportunities. The core allegation is that LinkedIn's algorithm—the complex code that decides who sees which posts—favours content from male users, effectively creating a glass ceiling in the digital world. For Kenyan women striving for visibility in business and employment, this raises a critical question: is a silent, digital barrier hindering their professional growth?
The experiment began when female users noticed their posts received significantly less engagement compared to male counterparts posting similar content. London-based entrepreneur Jo Dalton, for instance, reported that changing her gender settings boosted her post's reach by a staggering 244%. Others have reported similar dramatic increases in profile views and post impressions after making the switch.
LinkedIn has officially denied any gender bias in its systems. Sakshi Jain, who oversees AI governance at the company, stated that their algorithms do not use demographic information like gender to determine content visibility. Instead, the platform attributes differences in engagement to "hundreds of signals," including a user's network and activity.
However, critics argue the issue may be one of "proxy bias." This means the algorithm might not be penalising users for being women, but for characteristics it has learned to associate with female users, such as certain language styles or non-linear career paths that include breaks for caregiving.
While the protests are global, the implications are profoundly local. In Kenya, where women already face significant hurdles in the workplace, a biased algorithm could amplify existing inequalities. Data shows that Kenyan women are less likely to be employed than men and face a substantial pay gap. Furthermore, traditional beliefs and cultural norms often create barriers to career progression.
On LinkedIn, these disparities are reflected in user data. In early 2024, men made up 60% of LinkedIn's advertising audience in Kenya, while women accounted for only 40%. If the platform's algorithm inadvertently favours the majority user base's communication style, it risks further marginalising the voices of the 40%.
For the thousands of Kenyan women using LinkedIn to build businesses, find jobs, and network, algorithmic suppression is not a theoretical problem—it's an economic one. It means fewer eyes on a business proposal, a missed connection with a potential employer, and a muted voice in professional conversations.
As this debate unfolds, women are calling for greater transparency from tech platforms whose algorithms hold immense power over their economic futures. The question now is whether these companies will heed the call and ensure their digital spaces offer a truly level playing field for all.
Keep the conversation in one place—threads here stay linked to the story and in the forums.
Other hot threads
E-sports and Gaming Community in Kenya
Active 6 months ago
Popular Recreational Activities Across Counties
Active 6 months ago
The Role of Technology in Modern Agriculture (AgriTech)
Active 6 months ago
Investing in Youth Sports Development Programs
Active 6 months ago