According to Fortune, women are conducting a fascinating social experiment on LinkedIn by temporarily changing their gender settings to male. Lucy Ferguson saw her content impressions jump 818% after changing her name for 24 hours, while Rosie Taylor reported a 220% increase in people reached after switching her gender to male. Cass Cooper tried the experiment but actually saw her visibility drop, which she attributed to the intersection of gender and race since her profile then registered as a Black man. LinkedIn’s head of responsible AI, Sakshi Jain, responded with a blog post insisting the platform’s algorithm doesn’t use demographic information like gender or race to determine content visibility. Instead, she said LinkedIn relies on signals including position, industry, network, and activity to determine what content appears in feeds.
The Algorithm Denial
So LinkedIn says they’re not using gender data in their algorithm. And honestly, I believe them – they’d be insane to explicitly code gender bias into their system in 2025. But here’s the thing: when they say they use signals like “position” and “network,” they’re basically admitting the problem exists at a systemic level. If men dominate leadership positions across industries (which they do), and if people tend to network with others who look like them (which they do), then of course men will have an advantage in LinkedIn’s algorithm. It’s not that LinkedIn is intentionally favoring men – it’s that their algorithm reflects and amplifies existing business world inequalities.
The Unconscious Bias Reality
Now, let’s talk about what might be happening that LinkedIn isn’t admitting. We know from research on AI bias in hiring that algorithms can pick up on subtle patterns that effectively recreate human biases. Maybe the algorithm notices that content from profiles with male-associated names gets more engagement initially, so it shows that content to more people. Or maybe users unconsciously engage more with content they perceive as coming from male authority figures. The platform’s quick response from their head of responsible AI shows they’re taking this seriously – and they should, given how much is at stake for professional visibility.
The Bigger Picture
This LinkedIn experiment is happening against a backdrop of other gender-related business news that Fortune covered in the same newsletter. The Department of Education just made changes that exclude nursing and other female-dominated professions from favorable federal loan programs, which could worsen the nursing shortage. Japan’s first female prime minister is facing a major international crisis with China. And the NWSL might lose star player Trinity Rodman to international competition because salary cap limits prevent matching offers. Basically, women are fighting systemic barriers everywhere – and LinkedIn is just one more battlefield.
What’s Next?
I’m curious how many people will try this experiment themselves. The fact that Fortune’s reporter included her email asking for experiences suggests this is just beginning. And honestly, if enough people document similar results, LinkedIn might need to do more than just issue denials. They might need to actively audit their algorithm for unintended bias – something that’s becoming increasingly common as companies realize that AI systems can perpetuate discrimination even when they’re not explicitly programmed to do so. The real question isn’t whether LinkedIn is intentionally biased – it’s whether any platform that reflects our world can avoid reflecting our world’s biases.
