According to TechCrunch, former Harvard physician Jenny Shao left her medical residency during the pandemic to launch Robyn, an empathetic AI companion that just raised $5.5 million in seed funding. The app costs $19.99 monthly or $199 annually and launched today in the US after months of testing. Shao worked under Nobel Laureate Eric Kandel and applied human memory research to create what she calls an “emotionally intelligent partner” that’s neither a friend nor therapist replacement. The funding was led by M13 with participation from Google Maps co-founder Lars Rasmussen and other notable investors, while the startup has grown from 3 to 10 employees this year.
The emotional AI dilemma
Here’s the thing about emotional AI companions – we’re walking into a minefield with good intentions. Shao is absolutely right that we’re facing a “massive disconnection problem” where people feel less understood despite being surrounded by technology. And the stats are staggering – 72% of U.S. teens have used AI companion apps. But we’ve already seen how badly this can go with lawsuits blaming similar apps for contributing to suicides.
The positioning is clever though. Robyn isn’t claiming to be your friend or your therapist. It’s that middle ground of “someone who knows you very well.” But let’s be real – when people start sharing their deepest emotions with an AI, boundaries get blurry fast. The company says it has guardrails, crisis line numbers, and pushes back on certain topics. But can you really program empathy? And at $20 monthly, that’s a serious commitment for something that might just be a fancy journaling app with better memory.
The safety question nobody’s solved
Look, I appreciate that they’re thinking about safety from day one. The app apparently directs users to crisis resources and emergency rooms if they mention self-harm. But we’ve been down this road before with other AI companions that started with good intentions. The fundamental problem is that humans anthropomorphize everything – we form emotional attachments to Roomba vacuums, for crying out loud.
What happens when someone becomes emotionally dependent on Robyn? The investors acknowledge this challenge, with M13’s Latif Parecha saying “there needs to be guardrails in place for escalation.” But escalation to what? If someone’s in crisis and Robyn says “call this number,” will they? Or will they just keep talking to the AI that makes them feel understood? It’s a terrifying responsibility to put on any algorithm.
Where this all leads
Basically, we’re witnessing the professionalization of AI companionship. This isn’t some random startup – it’s founded by a Harvard-trained physician with Nobel-level research credentials. The funding comes from serious investors who normally back more conventional tech. That gives Robyn credibility that earlier emotional AI apps lacked.
But I keep coming back to the same question: Are we solving loneliness or just monetizing it? At $240 per year, Robyn costs more than many therapy co-pays. The app’s available on iOS now, and it’ll be fascinating to see how users respond to something positioned as neither friend nor therapist, but something in between. The space between “helpful support” and “dangerous dependency” is awfully narrow, and I’m not convinced any AI company has found the right balance yet.
