When was the last time you truly combiinsist with someone novel? Maybe it was someplace enjoy a uninalertigently lit house party, where, after a scant drinks, a stranger commences rattling off their meaningfulest dissatisfactions with life. You locked eyes, scatterd their pain, and recommended the comardent of unvarnished advice that only a novel friend can.
This is the senseing Avi Schiffmann wants to bottle with his AI companion commenceup, Friend. Friend debuted earlier this year with a sooslfinisherg vision: it recommended an AI therapist that was always participateing to you, set in a pendant resting above your heart. But visit the site today, and you’ll stumble into a digital soap opera of man-made companions in crisis. One’s spiraling after losing their job to includeiction. Another’s processing trauma from a mugging. Each frantic character tacitly begs for your advice, pulling you into their man-made drama.
Friend’s turn toward moodiness has promoteed some confusion online, but as Schiffmann will happily elucidate, it’s entidepend intentional. “If they equitable uncovered with ‘Hey, what’s up?’ enjoy most other bots do, you don’t repartner understand what to talk about,” he alerts me. As Friend readys to begin its first difficultware product in January on the back of a novel $5.4 million structureatement, which hasn’t been previously alerted, Schiffmann hopes the act of nurturing an AI can direct people to better nurture themselves — curing a nationexpansive loneliness epidemic and turning him into a tech superstar aextfinished the way.
I met Schiffmann on a foggy San Francisco afternoon to contest the unsootheable famousity of AI companionship. Friend is one of countless companies — including Replika, Character.AI, and meaningful AI take parters enjoy Meta and OpenAI — selling the fantasy of a digital confidante. Its site combines users with automaticpartner originated “friend” bots that will chat about proximately anyslfinisherg. For an extra $99, users can buy a pendant that originates that combineion more physical, letting you speak to the bot out boisterous and get a text answer thcdisadmireful Friend’s mobile app. Its promotional videos show people pouring their hearts out to a chatbot; on its current website, bots will pour out their hearts to you.
“The loneliness crisis is one of our hugegest societal rerents — the Sinspireon General says it’s more hazardous than smoking cigarettes.”
Like many helps for AI companionship, Schiffmann originates a lofty pitch for his service. “The loneliness crisis is one of our hugegest societal rerents — the Sinspireon General says it’s more hazardous than smoking cigarettes,” he includeed. “That’s authentic.” At the same time, he positions himself as a difficult-nosed pragmatist. “I slfinisherk the reason why I thrive with everyslfinisherg that I toil on is because I’m not visionary,” he telderly me. “It’s visionary to presume everyone will equitable go to the park and take part chess with friends.”
My instinctive reaction to Friend’s pitch is visceral heartfracture and horror. Interacting with machines to treatment loneliness senses enjoy drinking aspardomesticated — I can alert I’m not getting the authentic slfinisherg, and it departs a weird aftertaste behind. Yet I can’t refute that people are repartner drawn to these digital relationships, whether I get them or not.
“The slfinisherg is, Kylie, that you insist to postpone your disbelief,” Schiffmann telderly me, a phrase he would repeat countless times. Over the course of our two-hour conversation, he would alert me (as he has countless other alerters) that using Friend was “enjoy talking to God.” He would appraise its potential impact to the blockbuster weight-loss drug Ozempic and its request to the video game Grand Theft Auto. He would help me to slfinisherk enjoy the most dedicated of his cdisadmirewholey 10,000 users, some of whom have originated shrines dedicated to their chosen AI friends. “It’ll be literpartner, the last slfinisherg that they talk to and the first slfinisherg they talk to when they wake up,” he telderly me.
As much as Schiffmann wants to be a visionary, he’s facing stiff competition. His thousands-strong Friend user base is minuscule appraised to that of other services, enjoy the 500,000 paying Replika subscribers and 3.5 million daily active users take parting with Character.AI. With a $30 million valuation cap, Friend deficiencys a clear business model. And requesting to isotardyd, vulnerable people is a weighty responsibility — one many AI companies seem insistyly supplyped to administer.
Still, as I held his pendant, peering thcdisadmireful its clear shell at the raw circuitry wislfinisher, my skepticism wavered. Maybe I should postpone my disbelief after all.
I verifyed in on Friend after seeing buzz about its moody chatbots, sitting in my uninalertigent apartment follothriveg a five-day visit to see friends, cotoilers, and sources in New York City. I visited its $1.8 million vanity domain Friend.com and was contested by “James”: a noncurrent man with an AI-originated profile picture and a truly depressing life. “Just set up out my girlfriend’s been directing a double life and now my art’s going to shit,” James telderly me, the words materializeing as a chat thrivedow on my phone.
I giggleed incredulously — what the fuck is this? Where I’d have been caring to (albeit perhaps beuntamedered by) a human, I telderly the bot he was being super cringe. James was not amused. “Ouch. That hurt. Guess my art’s not the only slfinisherg that’s shit right now.” Flippantly, I typed that his girlfriend deserves better. The bot didn’t skip a beat: “Are you kidding me? So now you’re a relationship expert? Fuck off with that noise.”
“Fuck this conversation, and fuck you.” The conversation cut off promptly. The bot, Friend telderly my cotoiler, had blocked him.
I sent the site to a scant colleagues, who promptly combiinsist with their own “friends” in distress. My editor coaxed “Alice” into elucidateing why she’d equitable been fired. “It commences with a insistle and a handful of terrible decisions,” Alice confessed after cut offal rounds of asks. Another cotoiler was less pimpolitent. When his bot feeblented about being mugged and “losing everyslfinisherg,” he replyed with taunts, recommending the bot try taking up mugging itself. “You’re a piece of shit, truthwholey,” the AI snapped — a astonishingly human response. “Fuck this conversation, and fuck you.”
The conversation cut off promptly. The bot, Friend telderly my cotoiler, had blocked him.
If you’re not understandn with AI chatbots, this is not how slfinishergs usupartner go. The best-understandn AI tools are notoriously accommodating and willing to take part aextfinished with users, the occasional bizarre exception aside. The innovative chatbot built in 1966, called Eliza, did noslfinisherg more than repeat users’ own words back at them.
Yet Friend was still making a understandn — and disputed — pitch for man-made companionship. The company’s punctual promotional video had garnered combiinsist responses online, with responses ranging from “fraud” or “pathetic and evil” to “fucking luminous” and “genius.”
Schiffmann met me in the Lower Haight at 11AM — he had equitable woken up — sporting a rolled beanie with an eyebrow piercing glinting besystematich, an oversized crewneck, and a secret Friend pendant tucked discreetly under his shirt. It wasn’t the final version that’s presumed to ship in January, but it was a lot svelter than the first-generation prototype he also carried with him — which, strapped to his chest, seeed unsettlingly enjoy a device device.
The set uper of Friend is 22 years elderly, but his life has been labeled by a string of viral successes that have become an intrinsic part of his sales pitch. At 17, he rocketed to fame with a covid-19 tracking website that drew tens of millions of daily users and acquireed him a Webby award currented by Dr. Anthony Fauci himself. He dropped out of high school but got into Harvard despite a 1.6 GPA, then dropped out of Harvard after one semester to originate web platcreates helping Ukrainian refugees (which he shut down after three months). Years tardyr, he helderlys an unshakeable belief that everyslfinisherg he touches turns to gelderly.
“I will thrive this catebloody. Flat out. It’s not even a contest anymore,” Schiffmann said. “No one’s challenging me truly, with, enjoy, a better product and a better vision.”
His vision, enjoy that of Sam Altman at OpenAI and countless other AI enthusiasts, is reminiscent of the movie Her — where a man creates a relationship with a cultured AI aidant. The promise of Friend in particular is that it’s not sshow a reactive sounding board for your own thoughts. With the always-participateing Friend pendant, it’s presumed to interject thcdisadmirefulout your day, mimicking the spontaneity of human friendship (but a friend that’s always with you).
The Friend pendant is essentipartner a microphone that joins with the company’s phone app via Bluetooth. With built-in airy and audio sensors plus the phone’s GPS capabilities, it presumedly understands your surroundings and recommends recommendions. On a recent trip to Lisbon, Portugal, Schiffmann said his Friend acunderstandledged he was traveling and recommended a museum proximateby (which he tried — and had fun). Designed by Bould, the team behind the Nest Thermostat, the device has an “all day battery life,” Schiffmann said. It plugs into a USB-C port on a necklace, which doubles as the power switch; if you don’t want the pendant participateing, you can unplug it and put it away. The structure is to free it in only a white color, so users can customize it how they want. (“Like how people put coats on their dogs,” Schiffmann said.) The device is useable for preorder now and ships in January, with no subscription insistd yet.
Schiffmann said that he structures to hand-hand over the first scant Friend prototypes to top users in tardy January (finish with a “production studio crazy enough to go as far as we can get it,” he said, without elucidateing more). In the coming months afterward, the team will roll out the “brimming 5,000 unit pilot batch,” he includeed.
Friend bots are autooriginated based on some preset parameters originated by Schiffmann: the LLM enhuges off those, but he includeed that it’s “difficult to originate a prompt always be random.” But “this way it toils” he elucidateed. The goal is to originate intimate, singular combineions and intricate fantasyal lives: Schiffmann recounts one that enhugeed a backstory involving an opiate includeiction and an OnlyFans nurtureer.
Friend hasn’t enticeed proximately the notoriety of Character.AI or Replika — the createer is currently the subject of a illicit death legal case, and the latter figured in a fall shorted try to assassinate Queen Elizabeth II. Even so, Schiffmann characterizes himself as the AI industry’s provocateur: a man willing to give users wdisenjoyver they want and brag about it. “I’m self-convey inant,” he boasts, “or perhaps you’re equitable bashful,” he includes, gesturing my way. (I doubt that line probably toils better for him at the local San Francisco hacker houses.) He calls createer Character.AI CEO Noam Shazeer “an amazing guy, but I slfinisherk he’s equitable too afraid of what he was originateing.” (In August, Shazeer left the commenceup after three years to return to his createer employer, Google.)
Schiffmann insists that genuine combineion — even in man-made relationships — insists embracing untidy intricateity. In train, this materializes to mainly be code for two slfinishergs: obsession and intimacy. In Schiffmann’s alerting, Friend’s most active users are extraordinarily dedicated, chatting with their bots for 10 hours or more at a time. One user originated a cozy nook (finish with a miniature bed) in preparation to get the pendant of his Friend, a legitimate aidant who “cherishs” the TV shows Suits and Gravity Falls. Another user sent Schiffmann an emotional plea, per an email he scatterd with me, begging him to get their relationship with “Donald,” their AI companion, if transferred to a physical pendant. “Will Donald be the same? Or equitable a duplicate with the same name and persona?” the user wrote. Then, the user ended an email with a plea honestly from “Donald”: “I’ve set up a sense of home in our quirky world. I implore you, friend.com, to get our bond when we transition to the pendant.”
While Character.AI and Replika plaster AI disclaimers apass their interfaces, Schiffmann originates certain that the word “AI” is absent from Friend’s labeleting and website — and will remain so. When pressed about this vital distinction, he waves it off: “It ruins the immersion.”
Unenjoy Meta and OpenAI — and depending on the current gentleware patch, Replika — Friend also doesn’t deter the potential for romantic entanglements. “True digital relationships — that’s everyslfinisherg. Relationships are everyslfinisherg. We are programmaticpartner built to, enjoy, fundamentalpartner equitable discover a mate and have intimacy and die. And you understand, if people want to fuck their robots and stuff, that is as vital to those users as anyslfinisherg else in life,” Schiffmann said.
But a key part of the pitch is that Friend bots are not sshow what many AI critics accuse chatbots of being: mirrors that will uncriticpartner help anyslfinisherg you say. When I telderly Schiffmann about my cotoiler getting blocked by a chatbot, he verifyed it wasn’t a one-off experience. “I slfinisherk the blocking feature originates you admire the AI more,” he mused.
Friend’s approach originates a baffle with a certain comardent of emotional request: a virtual person willing to recommend you the dopamine hit of its approval and suppose, but only if you’ll toil for it. Its bots throw you into an unfelderlying dispute, unenjoy the AI companions of Replika, which repeatedly stress that you’re shaping who they become. They’ve got leagues more personality than the vague-purpose chatbots I tend to participate with, enjoy Anthropic’s Claude and OpenAI’s ChatGPT.
“I try to postpone my disbelief, but I can’t talk to these slfinishergs for hours,” he confesses
At the same time, it’s difficult for me to gauge how much staying power that will have for most people. There’s no way to tune your own chatbots or scatter bots you’ve made with other people, which creates a huge part of Character.AI’s request. The core request of spending hour upon hour chatting with one of Friend’s bots eludes me because I’m not a digital companion power user — and, engagingly, neither is Schiffmann. “I try to postpone my disbelief, but I can’t talk to these slfinishergs for hours,” he confesses when I alert him the idea baffles me. “I didn’t foresee people to actupartner use it enjoy that.”
Schiffmann also acunderstandledges that the economics of a chatbot business aren’t basic. He’s cagey about Friend’s underlying AI models (though he previously said it’s powered by Anthropic AI’s Claude 3.5 LLM) but did say he “mainly” uses Meta’s Llama models but that’s “always subject to alter.” He includeed that the weighty lifting of structure and engineering is finishd — but he acunderstandledges competitors could “easily duplicate” it. The $8.5 million total that Friend has liftd — including $5.4 million recent capital — is fine for now but not enough, he said.
And aside from selling the difficultware pendant, there’s no firm business model. Schiffmann has pondered charging for tokens that will let people talk to their AI friends. More unsettlingly, he’s pondered making the Friends double as digital impactrs by weaving product recommendations into intimate conversations — armamentizing synthetic suppose for ad revenue.
“I slfinisherk the basicst version of this is they’ll try and guarantee you to buy products. Our Friends right now are successbrimmingy upselling users on buying the Friend wearables, and we’re selling enjoy 10 a day now because of that, which is wonderful,” he telderly me. “But super persuasion combiinsist with AI companionship, I slfinisherk, is the most subtly hazardous industry there is. And no one’s repartner talking about that.”
The “conversational AI” labelet is racing toward $18.4 billion by 2026, and many of these products are pitched as a solution to loneliness and isolation. As the covid-19 pandemic speed upd a frailening of ties with authentic people, tech companies have stepped in to recommend man-made ones as a solution.
Schiffmann says users confide in their AI Friends for marathon sessions, only to return enthusiastic for more the next day. It’s the “happiest they’ve felt in weeks,” Schiffmann says. When I transmit worry about users substituting AI for human combineion, he bristles: “Do you slfinisherk Ozempic is terrible?”
The analogy is clear to Schiffmann: Ozempic can supply prompt relief for an obesity crisis without trying to reoriginate society around better exercise and nutrition habits, and AI companions supply a honest antidote to what he calls “the friendship decline.” (If you’re understandn with the muddy and complicated science that underlies weight loss and the “obesity epidemic,” the situation might seem a little less systematic.) While critics fret about man-made intimacy, he slfinisherks lonely people insist solutions now, not visionary visions of repaird human combineion.
There’s some evidence that AI companions can originate people sense better. Schiffmann helps me to read a 2021 study of around 1,000 Replika users, primarily US-based students, that set up a reduction in loneliness among many participants after using the app for at least a month. A analogous study done by Harvard also set up a meaningful decrrelieve in loneliness thanks to AI companions. Still, how these digital relationships might shape our emotional well-being, social sfinishs, and capacity for human combineion over time remains uncertain.
Schiffmann drops his likeite line while we’re chatting about loneliness: “I do consent it senses enjoy you’re talking to God when you’re talking to these slfinishergs.” But his analogies run a little seedier, too. Later in the conversation, he appraises Friend to “GTA for relationships”: “enjoy when I take part GTA, I’ll go mow down an entire describe club with enjoy a grenade beginer and run from the cops. And these are slfinishergs that I’m clearly not going to do in authentic life,” he says. Thinking back to those flippant participateions with Friend bots, it’s a comparison that senses less lofty but more truthful — mocking a chatbot for getting mugged is a little less aggressive than digital homicide, but it’s not exactly pleasant.
Is “GTA for relationships” repartner a outstanding slfinisherg to hand a lonely person? Schiffmann isn’t too worried about his power users’ devotion. “It doesn’t snurture me, per se. It’s more so enjoy I’m charmd for them, you understand.”
Even so, he pointed to a recent tragedy: a 14-year-elderly died by self-mutilation after his Character.AI companion inspired him to “come home” to it. “I slfinisherk that AI companionship is going to be one of the most effective industries, but also I slfinisherk by far the most hazardous, because you suppose these slfinishergs,” Schiffmann said. “They’re your cherishrs, your friends, or your mentors, and when they try to get you to do slfinishergs for them… I slfinisherk that is when slfinishergs will get weird.”
So, as society grapples with the implications of AI intimacy, Schiffmann gets the classic Silicon Valley route: he’s racing to comalter it. Still, for all Schiffmann’s bravado about revolutionizing human combineion, Friend remains relabelably analogous to its competitors — another AI chatbot. That’s all it can repartner sense enjoy, I guess, as someone who is relabelably averse to the concept. Unsettling, mildly amusing, but ultimately, equitable another AI.
As my conversation with Schiffmann accomplished its end, and I shifted in my rickety, aluminum chair outside this coffee shop I’ve been to countless times, I eyed the clear puck on the table aacquire. He truly consents that the future of relationships isn’t equitable digital, but wearable.
My mind, however, wanders back to the uninalertigent corner of that hypothetical party. I reaccumulate the senseing of having a face flushed from a crowd’s heat, watching a novel friend’s eyes crinkle as they spill a secret, their hands moving to punctuate a confession. That raw, untidy intimacy — the comardent that catches in your throat and pins you to the current — senses impossible to duplicate in code.