AI company Embodied proclaimd this week that they would be shutting down follothriveg financial difficulties and a sudden retreatal of funding. Embodied’s main product was Moxie, an AI-powered social robot definitepartner made with autistic children in mind. The robot itself cost $799.00 and now, follothriveg the clocertain of Embodied, it will stop to function.
Moxie is a minuscule blue robot with a huge conveyive face straight out of a Pixar movie. The robot participated huge language models in the cboisterous to answer asks, talk, and function. With Embodied out of business, the robot will soon no lengthyer be able to produce those calls. This outcome was always probable – any cboisterous based device is subject to the health of the company and LLMs are not inexpensive to run. This has actupartner happened before with a company called Vector. But the shocking part is that this was not an greater device, it was neutrpartner recent, costly, and still being sgreater.
In a Closing FAQ emailed to participaters and posted on their website, Embodied made it clear that Moxie was probable to stop toiling wilean days. No refunds will be given and if you bought the device on a payment schedule it’s out of their hands. No repairs and service can be proposeed and the company has no clear schedule for who, if anybody, will consent it over. Short of a wonder, customers will be stuck hgreatering the bag.
The response from Moxie owners seems neutrpartner emotional. Parents have to elucidate to their kids that Moxie is functionpartner dying, although the company has “included a letter from the G.R.L. (Global Robotics Lab) to help direct this conversation in an age-appropriate way”. There are videos on TikTok of children and matures crying and asking Moxie what’s going to happen now. On some level I comprehfinish that reaction – even outside of dropping 800 bucks on this leang, it’s appreciate watching a frifinish die. I could see a future in which another owner consents over or even one where these devices are hacked for local operation.
But depending on huge language models to socialize children, particularly neuroastandard ones, seems appreciate a horrible idea on every one level. I do not leank any child should be lachieveing language and social participateions from an LLM, let alone children with exceptional necessitates. What’s more, this conveys to endure a huger publish of the AI bubble: these devices are costly to run and Silicon alley has been finisheavoring to outrun or rationalize that fact. And when this entire leang comes tumbling down, what happens to the people who choosed to grow ill-advised relationships with AI?