Would you welcome a robot into your home? Guess what: You probably already have.
That’s if you use the loose definition of “robot,” meaning any sort of programmable machine (per Merriam-Webster: “a device that automatically performs complicated tasks”). It could be the dishwasher, a robo-vacuum or a self-cleaning litter box. They’re all relatively affordable luxury products that blend in with the rest of your appliances.
Xpeng’s recent demo looks so real that people are often convinced it’s just a costumed human.
Whether you’d let a robot live in your house has recently become a hot topic — not for what these new robots do, but how they act and what they look like.
Companies like NEO are selling $20,000 humanoids that seem almost like real people, not just in appearance, but in tone. They’re not the clunky, metal person-like thing of science fiction. Xpeng’s recent demo looks so real that members of the public are often convinced it’s just a costumed human.
The marketing argument for building robots like this is relatability. They look like us, the thinking goes, so we’ll let them take on the kinds of personal tasks it feels better to have a human(ish) doing.

This is a common tactic of product builds. With early voice assistants, companies leaned into feminine inflections because women are associated with service roles. Then it was ChatGPT, with a tone so friendly that users started saying “please” and “thank you.”
Now that we’re in the era of physical AI, anthropomorphism is taking off. Research says human form factors help drive trust and make robots more enjoyable to interact with.
Often, however, the robots are missing the quirks that make them actually feel human. Many offerings veer into the uncanny valley, that nexus where we get an overwhelmingly creepy feeling because something that mostly looks human feels “off” just enough to be unsettling.
So, it’s a balance. New research suggests product dev teams should let the robots mess up more and go all-in on a neurotic personality. Small errors that can be fixed quickly show some relatability, without making it so uncomfortable that trust breaks.
Perhaps another shift will be dropping the “friend” act. In extreme cases, this schtick can end up with people trusting poor advice that could lead to self-harm. But even on a lighter scale, things like the wearable chatbot Friend just don’t get it, some users say.
You might let a humanoid live with you if it’s affordable, stays in the kitchen and puts away your dishes — but maybe not one that replaces your masseuse.
Would you get a massage from an AI robot? Reporter Holly Quinn is collecting responses, let her know at holly@technical.ly.