If you believe MIT's Dr. Cynthia Breazeal, we're on the cusp of a robot revolution.
"We've been waiting," Breazeal told delegates at this year's South by Southwest (SXSW) Festival. "When are ... robots going to come into our homes, into our daily lives? I think the time is finally upon us."
Dr Breazeal is an Associate Professor of Media Arts and Sciences at MIT Media Lab, where she founded and directs the Personal Robots Group. She's also founder and chief scientist of Jibo, a US-based start-up creating what it describes as a "family robot".
Jibo raised just under US$2.3 million (A$2.9 million) from crowd funding site Indigogo to get off the ground. It had been seeking a far more modest $100,000 (A$120,000), which it raised in just four hours. General availability of Jibo robots is expected around mid-2016.
Jibo is effectively the culmination of her life's work in personal robotics and realises her vision for robots in people's homes.
"Certainly, for me, the most important thing about the home is actually the family," she said. "Is it crazy to think that the personal robot's path into the home in a mass consumer way will be because of its relationship with the people who live there?"
Early robots that have made it into the home - such as the robot vacuum cleaners like the Roomba - serve a single purpose and do not emotionally engage with people in the home. Others, such as Sony's Aibo robotic dog, can engage the family but have little use outside that purpose.
Breazeal sees "huge potential" for personal or "social" robots that can serve multiple purposes and maintain a deep emotional engagement or connection with the people around them.
But she's cognisant of the challenges that come from talking up a future that affords robots any human-like qualities.
Though Breazeal advocates the entry of "humanised" robots into home and family structures, she is careful to point out that this does not mean trying to make the robots "more human per se".
"There's a lot of assumptions that we're trying to make robots more human-identical - that's really not what this is about," Breazeal said.
"Humanised [for me] really means how can you holistically support human experience – the social, emotional, cognitive, embodied aspects of experience."
She sees social robots forming a "partnership" with the people they are around. They should "empower" people and improve the way we live, rather than replicate and co-opt the characteristics that make us human.
"The old idea of robots was [of them] replacing people, being compatible, so to speak," she said.
"That's really not what this work is about. This work is really about building robots that work in partnership and enhance our human relationships, and enhance our abilities to do things for ourselves."
She continues: "I think one of the intriguing things about robots is they're distinctly not human.
"It's how their capabilities - how this relationship is different from our known relationships, different from our human capacities - can supplement and enhance ours."
And she believes that people will ultimately be able to form authentic, empowering relationships with robots.
"We already know that we can get a lot of value from our non-human relationships," Breazeal said.
"When you think about companion animals, your dog is not human, [it] does not have human emotions, and yet you do get a lot of value out of that kind of relationship."
She was intrigued by the idea of a robot providing similar value, and has conducted a number of "interaction studies" between people and social robots that have shown people are capable of forming close relationships with robots. These robots included one that helped children learn, and another that kept people in weight loss programs on track for success.
But questions remain - murky questions that raise ethical boundaries and blur the lines between robots and humans.
One of those is whether robots should have emotions and/or be capable of emotional response - assuming they can be programmed to do so.
"I'm not going to argue today that any robot has emotions but it's an intriguing question in the bigger sense," Breazeal said.
"It's intriguing to say, 'Maybe, yes', for the same reasons that we have emotions – to improve our intelligence, our capabilities, to learn from each other, to communicate, to collaborate it makes a lot of sense.
"But there are ethical concerns that are voiced in the media by people in the field that of course are also worthy of being considered and thought about because how you apply technology that could have profound impact on people's lives ... could be great or it could be not great.
"I think we need to have a dialogue around how we use these technologies to the best advantage."