"Our goal is to deploy autonomous humanoid workers to support us on a global scale," writes robotics and AI startup Figure, who recently revealed their Figure 02 'bot. "As we reach the upper limits of our production capability, humanoids will join the workforce with the ability to think, learn, and interact with their environment safely alongside us."
"Figure 02 is AI-powered and self-reliant, ready to produce an abundance of affordable, more widely available goods and services to a degree which humanity has never seen."
For them to achieve their goal, their Figure 02 'bots have to be able to interact with humans. This is a fairly stunning demonstration of where they're at right now:
Enter a caption (optional)
Three things jumped out at me.
One, the delay while it "thinks" is a bit eerie.
Two, I was surprised at how human-sounding they've got the voice, and I'm not sure I like the hesitancy they've programmed in, where it actually says "uh;" it's one thing to make the robot human-shaped so that it can work in human spaces, but I think it's a bad idea to lull people into feeling that they're interacting with another human. I think robots should sound like robots, so that we never forget what we're interacting with. Barriers to forming subconscious emotional connections with these machines should be designed in.
Three, check out those graphics on its face. While the visual language is not yet clear, you can see the designers are trying to convey to the interactee that something is happening; the 'bot is sensing now, the 'bot is processing now. To me, those face graphics are the most interesting part, and they remind me of the early days of GUIs. Rather than render a human face with expressions, the designers are using low-resolution white-on-black shapes to provide visual feedback to the human. I'll be curious to see how this visual language develops, and how competitors will handle it.
Your thoughts? Would you be comfortable interacting with one of these?
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.
Comments
Seems to me, for humanoid bots to interact with humans there will need to be wholesale changes to H&S legislation too. Visit anywhere where industrial robots work (oranges and lemons, I know, but a not-totally-invalid comparison) and you're kept outside a cage, covered in hi-vis and wearing a hard hat. So now suddenly we'll be allowed within touching distance of similarly articulated and still pretty powerful machinery with all the gappy, finger-trappy risks as well as software 'episodes' I can imagine happening inevitably, as with any computer system. Reporting always focuses on the hardware and their ever-closer resemblance to the human form, but there's going to have to be a huge swath of ancillary 'soft' change to enable close-quarter interaction between hardware and wetware, surely?
You need watch the movie Runaway. Michael Crichton has Tom Selleck chasing farm, office, and household robots run amok.
Why do designers design their humanoid robots to be so inhuman? Just look at that thing: faceless, all metal and mechanics, every sci fi nightmare brought to life. Why? It baffles me.
Wasn't this the bartender in the Fifth Element?
Some of the images that you pointed out which implicitly convey a process happening behind the scenes are actually part of the images displayed by the user interface of ChatGPT.
Once again, general purpose anthropomorphized robots will always be inferior to specialized application units, and this is just an ad for VCs.