You've seen people with a trace of a smile on their face, or a hint of a smirk, or a tinge of doubt. Human facial expressions can be subtle, created as they are by us setting our jaws, raising our eyebrows or tightening our lips. And that subtleness has traditionally been difficult, from a materials standpoint, to design into a robot.
At least until now. Prepare to be grossed out: Remember how we learned that plants can communicate with each other through fungus in the soil? Well, in a similar example of the utility of an often-overlooked material, robotics designers from the UK have now produced a robot whose expressions are controlled by mold. Slime mold.
Slime mold can move. The stuff tends to live in dark shady, wet areas like fallen logs and leaves, and by contracting its tube-like structure and propelling itself towards a food source, it can move at about 1 mm per hour. On top of that, slime mold is surprisingly clever. In 2000 Japanese researchers showed that slime mold can determine the shortest route through a maze, when searching for food. Even more amazing, it has an ability to "remember" where it has been even though it lacks a brain.
So what the heck does this have to do with robots? It starts with that movement capability. Klaus-Peter Zauner at the University of Southampton is a pioneer of connecting mold to a robot. After observing that slime mold shies away from light, Zauner grew the mold into a six-pointed star shape, on a six-pointed circuit, and connected it via computer to a six-legged robot. Each robot leg corresponded to one "arm" of the star-shaped mold. If light hit one of the mold's arms, then it moved away from the light, while at the same time controlled the movement of one of the robot's legs. In short, the robot scrambled away from light as an embodiment of the mold.
More recently Ella Gale, a post doc research at the University of The West of England, created a robot whose facial expressions move in sync with the movements of slime mold. She grew the mold on a series of electrodes. The mold would then either move toward a food source or shy away from light, and these movements would send specific electrical signals that were translated into sounds. Each sound corresponded to either happy or sad. A demo of this robot launched at the recent Living Machines conference a the Natural History Museum in London.
Hit the jump to see the video, if you dare....
Creepy, isn't it? ("The creators obviously hate falling asleep and feeling good," one YouTube commenter chimes in.) But Uncanny Valley aside, using robots as an amplifier for mold provides some pretty creative inspiration for what other living organisms can offer in terms of physical design.
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.
Comments
Is the slime mold moving within the robot?
Or is the mold moving elsewhere and just acting as a controller for the robot?