3D Hubs recently worked with a team of students (Project Aslan) from the University of Antwerp, Belgium. Through 3D printing, Project Aslan is able to create a 3D printed robot capable of translating text (soon speech) into sign language that otherwise would not be as affordable or accessible with other production technologies. The lack of sign language interpreters continues to be a global issue that's not being tackled. Now with the potential of this low-cost robot, more people can get the support they need as it develops.
Here's a video of the project.
Project Aslan:Lars Vermeiren (student), Jorn Dox (student), me (alumnus/supervisor), Matthias Goossens (alumnus/supervisor), Stijn Huys (alumnus/supervisor) and Erwin Smet (University supervisor)
Writer: George Fisher-Wilson
A team of engineering students from the University of Antwerp are working on building a humanoid robot that will have the ability to translate speech into sign language. Sponsored by the European Institute for Otorhinolaryngology, the robot titled Project Aslan aims to support the short supply of sign language interpreters across the world.
The project uses 3D printing combined with readily available components to make the robot affordable and easily manufacturable. Using 3D Hubs, a network of 3D printing services, the Aslan robot will be able to be produced in over 140 countries.
The project started in 2014 when three masters students (Guy Fierens, Stijn Huys, and Jasper Slaets) saw there was a large communication gap between the hearing and Deaf communities. They felt modern technologies could offer a solution to help bridge that gap, especially for situations where there continues to be a lack of support for the Deaf community. Stijn Huys outlines the beginning of the project:
"I was talking to friends about the shortage of sign language interpreters in Belgium, especially in Flanders for the Flemish sign language. We wanted to do something about it. I also wanted to work on robotics for my masters, so we combined the two."
Acquiring the help of further students, a robotics teacher and an ENT Surgeon, work progressed. Now 3 years later, Project Aslan is in its first iteration, a 3D printed robotic arm that has the ability to convert text into sign language, including finger spelling and counting. ASLAN is an abbreviation which stands for: "Antwerp's Sign Language Actuating Node," which reflects its meaning as a connection between the hearing and the hearing-impaired.
When manufacturing the Aslan robot, the team needed an affordable and scalable manufacturing solution that was accessible all over the world. 3D Hubs currently connects over 1 billion people within 10 miles of a 3D printer, so it made sense to partner to utilize this technology and network. 3D printing was also used, as it allows the easy replacement of parts that might break after extensive use and updated parts can easily be printed when available.
The first prototype featured 25 3D printed parts, which took a total of 139 hours to print on a desktop 3D printer. In addition to these 25 parts, 16 servo motors, 3 motor controllers, an Arduino Due and a number of other components were necessary to fully assemble the robot. The assembly of the complete arm takes around 10 hours. The Aslan robot works by receiving information from a local network and checking for updated sign languages from all over the world. Users connected to the network can send messages, which then activate the hand, elbow and finger joints to process the messages.
The goal of Project Aslan is not to replace human sign language translators but to merely provide support when they are not available. Once the designs are optimized, the robot can be used in numerous practical applications that, along with general assistance, can attempt to solve the root of the problem. The root problem is that sign language courses are sparse, resulting in a shortage of the necessary amount of translators. The Aslan robot can be used to help teach sign language with a human teacher, thereby expanding the capacity of these classes.
Robotics Teacher Erwin Smet outlines the possible applications of Project Aslan:
"A deaf person who needs to appear in court, a deaf person following a lesson in a classroom somewhere—these are all circumstances where a deaf person needs a sign language interpreter, but where often such an interpreter is not readily available. This is where a low-cost option, like Aslan, can offer a solution."
The future of the project is set, with four new research topics to be picked up by incoming masters students. Two of them focus on optimizing the current design to create a 2-arm setup. The third topic covers facial expressions and the implementation of an expressive face to the design.
The final topic is to investigate whether or not a webcam can be used as a modality to teach new gestures to the robot. Sign language is performed using the entire body, with both arms, shoulders, and face. Using a webcam to detect the movements of the larger joints and detect facial expressions will aid in gesture development.
Once the mechanical design and software of the robot have reached a sufficiently advanced level, all designs will be made open-source for everyone to use.
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.