A new robot is changing the way child autism is diagnosed and treated.
The robot, called Zeno, is a collaboration between Dr Dan Popa at the University of Texas at Arlington (UTA), Hanson RoboKind, Dallas Autism Treatment Centre, Texas Instruments and National Instruments, and is the brainchild of Hanson Robot owner and former Disney imagineer David Hanson.
Traditionally, the disorder is recognised amongst children through social interaction and speech exercises. This has meant that no diagnosis can be attempted before the child can talk. Zeno is set to revolutionise this. By using National Instrument (NI) tools, Zeno can primarily interact with children through nonverbal communication such as body movement. He has a wide array of facial expressions and he can walk and talk in order to fully communicate with a child. What is more, UTA has integrated a Microsoft Kinect sensor into the system allowing the therapist to conduct therapy sessions remotely. This can help speed up the diagnosis process.
Dr Dan Popa from UTA believes that Zeno is a good motivator for children as he is very engaging and non-threatening; the children listen to the robot. “The idea would be for the robot to instruct kids, give them some useful social skills and at the same time observe their reactions and calculate their reaction times. That calculation could form some kind of an autism scale.”
There are three ways that therapists can use Zeno. “The first mode is called a scripted mode of interaction where you pre-programme a certain sequence of motions,” Popa explains. “For the second mode we have added a control system using National Instruments Single-Board RIO so we can have an operator or therapist control the robot by tele-operations. In this mode it mirrors the motions of the instructor.”
The NI Single-Board RIO platform combines a real-time processor, reconfigurable field-programmable gate array (FPGA) and analogue and digital I/O on an embedded, single board that is programmed with NI LabVIEW.
“In the third mode we can also let the child take control of the robot directly using a Microsoft Kinect,” Popa adds. “This third mode can be unsafe as the child can do things like slap himself that the robot will copy and possibly break. So we tend to use this third mode as entertainment for the kids. Usually it is the therapist controlling Zeno, which could be in the background or right in the same room.”
“The problem to solve was, and still is, the need for new tools that can be used in the early identification and diagnosis of ASD in children,” says Rahman Jamal, Technical & Marketing Director of National Instruments. “The clinical hypothesis is that motor developmental problems including imitation are present in children with autism and can be used in early identification and diagnosis of ASD. The technological hypothesis is that a humanoid robot such as Zeno can motivate children with ASD to engage in motor activities and these interactions can be analysed for diagnosis and therapy.”
A friendly, easy to use yet powerful and highly interactive programming environment was needed. For this purpose, NI LabVIEW was selected and an application was developed by the team at UTA. The Zeno R30 robot comes with an embedded controller that uses a 1.6 GHz Intel Atom Z530 processor and the robot is controlled from a laptop running a LabVIEW application. One of the modes of interaction studied is called ‘Dynamic Interaction’, in which a MS Kinect sensor and LabVIEW is used to allow full tele-operation control of the arms and waist degrees-of-freedom (DOFs), allowing a therapist to interact with the child through the robot; it also allows the child to control the robot directly.
The use of robots in recognising the symptoms of autism is not unique to this project, there are literally dozens taking place around the world, but what is unique is the use of a human-like robot that defies long-held robotic conventions.
“The idea of lifelike robots has always been an important one; we have seen them in science fiction forever,” Richard Margolin, director of engineering at Hanson RoboKind explains. “As the technology becomes a reality we need robots that are like us, that we can communicate with naturally and signal their intentions, their next movement, and their thoughts to us in a way that is natural, otherwise the interaction is much more difficult.
“Our faces allow for this type of interaction and allow for phenomenal engagement when working with people. Trying to interact socially with robots without faces is like talking to someone who is wearing a mask and sunglasses, you will always be missing social cues and never feel comfortable.”
Due to the successful results obtained and the success of the initial project, a new phase has been proposed with additional experiments, control algorithms, a new motor cortex and potentially a faster, more powerful embedded controller, again running LabVIEW.