Humanoid Robot to Aid Autism Diagnosis is Goal of New Study Funded by Doris Duke Charitable Foundation Award

A multi-disciplinary team of researchers from Yale’s Child Study Center and Department of Computer Science have received a new grant from the Doris Duke Foundation to advance autism research.

They are planning to use a humanoid robot as an interactive diagnostic device with young children at risk for autism. The team includes Brian Scassellati, assistant professor of computer science and head of its humanoid robotics lab, Ami Klin, Harris Associate Professor of Child Psychology and Psychiatry, Warren Jones, associate research scientist, and Fred Volkmar, M.D., Irving B. Harris Professor of Child Psychiatry, Pediatrics, and Psychology.

The Yale team is one of six to receive the 2003 Doris Duke Clinical Interfaces Award a new grant that supports cross-disciplinary teams using innovative approaches to address challenging questions in human disease.

“We are encouraged by this important boost to our research effort as it makes possible the utilization of cutting-edge technology and developmental science,” said Volkmar. “It crosses the boundaries of previously unrelated disciplines, in the research of causes of autism and of new treatments for this condition, which is known to affect over one million individuals in the United States alone.”

The project involves detailed measurements of toddlers’ reactions to a humanoid robot designed to match the size, speed and range of motion of a one-year-old child. The robot being developed at Scassellati’s lab is capable of responding to human social cues, such as tone of voice, direction of gaze, proximity and gestures. Because the robot responds to the child’s own social behaviors, it becomes a reflection and measurement of the child’s behaviors. The explicit social behavior of the robots can be controlled and manipulated, the child’s social reactions to the robot can be assessed and the child’s reactions can be related to the cues needed for treating the robot like a human rather than a robot.

The researchers’ eye tracking laboratory will allow measurement of the way a person views complex situations. The child’s visual scanning patterns will provide a better understanding of social relevance for the child participants (i.e. focusing on eyes, mouth, expressions, or bodily movements). The team hopes to use this research to create a very early screening procedure to identify vulnerabilities for autism in the first year, if not months of life. “This is very important because the earlier intervention for autism is started, the more likely it is to have a significant impact on the child’s life,” said Volkmar.

Share this with Facebook Share this with Twitter Share this with LinkedIn Share this with Email Print this

Media Contact

Karen N. Peart: karen.peart@yale.edu, 203-432-1326