Newswise — ITHACA, N.Y. - The Yang-Tan Institute of Employment and Disability at Cornell University has joined a multi-institution team that has received a $5 million grant from the National Science Foundation (NSF) to help create better job outcomes for people with autism spectrum disorder.
The researchers will use the grant to develop artificial intelligence technology that will train and support individuals with autism spectrum disorder in the workplace. Vanderbilt University leads the team, which includes Yale University, Georgia Tech and the Vanderbilt University Medical Center.
“The focus of the overarching project is to improve employment outcomes for neurodiverse individuals, especially those with autism, who are significantly underemployed or unemployed compared to their neurotypical age peers,” said Susanne Bruyère, Yang-Tan Institute director, professor of disability studies and a project co-principal investigator.
The Yang-Tan Institute team includes Bruyère and research associates Hsiao-Ying “Vicki” Chang and Matthew Saleh. The team has interviewed employers, neurodiverse individuals, employment service providers and higher education career counselors in an effort to identify barriers for neurodiverse people in job interviews and workplace interactions.
“This information will inform the design team that will use artificial intelligence and other virtual tools to create a coaching process that will increase the likelihood of successful outcomes,” Bruyère said.
The researchers are building on a $1 million pilot program in 2019-20, also funded by the NSF through its Convergence Accelerator program, that produced prototypes through Vanderbilt’s Inclusion Engineering program and its partners.
For the next phase of the project, researchers will address three themes: individualized assessment of unique abilities and appropriate job-matching; tailored understanding and ongoing support related to social communication and interaction challenges; and tools to support job candidates, employees and employers.
The project includes further development, refinement and testing of five technologies prototyped in Phase 1. They are:
- an assessment system that integrates a wearable eye tracker, scene cameras and computer vision algorithms to produce a detailed record of a person’s performance in visuospatial cognitive tasks;
- a virtual reality-based job interview simulator that senses a user’s anxiety and attention through wearable computing devices, and provides feedback and coaching;
- a collaborative virtual reality platform to assess and help team-building skills through peer-based and intelligent agent-based interaction;
- a social robot for use in home environments to improve resilience and tolerance with job-related interruptions; and
- a computer vision-based tool to assess nonverbal communication in real-world settings.
The five technologies can be used separately or as an integrated system, and the work has broader potential to expand employment access for other neurodiverse people. In the United States, an estimated 50 million people have autism spectrum disorder, attention deficit hyperactivity disorder, learning disability or other neurological conditions.