Newswise — A collaboration between researchers at the University of Illinois Urbana-Champaign and Duke University has developed a robotic eye examination system, and the National Institutes of Health has awarded the researchers $1.2 million to expand and refine the system.

The researchers have developed a robotic system that automatically positions examination sensors to scan human eyes. It currently uses an optical scan technique which can operate from a reasonably safe distance from the eye, and now the researchers are working to add more features that will help it perform most steps of a standard eye exam. These features will require the system to operate in closer proximity to the eye.

“Instead of having to spend time in a doctor’s office going through the manual steps of routine examinations, a robotic system can do this automatically,” said Kris Hauser, a U. of I. computer science professor and the study’s principal investigator. “This would mean faster and more widespread screening leading to better health outcomes for more people. But to achieve this, we need to develop safer and more reliable controls, and this award allows us to do just that.”

Automated medical examinations could both make routine medical services accessible to more people and allow health care workers to treat more patients. However, medical examinations present unique safety concerns compared to other automated processes. The robots must be trusted to operate reliably and safely in proximity to sensitive body parts.

A prior system developed by Hauser and his collaborators was a robotic eye examination system that deploys a technique called optical coherence tomography which scans the eye to create a three-dimensional map of the eye’s interior. This capability allows many conditions to be diagnosed, but the researchers want to expand the system’s capabilities by including a slit eye examiner and an aberrometer. These additional features require the robot arm to be held within two centimeters of the eye, highlighting the need for enhanced robotic safety.

“Getting the robot within two centimeters of the patient’s eye while ensuring safety is a bit of a new concern,” Hauser said. “If a patient’s moving towards the robot, it has to move away. If the patient is swaying, the arm has to match their movement.”

Hauser likened the control system to those used in autonomous vehicles. While the system can’t react to all possible human behaviors, he said, it must prevent “at-fault collisions” like self-driving cars must do.

The award will enable the researchers to conduct large-scale reliability testing. An important component of these tests is ensuring that the system works for as many people as possible. To achieve this, the researchers have developed a second robot that will use mannequin heads to emulate unexpected human behaviors. Moreover, the second robot will automatically randomize the heads’ appearance with different skin tones, facial features, hair and coverings to help the researchers understand and mitigate the effects of algorithmic bias in their system.

The system will be designed for use in clinical settings, but Hauser imagines that one day such systems could be used in retail settings much like blood pressure stations.

“Something like this could be used in an eyeglass store to scan your eyes for the prescription, or it could give a diagnostic scan in a pharmacy and forward the information to your doctor,” he said. “This is really where an automated examination system like this would be most effective: giving as many people access to basic health care services as possible.”

***

Duke University professors Jospeh Izatt of biomedical engineering and Anthony Kuo of ophthalmology are co-principal investigators.

The award, cosponsored by the National Robotics Initiative, will be distributed over three years.