FOR RELEASE THURSDAY, APRIL 12, 2001

CONTACT: Doug Blank, assistant professor of computer science and computer engineering, (501) 575-2067; [email protected]

Gordon Beavers, associate professor of computer science and computer engineering,(501) 575-6040; [email protected]

Carolyne Garcia, science and research communication officer, (501) 575-5555; [email protected]

NOTE: Photo available at: http://pigtrail.uark.edu/news/

HEAR ME, SEE ME, FIND ME FEED ME

FAYETTEVILLE, Ark. -- U of A robotics researchers Gordon Beavers and Doug Blank have designed and demonstrated systems that significantly reduce the cost of hearing and vision systems for intelligent robots.

They presented their results last week at the Midwest Artificial Intelligence and Cognitive Science Society Meeting.

"We are not solving problems that haven't been solved before," explained Beavers, associate professor of computer science and computer engineering. "The same thing has been done with very expensive and sophisticated equipment. But we did it using programming and cheap sound cards."

In the past, locating the origin of a sound required installation of sophisticated sensors at fixed points to triangulate on the sound. This method also required expensive multichannel hardware. Beavers and Blank decided to use inexpensive, off-the-shelf sound cards and intelligent robots to achieve the same results.

Their solution is not only simple and inexpensive; it is also portable. Intelligent robots can be used for search-and-rescue missions in urban disasters, for example. Their programming allows them to interpret the sounds that they receive and make decisions, in much the same way that humans locate an object by its sound.

Beavers and Doug Blank, associate professor of computer science and computer engineering, demonstrated their system this year at the national robotics competition sponsored by the American Association for Artificial Intelligence. In the Urban Search and Rescue challenge, they used three small independent autonomous robots to locate simulated humans in a "disaster area" designed and constructed by the National Institute of Standards and Technology.

In addition to sound localization, the robots used vision to locate potential "disaster victims." Vision was also integral in the second competition, in which robots provided snacks at the conference banquet. The U of A robot Elektro took second place in this contest.

Elektro was designed not only to recognize humans, but to re-recognize them. By establishing a color tag for each person it met, Elektro could recall that he had previously recognized the individual and "converse" with them on the basis of that previous interaction.

"Elektro creates a color fingerprint for each person," Blank explained. "It uses the full color spectrum to identify an individual and stores that information."

Once it has identified the shape as a human, the robot ignores shape and uses color for recognition. That means it can recognize a person that is seated, even though the person was standing when they first "met."

"The most difficult part of this system is distinguishing the person from the background," Beavers explained. "Elekro uses a laser range finder, a vision system and a motion detector, but it takes a lot of processing."

###