EVANSTON, Ill. --- Northwestern University artificial intelligence experts are available to comment on the proposal by a group of robotics and AI specialists, including Elon Musk, for the United Nations to ban the development and use of “killer robots.” 

Kristian Hammond previously sat on a UN policy committee run by the United Nations Institute for Disarmament Research (UNIDIR). He is a professor of electrical engineering and computer science in the McCormick School of Engineering and a co-founder of Narrative Science, a startup company that uses artificial intelligence to extract the most important information from a data source and turn it into a narrative expressed in natural language. Hammond also previously founded the University of Chicago’s Artificial Intelligence Laboratory.

Quote from Professor Hammond

“Whenever you ban something, it comes out in ways you cannot control. Prohibition leads to secrecy, and the thing we need most in AI research is transparency. We need to understand how intelligence works before we push intelligent machines into the world. Understanding and transparency before deployment should be the goal, and that goal is not helped by prohibition.”

He can be reached at [email protected].

Larry Birnbaum also is a professor of electrical engineering and computer science at McCormick and is the other co-founder of Narrative Science. His research includes machine learning, human-computer interaction, case-based reasoning, computer vision and natural language processing. He was the program co-chair of the 1991 International Machine Learning Workshop and has been a member of the program committee for numerous other conferences and workshops.

Quote from Professor Birnbaum

"Bans only work if they are verifiable. The Nuclear Test Ban Treaty is verifiable via satellite and ground sensors. The Biological Weapons Convention lacks clarity around verification, and this remains a huge issue, even though the ban has been relatively successful – at least in terms actual deployment and use of such weapons. But development is another issue. How is a ban on the development of AI for use in weapons to be monitored?"

He can be reached at [email protected].

More News at Northwestern Now