Newswise — A technology consortium has launched an industry-wide competition to jump-start the development of more energy-efficient, language-based AI applications—those that would maintain the computing capacity of today’s virtual assistants and search engines while using a fraction of the electricity or battery power. 

The competition’s organizers, led in part by New York University, recognize that current tools, while pioneering, have sub-optimal architectures, which are slow and require remarkable amounts of energy to function. 

“We’ve made staggering progress over the last few years toward human-like AI techniques for language, but the methods and ideas that got us here relied on gigantic models that are too slow and too energy-hungry to be useful or worth using in many cases,” says organizer Sam Bowman, an assistant professor at NYU’s Department of Linguistics and Center for Data Science. “We’re optimistic that it’ll be possible to build systems that are perhaps a hundred times more efficient without losing out on the quality of understanding we’re seeing.” 

The competition has been preceded by other “benchmark tasks” in the field—all aimed at measuring the performance of language-based AI. 

In 2010, a prominent early benchmarking effort called “ImageNet” in the area of computer vision and image processing helped to kick off modern neural-network-based AI. In 2018, another consortium, led by Alex Wang, a doctoral candidate at NYU’s Courant Institute of Mathematical Science, and Bowman, launched GLUE (General Language Understanding Evaluation), which helped to spur major advances in language understanding that have started to appear in systems like search engines and virtual assistants; in 2019, they sponsored SuperGLUE, which called for the building of systems capable of grasping more complex or nuanced language. 

For more information, please visit https://sites.google.com/view/sustainlp2020/shared-task. The deadline for submissions is August 28, 2020. 

# # #