Source Newsroom: University at Buffalo
Newswise — Researchers at the University at Buffalo are developing a software system that may help the U.S. military and its allied forces lift the "fog of war" in their theaters of operation.
The system is designed to fuse and share information received from multiple air and ground sensors used by the military to predict and track movements of enemy and friendly troops, artillery and aircraft, according to Tarunraj Singh, Ph.D., associate professor of mechanical and aerospace engineering in the UB School of Engineering and Applied Sciences.
"In the theater of war you have multiple, disparate sensors with different capabilities, some of which are looking at the same targets," explains Singh. "There is a need to network and fuse information from the sensors, screen out noise, get better information and reduce error in measurements.
"By combining and filtering the information, our system will give military leaders the ability to monitor the theater of war with a lens that transitions from a soda-straw view to a bird's-eye view," he adds.
Singh is leading development of the system with Rakesh Nagi, Ph.D., UB associate professor of industrial engineering and co-principal investigator, along with a team of UB engineers, computer scientists and graduate students. The project is a joint effort of UB's Center for Multisource Information Fusion and the UB Center for Computational Research.
The software system can be used for real-time battle scenarios, or for strategic planning, to predict and simulate potential movements of friend or foe. The battlefield scenarios are displayed in 3-D perspectives on a computer screen or laptop, complete with accurate representations of a region's topography.
Fusion and depiction of this information will give military leaders a more accurate and comprehensive common operations picture from which they can make better deployment decisions, the researchers say.
"Just monitoring the actions of foes is not enough," explains Nagi. "When tracking a target it is important to make good judgments about the intent of the foe; you need a means to predict what he is going to do.
"With that information you can then make assessments about whether it is a threat or not, which gives you an opportunity to take counter action," he adds.
The software supports U.S. Department Defense efforts to implement "network-centric warfare," which uses information technology to link sensors, soldiers and decision makers, thus improving battlespace awareness, knowledge sharing and performance.
The backbone of the software system is a software architecture that permits scaling from laptop to supercomputer to cater to problems of track prediction/estimation to problems that require monstrous computing power for optimal design of network-centric warfare systems.
Development of the software system is funded incrementally up to $1.59 million by an R&D grant from Rosettex Technology & Ventures Group, which works with the U.S. government's National Technology Alliance to advance and commercialize technologies that address the government's national security and defense needs.
The UB researchers expect to deliver a prototype of the air tracking/fusion component of the software to Rosettex in February. Future delivery next year will include ground tracking/fusion capability.
The researchers say the system can be adapted for non-military use, as well -- for disaster response or environmental monitoring, as examples.
In addition to Singh and Nagi, other principal developers of the software system include James Llinas, Ph.D., UB professor of industrial engineering; Rajan Batta, Ph.D., UB professor of industrial engineering; Ann Bisantz, Ph.D., UB associate professor of industrial engineering; Bharat Jayaraman, Ph.D., professor and chair of the UB Department of Computer Science and Engineering; Thenkurussi Kesavadas, Ph.D., UB associate professor of mechanical and aerospace engineering, and Tom Furlani, Ph.D., associate director of UB's Center for Computational Research. Independent consultant Galya Rogova, Ph.D., also is contributing to the work.