Newswise — COLUMBUS, Ohio – Researchers at The Ohio State University have developed new software to aid in the development, evaluation and demonstration of safer autonomous, or driverless, vehicles.

Called the Vehicle-in-Virtual-Environment (VVE) method, it allows the testing of driverless cars in a perfectly safe environment, said Bilin Aksun-Guvenc, co-author of the study and a professor of mechanical and aerospace engineering at Ohio State.

Imagine a driverless car is placed in the middle of an empty parking lot. Although it is driving, it isn’t reacting to the real world, but to input from the software, which tells the car what the road looks like, and what cars, pedestrians and hazards it is meeting along the way.

“With our software, we’re able to make the vehicle think that it’s driving on actual roads while actually operating on a large open, safe test area,” said Aksun-Guvenc. “This ability saves time, money, and there is no risk of fatal traffic accidents.”

The study, published recently in the journal Sensors, found that by immersing self-driving machines in a virtual environment, the technique can help the car learn to avoid possible car collisions, increase pedestrian safety, and react to rare or extreme traffic events.

Although autonomous driving technologies have become a much more common sight on the road in the last few years, due to the sheer number of accidents these systems have caused, the way these technologies are tested deserves closer scrutiny, Aksun-Guvenc said.

“Our future depends on being able to trust any and all road vehicles with our safety, so all of our research concepts pertain to working towards that goal,” said Aksun-Guvenc, who is also co-director of Ohio State’s Automated Driving Lab, a research group originally formed in 2014 to advance autonomous vehicle technologies.

Current approaches for demonstrating autonomous vehicle functions involve testing software and technology first in simulations and then on public roads. Yet this method essentially turns other road users into involuntary participants in these driving experiments, said Aksun-Guvenc, and such risks can make the entire development process costly, inefficient, and potentially unsafe for both drivers and pedestrians alike.

To overcome the limitations of these faulty assessments, researchers in this study replaced the output of high-resolution sensors in a real vehicle with simulated data to connect its controls to a highly realistic 3D environment, much like giving the machine a VR headset or virtual reality glasses. After feeding the data to the autonomous driving system’s computers and syncing the car’s real motions with the simulations’, researchers were able to show that it behaves as if the virtual environment were its true surroundings in real time. 

But what makes their software especially powerful, said Levent Guvenc, co-author of the study and also co-director of the Automated Driving Lab, is the strength of how flexible their virtual environment can be. “When actual senses are replaced by virtual senses, the model can be easily changed to fit any kind of scenario,” said Guvenc.

Because the VVE method can be calibrated to maintain the properties of the real world while modeling rare events in the virtual environment, it could easily simulate extreme traffic scenarios, like someone jumping in front of a vehicle, to mundane ones like pedestrians waiting at a crosswalk, he said.

Additionally, with the help of a communication app for vehicle-to-pedestrian connectivity, the software can use Bluetooth to communicate between a pedestrian with a mobile phone and a phone in the test vehicle. The researchers had a pedestrian actually dart quickly across a simulated road a safe distance from the test vehicle. But the Bluetooth signal told the car that the person was darting right in front of it.

“The beauty of the method is that road users can share the same environment at the same time without being in the same location at all,” said Guvenc. And although generating these super-realistic environments can take time, he said the technological challenge of syncing different environments to use in real-time simulations is one challenge their team has solved.

The team has also filed a patent for the technology. In the future, Guvenc said he’d also like to see it be integrated into traffic guidelines made by groups such as The National Highway Traffic Safety Administration.

“We could see this technology becoming a staple in the industry in the next five or 10 years,” said Guvenc. “That’s why we’re focusing on building more applications for it.”

Other Ohio State co-authors were Xincheng Cao, Haochong Chen and Sukru Yaren Gelbal.

Journal Link: Sensors