Latest News

Monday, November 5, 2018

Having traffic optimized and pollution reduced by using Machine Learning




Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have proven that applying AI to self-driving cars to smooth traffic, reduce fuel consumption, and improve air quality predictions is no longer the stuff of science fiction by launching two research projects to do just that.

In collaboration with UC Berkeley, Berkeley Lab scientists are using deep reinforcement learning, a computational tool for training controllers, to make transportation more sustainable. One project uses deep reinforcement learning to train autonomous vehicles to drive in ways to simultaneously improve traffic flow and reduce energy consumption. A second uses deep learning algorithms to analyze satellite images combined with traffic information from cell phones and data already being collected by environmental sensors to improve air quality predictions.

“Thirty percent of energy use in the U.S. is to transport people and goods, and this energy consumption contributes to air pollution, including approximately half of all nitrogen oxide emissions, a precursor to particular matter and ozone – and black carbon (soot) emissions,” said Tom Kirchstetter, director of Berkeley Lab’s Energy Analysis and Environmental Impacts Division, an adjunct professor at UC Berkeley, and a member of the research team.

“Applying machine learning technologies to transportation and the environment is a new frontier that could pay significant dividends – for energy as well as for human health.”

Traffic smoothing with Flow

CIRCLES, or Congestion Impact Reduction via CAV-in-the-loop Lagrangian Energy Smoothing, is led by Berkeley Lab researcher Alexandre Bayen, who is also is a professor of electrical engineering and computer science at UC Berkeley and director of UC Berkeley’s Institute of Transportation Studies. CIRCLES is based on a software framework called Flow, developed by Bayen’s team of students and post-doctoral researchers.

Flow is a first-of-its-kind software framework allowing researchers to discover and benchmark schemes for optimizing traffic. Using a state-of-the-art open-source microsimulator, Flow can simulate hundreds of thousands of vehicles – some driven by humans, others autonomous – driving in custom traffic scenarios.

“The potential for cities is enormous,” said Bayen. “Experiments have shown that the energy savings with just a small percentage of vehicles on the road being autonomous can be huge. And we can improve it even further with our algorithms.”

Flow was launched in 2017 and released to the public in September, and the benchmarks are being released this month. With funding from the Laboratory Directed Research and Development program, Bayen and his team will use Flow to design, test, and deploy the first connected and autonomous vehicle (CAV)-enabled system to actively reduce stop-and-go phantom traffic jams on freeways.

Reducing congestion by reinforcement learning

A simple experiment done by Japanese researchers 10 years ago motivated some of the today researchers to use autonomous vehicles in smoothing traffic. Even though every is proceeding smoothly at first, the traffic waves start and cars come to a standstill 30 seconds later.

“You have stop-and-go oscillation within less than a minute,” Bayen said. “This experiment led to hundreds if not thousands of research papers to try to explain what is happening.”

There was one change of the same experiment, created by the team of researchers of Vanderbilt University, led by Dan Work: a single autonomous vehicle was added in the ring. The oscillations are immediately smoothed out as soon as the automation is turned on.

Why? “The automation essentially understands to not accelerate and catch up with the previous person – which would amplify the instability – but rather to behave as a flow pacifier, essentially smoothing down by restraining traffic so that it doesn’t amplify the instability,” Bayen said.

Deep reinforcement learning has been used to train computers to play chess and to teach a robot how to run an obstacle course. It trains by “taking observations of the system, and then iteratively trying out a bunch of actions, seeing if they’re good or bad, and then picking out which actions it should prioritize,” said Eugene Vinitsky, a graduate student working with Bayen and one of Flow’s developers.

In the case of traffic, Flow trains vehicles to check what the cars directly in front of and behind them are doing. “It tries out different things – it can accelerate, decelerate, or change lanes, for example,” Vinitsky explained. “You give it a reward signal, like, was traffic stopped or flowing smoothly, and it tries to correlate what it was doing to the state of the traffic.”

With the CIRCLES project, Bayen and his team plan to first run simulations to confirm that significant energy savings result from using the algorithms in autonomous vehicles. Next they will run a field test of the algorithm with human drivers responding to real-time commands.

DeepAir

{Marta Gonzalez, a professor in UC Berkeley’s City & Regional Planning Department, has established a pollution project called DeepAir (Deep Learning and Satellite Imaginary to Estimate Air Quality Impact at Scale). In one of her researches, Marta has recommended people to use electric vehicle charging schemes to save energy and cost after using cell phone data to study how people move around cities.}

For this project, she will take advantage of the power of deep learning algorithms to analyze satellite images combined with traffic information from cell phones and data already being collected by environmental monitoring stations.

“The novelty here is that while the environmental models, which show the interaction of pollutants with weather – such as wind speed, pressure, precipitation, and temperature – have been developed for years, there’s a missing piece,” Gonzalez said. “In order to be reliable, those models need to have good inventories of what’s entering the environment, such as emissions from vehicles and power plants.

“We bring novel data sources such as mobile phones, integrated with satellite images. In order to process and interpret all this information, we use machine learning models applied to computer vision. The integration of information technologies to better understand complex natural system interactions at large scale is the innovative piece of DeepAir.”

{Resulting analysis is predicted by the researchers to allow them to gain deep information about the sources and distributions of pollutants, so they could make a more efficient and timely interventions. Take this as an example, In the “Spare the Air” days of the Bay Aera, the traffic restrictions are voluntary, and other cities form plans to restrict traffic or industry.}

While the idea of using algorithms to control cars and traffic may sound incredible at the moment, Bayen believes technology is headed in that direction. “I do believe that within 10 years the things we’re coming up with here, like flow smoothing, will be standard practice, because there will be more automated vehicles on the road,” he said.

Source: https://newscenter.lbl.gov/2018/10/28/machine-learning-to-help-optimize-traffic-and-reduce-pollution/

No comments:

Post a Comment

Tags

Recent Post