Researchers at South Ural State University (SUSU) in Russia have developed an intelligent system for monitoring traffic flow using AI that does not require specific recording equipment and can work on almost any type of camera.
The Artificial Intelligence Monitoring System (AIMS) instantly processes data received in real time, unlike existing programs in which processing incurs a delay of up to 10-15 minutes, claims SUSU.
“We have proposed and implemented a modernised system for assessing traffic flows, based on the most recent advances in the detection and tracking of vehicles,” said Vladimir Shepelev, AIMS project manager and associate professor at SUSU’s automotive transport department.
“Unlike existing analogs, our system recognises and analyses in real-time the direction of movement of vehicles with a maximum relative error of less than 10%,” continued Shepelev.
“The closest analogs are able to determine the speed and classify vehicles in only one direction and with the condition of placing the cameras above the traffic flow with an accuracy of 80-90%. Operating a neural network allows you to generate up to 400 traffic parameters in real-time at each intersection.”
AIMS collects, interprets and transmits data on the intensity of road traffic, classifies 10 categories of vehicles, measures speed, the current load level of each direction of the intersection, and determines the further direction of vehicles. It also produces real-time object recognition at the intersection through use of one full HD CCTV camera.
“The results of this study can be applied by city authorities to improve the overall traffic capacity of the intersection,” said Shepelev.
“We have already proved our system at several intersections in Chelyabinsk to verify that the proposed solution is sufficiently accurate and can be used as a basis for other high-level models.”
Furthermore, AIMS uses data mining technology to deliver real-time data on traffic flow structures, vehicle directions and speeds, as well as to support implementation of efficient traffic patterns, reduce traffic congestion and to improve resource management.
According to SUSU, current traffic monitoring systems frequently rely on the use of expensive sensors for continuous data collection or on a visual study of traffic, usually measured over several days over certain periods of time.
However, SUSU claims these systems do not that result in transport services receiving proper and accurate information on the structure, intensity or speed of traffic flows. Nor do they follow direction of movement.
“We managed neural networks to process massive amounts of video data, not only for detecting and tracking vehicles but also for analysing the sequence of events,” said Shepelev.
“In the process of developing the technology, we used the open-source Mask R-CNN and YOLOv3 neural network architectures to detect objects in real-time, as well as the SORT tracker, the code of which was modified by the team to improve the quality of object tracking.”
The embedded AI-based analytic block determines the level of traffic organisation at the intersection and assigns key performance indicators to each direction of movement.
By optimising the YOLOv3 neural network algorithms, SUSU researchers were able to achieve 95% accuracy, taking into account the loss of objects during tracking, and significantly reducing the cost of real-time monitoring equipment.
“Artificial intelligence with machine vision takes data collection and analysis of road traffic to a new level, making it possible to recognise vehicles with much greater reliability than ever before,” said Shepelev.
“Our deep learning networks are easy to configure, do not require specific recording equipment and can work on almost any type of camera.”
SUSU hopes its AIMS technology will become part of Chelyabinsk’s Sustainable Public Transport project in the near future.