Georges Aoude and Karl Jeanbart are co-founders of Derq, a software development company that provides cities and fleets with an AI-powered infrastructure platform for road safety and traffic management that supports the deployment of autonomous vehicles at scale.
While in-vehicle technology for autonomous vehicles gets substantial attention, service providers and municipalities are just starting to discuss the road infrastructure technology that supports AVs and provides other traffic management benefits.
With advancements in artificial intelligence and 5G network connectivity, smart-road infrastructure technologies offer the promise of improving real-time traffic analytics and tackling the most challenging road safety and traffic management problems when they’re added to roads, bridges and other transit systems across the U.S.
Two technologies at the center of this discussion are AI-enhanced cameras and lidar: light detection and ranging devices.
The U.S. has hundreds of thousands of traffic cameras — millions when you also count closed-circuit TV cameras — used mainly for road monitoring and basic traffic management applications, such as loop emulation. Bringing the latest AI advancements to both cameras and data management systems, these assets can immediately improve basic application performance and unlock more advanced software applications and use cases.
AI and machine learning deliver superior sensing performance over legacy cameras’ computer vision techniques. By using algorithms that can automatically adapt to various lighting and weather conditions, they enable more robust, flexible and accurate detection, tracking and classification of all road users — distinguishing between a driver, pedestrian, and cyclist on or surrounding the road. In addition, their predictive capabilities can better model road-user movements and behaviors and improve road safety. Transportation agencies can immediately benefit from AI-enhanced cameras with applications such as road conflict detection and analysis, pedestrian crossing prediction and infrastructure sensing for AV deployments.
Lidar can provide complementary and sometimes overlapping value with cameras, but in several safety-critical edge cases, such as in heavy rain and snow or when providing more granular classification, our experience has been that cameras still provide superior results. Lidar works better in challenging light conditions and for providing localization data, but today’s lidar technology remains expensive to deploy at scale due to its high unit price and limited field of view. For example, it would take multiple lidar sensors deployed in a single intersection, at a hefty investment, to provide the equivalent information of just one 360-degree AI-enhanced camera, which is a more cost-effective solution.
For many budget-focused communities, AI-enhanced cameras remain the technology of choice. Over time, as the cost of lidar technology moderates, communities should consider whether to augment their infrastructure with lidar sensors.
As the cost of lidar technology comes down, it will become a strong and viable addition to today’s AI-enhanced cameras. Ultimately, the go-to approach for smart infrastructure solutions will be sensor fusion — the ability to combine data from both cameras and lidar in one data management system, as is happening now in autonomous vehicles — to maximize the benefits of both to improve overall traffic flow and eliminate road crashes and fatalities.
Performance of cameras vs. lidar today
|Feature||Legacy camera||AI-powered camera*||Lidar||AI-powered camera and lidar fusion|
|Challenging lighting (low light, glare)||Low||Medium||High||High|
|Adverse weather conditions (snow, rain, fog)||Low||High||Medium||High|
*Assumes presence of IR or good low-light sensor
**Expected to improve with time
Contributed pieces do not reflect an editorial position by Smart Cities Dive.
Do you have an opinion on a similar issue or another topic Smart Cities Dive is covering? Submit an op-ed.