Loading [MathJax]/extensions/MathMenu.js
Quadsight® Vision System in Adverse Weather Maximizing the benefits of visible and thermal cameras | IEEE Conference Publication | IEEE Xplore

Quadsight® Vision System in Adverse Weather Maximizing the benefits of visible and thermal cameras


Abstract:

Autonomous vehicles are currently one of the most popular research topics in computer vision. United Nations Economic Commission for Europe recently proposed a regulation...Show More

Abstract:

Autonomous vehicles are currently one of the most popular research topics in computer vision. United Nations Economic Commission for Europe recently proposed a regulation for SAE level 3 automated driving systems. The current Operational Design Domains (ODD) are highway, slow speed (i.e. traffic jam), and clear weather conditions. Research is steadily creeping towards a focus on harsh weather conditions. There are now two major issues to investigate: (1) knowing how to characterize ODD and (2) extending ODD to include ‘new’ conditions. This investigation is being carried out within the framework of the AWARD project at Cerema's PAVIN platform. Foresight Automotive's QuadSight® vision system was tested under a range of artificially reproduced weather conditions. The novelty of this work is to present results of a 3D object detection ODD characterization: (a) on a commercially ready system, (b) using visible and thermal wavelengths, and (c) in controlled fog and rain conditions. The use of dual visible and long-wave infrared thermal sensors in stereo is essential to the all-weather detection of pedestrians and vehicles. The thermal sensor is essential in challenging conditions such as nighttime or adverse weather conditions. Rain and low lighting conditions pose no problem for the QuadSight system. The system also performs well in foggy conditions, with the only exception of compromised performance in very dense fog.
Date of Conference: 07-10 June 2022
Date Added to IEEE Xplore: 19 August 2022
ISBN Information:
Conference Location: Saint-Etienne, France
References is not available for this document.

I. Introduction

Vision sensors are now commonly used to perceive the environment, detect surrounding objects, and make driving decisions. United Nations Economic Commission for Europe (UNECE) [1] recently proposed a regulation for SAE level 3 automated driving systems [2]. The regulation defines the Operational Design Domains (ODD) that correspond to the use cases which are validated, and in which the vehicle is able to drive in autonomous mode. The current research is focused on unusual or difficult environmental conditions. These include harsh weather conditions, dense traffic scenarios, urban areas, poorly surfaced or damaged roads, abnormal behaviors of other road users, and edge-cases. Two major issues remain concerning the use of vehicle perception systems (sensors and associated algorithms): (i) knowing how to characterize and verify the ODD, and (ii) to extend it to include new conditions.

Select All
1.
Proposal for a New UN Regulation on Uniform Provisions Concerning the Approval of Vehicles with Regards to Automated Lane Keeping System, 2020.
2.
J3016 - Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, pp. 12, 2014.
3.
AWARD Project, 2021.
4.
P. Sun, H. Kretzschmar, X. Dotiwalla, A. Chouard, V. Patnaik, P. Tsui, J. Guo, Y. Zhou, Y. Chai, B. Caine, W. Han et al., Scalability in Perception for Autonomous Driving: Waymo Open Dataset, 2020.
5.
H. Caesar, V. Bankiti, A. H. Lang, S. Vora, V. E. Liong, Q. Xu, et al., nuScenes: A multimodal dataset for autonomous driving, 2020.
6.
A. Geiger, P. Lenz and R. Urtasun, "Are we ready for autonomous driving? the kitti vision benchmark suite", chez 2012 IEEE conference on computer vision and pattern recognition, 2012.
7.
F. Yu, W. Xian, Y. Chen, F. Liu, M. Liao, V. Madhavan, et al., Bdd100k: A diverse driving video database with scalable annotation tooling, vol. 2, pp. 6, 2018.
8.
K. Dahmane, P. Duthon, F. Bernardin, M. Colomb, N. E. B. Amara and F. Chausse, The Cerema pedestrian database : A specific database in adverse weather conditions to evaluate computer vision pedestrian detectors, 2016.
9.
C. Sakaridis, D. Dai and L. Van Gool, "Semantic foggy scene understanding with synthetic data", International Journal of Computer Vision, vol. 126, pp. 973-992, 2018.
10.
K. Dahmane, P. Duthon, F. Bernardin, M. Colomb, F. Chausse and C. Blanc, "WeatherEye-Proposal of an Algorithm Able to Classify Weather Conditions from Traffic Camera Images", Atmosphere, vol. 12, 2021.
11.
K. Dahmane, P. Duthon, F. Bernardin and M. Colomb, Weather Classification with traffic surveillance cameras, 2018.
12.
C. Ancuti, C. O. Ancuti and R. Timofte, "Ntire 2018 challenge on image dehazing: Methods and results", chez Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2018.
13.
Y. Zhang, Y. Tian, Y. Kong, B. Zhong and Y. Fu, "Residual dense network for image super-resolution", chez Proceedings of the IEEE conference on computer vision and pattern recognition, 2018.
14.
S. Ki, H. Sim, J.-S. Choi, S. Kim and M. Kim, "Fully end-to-end learning based conditional boundary equilibrium gan with receptive field sizes enlarged for single ultra-high resolution image dehazing", chez Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2018.
15.
Y. Lei, T. Emaru, A. A. Ravankar, Y. Kobayashi and S. Wang, "Semantic Image Segmentation on Snow Driving Scenarios", 2020 IEEE International Conference on Mechatronics and Automation (ICMA), pp. 1094-1100, 2020.
16.
H. Sim, S. Ki, J.-S. Choi, S. Seo, S. Kim and M. Kim, "High-resolution image dehazing with respect to training losses and receptive field sizes", chez IEEE Conference on Computer Vision and Pattern Recognition, 2018.
17.
O. Kupyn, V. Budzan, M. Mykhailych, D. Mishkin and J. Matas, "Deblurgan: Blind motion deblurring using conditional adversarial networks", chez IEEE conference on computer vision and pattern recognition, 2018.
18.
N. A. M. Mai, P. Duthon, L. Khoudour, A. Crouzil and S. A. Velastin, "3D Object Detection with SLS-Fusion Network in Foggy Weather Conditions", Sensors, vol. 21, 2021.
19.
M. Bijelic, F. Mannan, T. Gruber, W. Ritter, K. Dietmayer and F. Heide, "Seeing Through Fog Without Seeing Fog: Deep Sensor Fusion in the Absence of Labeled Training Data", CoRR, vol. abs/1902.0, 2019.
20.
A. Pfeuffer and K. Dietmayer, "Robust semantic segmentation in adverse weather conditions by means of sensor data fusion", chez 2019 22th International Conference on Information Fusion (FUSION), 2019.
21.
Y. Li, P. Duthon, M. Colomb and J. Ibanez-Guzman, "What happens for a ToF LiDAR in fog? (accepted under review)", Transactions on Intelligent Transportation Systems, 2020.
22.
R. Heinzler, F. Piewak, P. Schindler and W. Stork, "Cnn-based lidar point cloud de-noising in adverse weather", IEEE Robotics and Automation Letters, vol. 5, pp. 2514-2521, 2020.
23.
M. Kutila, P. Pyykönen, W. Ritter, O. Sawade and B. Schäufele, "Automotive LIDAR sensor development scenarios for harsh weather conditions", chez 2016 IEEE 19th ITSC, 2016.
24.
M.-g. Cho, "A Study on the Obstacle Recognition for Autonomous Driving RC Car Using LiDAR and Thermal Infrared Camera", chez 2019 Eleventh International Conference on Ubiquitous and Future Networks (ICUFN), 2019.
25.
V. John and S. Mita, "Deep Feature-Level Sensor Fusion Using Skip Connections for Real-Time Object Detection in Autonomous Driving", Electronics, vol. 10, 2021.
26.
Guide to Meteorological Instruments and Methods of Observation (2014 edition updated in 2017; WMO-No. 8), pp. 1163, 2014.

References

References is not available for this document.