mmWave-YOLO: A mmWave Imaging Radar-Based Real-Time Multiclass Object Recognition System for ADAS Applications | IEEE Journals & Magazine | IEEE Xplore

mmWave-YOLO: A mmWave Imaging Radar-Based Real-Time Multiclass Object Recognition System for ADAS Applications


Abstract:

This article presents a millimeter wave (mmWave) imaging radar-based real-time multiclass object recognition system for advanced driver-assistance system (ADAS) applicati...Show More

Abstract:

This article presents a millimeter wave (mmWave) imaging radar-based real-time multiclass object recognition system for advanced driver-assistance system (ADAS) applications related to construction machinery. While mmWave radar has the advantage of being able to detect signals even in the optically harsh, dark, and dusty environments in which construction machinery is often used, the radar has two orders of magnitude lower resolution than cameras. Since the distance from the radar increases, object features vary, resulting in making it difficult to classify and detect the location of multiple objects. To address this issue, an mmWave-you only look once (YOLO) architecture is proposed that enables highly accurate object classification and location recognition by applying different detectors to each distance data. To provide precise labels for the radar data semiautomatically, a camera-radar cooperative data annotator is also developed. Using the radar only, a real-time (46.6 ms) object classification and location detection of six class objects is achieved. The accuracy of our system is 84% [mean average precision (mAP)], which is slightly higher than that of the red, blue, green (RGB)-camera-based system (78%). In addition, since the radar can acquire similar data of target objects against any background, mmWave-YOLO can detect objects in a variety of scenes with only a small variation of the training dataset, resulting in a lower training cost. Experiments confirmed that it can detect objects in outdoor scenes even when it is trained only with indoor scene data.
Article Sequence Number: 2509810
Date of Publication: 18 May 2022

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

In order to reduce the number of fatal accidents in the construction field, the installation of advanced driver-assistance systems (ADASs) in construction machinery, such as excavators, is strongly recommended. Accidents occurring at construction sites account for 33% of all accidents in all industries, which represents the largest proportion among them [1]. In particular, the number of accidents involving contact with construction machinery occupies a large percentage. To reduce fatal accidents, ADAS that can detect people and objects is increasingly being used in construction machinery [2].

Select All
1.
Industrial Accidents Statistics in Japan, 2017, [online] Available: https://www.jisha.or.jp/english/statistics/accidents_in_detail_2017.html.
2.
"Object detection system for ZAXIS-6 hydraulic excavator", Hitachi Rev., vol. 69, no. 3, pp. 90-91, Mar. 2020.
3.
Earth-Moving Machinery—Object Detection Systems and Visibility Aids—Performance Requirements and Tests, Nov. 2017.
4.
J. Peng et al., "Multi-task ADAS system on FPGA", Proc. IEEE Int. Conf. Artif. Intell. Circuits Syst. (AICAS), pp. 171-174, Mar. 2019.
5.
K. Patel, K. Rambach, T. Visentin, D. Rusev, M. Pfeiffer and B. Yang, "Deep learning-based object classification on automotive radar spectra", Proc. IEEE Radar Conf. (RadarConf), pp. 133-136, Oct. 2019.
6.
J. Redmon and A. Farhadi, "YOLOv3: An incremental improvement", arXiv:1804.02767, Apr. 2018.
7.
Q. Zhao et al., "M2Det: A single-shot object detector based on multi-level feature pyramid network", Proc. AAAI Conf. Artif. Intell., pp. 9259-9266, Jan. 2019.
8.
V. Crescitelli, A. Kosuge and T. Oshima, "POISON: Human pose estimation in insufficient lighting conditions using sensor fusion", IEEE Trans. Instrum. Meas., vol. 70, pp. 1-8, 2021.
9.
T. Arai et al., "A 77-GHz 8RX3TX transceiver for 250-m long-range automotive radar in 40-nm CMOS technology", IEEE J. Solid-State Circuits, vol. 56, pp. 1332-1344, 2021.
10.
A. Arbabian et al., "A 94 GHz mm-wave-to-baseband pulsed-radar transceiver with applications in imaging and gesture recognition", IEEE J. Solid-State Circuits, vol. 48, no. 4, pp. 1055-1071, Apr. 2013.
11.
M. T. Ghasr, M. J. Horst, M. R. Dvorsky and R. Zoughi, "Wideband microwave camera for real-time 3-D imaging", IEEE Trans. Antennas Propag., vol. 65, no. 1, pp. 258-268, Jan. 2017.
12.
S. Shahramian, M. J. Holyoak, A. Singh and Y. Baeyens, "A fully integrated 384-element 16-tile W-band phased array with self-alignment and self-test", IEEE J. Solid-State Circuits, vol. 54, no. 9, pp. 2419-2434, Sep. 2019.
13.
S. Zihir, O. D. Gurbuz, A. Kar-Roy, S. Raman and G. M. Rebeiz, "60-GHz 64- and 256-elements wafer-scale phased-array transmitters using full-reticle and subreticle stitching techniques", IEEE Trans. Microw. Theory Techn., vol. 64, no. 12, pp. 4701-4719, Dec. 2016.
14.
Airport Passenger Screening Using Millimeter Wave Machines: Compliance With Guidelines, Washington, DC, USA, 2017.
15.
R. Z. Syeda, T. G. Savelyev, M. C. van Beurden and A. B. Smolders, "Sparse MIMO array for improved 3D mm-wave imaging radar", Proc. 17th Eur. Radar Conf. (EuRAD), pp. 342-345, Jan. 2021.
16.
J. W. Odendaal, E. Barnard and C. W. I. Pistorius, "Two-dimensional superresolution radar imaging using the MUSIC algorithm", IEEE Trans. Antennas Propag., vol. 42, no. 10, pp. 1386-1391, Oct. 1994.
17.
M. Zhao et al., "Through-wall human pose estimation using radio signals", Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., pp. 7356-7365, Jun. 2018.
18.
M. Zhao et al., "Through-wall human mesh recovery using radio signals", Proc. IEEE/CVF Int. Conf. Comput. Vis. (ICCV), pp. 10112-10121, Oct. 2019.
19.
F. Adib, C.-Y. Hsu, H. Mao, D. Katabi and F. Durand, "Capturing the human figure through a wall", ACM Trans. Graph., vol. 34, no. 6, pp. 1-13, Nov. 2015.
20.
J. Guan et al., "Through fog high-resolution imaging using millimeter wave radar", IEEE/CVF Conf. Comput. Vis. Pattern Recognit. (CVPR) Dig. Tech. Papers, pp. 1147-11461, Jun. 2020.
21.
X. Shuai, Y. Shen, Y. Tang, S. Shi, L. Ji and G. Xing, "MilliEye: A lightweight mmWave radar and camera fusion system for robust object detection", Proc. Int. Conf. Internet Things Design Implement., pp. 145-157, May 2021.
22.
S. Gupta, P. K. Rai, A. Kumar, P. K. Yalavarthy and L. R. Cenkeramaddi, "Target classification by mmWave FMCW radars using machine learning on range-angle images", IEEE Sensors J., vol. 21, no. 18, pp. 19993-20001, Sep. 2021.
23.
Imaging Radar RCS11, Jan. 2022, [online] Available: http://www.keycom.co.jp/eproducts/rcs/rcs11/page.html.
24.
S. Sugimoto, H. Tateda, H. Takahashi and M. Okutomi, "Obstacle detection using millimeter-wave radar and its visualization on image sequence", Proc. 17th Int. Conf. Pattern Recognit. (ICPR), pp. 342-345, Aug. 2004.
25.
F. Zhuang et al., "A comprehensive survey on transfer learning", Proc. IEEE, vol. 109, no. 1, pp. 43-76, Jan. 2021.
26.
Jetson Modules, Jan. 2022, [online] Available: https://developer.nvidia.com/embedded/jetson-modules.
27.
Geforce RTX 3090, Jan. 2022, [online] Available: https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/.
28.
J. Redmon, S. Divvala, R. Girshick and A. Farhadi, "You only look once: Unified real-time object detection", arXiv:1506.02640, 2015.
29.
S. Lee, S. Kang, S.-C. Kim and J.-E. Lee, "Radar cross section measurement with 77 GHz automotive FMCW radar", Proc. IEEE 27th Annu. Int. Symp. Pers. Indoor Mobile Radio Commun. (PIMRC), pp. 1-6, Sep. 2016.
30.
B. Xu, N. Wang, T. Chen and M. Li, "Empirical evaluation of rectified activations in convolutional network", arXiv:1505.00853, Nov. 2015.
Contact IEEE to Subscribe

References

References is not available for this document.