Pose Estimation-Based Visual Perception System for Analyzing Fish Swimming | IEEE Journals & Magazine | IEEE Xplore

Pose Estimation-Based Visual Perception System for Analyzing Fish Swimming


Abstract:

Advances in modern deep learning-based computer vision perception techniques have revolutionized animal movement research methods. These techniques have also opened up ne...Show More

Abstract:

Advances in modern deep learning-based computer vision perception techniques have revolutionized animal movement research methods. These techniques have also opened up new avenues for studying fish swimming. To that end, we have developed a visual perception system based on pose estimation to analyze fish swimming. Our system can quantify fish motion by 3-D fish pose estimation and dynamically visualize the motion data of marked keypoints. Our experimental results show that our system can accurately extract the motion characteristics of fish swimming, which analyze how fish bodies and fins work together during different swimming states. This research provides an innovative idea for studying fish swimming, which can be valuable in designing, developing, and optimizing modern underwater robots, especially multifin codriven bionic robotic fish. The code and dataset are available at https://github.com/wux024/AdamPosePlug.
Published in: IEEE Sensors Journal ( Volume: 24, Issue: 8, 15 April 2024)
Page(s): 13293 - 13303
Date of Publication: 12 March 2024

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

Advanced deep learning-based computer vision perception techniques have facilitated the rapid development of animal behavior analysis tools that have greatly enhanced the ability of neuroscience and zoology scientists to conduct relevant research [1], [2], [3]. Among the behaviors of many species, we are more interested in the swimming of fishes because studying the swimming of fishes may reveal inspiring knowledge valuable for developing new types of robotic fishes [4].

Select All
1.
A. Mathis, S. Schneider, J. Lauer and M. W. Mathis, "A primer on motion capture with deep learning: Principles pitfalls and perspectives", Neuron, vol. 108, no. 1, pp. 44-65, Oct. 2020.
2.
S. B. Hausmann, A. M. Vargas, A. Mathis and M. W. Mathis, "Measuring and modeling the motor system with machine learning", Current Opinion Neurobiol., vol. 70, pp. 11-23, Oct. 2021.
3.
X. Yang, R. Bist, S. Subedi, Z. Wu, T. Liu and L. Chai, "An automatic classifier for monitoring applied behaviors of cage-free laying hens with deep learning", Eng. Appl. Artif. Intell., vol. 123, Aug. 2023.
4.
R. Bogue, "Underwater robots: A review of technologies and applications", Ind. Robot., vol. 42, no. 3, pp. 186-191, 2015.
5.
P. R. Bandyopadhyay, "Trends in biorobotic autonomous undersea vehicles", IEEE J. Ocean. Eng., vol. 30, no. 1, pp. 109-139, Jan. 2005.
6.
J. F. V. Vincent, "Applications—Influence of biology on engineering", J. Bionic Eng., vol. 3, no. 3, pp. 161-177, Sep. 2006.
7.
J. Yu, M. Wang, H. Dong, Y. Zhang and Z. Wu, "Motion control and motion coordination of bionic robotic fish: A review", J. Bionic Eng., vol. 15, no. 4, pp. 579-598, Jul. 2018.
8.
J. Jing, X. Yin and X. Lu, "Hydrodynamic analysis of C-start in crucian carp", J. Bionic Eng., vol. 1, no. 3, pp. 102-107, Sep. 2004.
9.
F. Li, T.-J. Hu, G.-M. Wang and L.-C. Shen, " Locomotion of gymnarchus niloticus : Experiment and kinematics ", J. Bionic Eng., vol. 2, no. 3, pp. 115-121, Sep. 2005.
10.
H. Yan, Y.-M. Su and L. Yang, "Experimentation of fish swimming based on tracking locomotion locus", J. Bionic Eng., vol. 5, no. 3, pp. 258-263, Sep. 2008.
11.
G. Wu, "Measuring the three-dimensional kinematics of a free-swimming koi carp by video tracking method", J. Bionic Eng., vol. 7, no. 1, pp. 49-55, Mar. 2010.
12.
Y. Zhang, J. He and G. Zhang, "Measurement on morphology and kinematics of crucian vertebral joints", J. Bionic Eng., vol. 8, no. 1, pp. 10-17, Mar. 2011.
13.
L. Wang, M. Xu, B. Liu, K. H. Low, J. Yang and S. Zhang, "A three-dimensional kinematics analysis of a koi carp pectoral fin by digital image processing", J. Bionic Eng., vol. 10, no. 2, pp. 210-221, Jun. 2013.
14.
C. J. Voesenek, R. P. M. Pieters and J. L. van Leeuwen, "Automated reconstruction of three-dimensional fish motion forces and torques", PLoS ONE, vol. 11, no. 1, Jan. 2016.
15.
J. Mao, G. Xiao, W. Sheng, Z. Qu and Y. Liu, "Research on realizing the 3D occlusion tracking location method of fish’s school target", Neurocomputing, vol. 214, pp. 61-79, Nov. 2016.
16.
M. M. Saberioon and P. Cisar, "Automated multiple fish tracking in three-dimension using a structured light sensor", Comput. Electron. Agricult., vol. 121, pp. 215-221, Feb. 2016.
17.
Z.-M. Qian and Y. Q. Chen, "Feature point based 3D tracking of multiple fish from multi-view images", PLoS ONE, vol. 12, no. 6, Jun. 2017.
18.
X. E. Cheng, S. S. Du, H. Y. Li, J. F. Hu and M. L. Chen, "Obtaining three-dimensional trajectory of multiple fish in water tank via video tracking", Multimedia Tools Appl., vol. 77, no. 18, pp. 24499-24519, Sep. 2018.
19.
X. Liu, Y. Yue, M. Shi and Z. Qian, "3-D video tracking of multiple fish in a water tank", IEEE Access, vol. 7, pp. 145049-145059, 2019.
20.
P. Afsar, P. Cortez and H. Santos, "Automatic visual detection of human behavior: A review from 2000 to 2014", Exp. Syst. Appl., vol. 42, no. 20, pp. 6935-6956, Nov. 2015.
21.
A. Sarkar, A. Banerjee, P. K. Singh and R. Sarkar, "3D human action recognition: Through the eyes of researchers", Expert Syst. Appl., vol. 193, May 2022.
22.
S. Zhang, Y. Yang, C. Chen, X. Zhang, Q. Leng and X. Zhao, "Deep learning-based multimodal emotion recognition from audio visual and text modalities: A systematic review of recent advancements and future prospects", Expert Syst. Appl., vol. 237, Mar. 2024.
23.
A. Mathis et al., "DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning", Nature Neurosci., vol. 21, no. 9, pp. 1281-1289, Sep. 2018.
24.
T. D. Pereira et al., "Fast animal pose estimation using deep neural networks", Nature Methods, vol. 16, no. 1, pp. 117-125, Jan. 2019.
25.
J. Lauer et al., "Multi-animal pose estimation identification and tracking with DeepLabCut", Nature Methods, vol. 19, no. 4, pp. 496-504, Apr. 2022.
26.
T. D. Pereira et al., "SLEAP: A deep learning system for multi-animal pose tracking", Nature Methods, vol. 19, no. 4, pp. 486-495, Apr. 2022.
27.
Z. Chen et al., "AlphaTracker: A multi-animal tracking and behavioral analysis tool", Frontiers Behav. Neurosci., vol. 17, 2023.
28.
X. Wu, Y. Wang, L. Chen, L. Zhang and L. Wang, "Motion parameters measurement of user-defined key points using 3D pose estimation", Eng. Appl. Artif. Intell., vol. 110, Apr. 2022.
29.
K. He, X. Zhang, S. Ren and J. Sun, "Deep residual learning for image recognition", Proc. IEEE Conf. Comput. Vis. Pattern Recognit., pp. 770-778, 2016.
30.
F. Chollet, "Xception: Deep learning with depthwise separable convolutions", Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 1800-1807, Jul. 2017.

Contact IEEE to Subscribe

References

References is not available for this document.