TY - GEN
T1 - Behavior Recognition in Mice Using RGB-D Videos Captured from Below
AU - Oikawa, H.
AU - Tsuruda, Y.
AU - Sano, Yoshitake
AU - Teiichi, T.
AU - Yamamoto, M.
AU - Takemura, H.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Changes in mouse behavior are useful for both basic and applied research. However, visual inspection by humans is subjective and time-consuming. With the advancement of deep learning, systems have been developed that are capable of automatically and quantitatively classifying mouse behavior from videos. As camera angles are typically from above or the side, consistently capturing keypoints related to limb movement can be challenging. In this study, a mouse was placed on a transparent acrylic plate and its movements were recorded from below using an RGB-D camera, successfully capturing its limbs in 3D at all times. Furthermore, by using DeepLabCut, the 3D coordinates of the mouse's keypoints were obtained. By using deep learning with the time-series data of these obtained keypoints coordinates and corresponding behavioral labels, we created a model that classifies mouse behaviors from videos. This method achieved a total accuracy of 96.7% and a walking classification accuracy of 94.5%, demonstrating higher precision compared to previous studies.
AB - Changes in mouse behavior are useful for both basic and applied research. However, visual inspection by humans is subjective and time-consuming. With the advancement of deep learning, systems have been developed that are capable of automatically and quantitatively classifying mouse behavior from videos. As camera angles are typically from above or the side, consistently capturing keypoints related to limb movement can be challenging. In this study, a mouse was placed on a transparent acrylic plate and its movements were recorded from below using an RGB-D camera, successfully capturing its limbs in 3D at all times. Furthermore, by using DeepLabCut, the 3D coordinates of the mouse's keypoints were obtained. By using deep learning with the time-series data of these obtained keypoints coordinates and corresponding behavioral labels, we created a model that classifies mouse behaviors from videos. This method achieved a total accuracy of 96.7% and a walking classification accuracy of 94.5%, demonstrating higher precision compared to previous studies.
UR - http://www.scopus.com/inward/record.url?scp=85217862505&partnerID=8YFLogxK
U2 - 10.1109/SMC54092.2024.10831155
DO - 10.1109/SMC54092.2024.10831155
M3 - Conference contribution
AN - SCOPUS:85217862505
T3 - Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
SP - 4797
EP - 4800
BT - 2024 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2024
Y2 - 6 October 2024 through 10 October 2024
ER -