TY - GEN
T1 - A Real-time Multi-Person 3D Pose Estimation System from Multiple RGB-D Views for Live Streaming of 3D Animation
AU - Hwang, Taemin
AU - Kim, Jieun
AU - Kim, Myoungjin
AU - Kim, Minjoon
N1 - Publisher Copyright:
© 2023 Owner/Author.
PY - 2023/3/27
Y1 - 2023/3/27
N2 - 3D pose estimation of multi-person in real-time is a challenging problem which is essential for a various virtual reality (VR) applications. In this paper, we propose a real-time multi-person 3D pose estimation system for live streaming of 3D animation with multiple RGB-D cameras. The proposed system comprises the several edge devices connected to a central server via networks. 2D pose detection and depth sensing is locally conducted on each edge device. The edge device transmits the results to the central server. 3D pose reconstruction is performed on the central server. The central server aligns the multiple camera coordinates with a specific world plane. Then, the central server matches the 2D pose result across the multiple cameras to a person based on the distance. Finally, the 3D poses are reconstructed based on multi-view triangulation. The proposed system is capable of processing in real-time. To demonstrate the proposed system estimates multi-person 3D poses in real-time, we implement the prototypes and show the proposed system can be applied to the live streaming of 3D animation such as PC and Web.
AB - 3D pose estimation of multi-person in real-time is a challenging problem which is essential for a various virtual reality (VR) applications. In this paper, we propose a real-time multi-person 3D pose estimation system for live streaming of 3D animation with multiple RGB-D cameras. The proposed system comprises the several edge devices connected to a central server via networks. 2D pose detection and depth sensing is locally conducted on each edge device. The edge device transmits the results to the central server. 3D pose reconstruction is performed on the central server. The central server aligns the multiple camera coordinates with a specific world plane. Then, the central server matches the 2D pose result across the multiple cameras to a person based on the distance. Finally, the 3D poses are reconstructed based on multi-view triangulation. The proposed system is capable of processing in real-time. To demonstrate the proposed system estimates multi-person 3D poses in real-time, we implement the prototypes and show the proposed system can be applied to the live streaming of 3D animation such as PC and Web.
KW - 3D pose estimation
KW - Body tracking
KW - Multiview triangulation
UR - https://www.scopus.com/pages/publications/85152019432
U2 - 10.1145/3581754.3584144
DO - 10.1145/3581754.3584144
M3 - Conference contribution
AN - SCOPUS:85152019432
T3 - International Conference on Intelligent User Interfaces, Proceedings IUI
SP - 105
EP - 107
BT - IUI 2023 - Companion Proceedings of the 28th International Conference on Intelligent User Interfaces
PB - Association for Computing Machinery
T2 - 28th International Conference on Intelligent User Interfaces, IUI 2023
Y2 - 27 March 2023 through 31 March 2023
ER -