Research output: Contribution to journal › Article › peer-review
Out of sight : a toolkit for tracking occluded human joint positions. / Wu, Chi-Jui; Quigley, Aaron John; Harris-Birtill, David Cameron Christopher.
In: Personal and Ubiquitous Computing, Vol. 21, No. 1, 02.2017, p. 125-135.Research output: Contribution to journal › Article › peer-review
}
TY - JOUR
T1 - Out of sight
T2 - a toolkit for tracking occluded human joint positions
AU - Wu, Chi-Jui
AU - Quigley, Aaron John
AU - Harris-Birtill, David Cameron Christopher
PY - 2017/2
Y1 - 2017/2
N2 - Real-time identification and tracking of the joint positions of people can be achieved with off-the-shelf sensing technologies such as the Microsoft Kinect, or other camera-based systems with computer vision. However, tracking is constrained by the system’s field of view of people. When a person is occluded from the camera view, their position can no longer be followed. Out of Sight addresses the occlusion problem in depth-sensing tracking systems. Our new tracking infrastructure provides human skeleton joint positions during occlusion, by combining the field of view of multiple Kinects using geometric calibration and affine transformation. We verified the technique’s accuracy through a system evaluation consisting of 20 participants in stationary position and in motion, with two Kinects positioned parallel, 45°, and 90° apart. Results show that our skeleton matching is accurate to within 16.1 cm (s.d. = 5.8 cm), which is within a person’s personal space. In a realistic scenario study, groups of two people quickly occlude each other, and occlusion is resolved for 85% of the participants. A RESTful API was developed to allow distributed access of occlusion-free skeleton joint positions. As a further contribution, we provide the system as open source.
AB - Real-time identification and tracking of the joint positions of people can be achieved with off-the-shelf sensing technologies such as the Microsoft Kinect, or other camera-based systems with computer vision. However, tracking is constrained by the system’s field of view of people. When a person is occluded from the camera view, their position can no longer be followed. Out of Sight addresses the occlusion problem in depth-sensing tracking systems. Our new tracking infrastructure provides human skeleton joint positions during occlusion, by combining the field of view of multiple Kinects using geometric calibration and affine transformation. We verified the technique’s accuracy through a system evaluation consisting of 20 participants in stationary position and in motion, with two Kinects positioned parallel, 45°, and 90° apart. Results show that our skeleton matching is accurate to within 16.1 cm (s.d. = 5.8 cm), which is within a person’s personal space. In a realistic scenario study, groups of two people quickly occlude each other, and occlusion is resolved for 85% of the participants. A RESTful API was developed to allow distributed access of occlusion-free skeleton joint positions. As a further contribution, we provide the system as open source.
KW - Kinect
KW - Occlusion
KW - Toolkit
U2 - 10.1007/s00779-016-0997-6
DO - 10.1007/s00779-016-0997-6
M3 - Article
VL - 21
SP - 125
EP - 135
JO - Personal and Ubiquitous Computing
JF - Personal and Ubiquitous Computing
SN - 1617-4909
IS - 1
ER -
Research output: Chapter in Book/Report/Conference proceeding › Chapter (peer-reviewed) › peer-review
Research output: Chapter in Book/Report/Conference proceeding › Chapter (peer-reviewed) › peer-review
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution
Research output: Contribution to conference › Poster › peer-review
Research output: Contribution to journal › Article › peer-review
Research output: Contribution to journal › Article › peer-review
Research output: Contribution to journal › Editorial › peer-review
Research output: Contribution to journal › Article › peer-review
Research output: Contribution to journal › Article › peer-review
ID: 248040132