In this research, we focus on how to track a target region that lies next to similar regions (e.g. a forearm and an upper arm) in zoom-in images. Many previous tracking methods express the target region (i.e. a part in a human body) with a single model such as an ellipse, a rectangle, and a deformable closed region. With the single model, however, it is difficult to track the target region in zoom-in images without confusing it and its neighboring similar regions (e.g. "a forearm and an upper arm" and "a small region in a torso and its neighboring regions") because they might have the same texture patterns and do not have the detectable border between them. In our method, a group of feature points in a target region is extracted and tracked as the model of the target. Small differences between the neighboring regions can be verified by focusing only on the feature points. In addition, (1) the stability of tracking is improved using particle filtering and (2) tracking robust to occlusions is realized by removing unreliable points using random sampling. Experimental results demonstrate the effectiveness of our method even when occlusions occur.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Norimichi UKITA, Akira MAKINO, Masatsugu KIDODE, "Real-Time Uncharacteristic-Part Tracking with a Point Set" in IEICE TRANSACTIONS on Information,
vol. E93-D, no. 7, pp. 1682-1689, July 2010, doi: 10.1587/transinf.E93.D.1682.
Abstract: In this research, we focus on how to track a target region that lies next to similar regions (e.g. a forearm and an upper arm) in zoom-in images. Many previous tracking methods express the target region (i.e. a part in a human body) with a single model such as an ellipse, a rectangle, and a deformable closed region. With the single model, however, it is difficult to track the target region in zoom-in images without confusing it and its neighboring similar regions (e.g. "a forearm and an upper arm" and "a small region in a torso and its neighboring regions") because they might have the same texture patterns and do not have the detectable border between them. In our method, a group of feature points in a target region is extracted and tracked as the model of the target. Small differences between the neighboring regions can be verified by focusing only on the feature points. In addition, (1) the stability of tracking is improved using particle filtering and (2) tracking robust to occlusions is realized by removing unreliable points using random sampling. Experimental results demonstrate the effectiveness of our method even when occlusions occur.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E93.D.1682/_p
Copy
@ARTICLE{e93-d_7_1682,
author={Norimichi UKITA, Akira MAKINO, Masatsugu KIDODE, },
journal={IEICE TRANSACTIONS on Information},
title={Real-Time Uncharacteristic-Part Tracking with a Point Set},
year={2010},
volume={E93-D},
number={7},
pages={1682-1689},
abstract={In this research, we focus on how to track a target region that lies next to similar regions (e.g. a forearm and an upper arm) in zoom-in images. Many previous tracking methods express the target region (i.e. a part in a human body) with a single model such as an ellipse, a rectangle, and a deformable closed region. With the single model, however, it is difficult to track the target region in zoom-in images without confusing it and its neighboring similar regions (e.g. "a forearm and an upper arm" and "a small region in a torso and its neighboring regions") because they might have the same texture patterns and do not have the detectable border between them. In our method, a group of feature points in a target region is extracted and tracked as the model of the target. Small differences between the neighboring regions can be verified by focusing only on the feature points. In addition, (1) the stability of tracking is improved using particle filtering and (2) tracking robust to occlusions is realized by removing unreliable points using random sampling. Experimental results demonstrate the effectiveness of our method even when occlusions occur.},
keywords={},
doi={10.1587/transinf.E93.D.1682},
ISSN={1745-1361},
month={July},}
Copy
TY - JOUR
TI - Real-Time Uncharacteristic-Part Tracking with a Point Set
T2 - IEICE TRANSACTIONS on Information
SP - 1682
EP - 1689
AU - Norimichi UKITA
AU - Akira MAKINO
AU - Masatsugu KIDODE
PY - 2010
DO - 10.1587/transinf.E93.D.1682
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E93-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2010
AB - In this research, we focus on how to track a target region that lies next to similar regions (e.g. a forearm and an upper arm) in zoom-in images. Many previous tracking methods express the target region (i.e. a part in a human body) with a single model such as an ellipse, a rectangle, and a deformable closed region. With the single model, however, it is difficult to track the target region in zoom-in images without confusing it and its neighboring similar regions (e.g. "a forearm and an upper arm" and "a small region in a torso and its neighboring regions") because they might have the same texture patterns and do not have the detectable border between them. In our method, a group of feature points in a target region is extracted and tracked as the model of the target. Small differences between the neighboring regions can be verified by focusing only on the feature points. In addition, (1) the stability of tracking is improved using particle filtering and (2) tracking robust to occlusions is realized by removing unreliable points using random sampling. Experimental results demonstrate the effectiveness of our method even when occlusions occur.
ER -