TY - JOUR
T1 - Robust Likelihood Model for Illumination Invariance in Particle Filtering
AU - Al Delail, Buti
AU - Bhaskar, Harish
AU - Zemerly, Mohamed Jamal
AU - Al-Mualla, Mohammed
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2018/10
Y1 - 2018/10
N2 - Tracking visual targets in an unconstrained environment is challenging due to variations in illumination, scale, occlusion, and motion blur, for example. Many video applications that utilize particle filter-based visual target trackers require tracking of visual targets under varying illuminations. Similarity measures and likelihood estimation strongly influence the performance of particle filters. In this paper, we propose a novel likelihood estimator that has been combined with other state-of-the-art particle filtering-based tracking techniques to accommodate varying illumination by predicting changes in the illumination intensity and direction of the illumination. Moreover, an enhanced update strategy for the template dictionary is used along with a sparse representation model to solve the problem of drift due to appearance changes during tracking. The proposed algorithm has been evaluated using various particle-filter-based tracking algorithms on scenes from public data sets and using our gesture data set, which includes variations in illumination. Using the proposed model, the algorithms perform up to 20% better on sequences for which variations in illumination are dominant. We carried systematic experiments to evaluate the robustness of the proposed algorithm on video sequences with illumination variations, as well as other variations. Furthermore, in sequences that include variations in illumination, our likelihood model usually performs better than the default tracker likelihood model.
AB - Tracking visual targets in an unconstrained environment is challenging due to variations in illumination, scale, occlusion, and motion blur, for example. Many video applications that utilize particle filter-based visual target trackers require tracking of visual targets under varying illuminations. Similarity measures and likelihood estimation strongly influence the performance of particle filters. In this paper, we propose a novel likelihood estimator that has been combined with other state-of-the-art particle filtering-based tracking techniques to accommodate varying illumination by predicting changes in the illumination intensity and direction of the illumination. Moreover, an enhanced update strategy for the template dictionary is used along with a sparse representation model to solve the problem of drift due to appearance changes during tracking. The proposed algorithm has been evaluated using various particle-filter-based tracking algorithms on scenes from public data sets and using our gesture data set, which includes variations in illumination. Using the proposed model, the algorithms perform up to 20% better on sequences for which variations in illumination are dominant. We carried systematic experiments to evaluate the robustness of the proposed algorithm on video sequences with illumination variations, as well as other variations. Furthermore, in sequences that include variations in illumination, our likelihood model usually performs better than the default tracker likelihood model.
KW - gesture recognition
KW - Illumination variation
KW - particle filter
KW - sparse representation
KW - target tracking
UR - https://www.scopus.com/pages/publications/85023624863
U2 - 10.1109/TCSVT.2017.2725322
DO - 10.1109/TCSVT.2017.2725322
M3 - Article
AN - SCOPUS:85023624863
SN - 1051-8215
VL - 28
SP - 2836
EP - 2848
JO - IEEE Transactions on Circuits and Systems for Video Technology
JF - IEEE Transactions on Circuits and Systems for Video Technology
IS - 10
M1 - 7973188
ER -