BibTex RIS Kaynak Göster

YAPISAL ÖZELLİKLERİ KULLANAN PARÇACIK FİLTRESİ İLE UZUN SÜRELİ NESNE TAKİBİ

Yıl 2017, Cilt: 5 Sayı: 1, 107 - 118, 20.03.2017

Öz

Nesnelerin uzun süreli takip edilmesi eski bir araştırma konusu olmasına rağmen araştırmacıların hala aktif olarak ilgisini çeken ve hakkında birçok çalışma yapılan araştırma konularının başında gelmektedir. Bu çalışmada tahminsel yöntemler arasında adı anılan, durum uzay değişkenlerinden yararlanarak takip konusunu ilgilendiren dinamikleri modelleyen parçacık filtresi ile nesne takibi gerçekleştirilmiştir. Parçacık filtresinde, parçacık ağırlıklarının belirlenmesinde kullanılan ölçüm modelinde yenilikler sunularak nesnenin yapısal özelliklerinin kullanıldığı SSIM benzerlik katsayısı ile birlikte adaptif histogram eşitlemesi ve nesne merkez bölgesinin ağırlıklandırılması temeline dayanan yeni bir ölçüm modeli geliştirilmiştir.  Yapılan deneysel sonuçlar, önerilen nesne takip yönteminin klasik takip performansını en az %18.59 oranında arttırdığı gözlemlenmiştir. 

Kaynakça

  • M. Islam, C. Oh, and C. Lee, “Video Based Moving Object Tracking by Particle Filter,” Int. J. Signal Process. Image Process. Pattern, vol. 2, pp. 119–132, 2009.
  • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process., vol. 50, no. 2, pp. 174–188, 2002.
  • G. M. Rao and C. Satyanarayana, “Visual Object Target Tracking Using Particle Filter: A Survey,” Int. J. Image, Graph. Signal Process., vol. 5, no. 6, pp. 57–71, 2013.
  • L. Mihaylova, P. Brasnett, N. Canagarajah, and D. Bull, “Object Tracking by Particle Filtering Techniques in Video Sequences,” no. July 2015, 2007.
  • M. Isard and A. Blake, “Condensation - conditional density propagation for visual tracking,” Int. J. Comput. Vis., vol. 29, no. 1, pp. 5–28, 1998.
  • C. Hue, J. Vermaak, and M. Gangnet, “Color-Based Probabilistic Tracking,” pp. 661–675, 2002.
  • X. Jia and H. Lu, “Visual Tracking via Adaptive Structural Local Sparse Appearance Model,” in IEEE Conf. Comput. Vis. Pattern Recognit., pp. 1822–1829, 2012.
  • C. Chen, W. Tarng, and K. Lo, “An Improved Particle Filter Tracking System Based on Colour and Moving Edge Information,” Int. J. Comput. Sci. Inf. Technol., vol. 6, no. 4, pp. 97–117, 2014.
  • K. Ng and E. Delp, “New models for real-time tracking using particle filtering,” IS&T/SPIE Electron. Imaging, p. 72570B–72570B–12, 2009.
  • J. Chun and G. Shin, “Realtime Facial Expression Recognition from Video Sequences Using Optical Flow and Expression HMM,” in Journal of Korean Society for Internet Information, pp. 55–70, 2014.
  • M. Lucena and J. M. Fuertes, “Optical flow-based observation models for particle filter tracking,” pp. 135–143, 2015.
  • S. Belgacem, A. Ben-hamadou, and T. Paquet, “Hand Tracking Using Optical-Flow Embedded Particle Filter in Sign Language Scenes,” pp. 1–8, 2012.
  • T. M. F. Dilmen H, “Tek Boyutlu Durum Uzay Değişkenlerinin Parçacık Filtresi Yöntemi İle Takibi Tracking One Dimension State Space Variables With Particle Filter Method.,” in Signal Processing and Communications Applications Conference (SIU), 2015 23th, Malatya, 2015, pp. 1513–1516, 2015.
  • S. J. Julier and J. K. Uhlmann, “A new extension of the Kalman filter to nonlinear systems,” Int Symp AerospaceDefense Sens. Simul Control., vol. 3, pp. 182–193, 1997.
  • D. L. and T. Kanade, “An iterative image registration tech-nique with an application to stereo vision,” in inProc. 7th Int. Joint Conf. Artif. Intell., 1981, pp. 121–130, 1981.
  • X. Mei and H. Ling, “Robust Visual Tracking using L1 Minimization,” in IEEE International Conference on Computer Vision, no. Iccv, pp. 1436–1443, 2009.
  • D. A. Ross, J. Lim, R.-S. Lin, and M.-H. Yang, “Incremental Learning for Robust Visual Tracking,” Int. J. Comput. Vis., vol. 77, no. 1–3, pp. 125–141, 2007.
  • Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: From error visibility to structural similarity,” IEEE Trans. Image Process., vol. 13, no. 4, pp. 600–612, 2004.
  • A. Łoza, L. Mihaylova, D. Bull, and N. Canagarajah, “Structural similarity-based object tracking in multimodality surveillance videos,” Mach. Vis. Appl., vol. 20, no. 2, pp. 71–83, 2009.
  • G. Yadav, S. Maheshwari, and A. Agarwal, “Contrast limited adaptive histogram equalization based enhancement for real time video system,” Proc. 2014 Int. Conf. Adv. Comput. Commun. Informatics, ICACCI 2014, pp. 2392–2397, 2014.
  • Y. Wu, J. Lim, and M.-H. Yang, “Online Object Tracking: A Benchmark,” 2013 IEEE Conf. Comput. Vis. Pattern Recognit., pp. 2411–2418, 2013.
Yıl 2017, Cilt: 5 Sayı: 1, 107 - 118, 20.03.2017

Öz

Kaynakça

  • M. Islam, C. Oh, and C. Lee, “Video Based Moving Object Tracking by Particle Filter,” Int. J. Signal Process. Image Process. Pattern, vol. 2, pp. 119–132, 2009.
  • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process., vol. 50, no. 2, pp. 174–188, 2002.
  • G. M. Rao and C. Satyanarayana, “Visual Object Target Tracking Using Particle Filter: A Survey,” Int. J. Image, Graph. Signal Process., vol. 5, no. 6, pp. 57–71, 2013.
  • L. Mihaylova, P. Brasnett, N. Canagarajah, and D. Bull, “Object Tracking by Particle Filtering Techniques in Video Sequences,” no. July 2015, 2007.
  • M. Isard and A. Blake, “Condensation - conditional density propagation for visual tracking,” Int. J. Comput. Vis., vol. 29, no. 1, pp. 5–28, 1998.
  • C. Hue, J. Vermaak, and M. Gangnet, “Color-Based Probabilistic Tracking,” pp. 661–675, 2002.
  • X. Jia and H. Lu, “Visual Tracking via Adaptive Structural Local Sparse Appearance Model,” in IEEE Conf. Comput. Vis. Pattern Recognit., pp. 1822–1829, 2012.
  • C. Chen, W. Tarng, and K. Lo, “An Improved Particle Filter Tracking System Based on Colour and Moving Edge Information,” Int. J. Comput. Sci. Inf. Technol., vol. 6, no. 4, pp. 97–117, 2014.
  • K. Ng and E. Delp, “New models for real-time tracking using particle filtering,” IS&T/SPIE Electron. Imaging, p. 72570B–72570B–12, 2009.
  • J. Chun and G. Shin, “Realtime Facial Expression Recognition from Video Sequences Using Optical Flow and Expression HMM,” in Journal of Korean Society for Internet Information, pp. 55–70, 2014.
  • M. Lucena and J. M. Fuertes, “Optical flow-based observation models for particle filter tracking,” pp. 135–143, 2015.
  • S. Belgacem, A. Ben-hamadou, and T. Paquet, “Hand Tracking Using Optical-Flow Embedded Particle Filter in Sign Language Scenes,” pp. 1–8, 2012.
  • T. M. F. Dilmen H, “Tek Boyutlu Durum Uzay Değişkenlerinin Parçacık Filtresi Yöntemi İle Takibi Tracking One Dimension State Space Variables With Particle Filter Method.,” in Signal Processing and Communications Applications Conference (SIU), 2015 23th, Malatya, 2015, pp. 1513–1516, 2015.
  • S. J. Julier and J. K. Uhlmann, “A new extension of the Kalman filter to nonlinear systems,” Int Symp AerospaceDefense Sens. Simul Control., vol. 3, pp. 182–193, 1997.
  • D. L. and T. Kanade, “An iterative image registration tech-nique with an application to stereo vision,” in inProc. 7th Int. Joint Conf. Artif. Intell., 1981, pp. 121–130, 1981.
  • X. Mei and H. Ling, “Robust Visual Tracking using L1 Minimization,” in IEEE International Conference on Computer Vision, no. Iccv, pp. 1436–1443, 2009.
  • D. A. Ross, J. Lim, R.-S. Lin, and M.-H. Yang, “Incremental Learning for Robust Visual Tracking,” Int. J. Comput. Vis., vol. 77, no. 1–3, pp. 125–141, 2007.
  • Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: From error visibility to structural similarity,” IEEE Trans. Image Process., vol. 13, no. 4, pp. 600–612, 2004.
  • A. Łoza, L. Mihaylova, D. Bull, and N. Canagarajah, “Structural similarity-based object tracking in multimodality surveillance videos,” Mach. Vis. Appl., vol. 20, no. 2, pp. 71–83, 2009.
  • G. Yadav, S. Maheshwari, and A. Agarwal, “Contrast limited adaptive histogram equalization based enhancement for real time video system,” Proc. 2014 Int. Conf. Adv. Comput. Commun. Informatics, ICACCI 2014, pp. 2392–2397, 2014.
  • Y. Wu, J. Lim, and M.-H. Yang, “Online Object Tracking: A Benchmark,” 2013 IEEE Conf. Comput. Vis. Pattern Recognit., pp. 2411–2418, 2013.
Toplam 21 adet kaynakça vardır.

Ayrıntılar

Bölüm Makaleler
Yazarlar

Haluk Dilmen

Muhammet Fatih Talu

Yayımlanma Tarihi 20 Mart 2017
Gönderilme Tarihi 19 Nisan 2016
Yayımlandığı Sayı Yıl 2017 Cilt: 5 Sayı: 1

Kaynak Göster

APA Dilmen, H., & Talu, M. F. (2017). YAPISAL ÖZELLİKLERİ KULLANAN PARÇACIK FİLTRESİ İLE UZUN SÜRELİ NESNE TAKİBİ. Gazi University Journal of Science Part C: Design and Technology, 5(1), 107-118.

                                     16168      16167     16166     21432        logo.png


    e-ISSN:2147-9526