Egocentric View Transition for Video Monitoring in a Distributed

Download Report

Transcript Egocentric View Transition for Video Monitoring in a Distributed

Fast and Robust Algorithm of Tracking
Multiple Moving Objects for Intelligent
Video Surveillance Systems
Jong Sun Kim, Dong Hae Yeom, and Young Hoon Joo,
IEEE Transactions on Consumer Electronics,Vol. 57, No. 3,
August, 2011
Chairman: Dr. Hung-Chi Yang
Presenter: Fong-Ren Sie
Advisor: Dr. Yen-Ting Chen
Date: 2013.10.16
1
Outline
Introduction
 Methodology
 Results
 Conclusions
 References

2
Introduction

The traditional video surveillance system
◦ Closed-circuit televisions (CCTV)
◦ Digital video recorders (DVR)

Disadvantages
◦ Need someone to monitor and search

Real time intelligent video surveillance
systems
◦ High-cost and low-efficiency
3
Introduction

The intelligent video surveillance system is
a convergence technology
◦ Detecting and tracking objects
◦ Analyzing their movements
◦ Responding
4
Introduction

Tracking Multiple Moving Objects for
Intelligent Video Surveillance Systems
◦ The basic technologies of the intelligent video
surveillance systems.
◦ To detect and track the specific moving
objects.
◦ Eliminate the environmental disturbances
5
Introduction

Eliminate the environmental disturbances
◦ The Bayesian method such as the Particle
Filter(PF) or the Extended Kalman Filter (EKF)
◦ Background modeling (BM) or the Gaussian
mixture model (GMM).
6
Introduction

RGB BM with a new sensitivity parameter
to extract moving regions

Morphology schemes to eliminate noises
and labeling to group the moving objects.
7
Methodology

DETECTING MOVING OBJECTS
◦ Extraction of Moving Objects
 BM involves the loss of image information
compared with the color BM using RGB and HSI
color space models
◦ Gray-scale BM
 Image information is excessively attenuated.
◦ RGB color model
 Very sensitive to even small changes caused by light
scattering or reflection.
Methodology

Gray-scale BM
9
Methodology

RGB color model


Prevent excessive attenuation
Shorter execution time
10
Methodology

Binary image
11
Methodology

The group tracking
◦ Prevent the problems of the individual
tracking
◦ A grouping scheme is required to classify
moving objects into several groups
◦ The 4-directional blob labeling is employed to
group moving objects
12
Methodology

4-directional blob-labeling
13
Methodology

Tracking moving object
◦ Predicting the position of each group
◦ Recognizing the homogeneity of each group in
the sequential frames
◦ identifying the newly appearing and
disappearing groups.
14
Methodology
15
Results

(d) The 169th frame
16
Results

The error of the predicted position of each group
17
Results

The processing time of the proposed method
18
Conclusions

Detecting and tracking multiple moving
objects
◦ Can be applied to consumer electronics
◦ The robustness and the speed
◦ The robustness against the environmental
influences
◦ The high-speed of the image processing
◦ The method is intended for a fixed camera
19
References





[1] C. Chang, R. Ansari, and A. Khokhar, “Multiple Object Tracking
with Kernel Particle Filter,” Proceedings of IEEE Computer Society
Conference on Computer Vision and Pattern Recognition,Vol.1, pp.566573, May 2005.
[2] F. Chang, C. J. Chen, and C. J. Lu. “A Linear-time Component
Labeling Algorithm Using Contour Tracing Technique,” Computer
Vision and Image Understanding, Vol. 93, No. 2, pp. 206-220, 2004.
[3] A. Hampapur, L. Brown, J. Connell, A. Ekin, N. Haas, M. Lu, H.
Merkl, S. Pankanti, A. Senior, C. Shu, and Y. L. Tian, “Smart Video
Surveillance,” IEEE Signal Processing Magazine,Vol. 22, No.2, pp. 38-51,
Mar. 2005.
[4] R. M. Haralick, S. R. Stemberg, and X. Zhuang, “Image Analysis
Using Mathematical Morphology,” IEEE Transactions on Pattern
Analysis and Machine Intelligence, Vol. PAMI-9, No. 4, pp. 532-550. 1987.
[5] I. Haritaoglu, D. Harwood, and L. S. Davis, “W4: Real-time
Surveillance of People and Their Activities,” IEEE Transactions on
Pattern Analysis and Machine Intelligence, Vol. 22, No.8, pp. 809-830,
Aug. 2000.
20
References





[6] M. Haseyama and Y. Kaga “Two-phased Region Integration
Approach for Effective Pedestrian Detection in Low Contrast
Images” IEEE International Conference on Consumer Electronics, pp. 1-2,
Jan. 2008.
[7] O. Javed and M. Shah, “Tracking and Object Classification for
Automated Surveillance,” 7th European Conference on Computer
Vision, Lecture Notes in Computer Science 2353, pp. 343–357, 2002.
[8] S. Kang, J. Paik, A. Koschan, B. Abidi, and A. Abidi, “Real-time Video
Tracking Using PTZ Cameras,” Proceedings of SPIE 6th International
Conference on Quality Control by Artificial Vision, Vol. 5132, pp. 103-111,
2003.
[9] W. Lao, J. Han, and H. N. Peter, “Automatic Video-based Human
Motion Analyzer for Consumer Surveillance System” IEEE
Transactions on Consumer Electronics, Vol. 55, No. 2, pp. 591-598,May
2009.
[10] D. Makris and T. Ellis, “Automatic Learning of an Activity-based
Semantic Scene Model,” Proceedings of IEEE Conference on Advanced
Video and Signal Based Surveillance, pp. 183-188, Jul. 2003.
21
References





[11] M. H. Sedky, M. Moniri, and C. C. Chibelushi, “Classification of
Smart Video Surveillance Systems for Commercial Applications,”
IEEE Conference on Advanced Video and Signal Based Surveillance, pp.
638-643, Sep. 2005.
[12] C. Stauffer and W. Grimson, “Learning Patterns of Activity
Using Real Time Tracking,” IEEE Transactions on Pattern Analysis and
machine Intelligence, Vol. 22, No.8, pp. 747-767, Aug. 2000.
[13] M.Valera and S. A.Velastine, “A Review of the State-of-art in
Distributed Surveillance Systems,” IEE Intelligent Distributed Video
Surveillance Systems, pp.1-30, 2006.
[14] Y. Zhai, M. B.Yeary, S. Cheng, and N. Keharnavaz, “An ObjectTracking Algorithm Based on Multiple-model Particle Filtering with
State Partitioning,” IEEE Transactions on instrumentation and
measurement,Vol.58, No.5, pp. 1797-1809, May 2009.
[15] R. Zhang, S. Zhang, and S.Yu, “Moving Objects Detection
Method Based on Brightness Distortion and Chromaticity
Distortion,” IEEE Transactions on Consumer Electronics, Vol. 53, No. 3,
pp. 1177-1185,Aug. 2007.
22
Thank you for your attention
23