throbber
US0078689.12B2
`
`(12) United States Patent
`Venetianer et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7,868,912 B2
`Jan. 11, 2011
`
`(54) VIDEO SURVEILLANCE SYSTEM
`EMPLOYINGVIDEO PRIMITIVES
`
`(75) Inventors: Peter L. Venetianer, McLean, VA (US);
`Alan J. Lipton, Herndon, VA (US);
`Andrew J. Chosak, Arlington, VA (US);
`Matthew F. Frazier, Arlington, VA
`(US); Niels Haering, Reston, VA (US);
`Gary W. Myers, Ashburn, VA (US);
`Weihong Yin, Herndon, VA (US);
`Zhong Zhang, Herndon, VA (US)
`
`(73) Assignee: ObjectVideo, Inc., Reston, VA (US)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 1612 days.
`
`(21) Appl. No.: 11/098,385
`
`(22) Filed:
`
`Apr. 5, 2005
`
`(65)
`
`Prior Publication Data
`US 2005/O169367 A1
`Aug. 4, 2005
`
`Related U.S. Application Data
`(63) Continuation-in-part of application No. 11/057,154,
`filed on Feb. 15, 2005, which is a continuation-in-part
`of application No. 09/987,707, filed on Nov. 15, 2001,
`now abandoned, which is a continuation-in-part of
`application No. 09/694,712, filed on Oct. 24, 2000,
`now Pat. No. 6,954,498.
`
`(51) Int. Cl.
`H04N 7/8
`
`(2006.01)
`
`(58) Field of Classification Search ................. 348/143,
`348/148, 150, 149, 166, 169, 170; 382/103,
`382/115; 375/240.02, 240.08; H04N 7/18
`See application file for complete search history.
`References Cited
`
`(56)
`
`U.S. PATENT DOCUMENTS
`3,812,287 A
`5, 1974 Lemelson
`4,249,207 A
`2, 1981 Harman et al.
`4,257,063 A
`3/1981 Loughry et al.
`4,737,847 A
`4, 1988 Araki et al.
`4,908,704 A
`3/1990 Fujioka et al.
`5.448,315 A
`9, 1995 Soohoo
`5,491,511 A
`2, 1996 Odle
`5,515,453 A
`5/1996 Hennessey et al.
`5,610,653 A
`3, 1997 Abecassis
`5,623,249 A
`4/1997 Camire
`5,696,503 A 12/1997 Nasburg
`(Continued)
`FOREIGN PATENT DOCUMENTS
`O293,189 B1
`T 1994
`(Continued)
`OTHER PUBLICATIONS
`International Search Report for International Application No. PCT/
`US08/09073, dated Nov. 3, 2008.
`(Continued)
`Primary Examiner Tung Vo
`(74) Attorney, Agent, or Firm Muir Patent Consulting,
`PLLC
`
`EP
`
`ABSTRACT
`(57)
`A video Surveillance system extracts video primitives and
`extracts event occurrences from the video primitives using
`event discriminators. The system can undertake a response,
`Such as an alarm, based on extracted event occurrences.
`
`(52) U.S. Cl. ...................................................... 348/143
`
`22 Claims, 19 Drawing Sheets
`
`11
`
`computer system
`
`
`
`Computer-readable
`medium
`
`16
`
`I/O devices
`
`
`
`
`
`
`
`video
`SenSOS
`
`video
`recorders
`
`15
`
`
`
`other
`SeSOS
`
`17
`
`Canon Ex. 1034 Page 1 of 39
`
`

`

`US 7,868,912 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`9/1998 Nasburg
`5,801,943 A
`9/1998 Wang et al.
`5,802.361 A
`5,850,352 A * 12/1998 MoeZZi et al. .............. 345/419
`5,860,086 A
`1/1999 Crump et al.
`5,872,865 A
`2, 1999 Normile et al.
`5,886,701 A
`3, 1999 Chauvin et al.
`5,912,980 A * 6/1999 Hunke ........................ 382,103
`5,926,210 A
`7, 1999 Hackett et al.
`5,956,081 A
`9, 1999 Katz et al.
`5,959,690 A
`9, 1999 Toebes, VIII et al.
`5,963,202 A 10/1999 Polish
`5,963.203 A 10/1999 Goldberg et al.
`5,983,147 A 1 1/1999 Krumm
`5,987.211 A 1 1/1999 Abecassis
`5.999,189 A 12/1999 Kajiya et al.
`6,014,461 A
`1/2000 Hennessey et al.
`6,025,877 A *
`2/2000 Chang et al. ........... 375,240.01
`6,031,573 A
`2, 2000 MacCormacket al.
`6,069,653 A
`5, 2000 Hudson et al.
`6,075,560 A
`6, 2000 Katz
`6,088.484 A
`7, 2000 Mead
`6,091,771 A
`7/2000 Seeley et al.
`6,097.429 A * 8/2000 Seeley et al. ................ 348,154
`6,123,123 A
`9, 2000 Carder et al.
`6,144,375. A 1 1/2000 Jain et al.
`6,151,413 A 1 1/2000 Jang
`6,166,744. A 12/2000 Jaszlics et al.
`6,177,886 B1
`1/2001 Billington et al.
`6,201.473 B1
`3/2001 Schaffer
`6,21 1907 B1
`4/2001 Scaman et al.
`6,226.388 B1
`5/2001 Qian et al.
`6,297,844 B1
`10/2001 Schatz et al.
`6,307,885 B1
`10/2001 Moon et al.
`6,310,916 B1 10/2001 Han
`6,326,964 B1
`12/2001 Snyder et al.
`6,351,265 B1
`2/2002 Bulman
`6,351,492 B1
`2/2002 Kim
`6,360,234 B2
`3/2002 Jain et al. ................... 715/201
`6,404,455 B1
`6/2002 Ito et al.
`6.41 1,724 B1
`6/2002 Vaithilingam et al.
`6.424,370 B1,
`7/2002 Courtney
`6,504,479 B1
`1/2003 Lemons et al.
`6,525,658 B2
`2/2003 Streetman et al.
`6,542,840 B2
`4/2003 Okamoto et al.
`6,552,826 B2
`4/2003 Adler et al.
`6,570,608 B1
`5/2003 Tsergn
`6,573,907 B1
`6/2003 Madrane et al.
`6,597,800 B1,
`7/2003 Murray et al.
`6,628,835 B1
`9/2003 Brill et al.
`6,646,676 B1
`1 1/2003 DaGraca et al.
`6,696,945 B1
`2/2004 Venetianer et al.
`6,707,852 B 1
`3/2004 Wang
`6,721.454 B1* 4/2004 Qian et al. .................. 382,224
`6,724.915 B1
`4/2004 Toklu et al.
`6,727,938 B1
`4/2004 Randall
`6,738,424 B1
`5, 2004 Allmen et al.
`6,741,977 B1
`5/2004 Nagaya
`6,801,662 B1
`10/2004 Owechko et al.
`6,816,184 B1
`1 1/2004 Brill et al.
`6,829,371 B1
`12/2004 Nichani et al.
`6,844,818 B2
`1/2005 Grech-Clini
`6,865,580 B1
`3, 2005 Bush
`6,924,801 B1
`8, 2005 Dorbie
`6,954.498 B1
`10/2005 Lipton
`6,987,528 B1
`1/2006 Nagahisa et al.
`6,987,883 B2
`1/2006 Lipton et al.
`7,023,469 B1
`4/2006 Olson
`7,167,519 B2
`1/2007 Comaniciu et al.
`7,197,072 B1* 3/2007 Hsu et al. .............. 375,240.02
`7,227,893 B1* 6/2007 Srinivasa et al. ....... 375,240.08
`7.356,830 B1 * 4/2008 Dimitrova .................... 7.25/51
`
`7,436,887 B2 10/2008 Yeredor et al.
`7.447,331 B2 * 1 1/2008 Brown et al. ................ 382,103
`7,660,439 B1
`2/2010 Lu et al. ..................... 382/107
`2001 OO19357 A1
`9, 2001 to et al.
`2001/0033330 A1 10, 2001 GarOutte
`2001/0035907 A1 11/2001 Broemmelsiek
`2002/0008758 A1
`1/2002 Broemmelsiek et al.
`2002/0024446 A1
`2/2002 Grech-Cini
`2002/0051058 A1
`5, 2002 to et al.
`2002/0082769 A1
`6, 2002 Church et al.
`2002fOO95490 A1
`7, 2002 Barker et al.
`2002/0135483 Al
`9, 2002 Merheim et al.
`2002/0163521 A1 1 1/2002 Ellenby et al.
`2002/019 1851 A1 12/2002 Keinan
`2003/0043160 A1
`3/2003 Elfving et al.
`2003/005.1255 A1
`3/2003 Bulman et al.
`2003.0053659 A1
`3, 2003 Pavlidis et al.
`2003/0085992 A1
`5/2003 Arpa et al.
`2003/0231769 A1* 12/2003 Bolle et al. ................. 380,210
`2004/O113933 A1
`6/2004 Guler
`2004/O161133 A1* 8, 2004 Elazar et al. ................ 382,115
`2004/0240542 A1 12/2004 Yeredor et al.
`2005/0146605 A1
`7/2005 Lipton et al.
`2005/0157169 A1
`7/2005 Brodsky et al.
`2005/0162515 A1
`7/2005 Venetianer et al.
`2005/0168574 A1
`8/2005 Lipton et al.
`2006/0232673 Al 10/2006 Lipton et al.
`2006/0279630 A1* 12/2006 Aggarwal et al. ........... 348/143
`2007/0002141 A1
`1/2007 Lipton et al.
`2007/0013776 A1
`1/2007 Venetianer et al.
`2007/0052803 A1
`3, 2007 Chosak et al.
`2007/0127774 A1
`6/2007 Zhang et al.
`2008/0.100704 A1
`5/2008 Venetianer et al.
`
`FOREIGN PATENT DOCUMENTS
`
`1/1999
`O893.823 A1
`EP
`1/1999
`O893.923 A1
`EP
`O967584 A2 12/1999
`EP
`1024666 A2
`8, 2000
`EP
`1120746 A2
`8, 2001
`EP
`1333682 A1
`8, 2003
`EP
`O9-247654. A
`9, 1997
`JP
`10-048008
`2, 1998
`JP
`10290449 A 10, 1998
`JP
`2000-175174
`6, 2000
`JP
`2000-339.923
`8, 2000
`JP
`2000-224.542
`11, 2000
`JP
`2001-175868
`6, 2001
`JP
`2001-285681
`10, 2001
`JP
`WO 94.03.014 A1
`2/1994
`WO
`WOO1? 62005
`8, 2001
`WO
`WO-03/044727 A1
`5/2003
`WO
`WO WO-2004/006184 A2
`1/2004
`
`OTHER PUBLICATIONS
`Written Opinion for International Patent Application No. PCT/US08/
`09073, dated Nov. 3, 2008.
`International Search Report issued for PCT Application No. PCT/
`US06/25196, mailed on Jan. 16, 2008.
`Written Opinion issued for PCT Application No. PCT/US06/25196,
`mailed on Jan. 16, 2008.
`Shio et al., "Segmentation of People in Motion', IEEE 1991, p.
`325-332.
`International Search Report issued in PCT Application No. PCT/
`US2006/012556, mailed on Feb. 12, 2008.
`Written Opinion issued in PCT Application No. PCT/US2006/
`0 12556, mailed on Feb. 12, 2008.
`Notification for IL App. No. 161777 issued February 21, 2008 and
`English translation thereof.
`CN Office Action for CN 0282.2772.7 on Oct. 14, 2005 in English.
`International Search Report issued for PCT Application No. PCT/
`US06/45625, mailed on Sep. 24, 2007.
`
`Canon Ex. 1034 Page 2 of 39
`
`

`

`US 7,868,912 B2
`Page 3
`
`International Search Report issued for PCT Application No. PCT/
`US01/32614 on May 6, 2002.
`International Search Report issued for PCT Application No. PCT/
`US02/22688 on Dec. 11, 2002.
`Written Opinion of the International Searching Authority issued for
`PCT Application No. PCT/US06/45625, mailed on Sep. 24, 2007.
`H. Fujiyoshi and A. J. Lipton, “Real-time Human Motion Analysis by
`Image Skeletonization,” Proceedings of IEEE WACV'98, Princeton,
`NJ, 1998, pp. 15-21.
`A. J. Lipton, H. Fujiyoshi and R. S. Patil, “Moving Target Classifi
`cation and Tracking from Real-time Video.” Proceedings of IEEE
`WACV'98, Princeton, NJ, 1998, pp. 8-14.
`A. J. Lipton, “Local Application of Optic Flow to Analyse Rigid
`Versus Non-Rigid Motion.” International Conference on Computer
`Vision, Corfu, Greece, Sep. 1999.
`R. T. Collins, Y. Tsin, J. R. Miller, and A. J. Lipton “Using a DEM to
`Determine Geospatial Object Trajectories.” CMU-RI-TR-98-19,
`1998.
`A. Selinger and L. Wixson, "Classifying Moving Objects as Rigid or
`Non-Rigid Without Correspondences.” Proceedings of DARPA
`Image Understanding Workshop, Nov. 1, 1998, pp. 341-347.
`Jemez Technology Corp., Variant iD Web-Site, www.variantid.com,
`printed Aug. 25, 2003.
`Alan J. Lipton “Virtual Postman—An Illustrative Example of Virtual
`Video.” International Journal of Robotics and Automation, vol. 15,
`No. 1, Jan. 2000, pp. 9-16.
`Alan J. Lipton, "Virtual Postman Real-Time, Interactive Virtual
`Video.” IASTED Conference on Computer Graphics and Imaging
`(CGIM 99), Palm Springs, Oct. 25-27, 1999.
`Robert T. Collins et al., “A System for Video Surveillance and Moni
`toring.” Technical Report CMU-RI-TR-00-12, Robotics Institute,
`Carnegie Mellon University, May 2000.
`L. Wixson et al., “Detecting Salient Motion by Accumulating
`Directionally-Consistent Flow.” IEEE, 1999.
`W.E.L. Grimson et al., “Using Adaptive Tracking to Classify and
`Monitor Activities in a Site.” CVPR, pp. 22-29, Jun. 1998.
`A.J. Lipton et al., “Moving Target Classification and Tracking from
`Real-time Video.” IUW, pp. 129-136, 1998.
`T.J. Olsen et al., “Moving Object Detection and Event Recognition
`Algorithm for Smart Cameras.” IUW, pp. 159-175, May 1997.
`
`A. J. Lipton, “Local Application of Optical Flow to Analyse Rigid
`Versus Non-Rigid Motion.” International Conference on Computer
`Vision Frame Rate Workshop, Corfu, Greece, Sep. 1999.
`F. Bartolini et al., "Counting people getting in and out of a bus by
`real-time image-sequence processing.” IVC, 12(1):36-41, Jan. 1994.
`M. Rossi et al., “Tracking and counting moving people.” ICIP94, pp.
`212-216, 1994.
`C.R. Wren et al., “Pfinder: Real-time tracking of the human body.”
`Vismod, 1995.
`L. Khoudour et al., “Real-Time Pedestrian Counting by Active Linear
`Cameras.” JEI, 5(4):452-459, Oct. 1996.
`S. Ioffe et al., “Probabilistic Methods for Finding People.” IJCV.
`43(1):45-68, Jun. 2001.
`M. Isard et al., “BraMBLe: A Bayesian Multiple-Blob Tracker.”
`ICCV, 2001.
`D.M. Gavrila, “The Visual Analysis of Human Movement: A Sur
`vey.” CVIU, 73(1):82-98, Jan. 1999.
`N. Haering et al., “Visual Event Detection.” Video Computing Series,
`Editor Mubarak Shah, 2001.
`Collins et al., “A System for Video Surveillance and Monitoring:
`VSAM Final Report.” Technical Report CMU-RI-TR-00-12, Robot
`ics Institute, Carnegie Mellon University, May 2000.
`J.P. Deparis et al., “A Device for Counting Passengers Making Use of
`Two Active Linear Cameras: Comparison of Algorithms.” IEEE, pp.
`1629-1634, 1996.
`C.R. Wren et al. “Pfinder: Real-TimeTracking of the Human Body.”
`PAMI, vol. 19, pp. 780-784, 1997.
`M. Allmen et al., “Long Range Spatiotemporal Motion Under
`standing Using Spatiotemporal Flow Curves.” Proc. IEEE CVPR,
`Lahaina, Maui, Hawaii. pp. 303-309, 1991.
`L. Wixson, “Detecting Salient Motion by Accumulating Direction
`ally Consistent Flow”, IEEE Trans. Pattern Anal. Mach. Intell., vol.
`22, pp. 774-781, Aug. 2000.
`International Search Report and Written Opinion in PCT/US06/
`02700, Apr. 13, 2007.
`JP Office Action issued in PCT/US02/22688, along with an English
`Translation, Oct. 9, 2007.
`* cited by examiner
`
`Canon Ex. 1034 Page 3 of 39
`
`

`

`US 7,868,912 B2
`2B219,868,7
`
`SE088U88:638
`
`9.930
`
`82>
`
`U.S. Patent
`U.S. Patent
`
`11021,1n.aJ
`
`91f01teehS
`
`
`
`
`
` 803%O:
`
`
`2889
`¢2qu8
`
`E2va
`
`
`
`5923839:8
`
`S.
`| ||
`
`82>
`
`988m
`
`3
`
`82>
`
`28208..
`
`850
`
`288m
`
`N._‘
`
`
`
`
`
`
`
`992:8 r.9“.
`
`xwmw
`
`82>
`
`882.538
`
`:88?
`
`NGE
`
`82>
`
`mocm___mE:m
`
`Ema?
`
`a:8m
`
`82>
`
`88:623
`
`E9w>w
`
`Canon EX. 1034 Page 4 of 39
`
`Canon Ex. 1034 Page 4 of 39
`
`
`
`
`

`

`U.S. Patent
`U.S. Patent
`
`11021,1
`
`f02t
`
`US 7,868,912 B2
`
`m
`
`m
`
`
`
`
`
`9.030925wwwcoawm.
`
`nJamOE
`no-
`
`2o0.”.
`
`
`
`
`
`wv.oE
`
`
`S3982:89me$8882>Uis
`
`W9250an8808880852.3:88?
`
`
`070,.888928882>828888:823
`
`
`m92.85%mm889580822258225.582>
`
`S.8288.:8882>82>858
`
`8298c:89me82:0809m29oHY
`
`Canon EX. 1034 Page 5 of 39
`
`£282£52
`
`
`
`>582>382
`
`
`
`ESQEB.288
`
`859:8288
`
`2:52
`
`38.30
`
`Canon Ex. 1034 Page 5 of 39
`
`

`

`U.S. Patent
`U.S. Patent
`
`11n.aJ
`
`1
`
`91f03mhS
`
`US 7,868,912 B2
`8,6897
`
`2B2
`
`v:m5§3mv
`
`99
`mm
`
`Em=mmm_
`
`3:82.B228.5.
`
`
`
`
`mm>£E=a9830“8.30$8.30.803cozoEm_>
`
`
`
`82>3.3.30950628fig992%muowfio69%
`
`
`
`M3:20ms2,£8.3069%
`
`
`
`
`
`
`
`
`Mm.07.
`
`
`
`S.U“myEfifiguH9.939.:
`
`
`9959anmm
`
`Canon EX. 1034 Page 6 of 39
`
`Canon Ex. 1034 Page 6 of 39
`
`
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 4 of 19
`
`US 7,868,912 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Canon Ex. 1034 Page 7 of 39
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 5 Of 19
`
`US 7,868,912 B2
`
`
`
`SAS
`y
`ES
`SSS/A
`silisi is
`(SSS
`ES:
`
`t . .
`
`
`
`
`w
`
`2E 2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`lame
`
`2
`3
`as a
`
`is
`
`s
`s s
`
`S.
`
`. N
`
`N
`
`Elias:
`SS594
`SEALs
`his
`is
`
`NE
`V
`
`At 22.Élan
`A ...
`s.
`SEA2
`EA 2.
`..a
`
`2
`
`Canon Ex. 1034 Page 8 of 39
`
`

`

`
`
`U.S. Patent
`U.S. Patent
`
`11021.,1n.aJ
`
`hS
`
`9
`
`US 7,868,912 B2
`-/9m
`
`2B2
`
`BEEP26..
`
`009502592052903.59:9”EmEQmao.><ll”%65m::9:Huoaw6:.225w
`
`
`9....ommNN5E:=m>>o.><,.
`
`
`
`
`
`
`‘w,.will“,.tuoaw2001mm;,.W:wr,
`
`
`
`
`
`uoam81-320.625><.595:”flwEQmao.><6So£vémE2w=os<a
`
`
`fl....owwvm“0E;:92....><0umwm.wEF
`
`5°51.”floEBmao.><:r,-Vill..
`
`
`‘uonwEco-3:00
`Soétuflwsofizoix..‘_.H4k.,‘:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Sofia”EmEoumzo.><.BEEF33.‘‘_......,.,..oowm“oErZESQ.><.,p,.._...
`
`Canon EX. 1034 Page 9 of 39
`
`Canon Ex. 1034 Page 9 of 39
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 7 of 19
`
`US 7,868,912 B2
`
`0902£0;EQOI
`
`”50%QUmfioQE8.6WEEmmaI
`
`
`
`coemq302I
`
`3.07.
`
`Hmm?Qcoemm©E86m:EQOI
`
`090EmmmEQOI
`
`
`
`”mm?m:coemm
`
`
`
`Canon EX. 1034 Page 10 of 39
`
`Canon Ex. 1034 Page 10 of 39
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 8 of 19
`
`US 7,868,912 B2
`
`
`
`
`
`
`
`
`
`
`
`e9L ?Inôl
`
`Canon Ex. 1034 Page 11 of 39
`
`

`

`U.S. Patent
`U.S. Patent
`
`aJ
`
`021,1
`
`91f09whS
`
`US 7,868,912 B2
`19:868:7S
`
`2
`
`mgm:9:9”.
`
`
`
`q9), eun61
`
`U@92on
`
`owU_>
`
`n...va
`
`UEm>m
`
`
`
`0283.3895000
`
`
`
`Em>mHomzxm
`
`$89580
`
`.2582>/Fm:
`\| 9],
`
`82:8:
`
`mm>=_e_.n_
`
`anº
`59‘\womtEE
`wwwcoammm
`
`/mm?
`
`cam93m
`
`mmcoqmm:
`
`cogéon
`
`N9.
`
`
`
`
`
`Canon EX. 1034 Page 12 of 39
`
`Canon Ex. 1034 Page 12 of 39
`
`
`
`

`

`U.S. Patent
`U.S. Patent
`
`J
`
`110211
`
`S
`
`US 7,868,912 B2
`8,68,7SU
`
`m2
`
`mat2:3at93E
`
`
`
`q/L ?un61-e/L e Inô|-
`
`m0:«t
`
`0E39%..”to:o
`
`
`
`SoEmocoEs
`
`_.w.oEw:9m
`
`mt.rt
`
`
`
`wE265w=“>830
`
`._o_oEm>um:Em.
`
`m832wBEE8m#8202%
`
`0HHAHmcozoEEmEmocozoéfimeo5.00
`
`
`8:85.330mwtRFv:mt
`
`
`
`
`
`Canon EX. 1034 Page 13 of 39
`
`Canon Ex. 1034 Page 13 of 39
`
`
`
`
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 11 of 19
`
`US 7,868
`912 B2
`
`9
`
`
`
`
`
`09|| eun61-q3L eun61
`
`
`
`Canon Ex. 1034 Page 14 of 39
`
`

`

`U.S. Patent
`U.S. Patent
`
`“J4
`
`11m1,1n.
`
`M2
`
`US 7,868,912 B2
`19,
`0068,7SU
`
`
`
`259:30.5..$530
`
`6592%.;cm:
`
`
`
`295,9:$390
`
`3?
`
`N2
`
`h.._,s§@E©%m@@s
`9A1.200
`8:8:586
`
`am?.
`
`E$8
`
`0_o__._m>
`
`m2‘95mm2-
`
`Canon EX. 1034 Page 15 of 39
`
`Canon Ex. 1034 Page 15 of 39
`
`
`

`

`U.S. Patent
`
`2
`
`1
`
`B
`
`n
`
`8,
`
`1
`
`2B
`
`:EEEEgo
`
`nJaEBQEE.BmEO
`
`
`
`
`nEBwe55?2N9:8n.558:.”658:.oz<
`
`
`mN2
`
`vowmom
`
`NON
`
`
`
`Em9:395..”>530
`
`$me“9:22:?um.
`
`
`
`_.E3:2_wmw___cm
`
`EN
`
`
`
`aa..
`
`8Nm;,,,fN:
`
`9Etilmggawmfiwgu
`
`ooz<
`
`9,......
`
`2om2%:
`
`
`
`
`
`cozmoEmmmGNno;Banker.20mgU5.00
`
`mt
`
`Canon EX. 1034 Page 16 of 39
`
`Canon Ex. 1034 Page 16 of 39
`
`
`
`
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 14 of 19
`
`US 7,868,912 B2
`
`s
`
`Canon Ex. 1034 Page 17 of 39
`
`

`

`s”
`
`neta
`
`11021,
`
`Poz<
`
`U35
`
`t25.«mg
`
`ERa5
`
`
`
`
`A3E90.63SEmEmwm_oEm>M”5:69:
`
`
`
`
`”5:69:Qz<nJEonEm...UmEO8285830
`
`
`
`“8.3001,mg928m:a9:8EN928NF59:8M5:69:umEcoEoz<quvoEquvoEoz<w6538
`
`.98533.30mEE:5S852585EN
`
`US 7,868,912 B2
`68,7
`
`219
`
`2
`
`Baa9:9“.
`
`8,95EN28EN
`
`passolo
`
`
`
`
`
`
`
`SumwwohoUmmmogoDowwogoDowwohoUSE239:moE9:52:8&SEENoa2355
`
`p?SSO10
`
`Canon EX. 1034 Page 18 of 39
`
`Canon Ex. 1034 Page 18 of 39
`
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 16 of 19
`
`US 7,868,912 B2
`
`ZZ ?un61
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Canon Ex. 1034 Page 19 of 39
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 17 Of 19
`
`US 7,868,912 B2
`
`
`
`
`
`Comms Layer
`
`£Z ?un61–
`
`
`
`
`
`S?ueuoduuOO ?o?A?OJ SISÁ?euw
`
`yndu]
`
`Canon Ex. 1034 Page 20 of 39
`
`

`

`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 18 of 19
`
`US 7,868,912 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`#Z ?un61-I
`
`Canon Ex. 1034 Page 21 of 39
`
`

`

`U.S. Patent
`U.S. Patent
`
`Jan. 11, 2011
`
`Sheet 19 of 19
`
`US7
`
`868,912 B2
`,868,912 B2
`
`
`
`mE£_.om_<855::
`
`.“Vqfiuuwxvui
`,034911.
`aim,
`
`
`
`5:5<9.www_9_>>RN
`
`
`
`35538:202:5:,m>o
`
`32:85
`
`
`
`
`
`
`
`$8th53
`
`82:53
`
`
`
`1.n_.x552
`
`EN
`
`
`
`
`
`83mmE£_.om_<00:23:.
`
`cam<05:599:8n:
`
`
`
`25232:52<05:,99:8n:
`
`2.5%?
`
`
`
`20380055,xoo_mm_w>_mc<
`
`2.5%?ho“mm
`
`,
`
`omw
`
`
`
`EmEmmmcmE82>n:
`
`2.6<05:3germ?—
`
`
`
`
`
`wEficomZ8:23;.90:60mo_mc<
`
`GZ ?un61
`mm05mm
`
`Canon EX. 1034 Page 22 of 39
`
`Canon Ex. 1034 Page 22 of 39
`
`
`
`
`
`
`

`

`1.
`VIDEO SURVELLANCE SYSTEM
`EMPLOYINGVIDEO PRIMITIVES
`
`US 7,868,912 B2
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation-in-part of U.S. patent
`application Ser. No. 11/057,154, filedon Feb. 15, 2005, which
`is a continuation-in-part of U.S. patent application Ser. No.
`09/987,707, filed on Nov. 15, 2001, which claims the priority
`of U.S. patent application Ser. No. 09/694,712, filed on Oct.
`24, 2000, all of which are incorporated herein by reference.
`
`BACKGROUND OF THE INVENTION
`
`Field of the Invention
`
`10
`
`15
`
`The invention relates to a system for automatic video Sur
`veillance employing video primitives.
`
`REFERENCES
`
`For the convenience of the reader, the references referred to
`herein are listed below. In the specification, the numerals
`within brackets refer to respective references. The listed ref.
`erences are incorporated herein by reference.
`The following references describe moving target detection:
`{1} A. Lipton, H. Fujiyoshi and R. S. Patil, “MovingTarget
`Detection and Classification from Real-Time Video. Pro
`ceedings of IEEE WACV '98, Princeton, N.J., 1998, pp. 8-14.
`{2} W. E. L. Grimson, et al., “Using Adaptive Tracking to
`Classify and Monitor Activities in a Site”, CVPR, pp. 22-29,
`June 1998.
`{3}. A. J. Lipton, H. Fujiyoshi, R. S. Patil, “MovingTarget
`Classification and Tracking from Real-time Video. IUW, pp.
`129-136, 1998.
`{4} T. J. Olson and F. Z. Brill, “Moving Object Detection
`and Event Recognition Algorithm for Smart Cameras.” IUW,
`pp. 159-175, May 1997.
`The following references describe detecting and tracking
`humans:
`{5} A. J. Lipton, “Local Application of Optical Flow to
`Analyse Rigid Versus Non-Rigid Motion.” International
`Conference on Computer Vision, Corfu, Greece, September
`1999.
`{6} F. Bartolini, V.Cappellini, and A. Mecocci, "Counting
`people getting in and out of a bus by real-time image-se
`quence processing.” IVC, 12(1):36-41, January 1994.
`{7} M. Rossi and A. Bozzoli, “Tracking and counting
`moving people.” ICIP94, pp. 212-216, 1994.
`{8 C. R. Wren, A. Azarbayejani, T. Darrell, and A. Pent
`land, “Pfinder: Real-time tracking of the human body. Vis
`mod, 1995.
`{9} L. Khoudour, L. Duvieubourg, J. P. Deparis, “Real
`Time Pedestrian Counting by Active Linear Cameras. JEI,
`5(4):452-459, October 1996.
`{10}. S. Ioffe, D. A. Forsyth, “Probabilistic Methods for
`Finding People.” IJCV43(1):45-68, June 2001.
`{11} M. Isard and J. MacCormick, “BraMBLe: A Baye
`sian Multiple-Blob Tracker.” ICCV, 2001.
`The following references describe blob analysis:
`{12} D. M. Gavrila, “The Visual Analysis of Human
`Movement: A Survey. CVIU, 73(1):82-98, January 1999.
`{13 Niels Haering and Niels da Vitoria Lobo, “Visual
`Event Detection. Video Computing Series, Editor Mubarak
`Shah, 2001.
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`The following references describe blob analysis for trucks,
`cars, and people:
`{14 Collins, Lipton, Kanade, Fujiyoshi, Duggins, Tsin,
`Tolliver, Enomoto, and Hasegawa, “A System for Video Sur
`veillance and Monitoring: VSAM Final Report. Technical
`Report CMU-RI-TR-00-12, Robotics Institute, Carnegie
`Mellon University, May 2000.
`{15 Lipton, Fujiyoshi, and Patil, “Moving Target Classi
`fication and Tracking from Real-time Video. 98 Darpa IUW,
`November 20-23, 1998.
`The following reference describes analyzing a single-per
`son blob and its contours:
`{16} C. R. Wren, A. Azarbayejani, T. Darrell, and A. P.
`Pentland. “Pfinder: Real-TimeTracking of the Human Body.”
`PAMI, vol 19, pp. 780-784, 1997.
`The following reference describes internal motion of
`blobs, including any motion-based segmentation:
`{17 M. Allmen and C. Dyer, “Long Range Spatiotem
`poral Motion Understanding Using Spatiotemporal Flow
`Curves.” Proc. IEEE CVPR, Lahaina, Maui, Hawaii, pp. 303
`309, 1991.
`{18} L. Wixson, “Detecting Salient Motion by Accumu
`lating Directionally Consistent Flow”, IEEE Trans. Pattern
`Anal. Mach. Intell., vol. 22, pp. 774-781, August 2000.
`
`BACKGROUND OF THE INVENTION
`
`Video surveillance of public spaces has become extremely
`widespread and accepted by the general public. Unfortu
`nately, conventional video Surveillance systems produce Such
`prodigious volumes of data that an intractable problem results
`in the analysis of video Surveillance data.
`A need exists to reduce the amount of video surveillance
`data so analysis of the video Surveillance data can be con
`ducted.
`A need exists to filter video surveillance data to identify
`desired portions of the video surveillance data.
`
`SUMMARY OF THE INVENTION
`
`An object of the invention is to reduce the amount of video
`Surveillance data so analysis of the video Surveillance data
`can be conducted.
`An object of the invention is to filter video surveillance data
`to identify desired portions of the video surveillance data.
`An object of the invention is to produce a real time alarm
`based on an automatic detection of an event from video Sur
`veillance data.
`An object of the invention is to integrate data from surveil
`lance sensors other than video for improved searching capa
`bilities.
`An object of the invention is to integrate data from surveil
`lance sensors other than video for improved event detection
`capabilities
`The invention includes an article of manufacture, a
`method, a system, and an apparatus for video Surveillance.
`The article of manufacture of the invention includes a
`computer-readable medium comprising Software for a video
`Surveillance system, comprising code segments for operating
`the video surveillance system based on video primitives.
`The article of manufacture of the invention includes a
`computer-readable medium comprising Software for a video
`Surveillance system, comprising code segments for accessing
`archived video primitives, and code segments for extracting
`event occurrences from accessed archived video primitives.
`
`Canon Ex. 1034 Page 23 of 39
`
`

`

`US 7,868,912 B2
`
`3
`The system of the invention includes a computer system
`including a computer-readable medium having Software to
`operate a computer in accordance with the invention.
`The apparatus of the invention includes a computer includ
`ing a computer-readable medium having Software to operate
`the computer in accordance with the invention.
`The article of manufacture of the invention includes a
`computer-readable medium having Software to operate a
`computer in accordance with the invention.
`Moreover, the above objects and advantages of the inven
`tion are illustrative, and not exhaustive, of those that can be
`achieved by the invention. Thus, these and other objects and
`advantages of the invention will be apparent from the descrip
`tion herein, both as embodied herein and as modified in view
`of any variations which will be apparent to those skilled in the
`art.
`
`DEFINITIONS
`
`4
`a carrier wave used to carry computer-readable electronic
`data, Such as those used in transmitting and receiving e-mail
`or in accessing a network.
`“Software” refers to prescribed rules to operate a computer.
`Examples of software include: Software; code segments;
`instructions; computer programs; and programmed logic.
`A “computer system” refers to a system having a computer,
`where the computer comprises a computer-readable medium
`embodying Software to operate the computer.
`A "network” refers to a number of computers and associ
`ated devices that are connected by communication facilities.
`A network involves permanent connections such as cables or
`temporary connections such as those made through telephone
`or other communication links. Examples of a network
`include: an internet, Such as the Internet; an intranet; a local
`area network (LAN); a wide area network (WAN); and a
`combination of networks, such as an internet and an intranet.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Embodiments of the invention are explained in greater
`detail by way of the drawings, where the same reference
`numerals refer to the same features.
`FIG. 1 illustrates a plan view of the video surveillance
`system of the invention.
`FIG. 2 illustrates a flow diagram for the video surveillance
`system of the invention.
`FIG. 3 illustrates a flow diagram for tasking the video
`Surveillance system.
`FIG. 4 illustrates a flow diagram for operating the video
`Surveillance system.
`FIG.5 illustrates a flow diagram for extracting video primi
`tives for the video surveillance system.
`FIG. 6 illustrates a flow diagram for taking action with the
`Video Surveillance system.
`FIG. 7 illustrates a flow diagram for semi-automatic cali
`bration of the video surveillance system.
`FIG. 8 illustrates a flow diagram for automatic calibration
`of the video surveillance system.
`FIG. 9 illustrates an additional flow diagram for the video
`Surveillance system of the invention.
`FIGS. 10-15 illustrate examples of the video surveillance
`system of the invention applied to monitoring a grocery store.
`FIG. 16a shows a flow diagram of a video analysis sub
`system according to an embodiment of the invention.
`FIG. 16b shows the flow diagram of the event occurrence
`detection and response Subsystem according to an embodi
`ment of the invention.
`FIG. 17 shows exemplary database queries.
`FIG. 18 shows three exemplary activity detectors accord
`ing to various embodiments of the invention: detecting trip
`wire crossings (FIG. 18a), loitering (FIG. 18b) and theft
`(FIG. 18c).
`FIG. 19 shows an activity detector query according to an
`embodiment of the invention.
`FIG. 20 shows an exemplary query using activity detectors
`and Boolean operators with modifiers, according to an
`embodiment of the invention.
`FIGS. 21a and 21b show an exemplary query using mul
`tiple levels of combinators, activity detectors, and property
`queries.
`FIG. 22 shows an exemplary configuration of a video Sur
`veillance system according to an embodiment of the inven
`tion.
`FIG. 23 shows another exemplary configuration of a video
`Surveillance system according to an embodiment of the
`invention.
`
`10
`
`15
`
`25
`
`30
`
`35
`
`A “video” refers to motion pictures represented in analog
`and/or digital form. Examples of video include: television,
`movies, image sequences from a video camera or other
`observer, and computer-generated image sequences.
`A“frame' refers to a particular image or other discrete unit
`within a video.
`An “object” refers to an item of interest in a video.
`Examples of an object include: a person, a vehicle, an animal,
`and a physical Subject.
`An “activity” refers to one or more actions and/or one or
`more composites of actions of one or more objects. Examples
`of an activity include: entering; exiting; stopping; moving:
`raising: lowering; growing; and shrinking.
`A "location” refers to a space where an activity may occur.
`A location can be, for example, Scene-based or image-based.
`Examples of a scene-based location include: a public space; a
`store; a retail space; an office; a warehouse; a hotel room; a
`hotel lobby; a lobby of a building; a casino; a bus station; a
`train station; an airport; a port; abus; a train; an airplane; and
`a ship. Examples of an image-based location include: a video
`image; a line in a Video image; an area in a Video image; a
`rectangular section of a video image; and a polygonal section
`of a video image.
`An "event” refers to one or more objects engaged in an
`activity. The event may be referenced with respect to a loca
`tion and/or a time.
`A “computer refers to any apparatus that is capable of
`accepting a structured input, processing the structured input
`according to prescribed rules, and producing results of the
`processing as output. Examples of a computer include: a
`computer, a general purpose computer, a Supercomputer, a
`mainframe; a Super mini-computer; a mini-computer; a work
`station; a micro-computer; a server; an interactive television;
`a hybrid combination of a computer and an interactive tele
`vision; and application-specific hardware to emulate a com
`55
`puter and/or Software. A computer can have a single processor
`or multiple processors, which can operate in parallel and/or
`not in parallel. A computer also refers to two or more com
`puters connected together via a network for transmitting or
`receiving information between the computers. An example of
`Such a computer includes a distributed computer system for
`processing information via computers linked by a network.
`A “computer-readable medium” refers to any storage
`device used for storing data accessible by a computer.
`Examples of a computer-readable medium include: a mag
`65
`netic hard disk; a floppy disk; an optical disk, Such as a
`CD-ROM and a DVD; a magnetic tape; a memory chip; and
`
`40
`
`45
`
`50
`
`60
`
`Canon Ex. 1034 Page 24 of 39
`
`

`

`US 7,868,912 B2
`
`5
`FIG. 24 shows another exemplary configuration of a video
`Surveillance system according to an embodiment of the
`invention.
`FIG. 25 shows a network that may be used in exemplary
`configurations of embodiments of the invention.
`
`DETAILED DESCRIPTION OF THE INVENTION
`
`6
`rate) when activity is detected and at lower quality at other
`times. In another exemplary embodiment, the video storage
`and database may be handled separately, e.g., by a digital
`video recorder (DVR), and the video processing subsystem
`may just control whether data is stored and with what quality.
`In another embodiment, the video surveillance system (or
`components thereof) may be on a processing device (such as
`general purpose processor, DSP microcontroller, ASIC,
`FPGA, or other device) on board a video management device
`such as a digital video camera, network video server, DVR, or
`Network Video Recorder (NVR), and the bandwidth of video
`streamed from the device can be modulated by the system.
`High quality video (high bit-rate or frame-rate) need only be
`transmitted through an IP video network only when activities
`of interest are detected. In this embodiment, primitives from
`intelligence-enabled devices can be broadcast via a network
`to multiple activity inference applications at physically dif
`ferent locations to enable a single camera network to provide
`multi-purpose applications through decentralized process
`1ng.
`FIG. 22 shows one configuration of an implementation of
`the video surveillance system. Block 221 represents a raw
`(uncompressed) digital video input. This can be obtained, for
`example, through analog to digital capture of an analog video
`signal or decoding of a digital video signal. Block 222 repre
`sents a hardware platform housing the main components of
`the video Surveillance system (video content analysis block
`225 and activity inference block 226). The hardware plat
`form may contain other components such as an operating
`system (block 223); a video encoder (block 224) that com
`presses raw digital video for video streaming or storage using
`any available compression scheme (JPEG, MJPEG, MPEG1,
`MPEG2, MPEG4, H.263, H.264, Wavelet, or any other); a
`storage mech

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket