throbber
Umted States Patent [191
`Tsuchikawa et al.
`
`[54] METHOD AND APPARATUS FOR MOVING
`
`UND
`
`_
`.
`.
`.
`_
`[75] Inv?ntors‘ ‘Ix/[15mg Tzgzzzlflllémlc’hifggisz?ogf
`Kama awaken 33 an
`’
`g
`-'
`P
`-
`,
`-
`.
`egra h
`T 1
`[73] Asslgnc? $2,531,331! Tgkysngpaiephone
`’
`’
`
`_
`[21] APPl- N°-- 401,972
`[22] Filsd:
`Mar- 9, 1995
`.
`.
`.
`.
`.
`tl
`Forelgn Apphca on Pnonty Data
`[30]
`Mar. 9, 1994 [1?]
`Japan .................................. .. 6-037438
`Feb. 17, 1995
`[IP]
`Japan .................................. .. 7-029220
`
`[51] Int. cl.6 .............................. .. G06K 9/00- G06K 9/46
`[52] U S Cl
`sszllsm 382/131. 382/170
`58 F: I'd
`’
`382/168 169 _
`[
`1
`le 0 ea
`190 19?: 309’
`’
`’
`
`’
`
`’
`
`’
`
`"""""" "
`
`.
`References Clted
`U.S. PATENT DOCUMENTS
`
`[56]
`
`5 604
`
`4’807’163
`4:847:677
`5,0214 13
`5,150,432
`
`2/1989
`7/1939
`6/1991
`9/1992 Ueno et a1. ........................... .. 382/250
`
`FOREIGN PATENT DOCUMENTS
`63-194477
`8/1988 Japan ........................... .. H04N 5/262
`5225341
`9/1993 Japan
`(306E 15/70
`622318
`1/1994 Japan
`H04N 7/18
`2/1994 $316
`0
`8/1991
`..
`3/1992 WIPO ........................... .. G06F 15/70
`
`9203801
`
`USOO5748775A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,748,775
`Ma 5 1998
`y a
`
`OTHER PUBLICATIONS
`M. Kaneta et al., Image Processing Method for Intruder
`Detection Around Power Line Towers, IEICE Transactions
`on Information and Systems, Oct. 1993, pp. 1153-1161.
`X. Yuan et al., A Computer Vision System for Measurement
`of Pedestrian Volume, Proceedings of the Region Ten Con
`ference (TENCON), Oct. 19-21, 1993, pp. 1046-1049.
`R.D. Horto11.A Ta er Cuein and Trackin S stem (TCATS)
`-
`'8
`g
`g Y
`for Smart Video Processing, Proceedings The Institute of
`Electrical and Electronic Engineers, 1990 International Car
`nahan Conference on Security Technology: Crime Counter
`measures, Oct. 10-12, 1990, pp. 68-72.
`Primary Examiner-Leo Boudreau
`Assistant Examiner—Wenpeng Chen
`Attome); Agent, or Firm-Banner & Witco?, Ltd.
`
`[57]
`
`ABSTRACT
`
`Amwing °bjcct emac‘iml bascd °n backgmund Subtraction
`capable of stably extracting the moving object under various
`environments. Temporal changes of image feature parameter
`values for sub-regions subdividing a frame of each input
`image are stored, and the background image is reconstructed
`by statistically processing a temporal change of the image
`feature parameter values for each sub-region within a pre
`scribed target region of the frame over a prescribed period
`of time to obtain the statistical quantity characterizing that
`temporal change, judging whether that temporal change is
`due to an illumination change or not according to the
`obtained statistical quantity and a prescribed illumination
`change judging condition, and updating a background image
`value for each sub-region by a new background image value
`according to the image feature parameter values for each
`sub-region during the prescribed period of time. Then, a
`Subtraction Processing iS applied to one of the input imagcs
`and the reconstructed background image, and a binarization
`processing is applied to the obtained subtraction image so as
`to extract the moving object region from the input images.
`
`40 Claims, 21 Drawing Sheets
`
`IMAGE SEQUENCE
`
`3 g; l IO
`H_
`12 a! :1 :IMAG LATURE
`PARA TER VALUE
`
`STORAG E MEANS
`
`,—
`
`B 11Ea|<_? 5
`E
`‘VARIANCEII
`8
`213
`I.“
`(
`E INTENSITY
`.TY
`CHA E
`STATISTICAL
`1N0
`
`:
`
`H
`214
`g
`HI
`‘
`E INTENSITY
`lN'l'ENSlTY
`CHANGE
`STATISTICAL
`PROCESSING
`MEANS
`mum/c in AT a.»
`7'7’)
`7-13
`"T
`a N”
`
`r120
`
`MOVING
`EXTRACI‘
`
`Ecr
`MEANS
`
`<M No
`
`5m y‘
`
`YES NO
`
`YES NI)
`
`52o
`
`VALUE
`UPDATE MEANS K
`1
`BACKGROUND
`mac UB'REGION
`UPDA MEANS
`
`vALuE
`UPDATE .\1E-\\\'
`1
`200 232
`
`_ /
`
`1 111V
`2
`L0 "I :|;_,31g
`I if . 2|:
`I owl-r
`300~ BACKGROUND IMAGE REGI1|\
`
`RECONSTRUCTION MEANS
`
`‘
`
`Page 1 of 37
`
`SAMSUNG EXHIBIT 1017
`Samsung v. Image Processing Techs.
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 1 of 21
`
`5,748,775
`
`FIG. 1
`PRIOR ART
`
`INPUT IMAGE Xi FIXED BACKGROUND
`IMAGE Y
`
`BINARIZATION
`OF DIFFERENCE
`
`DATA
`
`%
`
`MOVING OBJECT
`IMAGE Xi-Y
`
`PRIOR ART
`
`BACKGROUND CHANGE ~812
`JUDGMENT PROCESSING
`I
`"V813
`BACKGROUND IMAGE
`CORRECTION PROCESSING
`Yi‘,
`= B‘NAR‘ZAT‘ON
`55%?55385
`Xi PROCESSING Xi'Yi PROCESSING
`,I
`(I
`814
`815
`
`_
`
`816
`
`I
`MOVING OBJECT
`OUTPUT
`
`II’Z‘IgEINPUT
`PROCESSING
`I
`811
`
`SAMSUNG EXHIBIT 1017
`Page 2 of 37
`
`

`

`US. Patent
`
`May 5, 1998
`
`Sheet 2 0f 21
`
`5,748,775
`
`D CAM ERA ~0O1
`
`FIG. 3
`IMAGE SEQUENCE
`
`1
`‘IMAGE FEATURE PARAMETER VALUE
`\TEMPORAL CHANGE STORAGE MEANS
`V
`
`r f
`
`100
`
`Q00
`‘a
`
`a
`
`INTENSITY CHANGE
`~210
`STATISTICAL
`PROCESSING MEANS
`2820
`
`'
`
`I
`
`ILLUMINATION
`CHANGE IUDGING
`
`CONDITION‘?
`
`------- "
`
`UPDATE
`
`VALUE UPDATE ~20‘)
`
`MEANS
`
`‘
`
`N700
`
`‘“
`
`BACKGROUND IMAGE
`SUB-REGION UPDATE MEANS
`
`)
`
`k
`
`\
`
`BACKGROUND IMAGE REGION
`RECONSTRUCTION MEANS
`
`)
`
`1N? UT IMAGE
`
`RECONSTRUCTED
`BACKGROUND IMAGE
`
`K II
`
`I
`
`‘I
`
`SUBTRACTION MEANS @1400
`
`BINARIZATION MEANS N510
`
`MOVING OBJECT
`EXTRACTION MEANS
`
`~5OO
`
`I
`
`MOVING OBJECT
`OUTPUT
`
`-
`N520
`
`SAMSUNG EXHIBIT 1017
`Page 3 of 37
`
`

`

`US. Patent
`
`May 5, 1998
`
`Sheet 3 of 21
`
`5,748,775
`
`FIG.4 I
`
`100
`
`IMAGE SEQUENCE ’
`CAMERA
`,
`
`t
`
`_
`
`.
`
`.
`
`
`
`
`
`y . . .
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`'
`
`;
`
`:
`
`.
`
`.
`
`,rlOl
`
`a
`
`.
`
`v
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`,.
`
`\
`
`I
`<—-:-—> n‘
`'
`214
`
`(
`001
`
`.
`
`.
`
`.
`
`.
`
`F
`4 "_I~'01()
`12
`_;
`a]
`1 :14
`
`8
`sUBTRACTION 4_: r.'_:\: :
`MEANS
`I \ I}; I
`*
`,
`BINARIZATION ~5 l 0
`MEANS
`F
`
`r120
`
`MOVING OBJECT
`EXTRACTION MEANS
`
`k
`560 t
`
`7
`
`1
`’ H
`I
`520
`
`J
`
`-
`
`: 3311314402
`120
`:1:- 1 10
`a. P5.
`9' a2 a3' a4' IMAGE FEATURE
`(x,y)
`‘
`PARAMETER VALUE
`TEMPORAL CHANGE
`\
`\
`STORAGE MEANs )
`’/—-———r——\ \
`i
`">~
`PEAK
`_“ @-
`w
`U
`‘VARIANCE
`U
`E 4;
`a
`“Z1
`=
`:>
`213
`.3
`g
`E INTENSITY
`E INTENSITY
`INTENSITY
`INTENSITY
`CHANGE
`CHANGE
`sTATIsTICAL
`sTATIsTICAL
`PROCEssING
`PROCEssING
`MEANS
`MEANs
`L(DURING II) AT an) k(DURING II AT :12) j
`\
`\
`221
`211 /
`212 /222
`,
`rr<rrn? NO
`r7<a0. NO
`YEs
`YEs
`NO
`NO
`UPDATE UPDATE
`UPDATE UPDATE
`I
`'
`L
`I
`vALUE
`vALUE
`UPDATE MEANs 1 UPDATE MEANS
`BACKGROUND 231

`IMAGE SUB-REGION v200 232
`UPDATE MEANs
`\
`
`V
`:1:
`I
`.I
`il (yr-“QC
`
`I11
`113/310
`Ill
`I‘. (X -I O\ 0
`
`.1.
`300~ BACKGROUND IMAGE REGIU'N
`\
`RECONSTRUCTION MEANS
`
`/
`
`__)
`
`SAMSUNG EXHIBIT 1017
`Page 4 of 37
`
`

`

`US. Patent
`
`May s, 1998
`Sheet 4 0f 21
`FIG. 5A
`
`5,748,775
`
`720 SLIT
`
`7IO IMAGE SEQUENCE
`
`SAMPLING
`POSITION
`
`FIG. 5B FIG. 5c FIG. 5D
`742 INPUT
`751
`k 741 BACKGROUND MOWNG OBJECT
`
`/
`
`1
`
`I
`
`730
`
`740
`
`750
`
`SAMSUNG EXHIBIT 1017
`Page 5 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 5 of 21
`
`5,748,775
`
`FIG. 6
`IMAGE SEQUENCE
`FOR n FEATURE
`PARAMETERS
`\
`I
`COLOR i
`CAMERA
`/
`
`>
`
`'7
`00“
`
`011
`
`r I)
`
`ISO/s
`
`(R COMPONENT IMAGE)
`IMAGE
`
`FEATURE
`
`-
`
`.
`
`I
`
`.
`
`.
`
`.
`
`. .i
`
`PARAMETER--~-;_;»
`VALUE
`;; -
`-
`~
`I -;
`TEMPORAL
`;
`i
`132
`III '1 LB 3'4 ‘
`MEANS f6 COMPONENT
`MAGE)
`
`_
`
`~
`
`"
`
`11;:
`:$:
`' R"
`491
`I
`SUBTRACTION ‘
`MEANS
`I
`BINARIZATION ~511
`MEANS
`402
`I
`2
`SUBTRACTION
`MEANS
`(B COMPONENT IMAGE)
`.
`I
`+
`BINARIZATION
`BACKGROUND IMAGE
`MEANS 403
`330~ REGION RECONSTRUCTION
`512
`8
`SUBTRACTION M
`MEANS
`MEANS;
`24OWVII-DIMENSIONAL VECTOR
`BINARIZATION
`GENERATION MEANS
`
`1,
`
`MEANS
`r
`
`,
`
`7
`513
`
`120
`MOVING
`
`OBJECT N REGION
`
`~ 760%, VALUE CHANGE
`.
`STATISTICAL
`PROCESSING MEANS
`I
`‘
`BACKGROUND SUB-REGION UPOAIE
`
`
`
`204,”
`
`_
`T330
`MOVING OBJECT
`QEXTRACTION MEANS 1
`
`MEANS
`
`33]
`INEW BACKGROUND IMAGE]
`
`SAMSUNG EXHIBIT 1017
`Page 6 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 6 0f 21
`
`5,748,775
`
`FIG. 7
`
`m J
`
`300
`Z
`BACKGROUND IMAGE
`REGION
`RECONSTRUCTION
`MEANS
`
`G
`
`~
`
`\ \ r
`
`I
`
`3D FEATURE
`VECTOR “(1' k
`240n-D1MENSIONAL
`LT
`'
`VECTOR
`GENERATION MEANs
`1&8 M; ‘E; L: _
`+ \ 2252
`2251
`i
`R W2 AWl
`R
`W1
`V1
`W2
`B
`V3
`v G
`VECTOR SET DURING m AT a2
`k
`i
`
`V2
`G
`VECTOR sET DURING to AT A
`p
`J
`I
`
`B
`
`w
`
`r >_‘
`
`B
`21)
`c1
`
`V2
`
`r >_
`
`E
`I51
`o
`
`i
`
`VARIANCE “264
`/
`
`FEATURE VALUE I w l
`sTATIsTICAL PROCEssING
`MEANS FOR I w | DURING
`to AT an
`\ (
`262
`
`J
`
`I’
`a<$~0
`YES
`1
`NO
`UPDATE UPDATE
`1
`
`I
`VALUE
`282*“ UPDATE MEANS
`
`FEATURE VALUE | w l
`STATISTICAL PROCESSING
`MEANS FOR I w I DURING
`to AT E2
`I
`261
`
`J
`
`‘Y
`n<n(). NO
`YES
`
`NO
`UPDATE UPDATE
`I
`
`Y
`VALUE
`281“ UPDATE MEANs
`
`271“
`
`272“
`
`BACKGROUND
`IMAGE SUB-REGION ~204
`UPDATE MEANs
`'
`
`\
`
`*V
`
`L NEw BACKGROUND IMAGE @321
`
`SAMSUNG EXHIBIT 1017
`Page 7 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 7 of 21
`
`5,748,775
`
`8
`FIG.
`
`[1
`
`300
`(I. a
`
`3D FEATURE
`vEcTOR AT al\\
`
`2253
`
`EHARACTERISTIC CURRENT a \
`BACKGROUND
`CURVE Ll
`d2
`VALUE b
`d3
`
`Em;
`
`Li D
`
`240 n-DIMENSIONAL
`vEcTOR
`GENERATION MEANS
`
`*
`
`254
`
`2
`EHARACTERISTIC CURRENT
`CURVE L2
`BACKGROUND
`.
`VALUED
`\ R
`(
`\
`V2 d2
`d1
`
`\
`
`j
`
`N
`
`j
`
`\ \
`
`\
`
`r
`
`V1
`
`V3
`
`+
`
`G
`
`>..
`
`VARIANCE
`
`'
`
`i" -
`" 268
`iii
`W
`8
`E _-—____
`FEATURE VALUE \
`<1 i
`
`FREQUENCY VARIANCE a
`
`FEATURE VALUE 1 d |
`
`267
`
`a
`265
`
`k
`
`e
`266
`
`'
`
`r1 < (r O‘? NO
`
`273W
`
`YES
`
`NO
`UPDATE UPDATE
`
`274G“
`
`YES
`
`T
`NO
`UPDATE UPDATE
`
`T
`283“ VALUE
`UPDATE MEANS
`BACKGROUND
`IMAGE SUB~R EG ION
`UPDATE MEANS
`
`I
`VALUE
`284“ UPDATE MEANS
`
`V205
`
`LNEW BACKGROUND IMAGE 1322
`
`SAMSUNG EXHIBIT 1017
`Page 8 of 37
`
`

`

`US. Patent
`
`May 5, 1998
`
`Sheet 8 0f 21
`
`5,748,775
`
`FIG. 9A
`
`IY]
`
`FIG. 9B
`
`SAMSUNG EXHIBIT 1017
`Page 9 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 9 of 21
`
`5,748,775
`
`FIG. 10
`
`INTENSITY CHANGE STATISTICAL W215
`PROCESSING MEANS
`
`ILLUMINATION CHANGE
`IUDGING CONDITION?
`
`UPDATE
`
`UPDATE
`
`VALUE
`UPDATE
`MEANS]
`I
`233
`
`VALUE
`UPDATE
`MEANSII
`I
`235
`
`NO
`UPDATE
`
`I
`223
`
`k
`
`BACKGROUND IMAGE SUB-REGION
`UPDATE MEANS
`,
`
`j
`
`201
`
`SAMSUNG EXHIBIT 1017
`Page 10 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 10 0f 21
`
`5,748,775
`
`3431
`<
`
`r,
`r>~
`
`PEAK .
`
`a r
`
`'
`
`3135
`I
`j r
`
`3:32
`{
`
`2?‘
`
`s
`O
`
`U
`
`.
`
`i a 5
`I
`ivARIANGE a
`2
`<_‘'-_>':
`; a 2
`<-!->-
`m
`:
`355
`""i—>
`-
`D
`>
`
`
`E i
`351 M a a
`
`
`g m
`INTENSITY
`INTENSITY
`INTENSITY
`>_
`g] 361
`D :m,
`
`5 i TEMPORAL
`DIFFERENTIAL
`STATISTICAL
`PRocEsSING
`DURING to AT a1
`
`“:15
`M365
`
`362 n32
`
`STATISTICAL
`STATISTICAL
`PRocEsSING
`_ PRGcEssING
`DURING [0 AT 212
`DURING to AT as
`1 \
`J G
`INTENSITY CHANGE STATISTICAL PROCESSING MEANS
`k (DURING to AT :1)
`
`k
`
`~
`
`1
`
`J
`
`ml<m0
`
`v ILLUMINATION
`CHANGE JUDGING
`GGNDITIGN
`‘’ I "0’ m ' m"
`
`a 5 > a 0
`m5 < m0
`
`m2>m0
`
`i
`VALUE ~233
`UPDATE
`MEANSI
`
`II
`vALUE
`UPDATE
`235 ~ MEANS!
`
`I
`N0
`UPDATE
`1
`
`(FOR ABRUPT
`(FOR GRADUAL
`ILLU M I NATION
`ILLUMINATION
`CHANGE)
`CHANGE)
`k
`BACKGROUND IMAGE SUB-REGION UPDATE MEANS J
`
`223
`
`SAMSUNG EXHIBIT 1017
`Page 11 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 11 0f 21
`
`5,748,775
`
`FIG. 12
`CCAMERA IMAGE SEQUENCE
`Z
`001
`
`103
`
`IMAGE FEATURE
`PARAMETER VALUE
`TEMPORAL CHANGE
`STORAGE MEANS
`
`INTENSITY CHANGE ~210
`STATISTICAL
`220 PROCESSING MEANS
`
`. i __
`
`ILLUMINATION
`CHANGE IU DGING
`CONDITION?
`
`I UPDATE I
`
`INO UPDATEI
`
`[VALUE UPDATE MEANS |~239
`BACKGROUND IMAGE SUB-REGION
`UPDATE MEANS
`
`I
`
`\
`
`""AIOZO .
`
`260 BACKGROUND IMAGE REGION
`L RECONSTRUCTION MEANS
`
`a4
`
`‘
`
`‘
`
`.Q)
`
`310
`
`V
`V
`I SUBTRACTION MEANS
`I
`l BINARIZATION MEANS 510
`MOVING OBJECT
`S01
`EXTRACTION MEANS
`“
`
`401
`
`LMOVING OBJECT 0uTPuT]~520
`
`)
`
`Z
`300
`
`SAMSUNG EXHIBIT 1017
`Page 12 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 12 0f 21
`
`5,748,775
`
`"1 020
`
`SAMSUNG EXHIBIT 1017
`Page 13 of 37
`
`

`

`U.S. Patent
`
`May s, 1998
`
`Sheet 13 of 21
`
`5,748,775
`
`FIG. 14A
`
`603
`
`LU
`3 g
`§ F
`E g 602
`P
`\
`5 a ‘“
`g V
`to
`
`604
`
`_>
`
`A
`
`CURRENT CAMERA
`INPUT
`
`TIME
`
`Lu
`
`t2
`
`3 A >— N
`< > a
`605
`m z 602
`m m
`606
`i-— A“ r
`E g
`l
`55 V
`to
`
`LL‘
`
`‘
`
`r
`
`‘
`CURRENT CAMERA
`INPUT
`
`TIME
`
`SAMSUNG EXHIBIT 1017
`Page 14 of 37
`
`

`

`US. Patent
`
`May 5, 1998
`
`Sheet 14 of 21
`
`5,748,775
`
`D CAMERA 7» 001
`
`FIG. 15
`IMAGE SEQUENCE
`
`I
`rIMAGE FEATURE PARAMETER VALUE
`KTEMPORAL CHANGE STORAGE MEANS
`I
`(r
`
`INTENSITY CHANGE
`STATISTICAL
`PROCESSING MEANS
`
`N210
`
`220
`I
`
`I
`
`ILLUMINATION
`CHANGE JUDGING
`CONDITION‘?
`
`UPDATE
`
`NO UPDATE
`
`I
`VALUE UPDATE
`MEANS
`BACKGROUND IMAGE
`SUB-REGION UPDATE MEANS
`
`~23O
`
`\
`
`BACKGROUND IMAGE REGION
`RECONSTRUCTION MEANS
`UPDATED
`BACKGROUND IMAGE
`
`INPUT IMAGE
`
`Ivv
`THRESHOLD
`SETTING MEANS
`'
`I
`700
`
`II
`
`SUBTRACTION MEANS ~ 400
`
`v
`
`{
`
`BINARIZATION MEANS 7“ 512
`
`MOVING OBJECT
`EXTRACTION MEANS
`
`~502
`
`I
`MOVING OBJECT N520
`OUTPUT
`
`SAMSUNG EXHIBIT 1017
`Page 15 of 37
`
`

`

`US. Patent
`
`May s, 1998
`Sheet 15 0f 21
`FIG. 16A
`
`5,748,775
`
`
`
`mDQ<> mEIDAOmm<
`
`mUZmEmEEO m0
`
`
`
`mD4<> mELDwHOwm/x
`
`mUZmMmmEQ m0
`
`
`
`WDJ<> mHD15wm<
`
`mUZmmmEEQ m0
`
`701
`
`INTENSITY OF BACKGROUND IMAGE
`
`FIG. 16B
`
`702
`
`INTENSITY OF BACKGROUND IMAGE
`
`FIG. 16C
`
`/
`703
`
`INTENSITY OF BACKGROUND IMAGE
`
`SAMSUNG EXHIBIT 1017
`Page 16 of 37
`
`

`

`US. Patent
`
`May 5, 1998
`
`Sheet 16 0f 21
`
`5,748,775
`
`FIG. 17
`
`(IMAGE INPUT UNITkv 171
`
`[IMAGE STORAGE UNIT1~I72
`
`l73b 173C 173d
`1
`I
`1
`(I!
`(v
`
`r
`
`BACKGROUND
`UPDATE UNIT
`
`1731
`‘
`8
`PIXEL VALUE
`STATISTICAL
`
`PROCESSING UNIT
`
`%
`
`D
`
`LL]
`I—
`3
`m
`:
`Q
`5
`2
`o
`Km)
`5g
`
`E
`
`D
`
`LL]
`P
`5:
`L
`:>
`Q
`g
`a
`o
`5
`25
`
`E
`
`D
`
`LU
`I
`3
`m
`:>
`Q
`%
`2
`O
`{M}
`CE
`
`1732
`K
`
`‘
`
`I};
`ASTATISTICAL VALL
`JUDGMENT UNIT
`1735
`1733
`<’
`"
`5
`QIJSEgEFREQUENT MEAN VALUE
`CALCULATION
`EQII‘TCULATION
`UNIT
`1734
`1736
`I
`3
`2
`r
`VALUE UPDATE VALUE UPDATE
`UNIT
`UNIT
`
`I
`
`{
`173a
`
`1
`
`1
`
`4
`
`Y
`Y
`IMAGE SUBTRACTION N174
`CALCULATION UNIT
`
`IMAGE EINARIZATION N175
`CALCULATION UNIT
`1
`MOVING OBJECT ~ 176
`OUTPUT UNIT
`
`SAMSUNG EXHIBIT 1017
`Page 17 of 37
`
`

`

`US. Patent
`
`May s, 1998
`
`Sheet 17 0f 21
`
`'
`
`5,748,775
`
`FIG. 18
`
`LIMAGE INPUT UNIT j~l71
`
`ISLIT IMAGE ACQUISITION UNIHV181
`I
`[IMAGE STORAGE UNI?~172
`
`BACKGROUND
`UPDATE UNIT
`
`1731
`‘
`8
`PIXEL VALUE
`STATISTICAL
`PROCESSING UNIT
`
`I
`
`1732
`L V
`_ STATISTICAL VALUE
`JUDGMENT UNIT
`1733
`L
`MOST FREQUENT
`VALUE
`CALCULATION
`UNIT
`
`1735
`a
`
`MEAN VALUE
`SQILTCULATION
`
`1736
`1734
`a
`I
`e
`VALUE UPDATE VALUE UPDATE
`UNIT
`UNIT
`
`I
`
`I
`
`I73b 173C 173d
`I
`I
`I
`I
`I
`I
`
`II
`
`I
`
`%
`3
`
`LIJ
`F‘
`<
`E
`:
`Q
`Z
`8
`Q4
`O
`5
`g
`
`3
`
`I_‘
`<
`E
`D
`C)
`Z
`5
`m
`0
`5
`g
`
`5
`3
`
`L
`F‘
`<
`E
`3
`9
`5
`O
`E
`~
`5
`<
`T
`
`I
`
`173a
`
`I
`
`‘
`
`V
`
`I
`
`I
`
`""' 1'
`
`IMAGE SUBTRACTION NW4
`CALCULATION UNIT
`I
`IMAGE BINARIZATION N175
`CALCULATION UNIT
`I
`MOVING OBJECT N176
`OUTPUT UNIT
`
`SAMSUNG EXHIBIT 1017
`Page 18 of 37
`
`

`

`U.S. Patent
`
`May 5, 1998
`
`Sheet 18 of 21
`
`5,748,775
`
`FIG. 19
`
`[IMAGE INPUT UNITE/171
`
`I
`[IMAGE STORAGE UNITJN 191
`
`173b 1730 173d
`I
`I
`1
`I \
`
`If
`
`I731
`
`BACKGROUND
`UPDATE UNIT
`
`S]
`
`
`EAO TG LI
`LEE TE TUW
`VTS IN FEL
`TC2 EM SL T
`
`E H w T
`N N
`
`U HUB E N M W56 NZW m I
`
`M C U
`
`N
`V V“ V
`
`5 EN 6 3 U0 3
`
`
`
`
`
`FEZD wit/RED QZDOMOMU<m
`
`
`
`
`
`(EZD PCQAED OZDOmOv~U<m
`
`VALUE UPDATE VALUE UPDATE
`UNIT
`UNIT
`
`173a
`
`IMAGE SUBTRACTION
`CALCULATION UNIT
`
`~ 174
`
`IMAGE BIN ARIZATION
`CALCULATION UNIT
`
`175
`
`MOVING OBJECT
`OUTPUT UNIT
`
`~176
`
`SAMSUNG EXHIBIT 1017
`Page 19 of 37
`
`

`

`U.S. Patent
`
`May s, 1998
`
`Sheet 19 0f 21
`
`5,748,775
`
`[IMAGE INPUT UNIT|~171
`
`2
`I
`17
`IMAGE STORAGE UNIT
`7V
`
`1
`
`BACKGROUND
`UPDATE UNIT
`
`1731
`‘
`3
`PIXEL VALUE
`STATISTICAL
`PROCESSING UNIT
`1732
`8
`y
`__ STATISTICAL VALUE
`JUDGMENT UNIT
`1735
`1733
`I
`I
`I
`I
`QIISSEEFREQUENT MEAN VALUE
`CALCULATION
`CALCULATION
`UNIT
`
`UNIT
`
`1736
`1734
`2
`a
`I
`VALUE UPDATE VALUE UPDATE
`UNIT
`UNIT
`
`-
`
`I
`
`V
`
`173b 173C 173d
`I
`I
`I
`I
`I
`I
`
`I
`
`I
`
`t
`g
`In
`E-
`<
`5:
`:3
`Q
`E
`2
`O
`>4
`O
`<
`m
`
`I:
`:2)
`LU
`P
`<
`E
`z
`a
`E
`2
`U
`z
`O
`<
`m
`
`L:
`g
`Lu
`r—
`<
`g
`3
`Q
`E
`2
`Q
`x
`O
`<
`m
`
`‘
`173a
`
`‘
`
`I
`
`I
`
`'
`
`I
`II
`IMAGE SUBTRACTION ~174
`CALCULATION UNIT
`I
`I
`IMAGE BINARIZATION N175
`CALCULATION UNIT
`I
`
`I
`MOVING OBJECT ~176
`OUTPUT UNIT
`
`V
`VALUE
`SETTING UNIT
`I
`
`L“ ‘77
`‘
`
`SAMSUNG EXHIBIT 1017
`Page 20 of 37
`
`

`

`US. Patent
`
`May 5, 1998
`
`Sheet 20 of 21
`
`5,748,775
`
`FIG, 21
`17]
`
`
`
`[IMAGESTORAGEUNIT}~172
`
`
`
`IMAGE STORAGE UNIT
`173a
`173b 173c
`
`
`BACKGROUND UPDATE UNIT
`
`
`
`
`
`
`PIXEL VALUE STATISTICAL
`PROCESSING UNIT
`
`STATISTICAL VALUE
`JUDGMENT UNIT
`
`
`
`
`
`
` BACKGROUNDUPDATEUNIT
`BACKGROUND BACKGROUNDUPDATEUNIT
`
`
`VALUE UPDATE
`UNIT
`
`| BACKGROUND
`
`IMAGE STORAGE UNIT]
`
`|
`
`IMAGE SUBTRACTION
`CALCULATION UNIT
`
`IMAGE BINARIZATION
`CALCULATION UNIT
`
`MOVING OBJECT
`OUTPUT UNIT
`
`174
`
`175
`.
`
`176
`
`
`Pf
`
`SAMSUNG EXHIBIT 1017
`Page 21 of 37
`
`SAMSUNG EXHIBIT 1017
`Page 21 of 37
`
`

`

`U.S. Patent
`
`May5, 1998
`
`Sheet 21 of 21
`
`5,748,775
`
`FIG, 22
`17]
`
`
`
`IMAGE STORAGE UNIT
`
`
`
`
`PIXEL VALUE STATISTICAL
`PROCESSING UNIT
`
`STATISTICAL VALUE
`JUDGMENT UNIT
`
`
`
`BACKGROUNDUPDATEUNIT
`
`
`
`
`
`MOST
`
`
`FREQUENT
`MEAN VALUE
`VALUE
`
`
`CALCULATION
`
`
`CALCULATION
`
`UNIT
`UNIT
`
`
`
`VALUE UPDATE UNIT
`
`
`
`BACKGROUND
`IMAGE STORAGE UNIT
`
`178
`
`IMAGE SUBTRACTION
`CALCULATION UNIT
`
`IMAGE BINARIZATION
`CALCULATION UNIT
`
`MOVING OBJECT
`OUTPUT
`
`174
`
`175
`
`176
`
`SAMSUNG EXHIBIT 1017
`Page 22 of 37
`
`172
`
`j
`
`\
`
`
`
`BACKGROUND UPDATE UNIT
`
`173b 173c
`173a
`
`
`
`
`
`
`=Z
`
`z.
`a)
`a)
`ce
`Re
`
`aa _ aZ
`
`z.
`
`— ©t
`
`e
`QO
`me
`QO
`
`<j
`
`aa!
`
`SAMSUNG EXHIBIT 1017
`Page 22 of 37
`
`

`

`5,748,775
`
`1
`METHOD AND APPARATUS FOR MOVING
`OBJECT EXTRACTION BASED ON
`BACKGROUND SUBTRACTION
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`
`The present invention relates to a method and an appa-
`ratus for extracting a moving object in the image sequence
`by using a subtraction between an input
`image and a
`background image, which can stably extract the moving
`object region even under an environment which incorporates
`illumination changes.
`2. Description of the Background Art
`Conventionally known methods for extracting a moving
`object based on image processing include: (1) a methodfor
`storing a reference background image,extracting difference
`data by a subtraction between the input image and the
`background image, and obtaining the moving object by
`means of the binarization of the difference data using a
`threshold; (2) a method for obtaining data on difference
`between frames by a subtraction between the input image
`and an immediately previous frame image as a reference
`image, and obtaining the moving object by means of the
`binarization of the obtained data; (3) a method for obtaining
`correspondences between changing points in the reference
`image and the input image by calculating quantities such as
`motion vectors, and obtaining the moving object as a set of
`moved points; (4) a method for obtaining a change between
`the reference image and the input image according to a
`correlation within a target region, and obtaining the moving
`object as a changed region; and (5) a methodfor carrying out
`a (shape) recognition and a tracking of a movingtarget.
`Among these conventionally known methods, the meth-
`ods based on subtraction have an advantage that the moving
`object can be extracted at high speed by means of a
`relatively simple processing, and widely used in various
`fields such as the industrial product
`inspection and
`measurement, the automobile measurement, and the moni-
`toring system.
`FIG. 1 shows an outline of such a conventional method
`for extracting the moving object based on background
`subtraction, where the moving objectis extracted by obtain-
`ing a difference between a reference image Y representing a
`fixed background image and a latest input image Xi, and
`judging a region at which the obtained difference is greater
`than or equal to a prescribed threshold as the moving object
`in motion. In this method,
`the moving object can be
`extracted easily under a circumstance in which the back-
`ground image does not change, but when there is an illu-
`mination change,
`the reference background image also
`changes accordingly such that the difference obtainedin the
`above procedure can be significantly large for the back-
`groundportion as well and it becomes impossible to extract
`the moving object stably.
`Forthis reason,it is indispensable for the moving object
`extraction based on background subtraction to incorporate
`the appropriate updating of the background imagein corre-
`spondence to the change of the background values. Namely,
`it is necessary to sequentially carry out the moving object
`extraction with respect to the input image Xi along with the
`judgementof the change in the background values and the
`updating of the background image to an appropriate new
`background image Yi+1 for the moving object extraction for
`the next input image Xi+1 whenever the background has
`changed.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`65
`
`2
`FIG. 2 shows a flow chart for the operation in such a
`moving object extraction based on background subtraction
`incorporating the background updating. First, a target image
`input processing 811 enters the frame images sequentially.
`Then, a background change judgement processing $12
`checks whether there is a change in the background values
`for the input image Xi, and wheneverthere is a change, a
`background image correction processing 813 updates the
`background image accordingly. Then, a background sub-
`traction processing 814 obtains the difference data between
`the input image Xi and the updated background image, and
`a binarization processing 815 binarizes the obtained differ-
`ence data by using a prescribed threshold, so as to output a
`moving object output 816 representing a region in the input
`image Xi specified by the result of the binarization process-
`ing 815 as the moving object.
`In the above procedure, the conventionally used schemes
`for updating the background imageatthe background image
`correction processing 813 include a scheme for using a
`weighted sum of the input image values and the stored
`background image values, and a scheme for using a straight-
`forward mean of the frame image values for immediately
`previous several frames. However, in these schemes, the
`change in the background values is judged without distin-
`guishing a change dueto a passing of the moving object and
`a change due to the illumination change, so that there has
`been a problem that the background image can be updated
`erroneously when many moving objects pass through the
`input image.
`There is also a schemeforextracting the moving object by
`analyzing image features such as shapes of objects resulting
`from the background subtraction, but for the input image
`containing a moving object with a changing shape, it is
`impossible for this scheme to judge whether the object
`extraction result reflects the actual moving object or the
`change in the background values, so that it has been
`extremely difficult to stably extract the moving object with
`a changing shape such as a human being.
`Thus, a technique for properly updating the background
`image has not been known conventionally, and the moving
`object extraction based on background subtraction has not
`been realized under an environment which incorporates
`large illumination changes such as an outdoorsite.
`SUMMARY OF THE INVENTION
`
`It is therefore an object of the present invention to provide
`a method and an apparatus for moving object extraction
`based on background subtraction capable of stably extract-
`ing the moving object such as a human being or an
`automobile, equally under an environment which incorpo-
`rates large illumination changes such as an outdoor site as
`well as under an environment which incorporates a gradual
`background change.
`According to one aspect of the present invention there is
`provided a method of moving object extraction based on
`background subtraction, comprising the steps of: (a) sequen-
`tially entering input images containing a moving object
`region to be extracted; (b) storing temporal changes of
`image feature parameter values for sub-regions subdividing
`a frame of each input image entered at the step (a); (c)
`statistically processing a temporal change of the image
`feature parameter values for each sub-region within a pre-
`scribed target region of the frame stored at the step (b) over
`a prescribed periodoftime to to obtain at least onestatistical
`quantity characterizing said temporal change, judging
`whether said temporal change is due to an illumination
`
`SAMSUNG EXHIBIT 1017
`Page 23 of 37
`
`SAMSUNG EXHIBIT 1017
`Page 23 of 37
`
`

`

`5,748,775
`
`3
`change or not according to said statistical quantity and a
`prescribed illumination change judging condition, and
`updating a background image valuefor said each sub-region
`by a new background image value according to the image
`feature parameter values for said each sub-region during the
`prescribed period of time t,. so as to obtain a reconstructed
`background image; (d) applying a subtraction processing to
`one of the input images entered at the step (a) and the
`reconstructed background image obtained at the step (c) to
`obtain a subtraction image; and (e) applying a binarization
`processing to the subtraction image obtained at the step (d)
`to extract the moving object region from the input images
`entered at the step (a).
`According to another aspect of the present invention there
`is provided an apparatus for moving object extraction based
`on background subtraction, comprising:
`input means for
`sequentially entering input images containing a moving
`object region to be extracted; storage means for storing
`temporal changes of image feature parameter values for
`sub-regions subdividing a frame of each input image entered.
`by the input means; background update meansforstatisti-
`cally processing a temporal change of the image feature
`parameter values for each sub-region within a prescribed
`target region of the frame stored by the storage means over
`a prescribed period of time to to obtain at least one statistical
`quantity characterizing said temporal change,
`judging
`whether said temporal change is due to an illumination
`change or not according to said statistical quantity and a
`prescribed illumination change judging condition, and
`updating a background image value for said each sub-region
`by a new background image value according to the image
`feature parameter values for said each sub-region during the
`prescribed period of time t,, so as to obtain a reconstructed
`background image; subtraction means for applying a sub-
`traction processing to one of the input images entered by the
`input means and the reconstructed background image
`obtained by the background update means to obtain a
`subtraction image; and binarization means for applying a
`binarization processing to the subtraction image obtained by
`the subtraction means to extract the moving object region
`from the input images entered by the input means.
`Other features and advantages of the present invention
`will become apparent from the following description taken
`in conjunction with the accompanying drawings.
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a schematic diagram indicating an outline of a
`conventional method of moving object extraction based on
`background subtraction.
`FIG. 2 is a flow chart for the operation in a conventional
`method of moving object extraction based on background
`subtraction.
`
`FIG.3 is a block diagram showing a system configuration
`of a moving object extraction system in the first embodiment
`of the present invention.
`FIG.4 is a block diagram showing a detailed functional
`configuration of the moving object extraction system of FIG.
`3.
`
`FIG. 5A is an illustration of an exemplary input image
`with a slit used in a moving object extraction system in the
`second embodimentof the present invention.
`FIG. 5B is an illustration of an exemplary space-time
`image obtained from the input image of FIG. 5A.
`FIG.SCis an illustration of an exemplary graph indicat-
`ing temporal change of input value and background value
`obtained from the space-time image of FIG. 5B.
`
`4
`FIG. 5D is an illustration of an exemplary space-time
`image indicating the extraction result obtained from the
`space-time image of FIG. 5B.
`FIG.6 is a block diagram showing a system configuration
`of a moving object extraction system in the fourth embodi-
`ment of the present invention.
`FIG. 7 is a block diagram showing a detailed functional
`configuration of a background image region reconstruction
`means in the moving object extraction system of FIG. 6.
`FIG. 8 is a block diagram showing a detailed functional
`configuration of a background image region reconstruction
`means in a moving object extraction system in the fifth
`embodiment of the present invention.
`FIGS. 9A and 9B are graphs three-dimensional feature
`vector space for explaining the operation in a moving object
`extraction system in the fifth embodiment of the present
`invention.
`
`FIG. 10 is a block diagram showing a schematic configu-
`ration of a background image sub-region update means in a
`moving object extraction system in the sixth embodiment of
`the present invention.
`FIG. 11 is a block diagram showing a detailed functional
`configuration of the background image sub-region update
`meansof FIG. 10.
`
`FIG. 12 is a block diagram showing a system configura-
`tion of a moving object extraction system in the seventh
`embodiment of the present invention.
`image
`FIGS. 13A and 13B are diagrams of input
`sequencesfor explaining a difference between the first and.
`seventh embodiments of the present invention.
`FIGS. 14A and 14B are graphs of temporal change of
`intensity value for explaining a difference between thefirst
`and seventh embodiments of the present invention.
`FIG.15 is a block diagram showing a system configura-
`tion of a moving object extraction system in the eighth
`embodiment of the present invention.
`FIGS. 16A, 16B, and 16C are graphs showing exemplary
`threshold settings used in the moving object extraction
`system of FIG. 15.
`FIG. 17 is a block diagram of an exemplary physical
`configuration for an apparatus corresponding to the first
`embodiment of the present invention.
`FIG. 18 is a block diagram of an exemplary physical
`configuration for an apparatus corresponding to the second
`embodiment of the present invention.
`FIG. 19 is a block diagram of an exemplary physical
`configuration for an apparatus corresponding to the seventh
`embodiment of the present invention.
`FIG. 20 is a block diagram of an exemplary physical
`configuration for an apparatus corresponding to the eighth
`embodimentof the present invention.
`FIG. 21 is a block diagram of a modified physical
`configuration for an apparatus corresponding to the first
`embodiment of the present invention.
`FIG. 22 is a block diagram of a further modified physical
`configuration for an apparatus corresponding to the first
`embodiment of the present invention.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`65
`
`Referring now to FIGS. 3 and4,the first embodiment of
`the moving object extraction based on background subtrac-
`tion according to the present invention will be described in
`detail.
`
`SAMSUNG EXHIBIT 1017
`Page 24 of 37
`
`SAMSUNG EXHIBIT 1017
`Page 24 of 37
`
`

`

`5,748,775
`
`5
`FIG. 3 showsa system configuration of a moving object
`extraction system in this first embodiment, while FIG. 4
`shows a detailed functional configuration of the moving
`object extraction system of FIG. 3.
`In FIGS. 3 and 4, the system generally comprises: a
`camera 601 for entering an image sequence of the input
`images; an image feature parameter value temporal change
`storage means 100 including a plurality of frame image
`memories 161, 102, etc. for storing image feature parameter
`values for the sequentially entered input images; a back-
`ground image region reconstruction means 300 for recon-
`structing the background image according to the temporal
`changeof the stored image feature parameter values; and a
`moving object extraction means 500 for obtaining a moving
`object output 520 representing the moving object from the
`entered input image and the reconstructed background
`image.
`In further detail, each frame image memory in the image
`feature parameter value temporal change storage means 100
`stores the image feature parameter values for each input
`image containing a background region 110 and a moving
`object region 120 which is divided into a plurality of
`sub-regionsa, such as pixels located at coordinate positions
`(x, y) within each frame. In this first embodiment, an
`intensity at each pixel is used as an exemplary image feature
`parameter at each sub-region a,.
`The background image region reconstruction means 300
`further comprises a plurality of background image sub-
`region update means 200 provided in correspondence to a
`plurality of sub-regions a, for updating the image feature
`parameter value of each sub-region a,, and each background
`image sub-region update means 200 further includes an
`intensity changestatistical processing means 210 (211, 212
`in FIG. 4 for sub-regions a,, a.) for statistically processing
`the temporal chan

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket