throbber
United States Patent [191
`Ando
`
`[11] Patent Number:
`[45] Date of Patent:
`
`5,008,946
`Apr. 16, 1991
`
`[54] SYSTEM FOR RECOGNIZING IMAGE
`[75] Inventor: Mitsuhiro Ando, Tokyo, Japan
`[73] Assignee: Aisin Seiki K.K., Tokyo, Japan
`[21] Appl. No.: 242,441
`[22] Filed:
`Sep. 9, 1988
`[30]
`Foreign Application Priority Data
`Sep. 9, 1987 [JP]
`Japan .............................. .. 62-225862
`
`[51] Int. Cl.5 ............................................. .. G06K 9/00
`[52] US. Cl. ........................................ .. 382/2; 382/23;
`.
`382/24; 434/43; 180/271; 180/167
`[58] Field of Search ......................... .. 382/2, 1, 23, 24;
`340/825.03; 358/103; 434/62, 43, 44; 244/222,
`76 R, 194, 195, 3.11, 3.14; 414/901; 180/167,
`272, 271
`
`[56]
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`3,638,188 l/l972 Pincoffs et a1. ..................... .. 382/23
`4,281,734 8/1981 Johnston ........................... .. 180/167
`4,479,784 10/1984 Mallinson et a1. .............. .. 434/43
`4,625,329 11/1986 Ishikawa et a1. .................. .. 180/271
`
`OTHER PUBLICATIONS
`Nakasi Honda et al., Multivariate Data Representation
`
`and Analysis by Face Pattern Using Facial Expression
`Characteristics, 6-1985, 85-93.
`
`.
`Primary Examiner—Michael Razavi
`Attorney, Agent, or Firm-Sughrue, Mion, Zinn,
`Macpeak & Seas
`
`ABSTRACT
`[57]
`There is disclosed a system which permits an automo
`bile driver to control electrical devices installed on an
`automobile by moving his or her pupils and mouth
`intentionally. The system includes a TV camera, a light,
`a ?rst microprocessor which controls the electrical
`devices according to the changes in the shape of the
`driver’s mouth, a second microprocessor for perform
`ing~ various arithmetic operations, and memories. Refer
`ence values have been previously assigned to various
`elements of the driver’sface and stored in one of the
`memories. The second microprocessor normalizes the
`distances between the elements of the face with the
`distance between the pupils of the eyes and compares
`the normalized distances with the reference values to
`calculate the degrees of similarity of the elements.
`
`1 Claim, 22 Drawing Sheets
`
`Page 1 of 44
`
`SAMSUNG EXHIBIT 1009
`Samsung v. Image Processing Techs
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 1 of 22
`
`5,008,946
`
`NN
`
`n
`
`8.28as:
`
`“to20
`
`4ozhzoua.528m2tezo
`
`“25:
`
`:55:
`
`22:355.
`
`.9452;
`
`H555.
`
`m.
`
`:
`
`mist
`
`>55:
`
`$22.2
`
`9
`
`2.o:
`
`.5528
`
`:8
`
`.2.sz
`
`85%;.
`
`:8:
`
`54.5528
`
`
`
`
`
`ll:-3528Imma-8“359$:
`
`mm
`
`
`
`Ehmozbotho
`
`5228tea
`
`
`
`3.28SEEto?
`
`‘2EE-
`
`3
`
`8th8
`
`,R.,Sac“:35
`
`9..on
`
`mam
`
`5.15528
`
`ddég
`
`8555.
`
`<
`
`:3:
`
`SAMSUNG EXHIBIT 1009
`
`Page 2 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 2 of 44
`
`
`
`
`
`
`
`
`
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 2 of 22
`
`5,008,946
`
`FIG. 4
`
`{P
`
`uem ADJUST. ~ILc
`
`MODIFY BRIGHT. ~10
`N0
`9
`HIGHEST DUTY
`CYCLE?
`
`n
`\
`LIGHT 4 OFF
`1
`WRITE DATA
`T0 MEMORY 13 \12
`t
`
`WRITE DATA ~5
`T0 MEMORY 13
`i
`CALCUL. BRIGHT.
`
`~6
`
`SAMSUNG EXHIBIT 1009
`Page 3 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 3 0f 22
`
`5,008,946
`
`FIG. 2
`
`POWER ON
`i
`INITIAL.
`
`INPUT MODE
`
`LIGHT ADJUST.
`
`ERROR PROCESS. I
`
`DETECT. 0F HEAD /* HDD
`
`ERPI
`
`RED
`
`FOREHEAD DETECT. / BRD
`4--—-I A
`RIGHT EYE DETECT.
`_—-——l *
`SET wmoow we
`f
`LEFT EYE DETECT.
`'
`I
`EYEBROWS DETECT.
`<——-I i
`MOUTH DETECT.
`
`SET wmoow wm
`'
`NOSE DETECTION f NOD
`..____.___.I
`I
`FACE DETECTION
`_________I
`READ IMAGE DATA r40
`DIGITIZE DATA
`
`R
`
`ERROR pnoczssz .
`\
`ERPZ
`
`44
`
`42 .
`SECLOSED? NO
`YES
`SEARCH FOR PUPILS
`‘
`
`'
`
`Y
`SEARCH FOR MOUTH
`I
`DECISIONZ A?
`i
`CLEAR IPDE coum
`
`OUTPUT
`
`54
`
`49”“ DECISION 1
`1
`» 50* CLEAR IPDE coum.
`l__— I
`
`SAMSUNG EXHIBIT 1009
`Page 4 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 4 0f 22
`
`5,008,946
`
`FIG. 50
`
`HDD~ DETECTION OF HEAD
`3
`13-» DETERM. TH FOR
`HEAD DET CTION
`I
`FLAG<—0
`i
`'5 *1 HEAD DETECTION
`
`16
`
`YES
`
`FIG.- 5b
`
`DETERM. THE FOR
`HEAD DETE TION
`?
`,, 55~ DETERM. MASK H10
`1
`CAL. GRAD.
`HISTOGRAM
`Y
`57~ CLEAR REGISTER 8mm
`1
`CALMEAN
`CONCEN. M1
`7
`CLEAR REGISTER
`
`58”
`
`CAL. PIXEL NUMBER
`m, MEAN CONCEN. M1,
`PIXEL NUMBER '42,
`MEAN CONCEN. M2
`i
`
`+ MHz-M1
`
`NO
`
`YES
`STORE BIN REG. Bm~63
`STOREiIN REG. THh ~64
`——-1
`STORE mm REG.i ~55
`
`SAMSUNG EXHIBIT 1009
`Page 5 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 5 of 22
`
`5,008,946
`
`FIG. 5c II '5
`
`DETERM. MASK 2H1
`
`67
`
`68
`
`STORE xmin IN REG. LN
`
`><
`
`;‘
`a
`g
`
`g
`g
`
`X 2
`
`'Q
`E
`a;
`
`: O
`
`.
`g
`ac
`
`73
`
`74
`
`75
`
`YES
`RETURN
`
`xmax IN REG. LN
`
`COUNT NUMBER
`BNP 0F PIXELS
`EXCEEDING THh
`
`76
`
`BNP>50 ?
`
`_
`
`
`
`77
`
`NO
`
`YESm LN LN REG. ALX
`RETURN
`Aw ALX ARX+1
`=
`_
`
`YES
`
`79
`
`80
`
`I
`
`Ymin IN REG. LN
`
`81
`
`DETECT.0FUPPERENDATY
`
`COUNT NUMBER
`BNPOF PIXELS
`EXCEEDING TRh
`
`83
`
`BNP>AW/2?
`
`'
`
`YES
`
`FLAG «1
`
`37
`
`LN IN REG. ATY
`
`86
`
`
`
`No
`
`my
`IN 211,3".x
`
`LN+1 IN REG. LN
`
`YES
`
`SAMSUNG EXHIBIT 1009
`
`Page 6 of 44
`
`
`
`LN+T IN REG. LN
`
`COUNT NUMBER
`BNP 0F PIXELS
`EXCEED'NG T“h
`
`70
`
`
`71
`m LN IN REG. ARX
`
`BNP>50 ?
`
`SAMSUNG EXHIBIT 1009
`Page 6 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 6 of 22
`
`5,008,946
`
`FIG. 6
`
`FIG. 7b
`
`ERROR PROCESS] ~ERP1
`?
`IPDF ~—IPDF+ 1
`
`'7
`
`20~ DETERM. THf FOR
`FOREHEAD DETECTION
`$
`DETERMTH; FOR N
`RIGHT HALF 5ETERM.
`88
`To TH \ BY POINTS (ARX, ATY),
`h
`(ARY+AW/2, 255)
`
`'
`
`V
`DETERM. THf FOR ~89
`DETERM.
`LEFT HALF DéTERM.
`SIMILARLY
`BY POINTS (ARX+AW/2,
`To T“h \ m), (ALX, 255)
`
`FIG. 70
`
`ABRD
`
`DETECTION
`OF FOREHEAD
`§>
`05mm. THf FOR ~20
`FOREHEAD omcnon
`1
`FLAG -—0 ~21
`i
`DETECTION
`0F FOREHEAD
`
`,__,22
`
`TH gm -—TH fL
`
`THfm<— mfr'
`
`_
`
`V
`ASSIGN THfr, THfm, TH?
`T0 REGIONS 1, 2, 3. REGION
`11s DETERM. BY POINTS
`(ARX, ATY), (ARX+AW/3,
`255). REGION 2 1s DETERM.
`BY POINTS (ARX+AW/3,
`ATY), (ARX+2AW/3, 255).
`REGION 1 1s DETERM. BY
`POINTS (ARX+2AW/3, ATY),
`(ALX, 255)
`
`r
`RETURN
`
`)
`
`(
`
`23
`
`'
`
`£381 "0 @
`YES
`
`SAMSUNG EXHIBIT 1009
`Page 7 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`sheet 7 of 22
`
`‘5,008,946
`
`FIG. 7c
`
`~22
`
`DETECTION
`0F FOREHEAD
`9
`REG. LM<-ATY ~94
`
`REG. XP<—ARX
`REG. CK ‘— 0
`REG. RL<— o
`
`96
`
`DETERM. THfr, ~101
`THfm, THrL FOR XP
`y
`READGRFADDAPTA M102
`Dd 0R X
`
`REG. LN<—LN+I ~99
`
`100
`
`REG. HW<—MXL
`REG. HTY<- LN
`FLAG<—1
`7‘
`RETURN )
`
`(
`
`109
`/
`REG. HLX<—XP-1
`
`no
`
`N0
`
`/
`REG. RL<-RL+1
`
`\
`
`REG. XP<-XP+I
`——j '07
`
`111
`YES J
`REG. MXL<—RL
`
`‘
`
`r
`
`REG. CK<—0
`
`V
`
`4
`>1;
`5

`L g
`:E 2
`‘<5
`F5
`s
`
`SAMSUNG EXHIBIT 1009
`Page 8 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 8 of 22
`
`5,008,946
`
`FIG. 8a
`
`DETECTION OF ,\
`RIGHT EYE
`RED
`?
`24
`
`.05 N
`4
`
`DETERMINE TH
`FOR PUPIL oErEci
`v
`FLAG <— o
`v
`PUPIL DETECT.
`
`25
`
`26
`
`21
`
`\
`
`MODIFY
`
`gm
`
`BUZZER OFF ~31
`~32
`

`
`IPDF '
`'
`CAL. CENTER
`(Ecx,EcY) 0F PUPIL
`v
`CAL‘ FEATURE
`
`~33
`
`FIG. 8b
`
`25~ DETERMOFTHe
`FOR PUPIL DETECT.
`c;
`11 ~ DECIDE REGION sd
`3
`DETERM. BY POINTS
`(HRX,HTY),
`(RRx+Rw/2, HTY+ HW)
`v
`114 ~ CAL. DIFFER. GRAD.
`HISTOG. FOR REGION 5d
`Y
`H5~ REG. i<—i(0FMmux)
`
`116~ coum NUMBER
`us
`5 AND NUMBER
`/
`Sb OFPIXEL_S
`REEF-Rum")
`EXCEEDING I
`1
`'
`I17
`
`REG. THe <—- i
`
`\
`RETURN
`
`>
`
`(
`
`SAMSUNG EXHIBIT 1009
`Page 9 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 9 of 22
`
`5,008,946
`
`F IG. 8 C
`
`DETECTION
`
`OF PUPIL
`‘3
`REG. CK<—O
`REG. LN‘- HTY+1
`
`~
`
`27
`
`120
`
`FIG. 8d
`
`coum BLACK PIXELS \ Eggugg
`
`N12,
`
`THE
`‘22
`
`3°\ MODIFY THRESHOLD
`1431
`<3
`K1<— 01K]
`REG. K2<— 01x2
`
`REG. MXL<—BPN
`REG. WG-0
`REG. TP<—LN I127
`REG. BP-—LN
`
`ELAe—-—2
`7
`( RETURN )
`
`'4'
`/
`
`I
`
`128
`
`YES LN>HTY+HW?
`N0
`REG. LN<-LN+1 ’‘29
`i
`130
`coum BLACK PIXELS
`
`133
`REG MXL<-BPN
`-
`\
`
`FLAG <—-2
`REG. EBY‘- BP
`REG. ETY<-TP
`REG’ EYEw<-w
`/ I
`138
`
`‘—'—1 L42
`FLAG <— 1
`REG. EBY<- BP
`REG. ETY——TP
`
`SAMSUNG EXHIBIT 1009
`Page 10 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 10 of 22
`
`5,008,946
`
`FIG. 9a
`
`N
`FAD
`
`CHECK 0N
`FACE 051m.
`?
`CALCUL. P104200 “F 1
`1
`CAL. SPACE EW ~F2
`BETWEEN PUPILS
`T
`NORMAUZE
`Flo-F200 WITH aw
`0
`CAL. F'
`(i=1-é0)
`1
`F28=G21-F21*G22-F22 ~F5
`t
`F29= e23- F23*G24 - F24 ~F6
`
`NF;
`
`N
`F4
`
`@
`
`SAMSUNG EXHIBIT 1009
`Page 11 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 11 of 22
`
`5,008,946
`
`
`
`SAMSUNG EXHIBIT 1009
`
`Page 12 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 12 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 12 of 22
`
`5,008,946
`
`F60
`
`_
`EGZ-FZJ- 1?
`
`YES
`
`NO
`
`F64
`
`F65
`
`11o
`
`_
`
`YES
`
`61 F1<o? @ YES
`
`1
`RETURN
`
`>
`
`(
`
`SAMSUNG EXHIBIT 1009
`Page 13 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 13 of 22 ‘
`
`5,008,946
`
`FIG. 100
`
`FIG. 10b
`
`SEARCH FOR ~44
`PUPILS
`9
`
`oovmwmo SCAN ~144
`ON VERTICAL UNE
`
`@
`
`<9
`
`SCAN FROM UPPER LEFT ~159
`END T0 LOWER mom END
`
`152
`\
`HORIZONTAL
`5W
`
`153
`
`N0
`
`'
`
`'
`
`SCAN FROM LOWER mam ~1s1
`END TO DETECT POINT a
`v
`_ MLX+MRX ~
`“ex-T ‘62
`9
`wcww ~‘63
`2
`|_________
`——'—y
`SCAN FROM UPPER RIGHT ~164
`END T0 LOWER LEFT END
`165
`
`POINTA
`DETECTED?
`I
`SCAN FROM LOWER LEFT vwe
`END T0 DETECT POINT B
`v
`wcxdml? ~16?
`2
`'
`
`WCY : MTYZMBY P168
`
`RESET wmoow we ~15'
`
`UPWARD SCAN
`T0 DETECT MBY
`I
`-MJBLBY ~14?
`MLX
`WCY-
`z
`DETECTED?
`{
`has
`HORIZ. SCAN T0 ~14s
`154~ HORIZ. SCAN TO
`DETECT MLX
`DETECT am
`v
`T
`~ _MLX+MRX HORIZ. SCAN T0 ~
`WC)"
`2
`DETECT MRY
`‘55
`149
`i
`i
`156” DQWNWARD SCAN
`WCX: MLX+ MRX
`TO DETECT m
`2
`v
`Go
`157~ UPWARD 5cm T0
`0mm HEY
`;
`MTY+MBY
`15e~
`“Ch-T"
`L______..
`\
`
`Q5
`
`@5
`
`SAMSUNG EXHIBIT 1009
`Page 14 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 14 of 22
`
`5,008,946
`
`DECISION 1
`
`CALCUL. EH
`1
`CALCUL. EW
`
`FIG. 110
`
`1
`0011111. 10 REG.
`
`1
`RESET 100111101 ~180
`1
`CLEAR 110911110 ~1a1
`
`173w COUNT. T0 REG.
`1
`174 *4 RESET T COUNTER
`1
`175~ SET CLOS. FLAG
`
`011 9011111
`1
`
`OFF 9011111
`1
`
`194
`\
`01151011111
`
`YES
`
`192
`
`"0
`
`RADIO
`011 1
`193
`1E5
`\
`OFF 51011111
`
`50'
`
`191
`
`1921
`YES
`
`111101
`POSITION 1
`110
`
`AIR 0010.
`POSITION?
`110
`F
`
`SAMSUNG EXHIBIT 1009
`Page 15 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 15 of 22
`
`5,008,946
`
`FIG. ND
`
`201
`\
`
`AIR CONDI. c
`FOR NAME REG.
`
`RADIO com.
`FOR NAME REG.
`1
`
`,
`
`CLEAR REG. 1-3 “253
`
`wmoow com
`FOR NAME REG
`I
`
`.
`
`6')
`
`7
`CLEAR REG. 1-3
`272 ~
`
`SAMSUNG EXHIBIT 1009
`Page 16 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 16 of 22
`
`5,008,946
`
`FIG. 12a
`
`DECISION 2
`I
`DETECT RIGHT
`END OF LIPS
`1
`DETECT LEFT
`END OF LIPS
`
`DETECT LATERAL
`WIDTH MW
`1
`SCAN VERTICAL LINE
`CROSSING CENTER
`T
`DETEC T UPPER LOWER
`ENDS 0F LIPS
`if
`DETECT VERTICAL
`WIDTH NH
`
`FOR NAME REG.
`I
`CLEAR REG. I-3
`
`SAMSUNG EXHIBIT 1009
`Page 17 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 17 of 22
`
`5,008,946
`
`FIG. 12b
`
`CR
`FOR
`
`com.
`EREG. “239
`
`7
`CLEAR REG.1—3 ~240
`
`.
`.
`
`DOOR I.
`FOR NAME
`i
`CLEAR REG. 1-3
`
`251"“ SUNROOF CONT.
`FOR NAME REG.
`T
`252'” CLEAR REG. 1-3
`
`SAMSUNG EXHIBIT 1009
`Page 18 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 18 of 22
`
`5,008,946
`
`FIG. 130
`
`FIG. 13b
`
`FIG. 13c
`
`FIG. 13d
`
`SAMSUNG EXHIBIT 1009
`Page 19 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 19 of 22
`
`FIG. I39
`
`EYEN
`
`ETY
`
`-
`
`é; EBY
`
`FIG. 13g
`
`1 Sb
`
`In)
`
`4
`
`HRX
`
`IbI
`
`ETY
`
`EBY
`
`ETY
`
`EBY
`
`FIG. I3i
`BRX
`BLX
`
`§ § xgirm-Hw/g
`
`""* ‘~" "ETY+EYEN
`
`rv
`
`MGY"
`
`HLX
`
`FIG. 13f
`,ZIJSEBY
`
`ETY
`
`Jill/Ill”
`
`FIG.
`
`\ x
`I
`
`.
`
`I
`
`ETY
`
`VIEW
`
`ETY-EMW
`
`'
`
`1,, ‘"1 ,,
`
`/ A
`
`SAMSUNG EXHIBIT 1009
`Page 20 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 20 of 22
`
`5,008,946
`
`08m5asE
`
`>.mmozmurn:NE.EEE.55955E.55
`
`(E
`
`3
`
`to
`
`2o
`
`E
`
`m:
`
`.3o8m8
`
`m8
`
`cmasEV
`
`5.52:
`
`E
`
`o:
`
`Sc8
`
`m8
`
`So
`
`.8
`
` 1weorz522:2:E29:8“.22:8;
`
`So9N5:c
`o:258:9mEEBME.5umoz9:59.oh
`muzfimaE.558..558252muzfimao
`
`Re8on8v8
`am; mmm“.NE
`
`E.0:
`
`SAMSUNG EXHIBIT 1009
`
`Page 21 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 21 of 44
`
`
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 21 of 22
`
`5,008,946
`
`
`
`MRX
`
`MLX
`
`FIG. 150
`
`wmoow we
`
`MTY
`
`WCY
`
`MBY
`
`— I
`
`
`
`
`
`
`FIG. I5c
`
`FIG. 15d
`
`MTY
`M WCY
`
`
`
`
`
`
`
`c ‘
`
`WX
`
`
`k-
`§~
`"Iii. “
`
`
`MRX
`
`
`
`
`MLX
`
`FIG. 15%
`
`,m
`
`P02"
`
`FIG. 15f
`
`“RX
`
`“TY
`
`
`MBY
`
`
`,
`
`POAINT
`
`m
`
`
`
`wcx
`
`
`
`POINTB
`
`“Rx
`
`MLX
`
`SAMSUNG EXHIBIT 1009
`
`Page 22 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 22 of 44
`
`

`

`US. Patent
`
`Apr. 16, 1991
`
`Sheet 22 of 22
`
`5,008,946
`
`FIG. 159
`
`We
`
`I :1
`
`- I
`
`i’f/W/
`
`FIG. 16a @ A
`
`wmoow
`
`Wm
`
`
`
`SCAN
`
`SAMSUNG EXHIBIT 1009
`
`Page 23 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 23 of 44
`
`

`

`I
`
`SYSTEM FOR RECOGNIZING IMAGE
`
`FIELD OF THE INVENTION
`
`The present invention relates to a system which de-
`tects an image and also the elements of the image and,
`more particularly,
`to a system which recognizes an
`image, such as an object or person to turn on or off
`desired electrical devices, to increase or decrease the
`output power of the devices, or to otherwise control the
`devices by responding to the motion or operation of the
`image in a noncontact manner, it being noted that the
`present invention is not limited to these applications.
`BACKGROUND OF THE INVENTION
`
`The prior art techniques of this kind are used in auto-
`matic doors employing photosensors, footboards, etc.,
`warning devices for informing a person of entry or
`intrusion, and metal sensors. Any of these devices
`makes use of a noncontact sensor, such as a photosen-
`sor, microswitch, electrostatic field-type proximity
`switch, or electromagnetic sensor, or a mechanical
`switch, and detects opening or closure of an electrical
`contact, making or breaking of an electromagnetic
`wave path, a change in an electric field, or a change in
`a magnetic field which is caused when an object or a
`person makes contact with, approaches, or passes
`through, the device to turn on or off a desired electrical
`device, such as a buzzer, meter, automatic door, relay,
`monitor television , or an electrically controlled ma-
`chine.
`This electrical device cannot be controlled, e.g.,
`turned on and off, unless an object or person is close to
`the device and makes relatively large movement. Since
`a change in the state of a minute portion of an object or
`human body cannot be detected by a sensor, an input
`device consisting principally of keyswitches has been
`heretofore most frequently used to energize various
`electrical devices. As an example, various electrical
`devices are installed on an automobile, and various
`keyswitches, volumes, etc. are disposed corresponding
`to those electrical devices. However,
`if the driver
`stretches his or her arm or twists around to operate a
`switch or volume control, then the driving is endan-
`gered. Also, it is easy to meticulously operate a switch
`or volume control, because the driver cannot keep his
`or her eyes off the front view for a relatively long time
`to watch a device. Accordingly,
`it may be contem-
`plated to install. a speech recognition apparatus which
`recognizes the driver’s speech and controls various
`electrical devices. Unfortunately, a large amount of
`noise takes place inside the automobile and so the recog-
`nition involves noticeable error.
`
`In order to automatically control or energize various
`electrical devices according to the change in the state of
`a small portion within a broad region and to permit the
`driver to control various electrical devices relatively
`precisely in a noncontact manner without requiring
`great care or large motion, the present inventor has
`developed an apparatus that turns on and off devices
`installed on a vehicle in response to the motion of driv-
`er’s eyes and mouth, as disclosed in Japanese Patent
`application No. 272793/1985.
`This apparatus makes use of image pattern recogni-
`tion techniques. In particular, this apparatus uses a cam-
`era means for converting an image, or information in
`the form of light, into an electrical signal and a position-
`detecting means that detects the position of certain
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`60
`
`65
`
`5,008,946
`
`2
`portions of the image. In operation, the apparatus takes
`a picture of an object or person, such as an automobile
`driver, and detects the positions of the certain portions
`of the picture, such as the driver’s eyes and mouth.
`Since the brightness inside the automobile varies, an
`illuminating means for illuminating the driver, a bright-
`ness-setting means for setting the brightness of the illu-
`minating means, and a brightness control means are
`provided. The brightness control means detects the
`brightness on the driver’s face and adjusts the setting of
`the brightness-setting means to change the brightness.
`Thus, the brightness on the driver’s face is maintained
`constant to prevent the image processing from produc-
`ing error due to variations in the brightness.
`The position of the driver’s face may be changed by
`vibration of the automobile body or may vary because
`of his or her unintentional minute motion or a change in
`the posture. Also, the eyes and mouth may be intention-
`ally moved to control electrical devices in a noncontact
`manner as described later. To precisely extract informa-
`tion about the eyes and mouth from image information
`in response to the changes in the positions of the face,
`eyes, and mouth, the apparatus further includes a stor-
`age means for storing the detected positions, a window
`setting means for setting a region narrower than the
`image produced by the camera means according to the
`stored positions, a means for setting the region covered
`by a position-detecting means to the narrower region
`after a certain period of time elapses since the detected
`positions are stored in the storage means, and an updat-
`ing means for updating the positions of the aforemen-
`tioned certain portions within the narrower region
`which are stored in the storage means. Once the posi-
`tions of the certain portions, i.e., the eyes and mouth,
`are detected,
`the scan made to detect
`the eyes and
`mouth is limited to the narrower region and so they can
`be detected quickly. Further, the accuracy with which
`the detection is made is enhanced. Consequently, the
`apparatus follows the eyes and mouth quickly and pre-
`cisely.
`further equipped with a state
`This apparatus is
`change-detecting means for detecting the states of the
`eyes and mouth at successive instants of time to detect
`the changes in the states. Also, the apparatus includes an
`output-setting means which supplies a control signal or
`electric power to an electrical device according to the
`changes in the states. Specifically, when the states of the
`monitored eyes and mouth are found to change in a
`predetermined manner, i.e, it is ready to activate the
`electrical device, electric power is supplied to the de-
`vice according to the change.
`The apparatus enables the driver to control the elec-
`trical device by moving his or her eyes or mouth while
`assuming a posture adequate to drive the automobile.
`Therefore, the electrical device installed on the automo-
`bile can be quite easily operated. This contributes to a
`comfortable and safe drive. As an example, when the
`driver utters a word to indicate something, the electri-
`cal device is controlled according to the shape of the
`mouth. If the driver utters no word but moves the
`mouth intentionally as if to utter a word, then the elec-
`trical device is controlled according to the shape of the
`mouth. Since the operation of the device is not affected
`by utterance, the detection involves no error in spite of
`noise produced inside the passenger’s compartment.
`Also, if the radio set is played, or if a passenger is speak-
`
`SAMSUNG EXHIBIT 1009
`
`Page 24 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 24 of 44
`
`

`

`5,008,946
`
`3
`is unlikely that the electrical device is
`it
`ing loudly,
`caused to malfunction.
`
`The concept of the aforementioned apparatus can be
`similarly applied to the case. where a person other than
`an automobile driver is monitored. For example, a simi-
`lar apparatus allows a patient with an advanced disease
`to operate, stop, or control the surrounding medical
`instruments or assisting instruments with his or her eyes
`and mouth.
`
`5
`
`10
`
`20
`
`25‘
`
`30
`
`35
`
`The apparatus can also monitor a machine to detect
`abnormality and protect the machine. A certain part or
`portion of the machine is checked for trouble. If this
`part or portion operates abnormally, the operation of
`the machine is stopped, or a warning device is operated.
`In this way, the above described apparatus can be also 15
`employed with similar utility to monitor an object other
`than a person.
`Further, the invention can be utilized to monitor a
`broad region such as a natural sight, especially to moni—
`tor animals or vehicles moving in the region. For in-
`stance, a gate in a safari park can be opened and closed
`according to the movement of a vehicle or fierce ani-
`mals. For a manufacturing plant a belt conveyor line
`can be monitored to check the parts or products on the
`conveyor. When they move in a given direction, a
`safety device is operated, or equipment for the next
`manufacturing step is run. In this way, the aforemen-
`tioned apparatus can be used with similar utility in the
`same manner as the foregoing.
`The apparatus described above can detect the driv-
`er’s head, face, and pupils with high accuracy and yield
`the foregoing advantages when the monitored object,
`such as the face of the automobile driver, has a rela-
`tively uniform brightness, typically encountered when
`no car is running in the opposite direction at night and
`substantially only the interior light illuminates the face,
`thus permitting the monitoring. However, when the
`driver’s face or head is illuminated with intense light
`emanating from the headlamps either on a car running
`in the _opposite direction or on a succeeding car even at
`night, or when the sunlight is intense in the daytime, the
`external light stronger than the light emitted from the
`interior light is reflected or intercepted by the driver’s
`face or head. In this situation the brightness on the face
`frequently becomes nonuniform. That is, intense light is
`reflected from only a portion of the face; the remaining
`portion is in shadow and darker. As an example, when
`the automobile is running in fine weather under the sun
`located to the right of the automobile, the surroundings
`of the right eye are very bright, while the surroundings
`of the left eye are quite dark. In this nonuniform illumi—
`nation, the accuracy with which the driver’s pupils are
`detected deteriorates, because the apparatus uses only
`one threshold value in digitizing the whole obtained
`image. Also, the shape of the driver’s mouth is detected
`with decreased accuracy.
`Accordingly, the present inventor has developed an
`improvement over the aforementioned known appara-
`tus to detect elements, such as the pupils or the mouth
`or both, of a monitored object, such as the driver’s face,
`with increased accuracy, as disclosed in Japanese Patent
`application No. 169325/1987. The improved apparatus
`arithmetically obtains a first gradation histogram for
`each of small neighboring regions, for example the right
`half and the left half, within a desired portion such as a
`human face included in the monitored image. Then, a
`threshold value for each region is determined, based on
`the histogram. Information about the gradation of the
`
`45
`
`50
`
`55
`
`65
`
`4
`image is digitized, and a characteristic index (HTY)
`which indicates the boundary between the hair and the
`forehead, for example,
`is determined. This boundary
`extends through the neighboring regions on the moni-
`tored face. Opposite sides of the boundary differ in gray
`level. A second gradation histogram is created from
`information about the gradation of an image of a set
`region Sd based on the determined characteristic index
`(HTY). The set region 8,; contains the eyes. Then, a
`threshold value (THe) is determined according to this
`histogram to digitize the gradation of the image of the
`region (Sd). Thus, the positions of certain small portion
`or portions, such as pupils, within the region (Sd) are
`detecred. The certain small portion can be a mouth
`instead of pupils.
`Determination of a threshold value from a gradation
`histogram and digitization an analog signal are known
`in the field of object recognition image processing.
`These techniques are adequate to separate an object
`located in front of the background from the background
`of the image when the concentration of the image var-
`ies. Accordingly, this improved apparatus can precisely
`detect
`the characteristic index which indicates the
`
`upper end of the forehead. This digitization is adequate
`to detect a characteristic index (HTY) indicating the
`boundary between the background, or hair, and the
`main portion, or forehead, in each divided region even
`if the monitored object is not uniformly illuminated or
`the brightness of the light source itself varies. Hence,
`the index (HTY) can be detected with accuracy. The
`index (HTY) represents a reference position on the
`detected object, or face.
`The region (Sd) surrounding the eyes is set according
`to the characteristic index (HTY). A threshold value is
`set according to a gradation histogram obtained from
`this region (Sd). Then, an analog signal is transformed
`into binary codes, using the threshold value. These
`techniques are adequate to define the given region (Sd)
`containing the certain small regions, or pupils, of the
`detected object, and to separate the pupils whose gray
`levels suddenly change in the region (Sd), from the
`background, or the surroundings of the pupils if the
`object is illuminated asymmetrically or the brightness of
`the light source itself varies. Consequently, the certain
`small portions, or the pupils, can be detected accurately.
`Also, the small portions can be mouth or lips.
`In this manner, the improved apparatus is capable of
`detecting given portions of an object accurately if the
`object is illuminated asymmetrically or the brightness of
`the light source itself varies.
`If the driver sitting on the driver’s seat of an automo-
`bile shifts the seat forward or rearward to adjust the
`posture for driving, the distance between the camera
`means and the subject, or face, changes. At this time, an
`automatic focusing device prevents the obtained image
`from getting blurred. However,
`the possibility that
`elements of the image are incorrectly detected, e.g., the
`nostrils are regarded as the mouth, increases.
`SUMMARY OF THE INVENTION
`
`It is an object of the present invention to provide a
`system capable of detecting elements of an image with
`increased accuracy.
`The above object is achieved in accordance with the
`invention by a system comprising: a camera which con-
`verts optical information obtained from the image into
`an electrical signal; a position-detecting circuit for de-
`tecting three or more elements of the image and their
`
`SAMSUNG EXHIBIT 1009
`
`Page 25 of 44
`
`SAMSUNG EXHIBIT 1009
`Page 25 of 44
`
`

`

`5
`positions according to the electrical signal; a distance
`detecting circuit for detecting the distances between the
`detected elements; a normalizing circuit for normalizing
`data about
`the detected distances with the distance
`between given two of the detected elements; a storage
`circuit which hold reference values previously assigned
`to the elements of the image; a similarity degree calcu-
`lating circuit which compares the normalized data
`about the distances with the reference values and pro-
`duces data about the degrees of similarity to the ele-
`ments of a reference image; and a determining circuit
`which determines whether the image has been success-
`fully detected, from the data about the degrees of simi-
`larity of the detected elements
`The position detecting circuit detects the positions of
`three or more elements, such as the right pupil, the left
`pupil, the nostrils, and the mouth, of an image such as a
`human face. The distance-detecting'circuit detects the
`distances between the elements. The distances are nor-
`malized with the distance between certain elements.
`Therefore, the normalized data indicating the distances
`between the elements are substantially independent of
`the distance between the camera means and the image.
`The similarity degree-detecting circuit compares the
`normalized data with reference values which are stored
`in the storage circuit and have been previously assigned
`to the elements of the image to produce data about the
`degrees of similarity of the detected elements to the
`elements of the reference image, The degrees of similar-
`ity indicate the degrees to which the positions of the
`elements of the optical image formed by the camera
`bear resemblance to the positions of the elements of the
`reference image, or the normalized data about the dis-
`tances between the elements. As the degrees of similar-
`ity of the elements increase, the optical image formed
`by the camera means approaches the reference image.
`The determining circuit determines whether the ele-
`ments have been detected successfully, based on the
`data about the degrees of similarity. That is, if a high
`degree of similarity is obtained, then it is found that the
`image formed by the camera approximates the reference
`image. Conversely, if a low degree of similarity is ob-
`tained, then the image formed by the camera is judged
`to be different from the reference image.
`Accordingly, where the elements such as the pupils
`and the mouth of an automobile driver’s face, for exam-
`ple, are detected to turn on and off or otherwise control
`electrical devices installed on an automobile according
`to the shapes of.the elements and the pattern of change
`in the shapes, the decision to determine whether the
`image formed by the camera is the face or not can be
`made precisely. Therefore, the electrical devices can be
`controlled with reduced error. Especially, where the
`distance between the camera and the driver’s face
`change as encountered when the driver moves the seat
`forward or rearward, the pupils and the mouth can be
`detected correctly. The electrical devices can be con-
`trolled precisely according to the positions of the ele-
`ments and the changes in the shapes.
`Other objects and features of the invention will ap-
`pear in the course of description thereof which follows.
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`to
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`FIG. la is a block diagram of a system according to
`the invention;
`FIG. 1b is a perspective view of the dashboard of an
`automobile, for showing the arrangement of the camera
`3 and the light 4 shown in FIG. 1a;
`
`65
`
`5,008,946
`
`6
`FIG. 2 is a flowchart schematically illustrating a
`sequence of operations performed by the microproces-
`sor 6 shown in FIG. la;
`FIGS. 3, 4, 5a, Sb, 5c, 6, 7a, 7b, 7c, 8a, 8b, 8c, 8d, 9a,
`9b, 9c, 9d, 10a, 10b, Ila, 11b, 12a, 12b, and 12c are flow-
`charts particularly illustrating operations performed by
`the microprocessors 6 and 8 shown in FIG. 1a;
`FIGS. 13a, 13b, 13c, 13d, 13e, 13/? 13g, and 13h, 131',
`13j are plan views of all or some of images taken by the
`camera 3 shown in FIGS. la and 1b;
`FIG. 14 is a diagram showing the relations between
`the degrees of similarity F1—F14 and the degrees of certi-
`tude F21-F32 calculated in the routine (FAD) illustrated
`in FIG. 9a for checking detection of a face;
`FIGS. 150, 1517, 15c, 15d, 15e, 15]? and 15g are plan
`views of window regions We formed for searching for
`pupils and detected pupils;
`FIG. 16a is a plan View of a window region Wm
`formed for searching for a mouth and a detected mouth;
`and
`
`FIG. 16b is a plan View of a mouth, for showing
`various shapes of the mouth taken to pronounce vowels.
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`Referring to FIG. 1a, there is shown a system em-
`bodying the concept of the present invention. This sys-
`tem is installed on an automobile and acts to turn on or
`off, increase or decrease the power, or otherwise con-
`trol electrical devices installed on the automobile, ac-
`cording to intentional movement of the pupils and the
`mouth of the driver’s face.
`
`The system includes a TV camera 3 and a light 4 that
`illuminates at least the driver’s face. The camera 3 and
`the light 4 are combined into a unit and mounted on the
`instrumental panel 2 so as to be movable vertically and
`horizontally, as shown in FIG. 1b. Indicated by numeral
`1 in FIG. 1b is the steering wheel turned by the driver
`to steer the vehicle.
`Referring again to FIG. la, the light 4 is turned on
`and powered by a light controller 19, which also con-
`trols the brightness of the light. The light 4 consists of
`an incandescent lamp using a filament as a light source.
`The controller 19 rapidly switches on and off a direct
`voltage, using a thyristor chopper and applies it to the
`light 4. When an OFF signal is applied to the controller
`19 to turn off the light, the controller turns off the thy-
`ristor chopper. When an ON signal is applied to the
`controller 19, it switches on and off the chopper at a
`normal duty cycle. When a signal indicating an increase
`in the brightness arrives at the controller, it increases
`the duty cycle by one step. When a signal indicating a
`decrease in the brightness is fed to the controller,
`it
`reduces the duty cycle by one step. If the brightness is
`increased to its maximum level or reduced to its mini-
`mum level, the brightness is no longer changed. A mi-
`cropr

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket