`Pirim et al.
`
`(10) Patent N0.:
`(45) Date of Patent:
`
`US 6,717,518 B1
`Apr. 6, 2004
`
`US006717518B1
`
`(54) METHOD AND APPARATUS FOR
`DETECTION OF DROWSINESS
`
`5,689,241 A 11/1997 Clarke, Sr. et al. ....... .. 340/575
`5,786,765 A * 7/1998 Kumakura et al. ....... .. 340/576
`5,813,993 A
`9/1998 Kaplan et al. ............ .. 600/544
`
`(75) Inventors: Patrick Pirim, Paris (FR); Thomas
`
`2 *
`
`$139615; a1~ - - - - - - -
`
`- - - -
`
`-
`
`~
`
`,
`
`,
`
`uZ 1 . . . . . . . .
`
`. . . ..
`
`Bmford’ Cupemno’ CA(US)
`
`5,878,156 A * 3/1999 Okumura . . . . .
`
`. . . .. 382/118
`
`(73) Assigneez Holding B-E-V-S‘A” Luxembourg (LU)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`
`6,304,187 B1 * 10/2001 Pirim ....................... .. 340/576
`FOREIGN PATENT DOCUMENTS
`
`
`
`patent is extended or adjusted under 35 U'S'C' 154(k)) by 0 days‘
`
`W0
`
`
`
`W0 WO 98/05002
`
`
`
`A1 11; 2/1998
`
`
`
`~~~~~~~~~ ~~ ............. G06T/7/20
`
`(21) Appl. No.:
`
`09/600,390
`
`(22) PCT Flled:
`(86) PCT N0.:
`
`Jan' 15’ 1999
`PCT/EP99/00300
`
`§ 371 (c)(1),
`(2), (4) Date:
`
`Feb. 9, 2001
`
`(87) PCT Pub. No.: WO99/36893
`
`PCT Pub‘ Date: Jul‘ 22’ 1999
`Foreign Application Priority Data
`
`(30)
`
`Jan. 15, 1998
`(FR) .......................................... .. 98 00378
`Aug. 25, 1998 (WC) ............................. .. PCT/EP98/05383
`51
`I
`C]7
`G08B23/00
`.............................................. ..
`.
`nt.
`(52) US. Cl. ................... .. 340/576; 348/143; 382/117
`(58) Field of Search ............................... .. 340/575, 576;
`382/100, 103, 115, 117; 348/143, 148,
`672
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`3/1981 Manning .................. .. 340/575
`4,259,665 A
`4,485,375 A 11/1984 Hershberger __
`340/576
`4,555,697 A 11/1985 Thackrey .................. .. 340/575
`4,928,090 A
`5/ 1990 Yoshimi et a1- ----------- -- 340/575
`5,195,606 A
`3/1993 Martynluk ~~~~~~ ~~
`180/272
`57218387 A * 6/1993 Ueno et al'
`351/210
`5353913 A 10/1994 Estrada "" "
`340/575
`5’4O2’109 A
`3/1995 Manmk
`340/575
`5,469,143 A 11/1995 Cooper ..................... .. 340/575
`5,481,622 A * 1/1996 Gerhardt et al. .......... .. 382/103
`
`340/575
`5,682,144 A 10/1997 Mannik .......... ..
`5,684,461 A 11/1997 Jones ....................... .. 340/575
`
`OTHER PUBLICATIONS
`
`Ueno, H. et al.: “Development of Drowsiness Detection
`System” 1994 Vehicle Navigation & Information Systems
`Conference Proceedings, pp. 15—20, XP 000641294, Aug.
`1994.
`
`* cited by examiner
`
`Primary Examiner—Thomas Mullen
`(74) Attorney, Agent, or Firm—ToWnsend and ToWnsend
`and Crew LLP
`
`(57)
`ABSTRACT
`In a process of detecting a person falling asleep, an image of
`hf fh
`'
`~dP~ l fh'
`h '
`t e aceo t eperson 1s acquire . 1xe so t eimage av1ng
`Characteristics corresponding to an eye of the person are
`Selected and a histogram is formed of the Selected Pixels
`The histogram is analyZed over time to identify each open
`ing and closing of the eye, and characteristics indicative of
`the person falling asleep are determined. A sub-area of the
`image including the eye may be determined by identifying
`the head or a facial characteristic of the person, and then
`identifying the sub-area using an anthropomorphic model.
`To determine Openings and Closings 0f the eyes, histograms
`of shadoWed pixels of the eye are analyZed to determine the
`Width and height of the shadoWing, or histograms of move
`ment corresponding to blinking are analyZed. An apparatus
`for detecting a person falling asleep includes a sensor for
`acquiring an image of the face of the person, a controller,
`and a histogram formation unit for forming a histogram on
`ixels havin selected characteristics Also disclosed is a
`p
`.
`.g
`.
`.'
`rear'vlew mm“ assembly mcorporatmg the apparatus'
`
`39 Claims, 20 Drawing Sheets
`
`gzgy
`
`I
`II
`Y'Ib
`BSOb/Xd 330d
`329d Ya\
`r——-———— ———— ——— —— —+/|/ X"d
`' '''' " t - ‘15;;- '
`‘ ' ' ‘ ' ‘ ‘ ‘A; "~5-;~;<:..“‘‘C‘/ /
`‘(a/‘1
`7* '~@
`M4 16': P42, }-Y b
`I
`\‘_~(~\.\‘. é-Jr, ht» * -L\°)‘-|
`'1
`|
`l‘» _
`'
`-_ * W?v-M
`X C
`_ _ _ _ _ __E__:____ ______ -__ ______-_)|C__'_1
`
`Z’
`
`1
`
`330a
`
`z”
`
`330:
`
`X c
`
`Page 1 of 40
`
`SAMSUNG EXHIBIT 1001
`Samsung v. Image Processing Techs.
`
`
`
`US. Patent
`
`Apr. 6, 2004
`
`Sheet 1 0f 20
`
`US 6,717,518 B1
`
`1
`
`21
`
`1
`
`117
`
`000
`
`1
`
`12
`
`1
`
`11
`
`a1,1 ST
`
`a1,2
`
`_--
`+TR'2
`
`817,2
`
`3171
`
`SL
`
`1'22121 31
`
`TR1
`
`_ _ _ __
`
`In!“ _
`
`a1,2
`
`SL
`
`‘11,35 a11:31
`
`_ _ _ _-
`
`SPATIALAND '
`TEMPORAL .
`I
`l
`
`
`
`
`13
`
`11
`
`10
`
`133
`
`FIG. 1
`
`SAMSUNG EXHIBIT 1001
`
`Page 2 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 2 of 40
`
`
`
`US. Patent
`
`Apr. 6, 2004
`
`Sheet 2 0f 20
`
`US 6,717,518 B1
`
`O - - d C c — - - u o — - . — c a o p - - - - - - - c u - - n — . . - o _ - . o o - n o — — - n ¢ — _-
`
`S
`
`---__..-..__---..------—---f---_----_-
`
` -—--—--—-----
`
`SAMSUNG EXHIBIT 1001
`
`Page 3 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 3 of 40
`
`
`
`US. Patent
`
`Apr. 6, 2004
`
`Sheet 3 0f 20
`
`US 6,717,518 B1
`
` -
`
`FIG. 6
`
`P
`
`Ixy13, (Pl ijt3)
`
`FIG. 5
`
`SAMSUNG EXHIBIT 1001
`
`Page 4 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 4 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 4 0f 20
`
`US 6,717,518 B1
`
`+1
`
`+1
`
`I
`\I
`
`~---------’ n‘--
`
`I
`I
`
`\
`\
`
`SAMSUNG EXHIBIT 1001
`Page 5 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 5 0f 20
`
`US 6,717,518 B1
`
`FIG. 11
`
`'i..'--"-"---—--------
`i
`:2
`
`SAMSUNG EXHIBIT 1001
`Page 6 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 6 0f 20
`
`US 6,717,518 B1
`
`v
`SR
`FE _______________________ _
`
`VL
`
`_
`
`DI
`
`c0 21
`
`24\ HISTOGRAM
`FORMATION
`AND
`v1 , PROCESSING
`
`25
`\
`
`v2
`
`:
`
`V
`
`V
`
`2s
`\
`
`v3
`
`>
`
`21
`\
`
`v4
`
`:
`
`[It-F31
`‘I130
`SSR
`
`sv
`
`‘T13R32
`
`SDI
`
`[I133
`
`<
`
`/
`23
`34/>/——|
`LINEAR
`COMBINATION V5
`
`V v V
`
`AMm)
`36
`COMPOSITE
`\
`MOVINGAREABLOCK L1‘
`“x(m)
`“
`35
`2
`Y(m)2
`
`v v v
`
`V6
`
`sco
`ZH >
`
`CONTROLLER
`\
`42
`
`SEMI- \
`GRAPHIC
`50
`
`/
`28
`
`HP _
`
`3L L
`SC
`
`4
`\29
`A
`Y(m)1
`
`A
`X(m)1
`
`DATA LINE \
`CHANGE BLOCK
`37
`
`Twmm
`
`TXMO
`FIG. 12
`
`SAMSUNG EXHIBIT 1001
`Page 7 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 7 0f 20
`
`US 6,717,518 B1
`
`FIG. 13
`
`FIG. 16
`
`SAMSUNG EXHIBIT 1001
`Page 8 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 8 0f 20
`
`US 6,717,518 B1
`
`I
`
`HISTOGRAM‘
`FORMATION
`253
`
`yI-—>
`
`110
`< vALIDATIOIIv2
`g
`+1I+0 :
`
`DATA
`
`"
`
`DATA
`OUT; IN
`
`OUT
`
`MEMORY
`
`—
`T
`
`"\"T
`
`
`
`m ‘ADDRESS
`
`VALIDATION
`
`I12
`
`\MIN
`
`MAX
`RMAX
`
`INIT
`_ _,
`DATA
`'
`—
`CLOC ‘
`
`’
`
`DATA
`‘
`/I
`OODNTEMRQ “4185
`F 166
`T
`‘NIT DATA(V)
`
`}CLASS|FlER25b
`
`"
`OUT
`
`31
`OUT A /
`P
`2
`1
`O
`108/’ m2:
`“l?9m“i?2 “in1 “inc
`
`I
`
`23/
`
`FIG. 14
`
`SAMSUNG EXHIBIT 1001
`Page 9 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 9 0f 20
`
`US 6,717,518 B1
`
`POINTS CONCERNED -’
`
`BYANALYSIS
`
`“53:3.
`
`\\SCREEN -
`
`R _ NBPTS
`1"—RMAx'”
`1sR<STOP
`
`i
`5
`RMAX
`E
`{779%}
`1
`ORIGIN] POSRMAX NBPTS
`FIG. 15A
`
`i
`5
`i
`l
`STQP
`
`JWISSIS
`
`SCREEN
`
`A‘
`
`ANALYSIS
`AXIS
`
`‘
`
`POINTS
`CONCERNED-'/
`
`BYANALYSIS
`
`FIG. 15B
`
`SAMSUNG EXHIBIT 1001
`Page 10 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 10 0f 20
`
`US 6,717,518 B1
`
`M
`
`V
`
`FIG. 17
`
`SAMSUNG EXHIBIT 1001
`Page 11 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 11 0f 20
`
`US 6,717,518 B1
`
`’ //3o3
`
`299
`
`302b
`f ,
`
`//////////////////£//
`
`305
`
`SAMSUNG EXHIBIT 1001
`Page 12 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 12 0f 20
`
`US 6,717,518 B1
`
`/////‘/////////////L/
`
`305
`
`308
`
`SAMSUNG EXHIBIT 1001
`Page 13 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 13 0f 20
`
`US 6,717,518 B1
`
`323
`
`324
`
`325d
`
`II
`
`J |||||||| k 1---
`
`/. fm/vm. m A,“
`
`SAMSUNG EXHIBIT 1001
`Page 14 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 14 0f 20
`
`US 6,717,518 B1
`
`X:
`
`A
`r
`
`SAMSUNG EXHIBIT 1001
`Page 15 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 15 0f 20
`
`US 6,717,518 B1
`
`Q A
`
`O
`
`Y
`
`O
`
`Z
`
`601
`
`53°?
`
`é :"_;._
`
`t
`
`Fig. 28
`
`(a)
`
`331
`
`/R301
`
`j R302
`
`7 / R303
`
`JHIILI
`
`lununl
`
`lunu
`
`4—
`
`11''
`
`#<
`
`17
`
`>
`
`(b)
`
`SAMSUNG EXHIBIT 1001
`Page 16 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 16 0f 20
`
`US 6,717,518 B1
`
`‘t
`Detect P¥esence \
`0
`Driver
`402
`
`YES
`‘'
`Set Driver
`Present Flag \ 404
`
`NO
`
`L
`Set LCUs and VUs
`to Detect Head \406
`I
`Detect Edges
`of Head
`
`\408
`
`___l
`Set LCUs and We to \
`Detect Eye Movement
`418
`
`Analyze Histogram to
`Detect Eyes
`.
`
`\420
`
`Set Eye-Detected
`Flag
`
`\422
`
`Detect Eye Edges and
`Update Mask
`\424
`
`Mask Image to Keep
`Only Head
`\410
`
`Set E e_centered
`yFlag
`
`\426
`
`Convert Histogram Data
`to Blink Data
`\
`428
`
`Analyze Blink Data to
`Detect Falling Asleep \430
`
`Set Centered \
`Face Flag
`412
`
`MaskémggygsKeep \414
`
`Set Rough Eye
`Centering Flag \415
`L
`
`FIG. 30
`
`SAMSUNG EXHIBIT 1001
`Page 17 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 17 0f 20
`
`US 6,717,518 B1
`
`353 /”
`
`FIG. 31
`
`FIG. 32
`
`SAMSUNG EXHIBIT 1001
`Page 18 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 18 0f 20
`
`US 6,717,518 B1
`
`Ml
`POSRMAX
`
`I
`
`I
`I
`‘I
`
`MAXX
`
`EYE OPEN
`
`FIG. 33
`
`% 352 MINy
`
`ll lK'llll-clllll
`
`RMAX
`
`380
`
`MINX MAXX
`
`EYE CLOSED
`
`POSRMAX
`
`FIG. 34
`
`SAMSUNG EXHIBIT 1001
`Page 19 of 40
`
`
`
`U.S. Patent
`
`Apr. 6, 2004
`
`Sheet 19 0f 20
`
`US 6,717,518 B1
`
`Set HFU to Search <__—
`/ First Sub-image
`354
`
`‘—_'___'1
`Position Eye
`Search BOX
`I
`
`r\366
`
`"
`Set LCU to Detect
`J Low Luminance
`356
`
`Set VUS for
`
`
`
`/ Luminanpe and 353 Posltlon
`
`D
`~
`
`.
`_/ Detect Nosmls?
`360
`YES
`
`NO
`
`Search
`Mode
`350
`362
`
`j \
`Increment
`Sub-Image
`
`364'f
`
`Deterfmine Center
`N t ‘i
`0 08 n s
`
`‘___'_—‘
`
`Set LCUs and VUs to
`Detect Eye \
`Characteristics
`368
`382
`I
`Determine Width and =
`
`Height of Histogram T ?gdlgg
`
`400
`
`Determine Eye Open
`or Closed
`\384
`
`A I
`Bl, k
`na ze in
`Durationyand lntervai \386
`i
`Driver Drowsy?
`YES
`
`NO
`\388
`
`Generate Alarm
`
`390
`
`FIG. 35
`
`SAMSUNG EXHIBIT 1001
`Page 20 of 40
`
`
`
`US. Patent
`
`Apr. 6, 2004
`
`Sheet 20 0f 20
`
`US 6,717,518 B1
`
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 21 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 21 of 40
`
`
`
`US 6,717,518 B1
`
`1
`METHOD AND APPARATUS FOR
`DETECTION OF DROWSINESS
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`
`The present invention relates generally to an image pro-
`cessing system, and more particularly to the use of a generic
`image processing system to detect drowsiness.
`2. Description of the Related Art
`It is well known that a significant number of highway
`accidents result from drivers becoming drowsy or falling
`asleep, which results in many deaths and injuries. Drowsi-
`ness is also a problem in other fields, such as for airline
`pilots and power plant operators, in which great damage may
`result from failure to stay alert
`A number of different physical criteria may be used to
`establish when a person is drowsy, including a change in the
`duration and interval of eye blinking. Normally, the duration
`of bilking is about 100 to 200 ms when awake and about 500
`to 800 ms when drowsy. The time interval between succes-
`sive blinks is generally constant while awake, but varies
`within a relatively broad range when drowsy.
`Numerous devices have been proposed to detect drowsi-
`ness of drivers. Such devices are shown, for example, in
`US. Pat. Nos. 5,841,354; 5,813,993; 5,689,241; 5,684,461;
`5,682,144; 5,469,143; 5,402,109; 5,353,013; 5,195,606;
`4,928,090; 4,555,697, 4,485,375; and 4,259,665. In general,
`these devices fall into three categories: i) devices that detect
`movement of the head of the driver, e.g., tilting; ii) devices
`that detect a physiological change in the driver, e.g., altered
`heartbeat or brething, and iii) divice that detect a physical
`result of the driver falling asleep, e.g., a reduced grip of the
`steering wheel. None of this divice is believed to have met
`with commercial success.
`
`The German patent application DE 19715519 and the
`corresponding French patent application FR-2.747.346 dis-
`close an apparatus and process of evaluation of the drowsi-
`ness level of a driver using a video camera placed near the
`feet of the driver and a processing unit for processing the
`camera image with software detecting the blinks of the eyes
`to determine the time gap between the beginning and the end
`of the blinks. More particularly, a unit of the processor
`realizes
`
`a memorization of the video image and its treatment, so
`as so determine an area comprising the driver’s eyes,
`the dectetion of the time gap between the closing oft he
`driver ayelids and their full opening and
`a treatment in a memo and a processor in combination
`with the unit to calculate a ratio of slow blink appari-
`tion.
`
`The object of the international patent application pub-
`lished WO-97/01246 is a security system comprising a video
`camera placed within the rear-view mirror of a car and a
`video screen remotely disposed for the analysis of what is
`happening in the car and around it, as well as of what
`happened due to the recording of the output video signal of
`the camera This is in fact a concealed camera (within the
`rear-view mirror), so that it is imperceptible to vandals and
`thieves and which observes a large scope including the
`inside of the car and its surroundings, the record allowing
`one to know later what has happened in this scope (page 6,
`lines 13 to 19), this is not a detector whose effective angle
`is strictly limited to the car driver face in order to detect its
`eventual drowsiness and to make him awake.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`Commonly-owned PCT Application Ser. Nos. PCT/FR97/
`01354 and PCT/EP98/05383 disclose a generic image pro-
`cessing system that operates to localize objects in relative
`movement in an image and to determine the speed and
`direction of the objects in real-time. Each pixel of an image
`is smoothed using its own time constant. A binary value
`corresponding to the existence of a significant variation in
`the amplitude of the smoothed pixel from the prior frame,
`and the amplitude of the variation, are determined, and the
`time constant for the pixel is updated. For each particular
`pixel, two matrices are formed that include a subset of the
`pixels spatially related to the particular pixel. The first
`matrix contains the binary values of the subset of pixels. The
`second matrix contains the amplitude of the variation of the
`subset of pixels. In the first matrix, it is determined whether
`the pixels along an oriented direction relative to the particu-
`lar pixel have binary values representative of significant
`variation, and, for such pixels, it is determined in the second
`matrix whether the amplitude of these pixels varies in a
`known manner indicating movement in the oriented direc-
`tion. In domains that include luminance, hue, saturation,
`speed, oriented direction,
`time constant, and x and y
`position, a histogram is formed of the values in the first and
`second matrices falling in user selected combinations of
`such domains. Using the histograms,
`it
`is determined
`whether there is an area having the characteristics of the
`selected combinations of domains.
`
`It would be desirable to apply such a generic image
`processing system to detect the drowsiness of a person.
`SUMMARY OF THE INVENTION
`
`The present invention is a process of detecting a driver
`falling asleep in which an image of the face of the driver is
`acquired. Pixels of the image having characteristics corre-
`sponding to characteristics of at least one eye of the driver
`are selected and a histogram is formed of the selected pixels.
`The histogram is analyzed over time to identify each open-
`ing and closing of the eye, and from the eye opening and
`closing information, characteristics indicative of a driver
`falling asleep are determined.
`In one embodiment, a sub-area of the image comprising
`the eye is determined prior to the step of selecting pixels of
`the image having characteristics corresponding to charac-
`teristics of an eye. In this embodiment, the step of selecting
`pixels of the image having characteristics of an eye involves
`selecting pixels within the sub-area of the image. The step of
`identifying a sub-area of the image preferably involves
`identifying the head of the driver, or a facial characteristic of
`the driver, such as the driver’s nostrils, and then identifying
`the sub-area of the image using an anthropomorphic model.
`The head of the driver may be identified by selecting pixels
`of the image having characteristics corresponding to edges
`of the head of the driver. Histograms of the selected pixels
`of the edges of the driver’s head are projected onto orthogo-
`nal axes. These histograms are then analyzed to identify the
`edges of the driver’s head.
`The facial characteristic of the driver may be identified by
`selecting pixels of the image having characteristics corre-
`sponding to the facial characteristic. Histograms of the
`selected pixels of the facial characteristic are projected onto
`orthogonal axes. These histograms are then analyzed to
`identify the facial characteristic. If desired,
`the step of
`identifying the facial characteristic in the image involves
`searching sub-images of the image until the facial charac-
`teristic is found. In the case in which the facial characteristic
`
`is the nostrils of the driver, a histogram is formed of pixels
`having low luminance levels to detect
`the nostrils. To
`
`SAMSUNG EXHIBIT 1001
`
`Page 22 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 22 of 40
`
`
`
`US 6,717,518 B1
`
`3
`the histograms of the
`confirm detection of the nostrils,
`nostril pixels may be analyzed to determine whether the
`spacing between the nostrils is within a desired range and
`whether the dimensions of the nostrils fall within a desired
`range. In order to confirm the identification of the facial
`characteristic, an anthropomorphic model and the location
`of the facial characteristic are used to select a sub-area of the
`image containing a second facial characteristic. Pixels of the
`image having characteristics corresponding to the second
`facial characteristic are selected and histograms of the
`selected pixels of the second facial characteristic are ana-
`lyzed to confirm the identification of the first facial charac-
`teristic.
`
`In order to determine openings and closings of the eyes of
`the driver, the step of selecting pixels of the image having
`characteristics corresponding to characteristics of an eye of
`the driver involves selecting pixels having low luminance
`levels corresponding to shadowing of the eye.
`In this
`embodiment, the step of analyzing the histogram over time
`to identify each opening and closing of the eye involves
`analyzing the shape of the eye shadowing to determine
`openings and closings of the eye. The histograms of shad-
`owed pixels are preferably projected onto orthogonal axes,
`and the step of analyzing the shape of the eye shadowing
`involves analyzing the width and height of the shadowing.
`An alternative method of determining openings and clos-
`ings of the eyes of the driver involves selecting pixels of the
`image having characteristics of movement corresponding to
`blinking. In this embodiment,
`the step of analyzing the
`histogram over time to identify each opening and closing of
`the eye involves analyzing the number of pixels in move-
`ment corresponding to blinking over time. The characteris-
`tics of a blinking eye are preferably selected from the group
`consisting of i) DP=1, ii) CO indicative of a blinking eyelid,
`iii) velocity indicative of a blinking eyelid, and iv) up and
`down movement indicative of a blinking eyelid.
`An apparatus for detecting a driver falling asleep includes
`a sensor for acquiring an image of the face of the driver, a
`controller, and a histogram formation unit for forming a
`histogram on pixels having selected characteristics. The
`controller controls the histogram formation unit to select
`pixels of the image having characteristics corresponding to
`characteristics of at least one eye of the driver and to form
`a histogram of the selected pixels. The controller analyzes
`the histogram over time to identifies each opening and
`closing of the eye, and determines from the opening and
`closing information on the eye, characteristics indicative of
`the driver falling asleep.
`the controller interacts with the
`In one embodiment,
`histogram formation unit to identify a sub-area of the image
`comprising the eye, and the controller controls the histogram
`formation unit to select pixels of the image having charac-
`teristics corresponding to characteristics of the eye only
`within the sub-area of the image. In order to select the
`sub-area of the image,
`the controller interacts with the
`histogram formation unit to identify the head of the driver in
`the image, or a facial characteristic of the driver, such as the
`driver’s nostrils. The controller then identifies the sub-area
`
`of the image using an anthropomorphic model. To identify
`the head of the driver, the histogram formation unit selects
`pixels of the image having characteristics corresponding to
`edges of the head of the driver and forms histograms of the
`selected pixels projected onto orthogonal axes. To identify a
`facial characteristic of the driver, the histogram formation
`unit selects pixels of the image having characteristics cor-
`responding to the facial characteristic and forms histograms
`of the selected pixels projected onto orthogonal axes. The
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`controller then analyzes the histograms of the selected pixels
`to identify the edges of the head of the driver or the facial
`characteristic, as the case may be. If the facial characteristic
`is the nostrils of the driver, the histogram formation unit
`selects pixels of the image having low luminance levels
`corresponding to the luminance level of the nostrils. The
`controller may also analyze the histograms of the nostril
`pixels to determine whether the spacing between the nostrils
`is within a desired range and whether dimensions of the
`nostrils fall within a desired range. If desired, the controller
`may interact with the histogram formation unit to search
`sub-images of the image to identify the facial characteristic.
`In order to verify identification of the facial characteristic,
`the controller uses an anthropomorphic model and the
`location of the facial characteristic to cause the histogram
`formation unit to select a sub-area of the image containing
`a second facial characteristic. The histogram formation unit
`selects pixels of the image in the sub-area having charac-
`teristics corresponding to the second facial characteristic and
`forms a histogram of such pixels. The controller then
`analyzes the histogram of the selected pixels corresponding
`to the second facial characteristic to identify the second
`facial characteristic and to thereby confirm the identification
`of the first facial characteristic.
`
`In one embodiment, the histogram formation unit selects
`pixels of the image having low luminance levels correspond-
`ing to shadowing of the eyes, and the controller then
`analyzes the shape of the eye shadowing to identify shapes
`corresponding to openings and closings of the eye. The
`histogram formation unit preferably forms histograms of the
`shadowed pixels of the eye projected onto orthogonal axes,
`and the controller analyzes the width and height of the
`shadowing to determine openings and closings of the eye.
`In an alternative embodiment, the histogram formation
`unit selects pixels of the image in movement corresponding
`to blinking and the controller analyzes the number of pixels
`in movement over time to determine openings and closings
`of the eye. The characteristics of movement corresponding
`to blinking are preferably selected from the group consisting
`of i) DP=1,
`ii) CO indicative of a blinking eyelid,
`iii)
`velocity indicative of a blinking eyelid, and iv) up and down
`movement indicative of a blinking eyelid.
`If desired, the sensor may be integrally constructed with
`the controller and the histogram formation unit. The appa-
`ratus may comprise an alarm, which the controller operates
`upon detection of the driver falling asleep, and may com-
`prise an illumination source, such as a source of IR radiation,
`with the sensor being adapted to view the driver when
`illuminated by the illumination source.
`A rear-view mirror assembly comprises a rear-view mir-
`ror and the described apparatus for detecting driver drowsi-
`ness mounted to the rear-view mirror. In one embodiment, a
`bracket attaches the apparatus to the rear-view mirror. In an
`alternative embodiment, the rear-view mirror comprises a
`housing having an open side and an interior. The rear-view
`mirror is mounted to the open side of the housing, and is
`see-through from the interior of the housing to the exterior
`of the housing. The drowsiness detection apparatus is
`mounted interior to the housing with the sensor directed
`toward the rear-view mirror. If desired, a joint attaches the
`apparatus to the rear-view mirror assembly, with the joint
`being adapted to maintain the apparatus in a position facing
`the driver during adjustment of the mirror assembly by the
`driver. The rear-view mirror assembly may include a source
`of illumination directed toward the driver, with the sensor
`adapted to view the driver when illuminated by the source of
`
`SAMSUNG EXHIBIT 1001
`
`Page 23 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 23 of 40
`
`
`
`US 6,717,518 B1
`
`6
`FIG. 26 illustrates masking outside of the eyes of a
`person.
`
`FIG. 27 illustrates detection of the eyes of a person using
`the system of the invention.
`FIG. 28 illustrates successive blinks in a three-
`
`dimensional orthogonal coordinate system.
`FIGS. 29A and 29B illustrate conversion of peaks and
`valleys of eye movement histograms to information indica-
`tive of blinking.
`FIG. 30 is a flow diagram illustrating the use of the system
`of the invention to detect drowsiness.
`
`10
`
`5
`illumination. The rear-view mirror assembly may also
`include an alarm, with the controller operating the alarm
`upon detection of the driver falling asleep. Also disclosed is
`a vehicle comprising the drowsiness detection device.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a diagrammatic illustration of the system accord-
`ing to the invention.
`FIG. 2 is a block diagram of the temporal and spatial
`processing units of the invention.
`FIG. 3 is a block diagram of the temporal processing unit
`of the invention.
`
`FIG. 4 is a block diagram of the spatial processing unit of
`the invention.
`
`15
`
`FIG. 5 is a diagram showing the processing of pixels in
`accordance with the invention.
`FIG. 6 illustrates the numerical values of the Freeman
`code used to determine movement direction in accordance
`with the invention.
`
`FIG. 7 illustrates nested matrices as processed by the
`temporal processing unit.
`FIG. 8 illustrates hexagonal matrices as processed by the
`temporal processing unit.
`FIG. 9 illustrates reverse-L matrices as processed by the
`temporal processing unit.
`FIG. 10 illustrates angular sector shaped matrices as
`processed by the temporal processing unit.
`FIG. 11 is a block diagram showing the relationship
`between the temporal and spatial processing units, and the
`histogram formation units.
`FIG. 12 is a block diagram showing the interrelationship
`between the various histogram formation units.
`FIG. 13 shows the formation of a two-dimensional his-
`
`togram of a moving area from two one-dimensional histo-
`grams.
`
`20
`
`25
`
`30
`
`35
`
`FIG. 14 is a block diagram of an individual histogram
`formation unit.
`
`40
`
`FIGS. 15A and 15B illustrate the use of a histogram
`formation unit to find the orientation of a line relative to an
`
`analysis axis.
`FIG. 16 illustrates a one-dimensional histogram.
`FIG. 17 illustrates the use of semi-graphic sub-matrices to
`selected desired areas of an image.
`FIG. 18 is a side view illustrating a rear view mirror in
`combination with the drowsiness detection system of the
`invention.
`
`FIG. 19 is a top view illustrating operation of a rear view
`mirror.
`
`FIG. 20 is a schematic illustrating operation of a rear view
`mirror.
`
`FIG. 21 is a cross-sectional top view illustrating a rear
`view mirror assembly incorporating the drowsiness detec-
`tion system of the invention.
`FIG. 22 is a partial cross-sectional top view illustrating a
`joint supporting the drowsiness detection system of the
`invention in the mirror assembly of FIG. 21.
`FIG. 23 is a top view illustrating the relationship between
`the rear view mirror assembly of FIG. 21 and a driver.
`FIG. 24 illustrates detection of the edges of the head of a
`person using the system of the invention.
`FIG. 25 illustrates masking outside of the edges of the
`head of a person.
`
`45
`
`50
`
`55
`
`60
`
`65
`
`FIG. 31 illustrates the use of sub-images to search a
`complete image.
`FIG. 32 illustrates the use of the system of the invention
`to detect nostrils and to track eye movement.
`FIG. 33 illustrates the use of the system of the invention
`to detect an open eye.
`FIG. 34 illustrates the use of the system of the invention
`to detect a closed eye.
`FIG. 35 is a flow diagram of an alternative method of
`detecting drowsiness.
`FIG. 36 illustrates use of the system to detect a pupil.
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`invention discloses an application of the
`The present
`generic image processing system disclosed in commonly-
`owned PCT Application Serial Nos. PCT/FR97/01354 and
`PCT/EP98/05383, the contents of which are incorporated
`herein by reference for detection of various criteria associ-
`ated with the human eye, and especially to detection that a
`driver is falling asleep while driving a vehicle.
`The apparatus of the invention is similar to that described
`in the aforementioned PCT application Ser. Nos. PCT/FR97/
`01354 and PCT/EP98/05383, which will be described herein
`for purposes of clarity. Referring to FIGS. 1 and 11, the
`generic image processing system 22 includes a spatial and
`temporal processing unit 11 in combination with a histogram
`formation unit 22a. Spatial and temporal processing unit 11
`includes an input 12 that receives a digital video signal S
`originating from a video camera or other imaging device 13
`which monitors a scene 13a. Imaging device 13 is preferably
`a conventional CMOS-type CCD camera, which for pur-
`poses of the presently-described invention is mounted on a
`vehicle facing the driver. It will be appreciated that when
`used in non-vehicular applications,
`the camera may be
`mounted in any desired fashion to detect the specific criteria
`of interest. It is also foreseen that any other appropriate
`sensor, e.g., ultrasound, IR, Radar, etc., may be used as the
`imaging device. Imaging device 13 may have a direct digital
`output, or an analog output that is converted by an A/D
`converter into digital signal S. Imaging device 13 may also
`be integral with generic image processing system 22, if
`desired, for example as represented by element 13A.
`While signal S may be a progressive signal, it is prefer-
`ably composed of a succession of pairs of interlaced frames,
`TR1 and TR‘1 and TR2 and TR'2, each consisting of a
`succession of horizontal scanned lines, e.g., 1L1, 112,
`.
`.
`.
`,
`1L17 in TRl, and 121, TR2. Each line consists of a succession
`of pixels or image points PI, e.g., am, a1_2 and a1_3 for line
`1L1; an1 and aln2 for line 117; a1_1 and a1_2 for line 121.
`Signal S(PI) represents S composed of pixels PI.
`S(PI) includes a frame synchronization signal (ST) at the
`beginning of each frame, a line synchronization signal (SL)
`
`SAMSUNG EXHIBIT 1001
`
`Page 24 of 40
`
`SAMSUNG EXHIBIT 1001
`Page 24 of 40
`
`
`
`7
`
`8
`
`US 6,717,518 B1
`
`at the beginning of each line, and a blanking signal (BL).
`Thus, S(PI) includes a succession of frames, which are
`representative of the time domain, and within each frame, a
`series of lines and pixels, which are representative of the
`spatial domain.
`In the time domain, “successive frames” shall refer to
`successive frames of the same type (i.e., odd frames such as
`TR1 or even frames such as TR'l), and “successive pixels in
`the same position” shall denote successive values of the
`pixels (PI) in the same location in successive frames of the
`same type, e.g., a1_1 of 1L1 in frame TR1 and a1_1 of 1L1 in
`the next corresponding frame TR2.
`
`Spatial and temporal processing unit 11 generates outputs
`ZH and SR 14 to a data bus 23 (FIG. 12), which are
`preferably digital signals. Complex signal ZH comprises a
`number of output signals generated by the system, prefer-
`ably including signals indicating the existence and localiza-
`tion of an area or object in motion, and the speed V and the
`oriented direction of displacement DI of each pixel of the
`image. Also preferably output from the system is input
`digital video signal S, which is delayed (SR) to make it
`synchronous with the output ZH for the frame, taking into
`account the calculation time for the data in composite signal
`ZH (one frame). The delayed signal SR is used to display the
`image received by camera 13 on a monitor or television
`screen 10, which may also be used to display the information
`contained in composite signal ZH. Composite signal ZH
`may also be transmitted to a separate processing assemb