`INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)
`WO 99/36893
`
`WORLD INTELLECTUAL PROPERTY ORGANIZATION
`International Bureau
`
`(51) International Patent Classification 6 :
`G0SB 21/00
`
`Al
`
`(11) International Publication Number:
`
`(43) International Publication Date:
`
`22 July 1999 (22.07.99)
`
`(21) International Application Number:
`
`PCT/EP99/00300
`
`(22) International Filing Date:
`
`15 January 1999 (15.01.99)
`
`(30) Priority Data:
`98/00378
`PCT/EP98/05383
`
`15 January 1998 (15.01.98)
`25 August 1998 (25.08.98)
`
`FR
`EP
`
`(63) Related by Continuation (CON) or Continuation-in-Part
`(CIP) to Earlier Application
`us
`Filed on
`
`PCT/EP98/05383 (CIP)
`25 August 1998 (25.08.98)
`
`(71) Applicant (for all designated States except US): HOLDING
`B.E.V. S.A. [LU/LU]; 69, route d'Esch, L-Luxembourg
`(LU).
`
`(71)(72) Applicants and Inventors:
`[FR/FR];
`PIRIM, Patrick
`56, rue Patay, F-75013 Paris (FR). BINFORD, Thomas
`[US/US]; 16012 Flintlock Road, Cupertino, CA 95014
`(US).
`
`(74) Agent: PHELIP, Bruno; Cabinet Harle & Phelip, 7, rue de
`Madrid, F-75008 Paris (FR).
`
`(81) Designated States: AL, AM, AT, AU, AZ, BA, BB, BG, BR,
`BY, CA, CH, CN, CU, CZ, DE, DK, EE, ES, Fl, GB, GD,
`GE, GH, GM, HR, HU, ID, IL, IN, IS, JP, KE, KG, KP,
`KR, KZ, LC, LK, LR, LS, LT, LU, LV, MD, MG, MK,
`MN, MW, MX, NO, NZ, PL, PT, RO, RU, SD, SE, SG,
`SI, SK, SL, TJ, TM, TR, TT, UA, UG, US, UZ, VN, YU,
`ZW, ARIPO patent (GH, GM, KE, LS, MW, SD, SZ, UG,
`ZW), Eurasian patent (AM, AZ, BY, KG, KZ, MD, RU, TJ,
`TM), European patent (AT, BE, CH, CY, DE, DK, ES, FI,
`FR, GB, GR, IE, IT, LU, MC, NL, PT, SE), OAPI patent
`(BF, BJ, CF, CG, CI, CM, GA, GN, GW, ML, MR, NE,
`SN, TD, TG).
`
`Published
`With international search report.
`Before the expiration of the time limit for amending the
`claims and to be republished in the event of the receipt of
`amendments.
`
`(54) Title: METHOD AND APPARATUS FOR DETECTION OF DROWSINESS
`
`(57) Abstract
`
`In a process of detecting a person falling asleep, an image of the face of the person is acquired. Pixels of the image having
`characteristics corresponding to an eye of the person are selected and a histogram is formed of the selected pixels. The histogram is
`analyzed over time to identify each opening and closing of the eye, and characteristics indicative of the person falling asleep are determined.
`A sub-area of the image including the eye may be determined by identifying the head or a facial characteristic of the person, and then
`identifying the sub-area using an anthropomorphic model. To determine openings and closings of the eyes, histograms of shadowed pixels
`of the eye are analyzed to determine the width and height of the shadowing, or histograms of movement corresponding to blinking are
`analyzed. An apparatus for detecting a person falling asleep includes a sensor for acquiring an image of the face of the person, a controller,
`and a histogram formation unit for forming a histogram on pixels having selected characteristics. Also disclosed is a rear-view mirror
`assembly incorporating the apparatus.
`
`Petitioner LG Ex-1018, 0001
`
`
`
`FOR THE PURPOSES OF INFORMATION ONLY
`
`Codes used to identify States party to the PCT on the front pages of pamphlets publishing international applications under the PCT.
`
`AL
`AM
`AT
`AU
`AZ
`BA
`BB
`BE
`BF
`BG
`BJ
`BR
`BY
`CA
`CF
`CG
`CH
`CI
`CM
`CN
`cu
`CZ
`DE
`DK
`EE
`
`Albania
`Annenia
`Austria
`Australia
`Azerbaijan
`Bosnia and Herzegovina
`Barbados
`Belgium
`Burkina Faso
`Bulgaria
`Benin
`Brazil
`Belarus
`Canada
`Central African Republic
`Congo
`Switzerland
`C1\te d'Ivoire
`Cameroon
`China
`Cuba
`Czech Republic
`Gennany
`Denmark
`Estonia
`
`ES
`FI
`FR
`GA
`GB
`GE
`GH
`GN
`GR
`HU
`IE
`IL
`IS
`IT
`JP
`KE
`KG
`KP
`
`KR
`KZ
`LC
`LI
`LK
`LR
`
`Spain
`Finland
`France
`Gabon
`United Kingdom
`Georgia
`Ghana
`Guinea
`Greece
`Hungary
`Ireland
`Israel
`Iceland
`Italy
`Japan
`Kenya
`Kyrgyzstan
`Democratic People's
`Republic of Korea
`Republic of Korea
`Kazakstan
`Saint Lucia
`Liechtenstein
`Sri Lanka
`Liberia
`
`LS
`LT
`LU
`LV
`MC
`MD
`MG
`MK
`
`ML
`MN
`MR
`MW
`MX
`NE
`NL
`NO
`NZ
`PL
`PT
`RO
`RU
`SD
`SE
`SG
`
`Lesotho
`Lithuania
`Luxembourg
`Latvia
`Monaco
`Republic of Moldova
`Madagascar
`The fonner Yugoslav
`Republic of Macedonia
`Mali
`Mongolia
`Mauritania
`Malawi
`Mexico
`Niger
`Netherlands
`Norway
`New Zealand
`Poland
`Portugal
`Romania
`Russian Federation
`Sudan
`Sweden
`Singapore
`
`SI
`SK
`SN
`sz
`TD
`TG
`TJ
`TM
`TR
`TT
`UA
`UG
`us
`uz
`VN
`YU
`zw
`
`Slovenia
`Slovakia
`Senegal
`Swaziland
`Chad
`Togo
`Tajikistan
`Turkmenistan
`Turkey
`Trinidad and Tobago
`Ukraine
`Uganda
`United States of America
`Uzbekistan
`Viet Nam
`Yugoslavia
`Zimbabwe
`
`Petitioner LG Ex-1018, 0002
`
`
`
`WO99/36893
`
`PCT /EP99/00300
`
`METHOD AND APPARATUS FOR DETECTION OF DROWSINESS
`
`Binford
`
`BACKGROUND OF THE INVENTION
`
`1.
`
`Field of the Invention.
`
`The present invention relates generally to an image processing system, and
`
`more particularly to the use of a generic image processing system to detect drowsiness.
`
`2.
`
`Description of the Related Art.
`
`It is well known that a significant number of highway accidents result from
`
`drivers becoming drowsy or falling asleep, which results in many deaths and injuries.
`
`Drowsiness is also a problem in other fields, such as for airline pilots and power plant
`
`operators, in which great damage may result from failure to stay alert.
`
`A number of different physical criteria may be used to establish when a person
`
`is drowsy, including a change in the duration and interval of eye blinking. Normally, the
`
`duration of blinking is about 100 to 200 ms when awake and about 500 to 800 ms when
`
`drowsy. The time interval between successive blinks is generally constant while awake, but
`
`varies within a relatively broad range when drowsy.
`
`Numerous devices have been proposed to detect drowsiness of drivers. Such
`
`devices are shown, for example, in U.S. Patent Nos. 5,841,354; 5,813,99;
`
`5,689,241;5,684,461; 5,682,144; 5,469,143; 5,402,109; 5,353,013; 5,195,606; 4,928,090;
`
`4,555,697; 4,485,375; and 4,259,665. In general, these devices fall into three categories: i)
`
`devices that detect movement of the head of the driver, e.g., tilting; ii) devices that detect a
`
`physiological change in the driver, e.g., altered heartbeat or breathing, and iii) devices that
`
`CONFIRMATION COPY
`
`Petitioner LG Ex-1018, 0003
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`detect a physical result of the driver falling asleep, e.g., a reduced grip on the steering wheel.
`
`None of these devices is believed to have met with commercial success.
`
`Commonly-owned PCT Application Serial Nos. PCT/FR97/01354 and
`
`PCT/EP98/05383 disclose a generic image processing system that operates to localize objects
`
`in relative movement in an image and to determine the speed and direction of the objects in
`
`real-time. Each pixel of an image is smoothed using its own time constant. A binary value
`
`corresponding to the existence of a significant variation in the amplitude of the smoothed
`
`pixel from the prior frame, and the amplitude of the variation, are determined, and the time
`
`constant for the pixel is updated. For each particular pixel, two matrices are formed that
`
`include a subset of the pixels spatially related to the particular pixel. The first matrix contains
`
`the binary values of the subset of pixels. The second matrix contains the amplitude of the
`
`variation of the subset of pixels. In the first matrix, it is determined whether the pixels along
`
`an oriented direction relative to the particular pixel have binary values representative of
`
`significant variation, and, for such pixels, it is determined in the second matrix whether the
`
`amplitude of these pixels varies in a known manner indicating movement in the oriented
`
`directior.. In domains that include luminance, hue, saturation, speed, oriented direction, time
`
`constant, and x and y position, a histogram is formed of the values in the first and second
`
`matrices falling in user selected combinations of such domains. Using the histograms, it is
`
`determined whether there is an area having the characteristics of the selected combinations of
`
`domains.
`
`It would be desirable to apply such a generic image processing system to
`
`detect the
`
`drowsiness of a person.
`
`2
`
`Petitioner LG Ex-1018, 0004
`
`
`
`WO99/36893
`
`PCT /EP99/00300
`
`SUMMARY OF THE INVENTION
`
`The present invention is a process of detecting a driver falling asleep in which
`
`an image of the face of the driver is acquired. Pixels of the image having characteristics
`
`corresponding to characteristics.or'at least one eye of the driver are selected and a histogram
`
`is formed of the selected pixels. The histogram is analyzed over time to identify each
`
`opening and closing of the eye, and from the eye opening and closing information,
`
`characteristics indicative of a driver falling asleep are determined.
`
`In one embodiment, a sub-area of the image comprising the eye is determined
`
`prior to the step of selecting pixels of the image having characteristics corresponding to
`
`characteristics of an eye. In this embodiment, the step of selecting pixels of the image having
`
`characteristics of an eye involves selecting pixels within the sub-area of the image. The step
`
`of identifying a sub-area of the image preferably involves identifying the head of the driver,
`
`or a facial characteristic of the driver, such as the driver's nostrils, and then identifying the
`
`sub-area of the image using an anthropomorphic model. The head of the driver may be
`
`identified by selecting pixels of the image having characteristics corresponding to edges of
`
`the head of the driver. Histograms of the selected pixels of the edges of the driver's head are
`
`projected onto orthogonal axes. These histograms are then analyzed to identify the edges of
`
`the driver's head.
`
`The facial characteristic of the driver may be identified by selecting pixels of
`
`the image having characteristics corresponding to the facial characteristic. Histograms of the
`
`selected pixels of the facial characteristic are projected onto orthogonal axes. These
`
`histograms are then analyzed to identify the facial characteristic. If desired, the step of
`
`identifying the facial characteristic in the image involves searching sub-images of the image
`
`until the facial characteristic is found. In the case in which the facial characteristic is the
`
`3
`
`Petitioner LG Ex-1018, 0005
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`nostrils of the driver, a histogram is formed of pixels having low luminance levels to detect
`
`the nostrils. To confirm detection of the nostrils, the histograms of the nostril pixels may be
`
`analyzed to determine whether the spacing between the nostrils is within a desired range and
`
`whether the dimensions of the nostrils fall within a desired range. In order to confirm the
`
`identification of the facial characteristic, an anthropomorphic model and the location of the
`
`facial characteristic are used to select a sub-area of the image containing a second facial
`
`characteristic. Pixels of the image having characteristics corresponding to the second facial
`
`characteristic are selected and a histograms of the selected pixels of the second facial
`
`characteristic are analyzed to confirm the identification of the first facial characteristic.
`
`In order to determine openings and closings of the eyes of the driver, the step
`
`of selecting pixels of the image having characteristics corresponding to characteristics of an
`
`eye of the driver involves selecting pixels having low luminance levels corresponding to
`
`shadowing of the eye. In this embodiment, the step analyzing the histogram over time to
`
`identify each opening and closing of the eye involves analyzing the shape of the eye
`
`shadowing to determine openings and closings of the eye. The histograms of shadowed
`
`pixels are preferably projected onto orthogonal axes, and the step of analyzing the shape of
`
`the eye shado\ving involves analyzing the width and height of the shadowing.
`
`An alternative method of determining openings and closings of the eyes of the
`
`driver involves selecting pixels of the image having characteristics of movement
`
`corresponding to blinking. In this embodiment, the step analyzing the histogram over time to
`
`identify each opening and closing of the eye involves analyzing the number of pixels in
`
`movement corresponding to blinking over time. The characteristics of a blinking eye are
`
`preferably selected from the group consisting of i) DP= 1, ii) CO indicative of a blinking
`
`4
`
`Petitioner LG Ex-1018, 0006
`
`
`
`WO99/36893
`
`PCT /EP99/00300
`
`eyelid, iii) velocity indicative of a blinking eyelid, and iv) up and down movement indicative
`
`of a blinking eyelid.
`
`An apparatus for detecting a driver falling asleep includes a sensor for
`
`acquiring an image of the face af the driver, a controller, and a histogram formation unit for
`
`forming a histogram on pixels having selected characteristics. The controller controls the
`
`histogram formation unit to select pixels of the image having characteristics corresponding to
`
`characteristics of at least one eye of the driver and to form a histogram of the selected pixels.
`
`The controller analyzes the histogram over time to identify each opening and closing of the
`
`eye, and determines from the opening and closing information on the eye, characteristics
`
`indicative of the driver falling asleep.
`
`In one embodiment, the controller interacts with the histogram formation unit
`
`to identify a sub-area of the image comprising the eye, and the controller controls the
`
`histogram formation unit to select pixels of the image having characteristics corresponding to
`
`characteristics of the eye only within the sub-area of the image. In order to select the sub-area
`
`of the image, the controller interacts with the histogram formation unit to identify the head of
`
`the driver in the image, or a facial characteristic of the driver, such as the driver's nostrils.
`
`The controller then identifies the sub-area of the image using an anthropomorphic model. To
`
`identify the head of the driver, the histogram formation unit selects pixels of the image having
`
`characteristics corresponding to edges of the head of the driver and forms histograms of the
`
`selected pixels projected onto orthogonal axes. To identify a facial characteristic of the
`
`driver, the histogram formation unit selects pixels of the image having characteristics
`
`corresponding to the facial characteristic and forms histograms of the selected pixels
`
`projected onto orthogonal axes. The controller then analyzes the histograms of the selected
`
`pixels to identify the edges of the head of the driver or the facial characteristic, as the case
`
`5
`
`Petitioner LG Ex-1018, 0007
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`may be. If the facial characteristic is the nostrils of the driver, the histogram formation unit
`
`selects pixels of the image having low luminance levels corresponding to the luminance level
`
`of the nostrils. The controller may also analyze the histograms of the nostril pixels to
`
`determine whether the spacing between the nostrils is within a desired range and whether
`
`dimensions of the nostrils fall within a desired range. If desired, the controller may interact
`
`with the histogram formation unit to search sub-images of the image to identify the facial
`
`characteristic.
`
`In order to verify identification of the facial characteristic, the controller uses
`
`an anthropomorphic model and the location of the facial characteristic to cause the histogram
`
`formation unit to select a sub-area of the image containing a second facial characteristic. The
`
`histogram formation unit selects pixels of the image in the sub-area having characteristics
`
`corresponding to the second facial characteristic and forms a histogram of such pixels. The
`
`controller then analyzes the histogram of the selected pixels corresponding to the second
`
`facial characteristic to identify the second facial characteristic and to thereby confirm the
`
`identification of the first facial characteristic.
`
`In one embodiment, the histogram formation unit selects pixels of the image
`
`having low luminance levels corresponding to shadowing of the eyes, and the controller then
`
`analyzes the shape of the eye shadowing to identify shapes corresponding to openings and
`
`closings of the eye. The histogram formation unit preferably forms histograms of the
`
`shadowed pixels of the eye projected onto orthogonal axes, and the controller analyzes the
`
`width and height of the shadowing to determine openings and closings of the eye.
`
`In an alternative embodiment, the histogram formation unit selects pixels of
`
`the image in movement corresponding to blinking and the controller analyzes the number of
`
`pixels in movement over time to determine openings and closings of the eye. The
`
`6
`
`Petitioner LG Ex-1018, 0008
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`characteristics of movement corresponding to blinking are preferably selected from the group
`
`consisting of i) DP=l, ii) CO indicative of a blinking eyelid, iii) velocity indicative of a
`
`blinking eyelid, and iv) up and down movement indicative of a blinking eyelid.
`
`If desired, the sensor may be integrally constructed with the controller and the
`
`histogram formation unit. The apparatus may comprise an alarm, which the controller
`
`operates upon detection of the driver falling asleep, and may comprise an illumination source,
`
`such as a source of IR radiation, with the sensor being adapted to view the driver when
`
`illuminated by the illumination source.
`
`A rear-view minor assembly comprises a rear-view mirror and the described
`
`apparatus for detecting driver drowsiness mounted to the rear-view mirror. In one
`
`embodiment, a bracket attaches the apparatus to the rear-view mirror. In an alternative
`
`embodiment, the rear-view mirror comprises a housing having an open side and an interior.
`
`The rear-view mirror is mounted to the open side of the housing, and is see-through from the
`
`interior of the housing to the exterior of the housing. The drowsiness detection apparatus is
`
`mounted interior to the housing with the sensor directed toward the rear-view mirror. If
`
`desired, a joint attaches the apparatus to the rear-view mirror assembly, with the joint being
`
`adapted to maintain the apparatus in a position facing the driver during adjustment of the
`
`mirror assembly by the driver. The rear-view mirror assembly may include a source of
`
`illumination directed toward the driver, with the sensor adapted to view the driver when
`
`illuminated by the source of illumination. The rear-view mirror assembly may also include
`
`an alarm, with the controller operating the alarm upon detection of the driver falling asleep.
`
`Also disclosed is a vehicle comprising the drowsiness detection device.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Fig. 1 is a diagrammatic illustration of the system according to the invention.
`
`7
`
`Petitioner LG Ex-1018, 0009
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`Fig. 2 is a block diagram of the temporal and spatial processing units of the
`
`invention.
`
`invention.
`
`Fig. 3 is a block diagram of the temporal processing unit of the invention.
`
`Fig. 4 is a block diagram of the spatial processing unit of the invention.
`
`Fig. 5 is a diagram showing the processing of pixels in accordance with the
`
`Fig. 6 illustrates the numerical values of the Freeman code used to determine
`
`movement direction in accordance with the invention.
`
`Fig. 7 illustrates nested matrices as processed by the temporal processing unit.
`
`Fig. 8 illustrates hexagonal matrices as processed by the temporal processing
`
`unit.
`
`unit.
`
`Fig. 9 illustrates reverse-L matrices as processed by the temporal processing
`
`Fig. 10 illustrates angular sector shaped matrices as processed by the temporal
`
`processing unit.
`
`Fig. 11 is a block diagram showing the relationship between the temporal and
`
`spatial processing units, and the histogram formation units.
`
`Fig. 12 is a block diagram showing the interrelationship between the various
`
`histogram formation units.
`
`Fig. 13 shows the formation of a two-dimensional histogram of a moving area
`
`from two one-dimensional histograms.
`
`Fig. 14 is a block diagram of an individual histogram formation unit.
`
`Figs. 15A and 15B illustrate the use of a histogram formation unit to find the
`
`orientation of a line relative to an analysis axis.
`
`8
`
`Petitioner LG Ex-1018, 0010
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`Fig. 16 illustrates a one-dimensional histogram.
`
`Fig. 17 illustrates the use of semi-graphic sub-matrices to selected desired
`
`areas of an image.
`
`Fig. 18 is a side vie'w illustrating a rear view mirror in combination with the
`
`drowsiness detection system of the invention.
`
`Fig. 19 is a top view illustrating operation of a rear view mirror.
`
`Fig. 20 is a schematic illustrating operation of a rear view mirror.
`
`Fig. 21 is a cross-sectional top view illustrating a rear view mirror assembly
`
`incorporating the drowsiness detection system of the invention.
`
`Fig. 22 is a partial cross-sectional top view illustrating a joint supporting the
`
`drowsiness detection system of the invention in the mirror assembly of Fig. 21.
`
`Fig. 23 is a top view illustrating the relationship between the rear view mirror
`
`assembly of Fig. 21 and a driver.
`
`Fig. 24 illustrates detection of the edges of the head of a person using the
`
`system of the invention.
`
`Fig. 25 illustrates masking outside of the edges of the head of a person.
`
`Fig. 26 illustrates masking outside of the eyes of a person.
`
`Fig. 27 illustrates detection of the eyes of a person using the system of the
`
`invention.
`
`Fig. 28 illustrates successive blinks in a three-dimensional orthogonal
`
`coordinate system.
`
`Figs. 29A and 29B illustrate conversion of peaks and valleys of eye movement
`
`histograms to information indicative of blinking.
`
`9
`
`Petitioner LG Ex-1018, 0011
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`Fig. 30 is a flow diagram illustrating the use of the system of the invention to
`
`detect drowsiness.
`
`Fig. 31 illustrates the use of sub-images to search a complete image.
`
`Fig. 32 illustrates tlie use of the system of the invention to detect nostrils and
`
`to track eye movement.
`
`Fig. 33 illustrates the use of the system of the invention to detect an open eye.
`
`Fig. 34 illustrates the use of the system of the invention to detect a closed eye.
`
`Fig. 35 is a flow diagram of an alternative method of detecting drowsiness.
`
`Fig. 36 illustrates use of the system to detect a pupil.
`
`DETAILED DESCRIPTION OF THE INVENTION
`
`The present invention discloses an application of the generic image processing
`
`system disclosed in commonly-owned PCT Application Serial Nos. PCT/FR97/01354 and
`
`PCT/EP98/05383, the contents of which are incorporated herein by reference for detection of
`
`various criteria associated with the human eye, and especially to detection that a driver is
`
`falling asleep while driveing a vehicle.
`
`The apparatus of the invention is similar to that described in the
`
`aforementioned PCT Application Serial Nos. PCT/FR97/01354 and PCT/EP98/05383, which
`
`will be described herein for purposes of clarity. Referring to Figs. 1 and 10, the generic
`
`image processing system 22 includes a spatial and temporal processing unit 11 in
`
`combination with a histogram formation unit 22a. Spatial and temporal processing unit 11
`
`includes an input 12 that receives a digital video signal S originating from a video camera or
`
`other imaging device 13 which monitors a scene 13 a. Imaging device 13 is preferably a
`
`conventional CMOS-type CCD camera, which for purposes of the presently-described
`
`invention is mounted on a vehicle facing the driver. It will be appreciated that when used in
`
`10
`
`Petitioner LG Ex-1018, 0012
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`non-vehicluar applications, the camera may be mounted in any desired fashion to detect the
`
`specific criteria of interest. It is also foreseen that any other appropriate sensor, e.g.,
`
`ultrasound, IR, Radar, etc., may be used as the imaging device. Imaging device 13 may have
`
`a direct digital output, or an analog output that is converted by an AID convertor into digital
`
`signal S. Imaging device 13 may also be integral with generic image processing system 22, if
`
`desired.
`
`While signal S may be a progressive signal, it is preferably composed of a
`
`succession of pairs of interlaced frames, TR1 and TR' 1 and TR2 and TR'2, each consisting of a
`
`succession of horizontal scanned lines, e.g., 111 , 11.2, ••• ,11.1 7 in TR1, and 2.1 in TR2• Each line
`
`consists of a succession of pixels or image-points Pl, e.g., au, au and au for line 111 ; a1 17.,
`
`and al 17.22 for line 1117 ; alu and a1. 2 for line 12.1• Signal S(PI) represents signal S composed of
`
`pixels Pl.
`
`S(PI) includes a frame synchronization signal (ST) at the beginning of each
`
`frame, a line synchronization signal (SL) at the beginning of each line, and a blanking signal
`
`(BL). Thus, S(PI) includes a succession frames, which are representative of the time domain,
`
`and within each frame, a series of lines and pixels, which are representative of the spatial
`
`domain.
`
`In the time domain, "successive frames" shall refer to successive frames of the
`
`same type (i.e., odd frames such as TR1 or even frames such as TR' 1), and "successive pixels
`
`in the same position" shall denote successive values of the pixels (PI) in the same location in
`
`successive frames of the same type, e.g., a11 oflu in frame TR 1 and au of 11.1 in the next
`
`corresponding frame TR2
`
`Spatial and temporal processing unit 11 generates outputs ZH and SR 14 to a
`
`data bus 23 (Fig. 11 ), which are preferably digital signals. Complex signal ZH comprises a
`
`11
`
`Petitioner LG Ex-1018, 0013
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`number of output signals generated by the system, preferably including signals indicating the
`
`existence and localization of an area or object in motion, and the speed V and the oriented
`
`direction of displacement DI of each pixel of the image. Also preferably output from the
`
`system is input digital video signal S, which is delayed (SR) to make it synchronous with the
`
`output ZH for the frame, taking into account the calculation time for the data in composite
`
`signal ZH (one frame). The delayed signal SR is used to display the image received by
`
`camera 13 on a monitor or television screen I 0, which may also be used to display the
`
`information contained in composite signal ZH. Composite signal ZH may also be transmitted
`
`to a separate processing assembly I Oa in which further processing of the signal may be
`
`accomplished.
`
`Referring to Fig. 2, spatial and temporal processing unit 11 includes a first
`
`assembly 11 a, which consists of a temporal processing unit 15 having an associated memory
`
`16, a spatial processing unit 17 having a delay unit 18 and sequencing unit 19, and a pixel
`
`clock 20, which generates a clock signal HP, and which serves as a clock for temporal
`
`processing unit 15 and sequencing unit 19. Clock pulses HP are generated by clock 20 at the
`
`pixel rate of the image, which is preferably 13.5 MHZ.
`
`Fig. 3 shows the operation of temporal processing unit 15, the function of
`
`which is to smooth the video signal and generate a number of outputs that are utilized by
`
`spatial processing unit 17. During processing, temporal processing unit 15 retrieves from
`
`memory 16 the smoothed pixel values LI of the digital video signal from the immediately
`
`prior frame, and the values of a smoothing time constant CI for each pixel. As used herein,
`
`LO and CO shall be used to denote the pixel values (L) and time constants (C) stored in
`
`memory 16 from temporal processing unit 15, and LI and CI shall denote the pixel values (L)
`
`and time constants (C) respectively for such values retrieved from memory 16 for use by
`
`12
`
`Petitioner LG Ex-1018, 0014
`
`
`
`WO99/36893
`
`PCT /EP99/00300
`
`temporal processing unit 15. Temporal processing unit 15 generates a binary output signal
`
`DP for each pixel, which identifies whether the pixel has undergone significant variation, and
`
`a digital signal CO, which represents the updated calculated value of time constant C.
`
`Referring to Fig.·3,"temporal processing unit 15 includes a first block 15a
`
`which receives the pixels PI of input video signal S. For each pixel PI, the temporal
`
`processing unit retrieves from memory 16 a smoothed value LI of this pixel from the
`
`immediately preceding corresponding frame, which was calculated by temporal processing
`
`unit 15 during processing of the immediately prior frame and stored in memory 16 as LO.
`
`Temporal processing unit 15 calculates the absolute value AB of the difference between each
`
`pixel value PI and LI for the same pixel position (for example a1.1, of 11.1 in TR, and of 11.1 in
`
`TR2 :
`
`AB= IPI-LI I
`
`Temporal processing unit 15 is controlled by clock signal HP from clock 20 in
`
`order to maintain synchronization with the incoming pixel stream. Test block 15b of
`
`temporal processing unit 15 receives signal AB and a threshold value SE. Threshold SE may
`
`be constant, but preferably varies based upon the pixel value PI, and more preferably varies
`
`with the pixel value so as to form a gamma correction. Known means of varying SE to form
`
`a gamma correction is represented by the optional block 15e shown in dashed lines. Test
`
`block 15b compares, on a pixel-by-pixel basis, digital signals AB and SE in order to
`
`determine a binary signal DP. If AB exceeds threshold SE, which indicates that pixel value
`
`PI has undergone significant variation as compared to the smoothed value LI of the same
`
`pixel in the prior frame, DP is set to "1" for the pixel under consideration. Otherwise, DP is
`
`set to "O" for such pixel.
`
`13
`
`Petitioner LG Ex-1018, 0015
`
`
`
`WO99/36893
`
`PCT /EP99/00300
`
`When DP = 1, the difference between the pixel value PI and smoothed value
`
`LI of the same pixel in the prior frame is considered too great, and temporal processing unit
`
`15 attempts to reduce this difference in subsequent frames by reducing the smoothing time
`
`constant C for that pixel. Conv~rsely, if DP= 0, temporal processing unit 15 attempts to
`
`increase this difference in subsequent frames by increasing the smoothing time constant C for
`
`that pixel. These adjustments to time constant C as a function of the value of DP are made by
`
`block 15c. If DP = 1, block 15c reduces the time constant by a unit value U so that the new
`
`value of the time 7constant CO equals the old value of the constant CI minus unit value U.
`
`CO=CI-U
`
`If DP = 0, block 15c increases the time constant by a unit value U so that the
`
`new value of the time constant CO equals the old value of the constant Cl plus unit value U.
`
`CO=CI+U
`
`Thus, for each pixel, block 15c receives the binary signal DP from test unit
`
`15b and time constant CI from memory 16, adjusts CI up or down by unit value U, and
`
`generates a new time constant CO which is stored in memory 16 to replace time constant Cl.
`
`In a preferred embodiment, time constant C, is in the form 2P, where pis
`
`incremented or decremented by unit value U, which preferably equals 1, in block 15c. Thus,
`
`if DP= 1, block 15c subtracts one (for the case where U=l) from pin the time constant 2P
`
`which becomes 2P· 1
`
`• lfDP = 0, block 15c adds one top in time constant 2P, which becomes
`
`
`
`2P· 1• The choice of a time constant of the form 2P facilitates calculations and thus simplifies
`
`the structure of block 15c.
`
`Block 15c includes several tests to ensure proper operation of the system.
`
`First, CO must remain within defined limits. In a preferred embodiment, CO must not
`
`become negative (CO2: 0) and it must not exceed a limit N (CO SN), which is preferably
`
`14
`
`Petitioner LG Ex-1018, 0016
`
`
`
`WO 99/36893
`
`PCT /EP99/00300
`
`seven. In the instance in which CI and CO are in the form 2P, the upper limit N is the
`
`maximum value for p.
`
`The upper limit N may be constant, but is preferably variable. An optional
`
`input unit 15f includes a register or memory that enables the user, or controller 42 to vary N.
`
`The consequence of increasing N is to increase the sensitivity of the system to detecting
`
`displacement of pixels, whereas reducing N improves detection of high speeds. N may be
`
`made to depend on PI (N may vary on a pixel-by-pixel basis, if desired) in order to regulate
`
`the variation of LO as a function of the lever of PI, i.e., Nijt = f(Piiji), the calculation of which
`
`is done in block 15f, which in this case would receive the va