`
`SAMSUNG EXHIBIT 1005
`Samsung v. Image Processing Techs.
`
`
`
`
`FOR THE PURPOSES OF INFORMATION ONLY
`
`Codes used to identify States party to the PCT on the front pages of pamphlets publishing international applications under the PCT.
`
`
`
`
`AL
`AM
`
`AT
`AU
`
`AZ
`BA
`BB
`
`BE
`BF
`
`BG
`BJ
`BR
`
`BY
`CA
`
`CF
`
`CG
`
`CH
`cI
`
`CM
`CN
`cu
`
`CZ
`DE
`
`DK
`
`
`
`Albania
`Armenia
`Austria
`Australia
`Azerbaijan
`Bosnia and Herzegovina
`Barbados
`Belgium
`Burkina Faso
`Bulgaria
`Benin
`Brazil
`Belarus
`Canada
`Central African Republic
`Congo
`Switzerland
`Céte d’Ivoire
`Cameroon
`China
`Cuba
`Czech Republic
`Germany
`Denmark
`Estonia
`
`
`
`
`
`ES
`FI
`FR
`GA
`GB
`GE
`GH
`GN
`GR
`HU
`ti
`IL
`IS
`IT
`JP
`KE
`KG
`KP
`
`KR
`KZ
`Le
`Li
`LK
`LR
`
`Spain
`Finland
`France
`Gabon
`United Kingdom
`Georgia
`Ghana
`Guinea
`Greece
`Hungary
`Treland
`Israel
`Iceland
`Ttaly
`Japan
`Kenya
`Kyrgyzstan
`Democratic People’s
`Republic of Korea
`Republic of Korea
`Kazakstan
`Saint Lucia
`Liechtenstein
`Sri Lanka
`Liberia
`
`Slovenia
`SI
`Slovakia
`SK
`Senegal
`SN
`Swaziland
`SZ
`Chad
`TD
`Togo
`TG
`Tajikistan
`TJ
`™ Turkmenistan
`TR
`Turkey
`TT
`Trinidad and Tobago
`VA
`Ukraine
`UG
`Uganda
`US
`United States of America
`UZ
`Uzbekistan
`VN
`Viet Nam
`YU
`Yugoslavia
`ZW
`Zimbabwe
`
`LS
`LT
`LU
`LV
`MC
`MD
`MG
`MK
`
`ML
`MN
`MR
`MW
`MX
`NE
`NL
`NO
`NZ
`PL
`PT
`RO
`RU
`sD
`SE
`SG
`
`Lesotho
`Lithuania
`Luxembourg
`Latvia
`Monaco
`Republic of Moldova
`Madagascar
`The former Yugoslav
`Republic of Macedonia
`Mali
`Mongolia
`Mauritania
`Malawi
`Mexico
`-
`Niger
`Netherlands
`Norway
`New Zealand
`Poland
`Portugal
`Romania
`Russian Federation
`Sudan
`Sweden
`Singapore
`
`SAMSUNG EXHIBIT 1005
`Page 2 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 2 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`METHOD AND APPARATUS FOR DETECTION OF DROWSINESS
`
`Binford
`
`BACKGROUNDOE THE INVENTION
`
`1.
`
`Field of the Invention.
`
`The present invention relates generally to an image processing system, and
`
`more particularly to the use of a generic image processing system to detect drowsiness.
`
`
`
`2.DescriptionoftheRelatedArt.
`
`It is well knownthat a significant numberof highwayaccidentsresult from
`
`drivers becoming drowsyor falling asleep, which results in many deaths andinjuries.
`
`Drowsiness is also a problem in otherfields, such as forairline pilots and power plant
`
`operators, in which great damage mayresult from failure to stay alert.
`
`A numberofdifferent physical criteria may be used to establish when a person
`
`is drowsy, including a changein the duration andinterval of eye blinking. Normally, the
`
`duration of blinking is about 100 to 200 ms when awake and about 500 to 800 ms when
`
`drowsy. The time interval between successive blinks is generally constant while awake,but
`
`varies within a relatively broad range when drowsy.
`
`Numerousdevices have been proposed to detect drowsinessofdrivers. Such
`
`devices are shown, for example, in U.S. Patent Nos. 5,841,354; 5,813,99;
`
`5,689,241;5,684,461; 5,682,144; 5,469,143; 5,402,109; 5,353,013; 5,195,606; 4,928,090;
`
`4,555,697; 4,485,375; and 4,259,665.
`
`In general, these devices fall into three categories: i)
`
`devices that detect movementof the head ofthe driver, e.g., tilting; ii) devices that detect a
`
`physiological changein the driver, e.g., altered heartbeat or breathing, and iii) devices that
`
`CONFIRMATION COPY
`
`SAMSUNG EXHIBIT 1005
`Page 3 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 3 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`detect a physicalresult of the driver falling asleep, e.g., a reduced grip on the steering wheel.
`
`None ofthese devices is believed to have met with commercial success.
`
`Commonly-owned PCT Application Serial Nos. PCT/FR97/01354 and
`
`PCT/EP98/05383 disclose a generic image processing system that operates to localize objects
`
`in relative movement in an image and to determine the speed and direction ofthe objects in
`
`real-time. Each pixel of an imageis smoothedusing its own time constant. A binary value
`
`correspondingto the existence ofa significant variation in the amplitude of the smoothed
`
`pixel from the prior frame, and the amplitude of the variation, are determined, and the time
`
`constant for the pixel is updated. For each particular pixel, two matrices are formed that
`
`include a subsetof the pixels spatially related to the particular pixel. The first matrix contains
`
`the binary values of the subset of pixels. The second matrix contains the amplitude of the
`
`variation of the subsetofpixels. In the first matrix, it is determined whetherthe pixels along
`
`an oriented direction relative to the particular pixel have binary values representative of
`
`significant variation, and, for such pixels, it is determined in the second matrix whether the
`
`amplitude of these pixels varies in a known mannerindicating movementin the oriented
`
`direction.
`
`In domainsthat include luminance,hue, saturation, speed, oriented direction, time
`
`constant, and x andy position, a histogram is formed ofthe values in the first and second
`
`matrices falling in user selected combinations of such domains. Using the histograms,it is
`
`determined whetherthere is an area having the characteristics of the selected combinations of
`
`domains.
`
`It would be desirable to apply such a generic image processing system to
`
`detect the
`
`drowsinessofa person.
`
`SAMSUNG EXHIBIT 1005
`Page 4 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 4 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`SUMMARYOF THE INVENTION
`
`The present invention is a process of detecting a driverfalling asleep in which
`
`an image ofthe face of the driver is acquired. Pixels of the image having characteristics
`
`corresponding to characteristics.ofat least one eye ofthe driver are selected and a histogram
`
`is formedof the selected pixels. The histogram is analyzed over time to identify each
`opening and closing ofthe eye, and from the eye opening and closing information,
`
`characteristics indicative of a driver falling asleep are determined.
`
`In one embodiment, a sub-area of the image comprising the eye is determined
`
`prior to the step of selecting pixels of the image having characteristics corresponding to
`
`characteristics of an eye. In this embodiment,the step ofselecting pixels of the image having
`
`characteristics of an eye involves selecting pixels within the sub-area of the image. The step
`
`of identifying a sub-area of the image preferably involves identifying the head ofthe driver,
`
`or a facial characteristic of the driver, such as the driver's nostrils, and then identifying the
`
`sub-area of the image using an anthropomorphic model. The head of the driver may be
`
`identified by selecting pixels of the image having characteristics corresponding to edges of
`
`the head of the driver. Histograms of theselected pixels ofthe edges of the driver's head are
`
`projected onto orthogonal axes. These histogramsare then analyzed to identify the edges of
`
`the driver's head.
`
`The facial characteristic of the driver may beidentified by selecting pixels of
`
`the image having characteristics corresponding to the facial characteristic. Histograms of the
`
`selected pixels of the facial characteristic are projected onto orthogonal axes. These
`
`histograms are then analyzedto identify the facial characteristic. If desired, the step of
`identifying the facial characteristic in the image involves searching sub-images ofthe image
`
`until the facial characteristic is found. In the case in which the facial characteristic is the
`
`SAMSUNG EXHIBIT 1005
`Page 5 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 5 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`nostrils of the driver, a histogram is formedof pixels having low luminancelevels to detect
`
`the nostrils. To confirmdetectionof the nostrils, the histogramsofthe nostril pixels may be
`
`analyzed to determine whetherthe spacing between the nostrils is within a desired range and
`
`whetherthe dimensionsof the nostrils fall within a desired range. In order to confirm the
`
`identification of the facial characteristic, an anthropomorphic modelandthe location ofthe
`
`facial characteristic are used to select a sub-area of the image containing a secondfacial
`characteristic. Pixels of the image having characteristics correspondingto the secondfacial
`
`characteristic are selected and a histogramsofthe selected pixels of the second facial
`
`characteristic are analyzed to confirm the identification ofthe first facial characteristic.
`
`In order to determine openingsandclosings ofthe eyes ofthe driver, the step
`
`of selecting pixels of the image having characteristics correspondingto characteristics of an
`
`eye of the driver involvesselecting pixels having low luminance levels corresponding to
`
`shadowingofthe eye. In this embodiment, the step analyzing the histogram overtimeto
`
`identify each opening andclosing of the eye involves analyzing the shape ofthe eye
`
`shadowing to determine openings and closings of the eye. The histograms of shadowed
`
`pixels are preferably projected onto orthogonal axes, and the step of analyzing the shape of
`
`the eye shadowinginvolves analyzing the width and height ofthe shadowing.
`
`Analternative method of determining openings and closingsofthe eyesof the
`driver involves selecting pixels of the image having characteristics of movement
`
`correspondingto blinking. In this embodiment, the step analyzing the histogram over time to
`
`identify each opening andclosing of the eye involves analyzing the numberofpixels in
`
`movement correspondingto blinking over time. The characteristics of a blinking eye are
`
`preferably selected from the group consisting of i) DP=1, ii) CO indicative of a blinking
`
`SAMSUNG EXHIBIT 1005
`Page6 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 6 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`eyelid,ili) velocity indicative of a blinking eyelid, and iv) up and down movementindicative
`
`of a blinking eyelid.
`
`Anapparatusfor detecting a driver falling asleep includes a sensorfor
`
`acquiring an imageofthe face af the driver, a controller, and a histogram formation unit for
`
`forming a histogram on pixels having selected characteristics. The controller controls the
`histogram formation unit to select pixels ofthe image having characteristics corresponding to
`characteristics of at least one eye of the driver and to form a histogram oftheselectedpixels.
`
`Thecontroller analyzes the histogram overtime to identify each openingandclosing ofthe
`
`eye, and determines from the opening andclosing information onthe eye, characteristics
`
`indicative of the driver falling asleep.
`
`In one embodiment, the controller interacts with the histogram formation unit
`
`to identify a sub-area of the image comprising the eye, and the controller controls the
`
`histogram formation unit to select pixels of the image having characteristics corresponding to
`
`characteristics of the eye only within the sub-area of the image. In orderto select the sub-area
`
`ofthe image, the controller interacts with the histogram formation unit to identify the head of
`
`the driver in the image, or a facial characteristic of the driver, such as the driver's nostrils.
`
`The controller then identifies the sub-area of the image using an anthropomorphic model. To
`
`identify the head ofthe driver, the histogram formation unit selects pixels of the image having
`
`characteristics corresponding to edgesof the head of the driver and formshistogramsofthe
`
`selected pixels projected onto orthogonal axes. To identify a facial characteristic of the
`
`driver, the histogram formationunit selects pixels of the image having characteristics
`
`correspondingto the facial characteristic and forms histogramsofthe selectedpixels
`
`projected onto orthogonal axes. The controller then analyzes the histogramsofthe selected
`
`pixels to identify the edges of the headof the driveror the facial characteristic, as the case
`
`SAMSUNG EXHIBIT 1005
`Page 7 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 7 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`may be. If the facial characteristic is the nostrils of the driver, the histogram formation unit
`
`selects pixels of the image having low luminancelevels corresponding to the luminancelevel
`
`of the nostrils. The controller may also analyze the histogramsofthe nostril pixels to
`
`determine whetherthe spacing betweenthe nostrils is within a desired range and whether
`
`dimensionsofthe nostrils fall within a desired range. If desired, the controller may interact
`
`with the histogram formation unit to search sub-imagesof the imageto identify the facial
`
`characteristic.
`
`In order to verify identification of the facial characteristic, the controller uses
`
`an anthropomorphic modelandthe location ofthe facial characteristic to cause the histogram
`
`formation unit to select a sub-area of the image containing a secondfacial characteristic. The
`
`histogram formation unit selects pixels of the imagein the sub-area having characteristics
`
`corresponding to the second facial characteristic and forms a histogram of such pixels. The
`
`controller then analyzes the histogram of the selected pixels corresponding to the second
`
`facial characteristic to identify the second facial characteristic and to thereby confirm the
`
`identification ofthe first facial characteristic.
`
`In one embodiment, the histogram formation unit selects pixels of the image
`
`having low luminance levels corresponding to shadowingofthe eyes, andthe controller then
`
`analyzes the shape of the eye shadowingto identify shapes corresponding to openings and
`closings of the eye. The histogram formation unit preferably forms histogramsof the
`
`shadowedpixels of the eye projected onto orthogonalaxes, and the controller analyzes the
`
`width and height of the shadowing to determine openings and closingsofthe eye.
`
`In an alternative embodiment, the histogram formationunit selects pixels of
`
`the image in movementcorresponding to blinking and the controller analyzes the number of
`
`pixels in movementovertime to determine openings and closings of the eye. The
`
`6
`
`SAMSUNG EXHIBIT 1005
`Page8 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 8 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`characteristics of movementcorresponding to blinking are preferably selected from the group
`
`consisting of i) DP=1, ii) CO indicative of a blinking eyelid,iii) velocity indicative of a
`
`blinking eyelid, and iv) up and down movementindicative of a blinking eyelid.
`
`If desired, the sensormay be integrally constructed with the controller and the
`
`histogram formation unit. The apparatus may comprise an alarm, which the controller
`
`operates upon detection ofthe driver falling asleep, and may comprise an illumination source,
`
`such as a sourceof IR radiation, with the sensor being adapted to view the driver when
`
`illuminated by the illumination source.
`
`A rear-view mirror assembly comprises a rear-view mirror and the described
`
`apparatus for detecting driver drowsiness mountedto the rear-view mirror. In one
`
`embodiment, a bracket attaches the apparatus to the rear-view mirror. In an alternative
`
`embodiment, the rear-view mirror comprises a housing having an openside andaninterior.
`
`The rear-view mirror is mounted to the openside of the housing, andis see-through from the
`
`interior of the housing to the exterior of the housing. The drowsinessdetection apparatusis
`
`mountedinterior to the housing with the sensor directed toward the rear-view mirror. If
`
`desired, a joint attaches the apparatus to the rear-view mirror assembly, with the joint being
`
`adapted to maintain the apparatusin a position facing the driver during adjustment ofthe
`
`mirror assembly by the driver. The rear-view mirror assembly mayinclude a source of
`
`illumination directed toward the driver, with the sensor adapted to view the driver when
`
`illuminated by the source of illumination. The rear-view mirror assembly mayalsoinclude
`
`an alarm, with the controller operating the alarm upon detection of the driver falling asleep.
`
`Also disclosed is a vehicle comprising the drowsiness detection device.
`
`
`
`BRIEFDESCRIPTIONOFTHEDRAWINGS
`
`Fig. 1 is a diagrammatic illustration of the system according to the invention.
`
`SAMSUNG EXHIBIT 1005
`Page 9 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 9 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`Fig. 2 is a block diagram of the temporal and spatial processing units of the
`
`Fig. 3 is a block diagram of the temporalprocessing unit of the invention.
`
`Fig. 4 is a block didgram ofthe spatial processing unit of the invention.
`
`Fig. 5 is a diagram showingthe processing of pixels in accordance with the
`
`invention.
`
`invention.
`
`Fig. 6 illustrates the numerical values of the Freeman code used to determine
`
`movementdirection in accordance with the invention.
`
`Fi_g. 7 illustrates nested matrices as processed by the temporal processing unit.
`
`Fj_g. 8 illustrates hexagonal matrices as processed by the temporal processing
`
`unit.
`
`unit.
`
`Fi— iQ
`g. 9 illustrates reverse-L matrices as processed by the temporal processing
`
`Fj— Q
`g. 10 illustrates angular sector shaped matrices as processed by the temporal
`
`processing unit.
`
`Fig. 11 is a block diagram showing the relationship between the temporal and
`
`spatial processing units, and the histogram formation units.
`
`Fig. 12 is a block diagram showingtheinterrelationship betweenthe various
`
`histogram formation units.
`
`Fig. 13 shows the formation of a two-dimensional histogram of a moving area
`
`from two one-dimensional histograms.
`
`Fig. 14 is a block diagram of an individual histogram formation unit.
`
`Figs. 1S5A and 15Billustrate the use of a histogram formation unitto find the
`
`orientation ofa line relative to an analysis axis.
`
`SAMSUNG EXHIBIT 1005
`Page 10 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 10 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`Fig. 16 illustrates a one-dimensional histogram.
`
`Fig. 17 illustrates the use of semi-graphic sub-matrices to selected desired
`
`areas of an image.
`
`Fig. 18 is a side view illustrating a rear view mirror in combination with the
`
`drowsiness detection system of the invention.
`Fig. 19 is a top viewillustrating operation ofa rear view mirror.
`
`Fig. 20 is a schematic illustrating operation of a rear view mirror.
`
`Fig. 21 is a cross-sectional top viewillustrating a rear view mirror assembly
`
`incorporating the drowsiness detection system of the invention.
`
`Fig. 22 is a partial cross-sectional top viewillustrating a joint supporting the
`
`drowsiness detection system of the invention in the mirror assembly of Fig. 21.
`
`Fig. 23 is a top viewillustrating the relationship between the rear view mirror
`
`assembly of Fig. 21 and a driver.
`
`Fig. 24 illustrates detection of the edges of the head of a person using the
`
`system of the invention.
`
`Fig. 25 illustrates masking outside of the edges of the head of a person.
`
`Fig. 26 illustrates masking outside of the eyes of a person.
`
`Fig. 27 illustrates detection of the eyes of a person using the system ofthe
`
`invention.
`
`Fig. 28 illustrates successive blinks in a three-dimensional orthogonal
`
`coordinate system.
`
`Figs. 29A and 29Billustrate conversion of peaks and valleys of eye movement
`
`histogramsto information indicative of blinking.
`
`SAMSUNG EXHIBIT 1005
`Page 11 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 11 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`Fig. 30 is a flow diagram illustrating the use of the system of the invention to
`
`detect drowsiness.
`
`Fig. 31 illustrates the use of sub-images to search a complete image.
`Fig. 32 illustrates the use of the system ofthe invention to detect nostrils and
`
`to track eye movement.
`
`Fig. 33 illustrates the use of the system of the invention to detect an open eye.
`
`Fig. 34 illustrates the use of the system of the invention to detect a closed eye.
`
`Fig. 35 is a flow diagram ofan alternative method of detecting drowsiness.
`
`Fig. 36 illustrates use of the system to detect a pupil.
`
`DETAILED DESCRIPTION OF THE INVENTION
`
`The present invention discloses an application of the generic image processing
`
`system disclosed in commonly-owned PCT Application Serial Nos. PCT/FR97/01354 and
`
`PCT/EP98/05383, the contents of which are incorporated herein by reference for detection of
`
`various criteria associated with the humaneye, and especially to detection that a driveris
`
`falling asleep while driveing a vehicle.
`
`The apparatusof the inventionis similar to that described in the
`
`aforementioned PCT Application Serial Nos. PCT/FR97/01354 and PCT/EP98/05383, which
`
`will be described herein for purposes ofclarity. Referring to Figs. 1 and 10, the generic
`
`image processing system 22 includesa spatial and temporal processing unit 11 in
`
`combination with a histogram formation unit 22a. Spatial and temporal processing unit 11
`
`includes an input 12 that receives a digital video signal S originating from a video camera or
`
`other imaging device 13 which monitors a scene 13a. Imaging device 13 is preferably a
`
`conventional CMOS-type CCD camera, which for purposesof the presently-described
`
`invention is mounted on a vehicle facing the driver. It will be appreciated that when used in
`
`10
`
`SAMSUNG EXHIBIT 1005
`Page 12 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 12 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`non-vehicluar applications, the camera may be mountedin any desired fashion to detect the
`
`specific criteria of interest. It is also foreseen that any other appropriate sensor,e.g.,
`
`ultrasound, IR, Radar,etc., may be used as the imaging device. Imaging device 13 may have
`a direct digital output, or an analogoutput that is converted by an A/D convertorinto digital
`
`signal S. Imaging device 13 may also be integral with generic image processing system 22,if
`
`desired.
`
`While signal S may be a progressive signal, it is preferably composed of a
`
`successionof pairs of interlaced frames, TR, and TR’, and TR, and TR’,, each consistingof a
`
`succession of horizontal scanned lines,e.g., 1, ,, 1;5,...,1).;7 in TRy, and, in TR,. Each line
`
`consists of a succession ofpixels or image-pointsPI,e.g., a, ,, a, and a,for line |, ,; al,7,
`
`and al j7, for line 1, |, ; al,,, and a,, for line 1, ,. Signal S(PI) represents signal S composed of
`
`pixels PI.
`
`S(PI) includes a frame synchronization signal (ST)at the beginning of each
`
`frame, a line synchronization signal (SL) at the beginning of eachline, and a blanking signal
`
`(BL). Thus, S(PI) includes a succession frames, which are representative of the time domain,
`
`and within each frame,a series of lines and pixels, which are representative of the spatial
`
`domain.
`
`In the time domain, "successive frames" shall refer to successive frames of the
`
`same type(i.e., odd frames such as TR, or even frames such as TR',), and "successive pixels
`
`in the same position” shall denote successive values of the pixels (PI) in the samelocation in
`
`successive frames of the same type, e.g., a,, of |, , in frame TR, and a, , of |,, in the next
`
`corresponding frame TR,
`
`Spatial and temporal processing unit 11 generates outputs ZH and SR 14 toa
`
`data bus 23 (Fig. 11), which are preferably digital signals. Complex signal ZH comprises a
`
`11
`
`SAMSUNG EXHIBIT 1005
`Page 13 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 13 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`numberofoutput signals generated by the system, preferably including signals indicating the
`
`existence andlocalization ofan area or object in motion, and the speed V andthe oriented
`
`direction of displacement DIof each pixel of the image. Also preferably output from the
`system is input digital video signalS, which is delayed (SR) to make it synchronouswith the
`
`output ZH for the frame, taking into accountthe calculation time for the data in composite
`signal ZH (one frame). The delayed signal SR is used to display the image received by
`
`camera 13 on a monitorortelevision screen 10, which mayalsobeusedto display the
`
`information contained in composite signal ZH. Composite signal ZH mayalso be transmitted
`
`to a separate processing assembly 10a in whichfurther processing of the signal may be
`
`accomplished.
`
`Referring to Fig. 2, spatial and temporal processing unit 11 includesa first
`
`assembly 11a, which consists of a temporal processing unit 15 having an associated memory
`
`16, a spatial processing unit 17 having a delay unit 18 and sequencing unit 19, and a pixel
`
`clock 20, which generatesa clock signal HP, and which serves as a clock for temporal
`
`processing unit 15 and sequencing unit 19. Clock pulses HP are generated by clock 20 at the
`
`pixel rate of the image, which is preferably 13.5 MHZ.
`
`Fig. 3 showsthe operation of temporal processing unit 15, the function of
`
`whichis to smooth the video signal and generate a numberof outputs that are utilized by
`
`spatial processing unit 17. During processing, temporal processing unit 15 retrieves from
`
`memory 16 the smoothed pixel values LI ofthe digital video signal from the immediately
`
`prior frame,and the values of a smoothing time constant CI for each pixel. As used herein,
`
`LO and CO shall be used to denote the pixel values (L) and time constants (C) stored in
`
`memory 16 from temporal processing unit 15, and LI and CIshall denote the pixel values (L)
`
`and time constants (C) respectively for such valuesretrieved from memory 16 for use by
`
`12
`
`SAMSUNG EXHIBIT 1005
`Page 14 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 14 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`temporal processing unit 15. Temporalprocessing unit 15 generates a binary outputsignal
`
`DP for each pixel, which identifies whether the pixel has undergonesignificant variation, and
`
`a digital signal CO, which represents the updated calculated value of time constant C.
`Referring to Fig.-3,temporal processing unit 15 includesafirst block 15a
`
`whichreceives the pixels PI of input videosignal S. For each pixel PI, the temporal
`
`processing unit retrieves from memory 16 a smoothed valueLI ofthis pixel from the
`
`immediately preceding corresponding frame, which was calculated by temporalprocessing
`
`unit 15 during processing of the immediately prior frame and stored in memory 16 as LO.
`
`Temporal processing unit 15 calculates the absolute value AB ofthe difference between each
`
`pixel value PI and LIfor the samepixelposition (for example a, ,, of 1,, in TR, and of1, , in
`
`TR,:
`
`AB = |PI-LI|
`
`Temporal processing unit 15 is controlled by clock signal HP from clock 20 in
`
`order to maintain synchronization with the incoming pixel stream. Test block 15b of
`
`temporalprocessing unit 15 receives signal AB and a threshold value SE. Threshold SE may
`
`be constant, but preferably varies based upon the pixel value PI, and more preferably varies
`
`with the pixel value so as to form a gammacorrection. Known meansof varying SE to form
`
`a gammacorrection is represented by the optional block 15e shown in dashedlines. Test
`
`block 15b compares, on a pixel-by-pixel basis, digital signals AB and SE in orderto
`
`determinea binary signal DP. If AB exceeds threshold SE, which indicatesthat pixel value
`
`PI has undergonesignificant variation as compared to the smoothed value LI of the same
`
`pixel in the prior frame, DPis set to "1" for the pixel under consideration. Otherwise, DP is
`
`set to "0"for such pixel.
`
`13
`
`SAMSUNG EXHIBIT 1005
`Page 15 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 15 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`When DP = 1, the difference between the pixel value PI and smoothed value
`
`LI of the same pixelin the prior frame is considered too great, and temporal processing unit
`
`15 attempts to reduce this difference in subsequentframes by reducing the smoothing time
`constant C for that pixel. Conversely, if DP = 0, temporal processing unit 15 attempts to
`
`increase this difference in subsequent frames by increasing the smoothing time constant C for
`that pixel. These adjustments to time constant C as a function ofthe value ofDP are made by
`block 15c. IfDP = 1, block 15c reduces the time constantby a unit value U so that the new
`
`value of the time 7constant CO equals the old value of the constant CI minusunit value U.
`
`CO=CI-U
`
`If DP = 0, block 15c increases the time constant by a unit value U sothat the
`
`new valueof the time constant CO equals the old value of the constant Cl plus unit value U.
`
`CO=CI+U
`
`Thus, for each pixel, block 15c receives the binary signal DP from testunit
`
`15b and time constant CI from memory 16, adjusts CI up or down by unit value U, and
`
`generates a new time constant CO whichis stored in memory 16 to replace time constant CI.
`
`In a preferred embodiment, time constantC,is in the form 2”, wherepis
`
`incremented or decremented by unit value U, which preferably equals 1. in block 15c. Thus,
`
`if DP = 1, block 15c subtracts one (for the case where U=1) from p in the time constant 2°
`
`which becomes 2°"'. If DP = 0, block 15c adds oneto p in time constant 2°, which becomes
`
`2"*!_ The choice ofa time constant of the form 2?facilitates calculations and thus simplifies
`
`the structure of block 15c.
`
`Block 15c includes severaltests to ensure proper operation of the system.
`
`First, CO must remain within defined limits. In a preferred embodiment, CO must not
`
`becomenegative (CO > 0) and it must not exceed a limit N (CO <N), whichis preferably
`
`14
`
`SAMSUNG EXHIBIT 1005
`Page 16 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 16 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`seven.
`
`In the instance in which CI and COare in the form 2, the upperlimit N is the
`
`maximumvalueforp.
`
`The upperlimit N may be constant, but is preferably variable. An optional
`input unit 15f includesa register ofmemory that enables the user, or controller 42 to vary N.
`
`The consequence ofincreasingN isto increase the sensitivity of the system to detecting
`displacement ofpixels, whereas reducing N improves detection ofhigh speeds. N may be
`
`made to depend on PI (N mayvary on a pixel-by-pixel basis, if desired) in order to regulate
`
`the variation of LO asa functionofthe leverofPI, i-e., N;, = f(Plj,), the calculation of which
`
`is done in block 15f, which in this case would receive the value of PI from video camera 13.
`
`Finally, a calculation block 15d receives, for each pixel, the new time constant
`
`CO generated in block 15c, the pixel values PI of the incoming video signal S, and the
`
`smoothed pixel value LI of the pixel in the previous frame from memory 16. Calculation
`
`block 15d then calculates a new smoothed pixel value LO for the pixel as follows:
`
`LO=LI + (PI - LI/CO
`
`If CO = 2°,then
`
`LO=LI + (PI - LI/2°
`
`where "po", is the new valueof p calculated in unit 15c and which replaces previous value of
`
`"pi" in memory 16.
`
`Thepurposeofthe smoothing operation is to normalize variations in the value
`
`of each pixel PI of the incoming video signal for reducing the variation differences. For each
`
`pixel of the frame, temporal processing unit 15 retrieves LI and CI from memory 16, and
`
`generates new values LO (new smoothed pixel value) and CO (new time constant) that are
`
`stored in memory 16 to replace LI and Cl respectively. As shown in Fig. 2, temporal
`
`SAMSUNG EXHIBIT 1005
`Page 17 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 17 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`processing unit 15 transmits the CO and DPvalues for each pixel to spatial processing unit 17
`
`through the delay unit 18.
`
`The capacity of memory 16 assuming thatthere are R pixels in a frame, and
`therefore 2R pixels per complete image, mustbe at least 2R(e+f) bits, where e is the number
`
`of bits required to store a single pixel value LI (preferably eightbits), and f is the number of
`bits required to store a single time constant Cl(preferably 3 bits). Ifeach video imageis
`
`composed of a single frame (progressive image), it is sufficient to use R(e+f) bits rather than
`
`2R(e+f) bits.
`
`Spatial processing unit 17 is used to identify an area in relative movementin
`
`the images from camera 13 and to determine the speed andoriented direction of the
`
`movement. Spatial processing unit 17, in conjunction with delay unit 18, cooperates with a
`
`controlunit 19 that is controlled by clock 20, which generates clock pulse HPat thepixel
`
`frequency. Spatial processing unit 17 receives signals DP; and CO, (where i and j correspond
`
`to the x and y coordinates ofthe pixel) from temporal processing unit 15 and processes these
`
`signals as discussed below. Whereas temporalprocessing unit 15 processes pixels within
`
`each frame, spatial processing unit 17 processes groupingsofpixels within the frames.
`
`Fig. 5 diagrammatically shows the temporal processing of successive
`
`corresponding frame sequences TR,, TR,, TR; and the spatial processing in the these frames
`
`of a pixel PI with coordinatesx, y, at times t,, t,, and t;. A plane in Fig. 5 correspondsto the
`
`spatial processing of a frame, whereas the superposition of frames correspondsto the
`
`temporal processing of successive frames.
`
`Signals DP; and CO;, from temporalprocessing unit 15 are distributed by
`
`spatial processing unit 17 into a first matrix 21 containing a numberof rows and columns
`
`muchsmaller than the numberoflines L of the frame and the numberofpixels M perline.
`
`16
`
`SAMSUNG EXHIBIT 1005
`Page 18 of 91
`
`SAMSUNG EXHIBIT 1005
`Page 18 of 91
`
`
`
`WO 99/36893
`
`PCT/EP99/00300
`
`Matrix 21 preferably includes 2/ + 1 lines along the y axis and 2m+1 columnsalongthex axis
`
`(in Cartesian coordinates), where / and m are small integer numbers. Advantageously, / and
`
`mare chosen to