throbber
(12) United States Patent
`Lee et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7,298,871 B2
`Nov. 20, 2007
`
`US00729.8871B2
`
`6,625,503 B1* 9/2003 Smith .......................... TOO,83
`6,934,917 B2* 8/2005 Lin .....
`... 715,811
`2002/000751.0 A1
`1/2002 Mann ............................ 4,300
`
`(54) SYSTEM AND METHOD FOR ADAPTING
`THE AMIBIENCE OF A LOCAL
`ENVIRONMENT ACCORDING TO THE
`LOCATION AND PERSONAL PREFERENCES
`OF PEOPLE IN THE LOCAL
`ENVIRONMENT
`(75) Inventors: Mi-Suen Lee, Ossining, NY (US);
`Hug Strubbe, Yorktown Heights, NY
`
`FOREIGN PATENT DOCUMENTS
`EP
`1102500
`5, 2001
`WO
`WO9747O66
`12/1997
`W WE g3.
`WO
`WOO159622
`8, 2001
`WO
`WOO179952
`10, 2001
`OTHER PUBLICATIONS
`(73) Assignee: Koninklijke Philips Electronics N.V.,
`Eindohoven (NL
`McKenna, Stephen et al., Tracking Faces, Proceedings of the
`(NL)
`Second Int’l Conference on Automatic Face and Gesture Recogni
`(*) Notice PlayS", "T" ionoct. 14-16, 1906.kilington vii. pp. 271276.
`f thi
`the t
`disclai
`Subiect t
`c
`Notice:
`patent is extended or adjusted under 35
`
`(21) Appl. No.: 10/165,286
`(22) Filed:
`Jun. 7, 2002
`
`Primary Examiner Matthew C. Bella
`Assistant Examiner Tom Y Lu
`(74) Attorney, Agent, or Firm Yan Glickberg
`
`(65)
`
`Prior Publication Data
`US 2003/0227439 A1
`Dec. 11, 2003
`
`(51) Int. Cl.
`(2006.01)
`G06K 9/00
`(52) U.S. Cl. ...................................................... 382/115
`(58) Field of Classification Search ................ 382/115,
`382/118, 155: 700/47, 48
`See application file for complete search history.
`References Cited
`U.S. PATENT DOCUMENTS
`
`(56)
`
`4, 1998 Jain et al. ................... 345,952
`5,745,126 A
`5,835,616 A 1 1/1998 Lobo et al. ................. 382,118
`6,223,992 B1
`5/2001 Yasui et al. .........
`6,400,835 B1* 6/2002 Lemelson et al. .......... 382,118
`6,548,967 B1 * 4/2003 Dowling et al. ............ 315,318
`
`ABSTRACT
`(57)
`A system and method for automatically controlling systems
`and devices in a local environment, such as a home. The
`system comprises a control unit that receives images asso
`ciated with one or more regions of the local environment.
`The one or more regions are each serviced by Ole O Oe
`servicing components. The control unit processes the images
`to identify, from a group of known persons associated with
`the local environment, any one or more known persons
`located in the regions. For the regions in which one or more
`known person is identified, the control unit automatically
`generates a control signal for at least one of the servicing
`components associated with the region, the control signal
`reflecting a preference of at least one of the known persons
`located in the respective region.
`
`24 Claims, 3 Drawing Sheets
`
`100
`
`CAPTURE IMAGES
`FROM THE REGION
`
`KNOWN
`PERSON
`IDENTIFIEDIN
`THE REGIO)
`
`
`
`
`
`110
`
`KNOWN
`PERSON
`PREVIOUSLY
`IDENTIFIED INTHE
`REGON
`
`
`
`120
`
`130
`
`RETRIEVE PREFERENCES)
`OFKNOWNPERSON
`
`140
`
`CONTROL SERVICING
`COMPONENT(S)N THE
`REGIONSING
`PREFERENCES)OF THE
`KNOWNPERSON
`
`VWGoA EX1026
`U.S. Patent No. 9,955,551
`
`

`

`US 7,298,871 B2
`Page 2
`
`OTHER PUBLICATIONS
`Gutta, S., et al., Hand Gesture Recognition Using Ensembles Of
`Radial Basis Function (RBP) Networks And Decision Trees, Int'l J.
`Of Pattern Recognition and Artificial Intelligence, vol. 11, No. 6, pp.
`845-874 (1997).
`
`Gutta, S., et al. Mixture Of Experts For Classification of Gender,
`Ethnic Origin and Pose of Human Faces, IEEE Transactions. On
`Neural Networks, vol. 11, No. 4, (Jul. 2000), pp. 948-960.
`
`* cited by examiner
`
`

`

`U.S. Patent
`U.S. Patent
`
`Nov. 20, 2007
`Nov. 20, 2007
`
`Sheet 1 of 3
`Sheet 1 of3
`
`US 7,298,871 B2
`US 7,298,871 B2
`
`
`
`
`
`

`

`U.S. Patent
`
`Nov. 20, 2007
`
`Sheet 2 of 3
`
`US 7,298,871 B2
`
`
`
`FIG.2
`
`
`
`20
`
`
`
`PROCESSOR
`22
`F
`
`MEMORY
`24
`
`FIG2a
`
`

`

`U.S. Patent
`
`Nov. 20, 2007
`
`Sheet 3 of 3
`
`US 7,298,871 B2
`
`100
`
`CAPTURE IMAGES
`FROM THE REGION
`
`
`
`
`
`
`
`
`
`
`
`
`
`KNOWN
`PERSON
`IDENTIFIED IN
`THE REGION
`
`
`
`KNOWN
`PERSON
`PREVIOUSLY
`DENTIFIED IN THE
`REGION
`
`
`
`120
`
`Y
`
`130
`
`RETRIEVE PREFERENCE(S)
`OF KNOWN PERSON
`
`140
`
`
`
`CONTROL SERVICING
`COMPONENT(S) IN THE
`REGIONUSING
`PREFERENCE(S) OF THE
`KNOWN PERSON
`
`FIG. 3
`
`

`

`US 7,298,871 B2
`
`1.
`SYSTEMAND METHOD FOR ADAPTING
`THE AMBIENCE OF A LOCAL
`ENVIRONMENT ACCORDING TO THE
`LOCATION AND PERSONAL PREFERENCES
`OF PEOPLE IN THE LOCAL
`ENVIRONMENT
`
`2
`fixed response when an input is received. Thus, for example,
`a motion sensor will switch on a light even if person would
`not otherwise want it on. Even a reactive system such as
`Richton, where certain reactions may be programmed, Suffer
`from such a disadvantage. For example, a mobile phone that
`initiates certain functions in the home at certain distances
`that reflect a wife's preferences may create conditions that
`are not agreeable to a husband who is carrying his wife's
`phone.
`Similarly, known pre-programmed type home automation
`systems have numerous deficiencies. For example, a timer
`that automatically turns on an appliance or system will do so
`unless it is turned off, thus creating situations that are
`undesirable or possibly unsafe. For example, if a person
`forgets to turn the timer of a coffee maker off on the day he
`or she has an early business meeting, a potential hazard may
`occur when the coffee maker is turned on later in the
`morning and remains on for the entire day. Likewise, for
`example, if the “vacation mode” is selected in Richton and
`a son or daughter who is unfamiliar with the system controls
`unexpectedly returns home from college for a weekend
`while the rest of the family is away, he or she may not be
`able to operate the lights, heating, etc. to their liking.
`Other disadvantages of known home automation systems
`and techniques include an inability to identify a particular
`person and tailor a setting or response in the house to the
`preferences of the identified person. In addition, known
`systems and techniques do not respond with the preferred
`settings or responses based on the location of a particular
`person in the home. In addition, known systems and tech
`niques do not respond with the preferred settings or
`responses of a number of persons based upon where they are
`located in the house.
`
`SUMMARY OF THE INVENTION
`
`It is thus an objective of the invention to provide auto
`matic setting of conditions or ambiance in a local environ
`ment, such as a home. It is also an objective to provide
`automatic detection of the location of a particular person in
`the local environment and automatic setting of conditions or
`ambiance in the region of local environment in which the
`person is detected based on the preferences of the particular
`person. It is also an objective to provide automatic detection
`of the location of a particular user in the local environment
`using image recognition.
`Accordingly, the invention provides a system comprising
`a control unit that receives images associated with one or
`more regions of a local environment. The local environment
`may be, for example, a home, and the two or more regions
`may be the rooms of the home, a wing or floor of the home,
`etc. The one or more regions are each serviced by one or
`more controllable devices or systems. For example, the
`controllable devices or systems may be the lights in a room,
`the heat level for a sector of the home, etc. The control unit
`processes the images to identify, from a group of known
`persons associated with the local environment, any known
`persons located in one or more of the regions. For a known
`person so identified in a respective region, the control unit
`retrieves from a database an indicium of the identified
`person’s preference for at least one of the one or more
`controllable devices or systems that service the respective
`region in which the known person is located. The control
`unit generates control signals so that the one or more
`controllable devices or systems that service the respective
`region in which the identified person is located is adjusted to
`reflect the known person’s preference.
`
`FIELD OF THE INVENTION
`
`The invention relates to adjusting the ambience, such as
`the lighting, temperature, noise level, etc., in a home or like
`interior environment.
`
`10
`
`BACKGROUND OF THE INVENTION
`
`15
`
`25
`
`30
`
`35
`
`Certain home automation systems and techniques are
`known. Many known home automation systems and tech
`niques may generally be classified as reactive to a real-time
`physical input. A well-known example are lights having
`attendant IR sensors (or like motion sensor), which will turn
`on when a person walks by, such as into a room. Such lights
`can often have an attendant daylight sensor (another real
`time input), which will prevent the light from turning on
`when there is ambient daylight.
`Other known home automation systems and techniques
`may generally be classified as pre-programmed to carry out
`certain functions when certain criteria are met. Many reac
`tive systems are controlled by timers. For example, heating
`systems can be initiated automatically at a certain time of
`day, such as in the morning. Similarly, coffee makers can be
`automatically initiated at a specified time, so that a person
`has a cup of brewed coffee ready when he or she walks into
`the kitchen in the morning.
`An example of a more complex home automation system
`is described in European Patent Application EP 1 102 500
`A2 of Richton. The position of a wireless mobile unit (such
`as a wireless phone) carried by a person is used to determine
`the distance of the person to the home. Messages or instruc
`tions to perform certain actions based on the distance
`between the person and the home are generated and sent to
`a controller within the home. The controller causes the
`instruction to be enacted. For example, when the user is
`within a certain distance of the home, the home heating
`system may be instructed to turn on. Richton thus has
`features that are analogous to both a reactive system (i.e., a
`feature is engaged based upon proximity) and a pre-pro
`grammed system (i.e., engagement of a feature when certain
`pre-stored criteria are met).
`Another example of a more elaborate pre-programmed
`type home automation system is described in PCT WO
`50
`01/52478 A2 of Sharood et al. In the Sharood system,
`existing home appliances and systems are connected to a
`control server. The user may control a selected appliance or
`system via a user interface that interacts with the server and
`can present graphic representations of the actual control
`inputs for the selected appliance or system. The user may
`therefore access the server and control appliances or systems
`remotely, for example, through an internet connection. In
`addition, the control server may be programmed so that
`certain appliances or systems are initiated and run under
`certain circumstances. For example, when a “vacation
`mode” is engaged, the lights are turned on at certain times
`for security purposes, and the heat is run at a lower tem
`perature.
`There are numerous deficiencies associated with the
`known home automation techniques and systems. For
`example, known reactive-type systems simply provide a
`
`40
`
`45
`
`55
`
`60
`
`65
`
`

`

`US 7,298,871 B2
`
`3
`Also, the invention provides a method for adjusting the
`conditions or ambiance of regions comprising a local envi
`ronment. The method comprises capturing images associ
`ated with each of a number of regions of a local environ
`ment. From a group of known persons associated with the
`local environment, any known persons located in one or
`more of the regions are identified from the captured images.
`One or more preferences of an identified person are
`retrieved. The one or more preferences for the identified
`person are used to control one or more devices or systems
`associated with the region in which the identified person is
`located.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a representative view of an embodiment of the
`invention;
`FIG. 2 is a more detailed representative view of the
`embodiment of the invention shown in FIG. 1;
`FIG. 2a depicts further details of a component of FIG. 2;
`and
`FIG. 3 is a flow-chart of an embodiment of a method in
`accordance with the invention
`
`10
`
`15
`
`4
`components of room R2 that are controlled in accordance
`with the invention. As will be described in more detail
`below, according to the invention, images from cameras C1,
`C2 are used to identify a known person in a respective room
`R1, R2. Once identified, the devices and/or systems that
`service the room in which the identified person is located is
`automatically adjusted or controlled in accordance with the
`individual preference of the identified person. For example,
`if person X is recognized by camera C1, the light L2, heat
`H2 and volume of music from speaker S2 in room R2 is
`automatically adjusted to the preferences of X.
`Before proceeding, it is also noted that the particular
`devices and/or systems shown that service each room R1,R2
`and which are controlled in accordance with the invention is
`for convenience to aid in describing the embodiment of the
`present invention. However, each room may include more or
`less and/or different devices or systems that service the room
`and are controlled according to the invention. One skilled in
`the art may readily adapt the description applied to the
`representative devices and systems described below to dif
`ferent and/or additional or fewer devices or systems found in
`any individual room.
`In addition, it is also noted that each device and system is
`ascribed in FIG. 1 as servicing a particular room. However,
`any one device or system may service two or more rooms.
`For those servicing devices or systems, the servicing area of
`the device or system defines the room or local region and
`thus which cameras are used in controlling the device or
`system. For example, in FIG. 1, heating unit H2 may be
`absent and heating unit H1 may service both rooms R1,R2.
`Thus, for the purposes of the heating unit H1, the local
`region is rooms R1 and R2, and is adjusted according to the
`preference of a person identified by either C1 or C2.
`Similarly, speaker S2 may provide music to both rooms R1
`and R2, so it’s volume will be adjusted to the preference of
`an identified person located in either R1 or R2 (i.e., a person
`identified in an image captured by either C1 or C2).
`Referring to FIG. 2, a more detailed (and somewhat more
`generalized) representation of the embodiment introduced in
`FIG. 1 is shown. Rooms R1, R2 are shown schematically,
`along with respective cameras C1, C2, respective lights L1,
`L2, and respective heating units H1, H2. For room R2,
`speaker S2 is also shown. For clarity, devices and/or systems
`that service a room and which are controlled according to the
`invention (such as L1, L2, H1, H2 and S2) may alternatively
`be referred to as a “servicing component'.
`FIG. 2 shows additional components of the embodiment
`of the invention. Control unit 20 provides the central pro
`cessing and initiation of control signals for the servicing
`components of rooms R1, R2. Control unit 20 may comprise
`any digital controller, processor, microprocessor, computer,
`server or the like, and any needed ancillary components
`(such as a memory, database, etc.) which can carry out the
`control processing and signal generation of the invention.
`Control unit 20 may comprise, for example, a processor 22
`and memory 24, as shown further in FIG. 2a, and run
`Software for determining and outputting appropriate control
`signals to the servicing components, which is described in
`further detail below. Cameras C1, C2 are connected to
`control unit 20 over data lines 1 (C1), 1 (C2), respectively.
`Data lines 1 (C1), 1 (C2) and like lines described below may
`comprise standard communication wires, optical fibers and
`like hardwired data lines. They may also represent wireless
`communication. Each camera C1, C2 thus provides images
`of the respective room R1, R2 in which it is located to the
`processor 22 of control unit 20. Thus, camera C1 provides
`
`DETAILED DESCRIPTION
`
`25
`
`30
`
`35
`
`45
`
`Referring to FIG. 1, a local environment comprising a
`home 10 is represented that supports an embodiment of the
`invention. Although a home is focused on in the ensuing
`description, the local environment may be any setting to
`which the invention may be applied, such as an office, Store,
`hospital, hotel, camper, etc. The invention may be easily
`adapted to such other settings by one skilled in the art.
`The home 10 is shown to be comprised of rooms R1, R2.
`Although R1, R2 are represented and referred to as rooms,
`they are generally intended to represent definable regions in
`the home, not just traditional rooms. For example, any of the
`regions may alternatively be a kitchen, hallway, stairway,
`garage, basement, storage space, etc. In addition, rooms R1,
`R2 in FIG. 1 are representative of other rooms in the home
`40
`that have at least one controllable device or system that
`service the respective room. Any regions in the home that do
`not include systems or devices that are not controlled in
`accordance with the invention are not represented in FIG. 1,
`but it is understood that such regions may exist. For
`example, the systems and/or devices that are controlled in
`accordance with the invention may be found in certain
`regions of the home that are used more frequently, Such as
`the bedrooms, kitchen, den and living room, and may be
`absent from regions of the home that are used less fre
`quently, such as the hallways, stairways, basement and
`garage.
`Each room R1,R2 in FIG. 1 is shown as having a camera
`C1, C2, respectively, or like image capturing device, that
`captures images within the room and, in particular, images
`of persons in the room. More than one camera may be used
`to cover a region, but for ease of description only one camera
`is represented as covering each region in FIG. 1 Thus, for
`example, camera C2 will capture images of person X when
`located as shown in room R2. Each room R1, R2 is also
`shown as having a respective light, L1, L2, that illuminates
`the room, as well as a respective heating unit H1, H2 that
`heats the room. Room R2 is serviced by an audio system that
`provides music through speaker S2. Light L1 and heating
`unit H1 are the systems and components of room R1 that are
`controlled in accordance with the invention. Similarly, light
`L2, heating unit H2 and speaker S2 are the systems and
`
`50
`
`55
`
`60
`
`65
`
`

`

`5
`images of room R1 to control unit 20 and camera C2
`provides images of room R2 to control unit 20.
`In addition, processor 22 of control unit 20 provides
`appropriate control signals to lights L1, L2 over lines 1 (L1),
`1(L2), respectively, for controlling the intensity of the
`respective lights L1, L2. For convenience, lines 1 (L1), 1 (L2)
`are shown in FIG. 2 as directly connected to lights L1, L2.
`respectively, but it is understood that lines 1 (L1), 1 (L2)
`actually provide control signals to dimming circuitry
`attached to each respective light L1, L2. Alternatively, lines
`1 (L1), 1 (L2) may be input to a separate lighting controller
`that provides the appropriate dimming control signals to L1
`and/or L2 based on the input received from control unit 20.
`Processor 22 of control unit 20 also provides control
`signals over lines 1 (H1), 1 (H2) for controlling the tempera
`ture provided by heating units H1, H2 to rooms R1, R2,
`respectively. The control signals from control unit 20 over
`lines 1 (H1), 1 (H2) may comprise an appropriate temperature
`control signal for heating unit H1, H2, respectively. In a
`particular example of the heating system of FIG. 2, heating
`units H1, H2 are electric heaters that each have associated
`thermostats that receive control signals (in the form of a
`temperature setting) from control unit 20 over lines 1 (H1),
`1(H2), respectively.
`For other common types of heating systems known in the
`art, the control signal provided by control unit 20 to heating
`elements H1, H2 shown in FIG. 2 is a more abstract
`representation of the actual underlying system. For example,
`for heat provided by a centralized source (such as a gas fired
`hot water furnace), control unit 20 may provide a tempera
`ture setting over line 1 (H1) for a thermostat (not shown) in
`room R1. The thermostat consequently turns on a particular
`circulator attached to the furnace that provides hot water to
`baseboard heating elements comprising heating unit H1. In
`addition, lines 1 (H1), 1 (H2) may be input to a separate
`heating controller that provides the appropriate heating
`control signals to H1 and/or H2 based on the input received
`from control unit 20. Whatever the underlying heating
`system however, the control of the embodiment described
`with respect to FIG.2 may be readily adapted by one skilled
`in the art.
`Control unit 20 also provides control signals over line
`1(S) to audio system 40. Audio system 40 provides music to
`speaker S2 in room R2 over line 1(S2) in accordance with
`the control signals received from control unit 20. The control
`unit 20 may provide signals to the audio system that set the
`volume level of speaker S2, the type of music selected for
`play (for example, particular CDs, a radio station or webcast,
`etc.), etc. Audio system 40 may be located in room R2. Such
`as a stereo, but also may be a centralized audio system that
`provides music to other rooms in the home. Audio system 40
`may include an internal processor that receives the control
`signals from control unit 20 and processes those signals to
`select the music to play, the Volume of speaker S2 output
`over line 1(S2), etc.
`Control unit 20 further comprises image recognition Soft
`ware that is stored in memory 24 and run by processor 22.
`The image recognition Software processes the incoming
`images of each room R1,R2 received from cameras C1, C2,
`respectively. For convenience, the ensuing description will
`focus on the images received from a single camera, selected
`to be C1 of room R1, shown in FIG. 2. The description is
`also applicable to images received by control unit 20 from
`camera C2 located in room R2.
`As noted, camera C1 captures images of room R1 and
`transmits the image data to control unit 20. The images are
`typically comprised of pixel data, for example, those from a
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 7,298,871 B2
`
`6
`CCD array in a typical digital camera. The pixel data of the
`images is assumed to be pre-processed into a known digital
`format that may be further processed using the image
`recognition software in control unit 20. Such pre-processing
`of the images may take place in a processor of the camera
`C1. Such processing of images by digital cameras (which
`provides the pre-processed image data to the control unit 20
`for further processing by the image recognition Software) is
`well known in the art and, for convenience, it’s description
`will be omitted except to the extent necessary to describe the
`invention. While Such pre-processing of the images of
`camera C1 may take place in the camera C1, it may
`alternatively take place in the processor 22 of control unit 20
`itself.
`Processor 22 includes known image recognition Software
`loaded therein that analyzes the image data received from
`camera C1 via data line 1 (C1). If a person is located in room
`R1, he or she will thus be depicted in the image data. The
`image recognition Software may be used, for example, to
`recognize the contours of a human body in the image, thus
`recognizing the person in the image. Once the person’s body
`is located, the image recognition Software may be used to
`locate the person’s face in the received image and to identify
`the person.
`For example, if control unit 20 receives a series of images
`from camera C1, control unit 20 may detect and track a
`person that moves into the room R1 covered by camera C1
`and, in particular, may detect and track the approximate
`location of the person's head. Such a detection and tracking
`technique is described in more detail in “Tracking Faces” by
`McKenna and Gong, Proceedings of the Second Interna
`tional Conference on Automatic Face and Gesture Recog
`nition, Killington, Vt. Oct. 14-16, 1996, pp. 271-276, the
`contents of which are hereby incorporated by reference.
`(Section 2 of the aforementioned paper describes tracking of
`multiple motions.)
`When the person is stationary in region R1, for example,
`when he or she sits in a chair, the movement of the body (and
`the head) will be relatively stationary. Where the software of
`the control unit 20 has previously tracked the person's
`movement in the image, it may then initiate a separate or
`Supplementary technique of face detection that focuses on
`the portion of the Subsequent images received from the
`camera C1 where the person’s head is located. If the
`software of the control unit 20 does not track movements in
`the images, then the person's face may be detected using the
`entire image, for example, by applying face detection pro
`cessing in sequence to segments of the entire image.
`Forface detection, the control unit 20 may identify a static
`face in an image using known techniques that apply simple
`shape information (for example, an ellipse fitting or eigen
`silhouettes) to conform to the contour in the image. Other
`structure of the face may be used in the identification (such
`as the nose, eyes, etc.), the symmetry of the face and typical
`skin tones. A more complex modeling technique uses pho
`tometric representations that model faces as points in large
`multi-dimensional hyperspaces, where the spatial arrange
`ment of facial features are encoded within a holistic repre
`sentation of the internal structure of the face. Face detection
`is achieved by classifying patches in the image as either
`“face' or “non-face' vectors, for example, by determining a
`probability density estimate by comparing the patches with
`models of faces for a particular Sub-space of the image
`hyperspace. This and other face detection techniques are
`described in more detail in the aforementioned Tracking
`Faces paper.
`
`

`

`US 7,298,871 B2
`
`5
`
`10
`
`15
`
`7
`Face detection may alternatively be achieved by training
`a neural network supported within the control unit 20 to
`detect frontal or near-frontal views. The network may be
`trained using many face images. The training images are
`scaled and masked to focus, for example, on a standard oval
`portion centered on the face images. A number of known
`techniques for equalizing the light intensity of the training
`images may be applied. The training may be expanded by
`adjusting the scale of the training face images and the
`rotation of the face images (thus training the network to
`accommodate the pose of the image). The training may also
`involve back-propagation of false-positive non-face pat
`terns. The control unit 20 provides portions of the image to
`such a trained neural network routine in the control unit 20.
`The neural network processes the image portion and deter
`mines whether it is a face image based on its image training.
`The neural network technique of face detection is also
`described in more detail in the aforementioned Tracking
`Faces paper. Additional details of face detection (as well as
`detection of other facial Sub-classifications, such as gender,
`ethnicity and pose) using a neural network is described in
`“Mixture of Experts for Classification of Gender, Ethnic
`Origin and Pose of Human Faces” by Gutta, et al., IEEE
`Transactions on Neural Networks, vol. 11, no. 4, pp. 948
`960 (July 2000), the contents of which are hereby incorpo
`25
`rated by reference and referred to below as the “Mixture of
`Experts' paper.
`Once a face is detected in the image, the control unit 20
`provides image recognition processing to the face to identify
`the person. Thus, the image recognition processing is be
`programmed to recognize particular faces, and each face is
`correlated to the identity of a person. For example, for the
`home represented in the embodiment of FIGS. 1 and 2, the
`image recognition processing is programmed to recognize
`the faces of the family members and/or other residents that
`reside in the home, and each face is correlated to the identity
`of the family member/resident. The neural network tech
`nique of face detection described above may be adapted for
`identification by training the network using the faces of
`those persons who must be identified. Faces of other persons
`40
`may be used in the training as negative matches (for
`example, false-positive indications). Thus, a determination
`by the neural network that a portion of the image contains a
`face image will be based on a training image for a known
`(identified) person, thus simultaneously providing the iden
`45
`tification of the person. So programmed, the neural network
`provides both face detection and identification of the person.
`Alternatively, where a face is detected in the image using a
`technique other than a neural network (Such as that
`described above), the neural network procedure may be used
`to confirm detection of a face and to also provide identifi
`cation of the face.
`As another alternative technique of face recognition and
`processing that may be programmed in control unit 20, U.S.
`Pat. No. 5,835,616, “FACE DETECTION USING TEM
`55
`PLATES” of Lobo et al., issued Nov. 10, 1998, hereby
`incorporated by reference herein, presents a two step process
`for automatically detecting and/or identifying a human face
`in a digitized image, and for confirming the existence of the
`face by examining facial features. Thus, the technique of
`Lobo may be used in lieu of, or as a Supplement to, the face
`detection and identification provided by the neural network
`technique after the initial tracking of a moving body (when
`utilized), as described above. The system of Lobo et al is
`particularly well Suited for detecting one or more faces
`within a camera's field of view, even though the view may
`not correspond to a typical position of a face within an
`
`50
`
`8
`image. Thus, control unit 20 may analyze portions of the
`image for an area having the general characteristics of a
`face, based on the location of flesh tones, the location of
`non-flesh tones corresponding to eye brows, demarcation
`lines corresponding to chins, nose, and so on, as in the
`referenced U.S. Pat. No. 5,835,616.
`If a face is detected, it is characterized for comparison
`with reference faces for family members who reside in the
`home (which are stored in database 22), as in the referenced
`U.S. Pat. No. 5,835,616. This characterization of the face in
`the image is preferably the same characterization process
`that is used to characterize the reference faces, and facilitates
`a comparison of faces based on characteristics, rather than
`an optical match, thereby obviating the need to have two
`identical images (current face and reference face) in order to
`locate a match. In a preferred embodiment, the number of
`reference faces is relatively small, typically limited to the
`number of people in a home, office, or other Small sized
`environment, thereby allowing the face recognition process
`to be effected quickly. The reference faces stored in memory
`24 of control unit 20 have the identity of the person
`associated therewith; thus, a match between a face detected
`in the image and a reference face provides an identification
`of the person in the image.
`Thus, the memory 24 and/or software of control unit 20
`effectively includes a pool of reference images and the
`identities of the persons associated therewith. Using the
`images received from camera C1, the control unit 20 effec
`tively detects and identifies a known person (or persons)
`when located in room R1 by locating a face (or faces) in the
`image and matching it with an image in the pool of reference
`images. The “match” may be detection of a face in the image
`provided by a neural network trained using the pool of
`reference images, or the matching of facial characteristics in
`the camera image and reference images as in U.S. Pat. No.
`5.835,616, as described above. Using the images received
`from camera C2, the control unit 20 likewise detects and
`identifies a known person (or persons) when located in room
`R2.
`When an image of a known person (such as a family
`member) located in a room is identified in the control unit 20
`by applying the image recognition Software to the images
`received from the camera in the room, the processor 22 then
`executes control software so that the servicing components
`of the ro

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket