throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2010/0066822 A1
`Steinberg et al.
`(43) Pub. Date:
`Mar. 18, 2010
`
`US 2010.0066822A1
`
`(54) CLASSIFICATION AND ORGANIZATION OF
`CONSUMER DIGITAL MAGES USING
`WORKFLOW, AND FACE DETECTION AND
`RECOGNITION
`
`(75) Inventors:
`
`Eran Steinberg, San Francisco, CA
`(US); Peter Corcoran, Galway
`(IE); Petronel Bigioi, Galway (IE):
`Mihai Ciuc, Bucuresti (RO);
`Stefanita Ciurel, Bucuresti (RO);
`Constantin Vertran, Bucuresti
`(RO)
`
`Correspondence Address:
`Tessera/FotoNation
`Patent Legal Dept.
`3025 Orchard Parkway
`San Jose, CA 95134 (US)
`
`(73) Assignee:
`
`FotoNation Ireland Limited,
`Galway (IE)
`
`(21) Appl. No.:
`
`12/554.258
`
`(22) Filed:
`
`Sep. 4, 2009
`
`Related U.S. Application Data
`(63) Continuation-in-part of application No. 10/764,335,
`filed on Jan. 22, 2004, now Pat. No. 7,587,068.
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`H04N 7/8
`(2006.01)
`G06K 9/00
`(52) U.S. C. ... 348/77:382/118; 382/224; 348/E07.085
`(57)
`ABSTRACT
`A processor-based system operating according to digitally
`embedded programming instructions performs a method
`including identifying a group of pixels corresponding to a
`face region within digital image data acquired by an image
`acquisition device. A set of face analysis parameter values is
`extacted from said face region, including a faceprint associ
`ated with the face region. First and second reference face
`prints are determined for a person using reference images
`captured respectively in predetermined face-portrait condi
`tions and using ambient conditions. The faceprints are ana
`lyzed to determine a baseline faceprint and a range of vari
`ability from the baseline associated with the person. Results
`of the analyzing are stored and used in Subsequent recogni
`tion of the person in a Subsequent image acquired under
`ambient conditions.
`
`deni & Rei Adrin Query Browser
`\cd.ie
`wide ?ocie iiie
`C
`
`Msg, slide Show Publisher FaceTools
`Oce vice
`viaduie
`viciule
`
`se teace scies
`
`1040
`
`1050
`
`1060 /
`
`Of
`
`1080
`
`o cc or
`
`c oxic to :
`
`oo :
`
`a
`
`Work Cecises
`
`110
`
`190
`
`A.
`
`i
`
`at:
`image
`Det.
`Dei.
`Myle | Module
`is
`
`
`
`18O
`W
`
`refSofia
`cit
`Collection
`
`-
`
`|
`
`--
`
`r
`F.
`3C
`s
`3.
`Recog. Dale |-1150
`Nor
`Module
`tile -
`Module
`f
`*
`e ---,
`is-n
`-
`s
`image
`Classification
`Catabase
`
`y
`
`image
`Classification 1 O
`8
`3
`
`60
`
`c
`
`W
`
`- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`
`Petitioner Apple Inc. - Ex. 1054, p. 1
`
`

`

`Patent Application Publication
`
`Mar.18, 2010 Sheet 1 of 29
`
`US 2010/0066822 A1
`
`Oo
`oSS
`=
`
`esegeleq
`
`|
`
`euibug
`foRa|NEPameggmememmogmmoyoansgocnespunammeng°70T3me:|ainpow||sinpoy
`
`
`
`
`
`UGHEIYISSEIDOZLLEebeuyJfgOBE
`ainpoy|feinpoy||einpoyy1|sjooy
`
`
`osar=6z01fogor|osor|ovar\oeorl\ozo
`
`
`soe,|jueusiignd||Mmougepi
`le£E
`SeOINOS1OSE"|ebeu|ebeuyi
`
`OLLE-SANPOW]BODAOLIOAR
`
`
`_-|uoqeayisseigf-————Mabel
`
`pooj2UOSIOcIpuleXy
`
`ainpoyy)(ainpow‘Booey||tion2yoq|aoe;||aoe3084
`afeuypjf|A“f5x£7é
`
`Petitioner Apple Inc. - Ex. 1054, p. 2
`
`Petitioner Apple Inc. - Ex. 1054, p. 2
`
`
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 2 of 29
`
`US 2010/0066822 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 3
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 3 of 29
`
`US 2010/0066822 A1
`
`
`
`
`
`
`
`
`
`
`
`?)z)
`
`the r- r - - - - - - - - a ran m r - - - - - - - - - - - - - - - - - - - - - - - - - - - - a ran
`
`Petitioner Apple Inc. - Ex. 1054, p. 4
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 4 of 29
`
`US 2010/0066822 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`|-------}T___
`
`k
`wer
`van
`N
`
`r
`
`s
`xx
`
`Petitioner Apple Inc. - Ex. 1054, p. 5
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 5 of 29
`
`US 2010/0066822 A1
`
`w
`va
`
`ce A. s
`
`xx xx xx xx xx xx xx xx xx xx sex saw
`
`xa
`cy
`N
`
`rex res exa e ex: text ex; we sex w w was a *
`
`Y
`
`w w w w w w me.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`ex .
`
`. .
`
`.
`
`- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - r
`
`Petitioner Apple Inc. - Ex. 1054, p. 6
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 6 of 29
`
`US 2010/0066822 A1
`
`Face Detection Workflow (Detection Viodie)
`
`Wait for image from Main
`Workflow (idle State)
`
`
`
`
`
`
`
`
`
`Locate Face Pixel Groupings
`ar Custer Groupings to form XXX XXXX-XXX
`Complete Face Candidate Regions
`
`A.
`
`Scar image with Face Feature Pre-Filter
`. ocate & Mark Face Candidate Regions
`
`3150
`3160-1
`Custer Remaining Face Pixel Groupings
`to Form Partial Face Candidates
`
`Pass Auto-Recognition List to
`workflow Module and return to idle
`
`
`
`
`
`Pass Marua & Auto-Recognition lists
`to Workflow Module and return to die J
`Y-3200
`
`Pass raining list to workflow Module
`
`
`
`FG. 3
`
`Petitioner Apple Inc. - Ex. 1054, p. 7
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 7 of 29
`
`US 2010/0066822 A1
`
`Core Syster Workflow Workflow iodie
`
`Wait for Next image
`*1101WEWE.
`
`Send image to Face Detection Module
`
`
`
`
`
`4.50
`B. Get next Face on Auto List
`X
`
`
`
`4155-
`X
`t
`Serid to face Recog. Mod.
`
`Prompt User for Face identity
`
`X Got Faceprint - ra Record Data
`470- Yesy
`437
`NUpdate Database
`Add to Faceprint Search list
`
`
`
`NO
`A? Face
`Region to "Un
`identified" list
`
`Yes
`
`’ Yes T-m-
`
`
`
`
`
`
`
`
`
`Search Faceprint Database 4210
`for the "N" closest entries
`42201. --
`ce
`X Same Fo class?
`to...Prs
`Yes --
`4230
`4250
`
`
`
`
`
`No.
`Add to Manual list
`4398
`No.
`smresar
`Sane identity car
`Yes
`
`
`
`FIG. 4(a)
`
`Petitioner Apple Inc. - Ex. 1054, p. 8
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 8 of 29
`
`US 2010/0066822 A1
`
`Core System Workflow (Workflow lodie )
`
`4230
`
`
`
`Same FP class?
`Yes
`harr:
`Air Faceprint to Face CaSS
`4240
`427C Link image to Known Person Data)
`X
`428O- Crossink Person to image Data
`429-1 Print visg. to ressage Module
`4300-
`ca
`ser
`NC
`ar
`sr. r. r
`awe
`-last Facepring Wax
`Yes
`
`WAM
`
`Same identity
`Yes
`Add Faceprint to closest Face
`Class for this identity
`
`"4260
`
`439
`TFrom Fig4(a)
`: Add Faceprint to most
`frequent Face Class
`for this identity
`"ggers".
`.
`. .
`.
`.
`Get next Faceprint in
`the Searc list
`X
`
`M
`
`430 load wa? a Recognition list
`4320- Get next Face on Manual list
`A330- Prompt ser for FaCe identity
`
`
`
`4340-
`435
`
`XM
`User Responds cer
`Yes
`if Faceprint is valid create a new face class
`X
`s'.
`and add to identity, Crosslink if age and
`dentity data, Pring Msg. to message module Add Face Region
`f
`to "r-identified"
`4360-Jpdate Database
`list
`4370 ast Entry de
`4380-
`Yes
`Return to Main workflow idle
`
`
`
`FIG. 4(b)
`
`Petitioner Apple Inc. - Ex. 1054, p. 9
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 9 of 29
`
`US 2010/0066822 A1
`
`Face Norialization Mode (Voie A
`
`Wait for Face Region from Main
`Workflow (die State
`
`SO2
`
`
`
`X
`
`
`
`53
`
`5efite2 C Apply Face Feature location Filter
`
`
`
`Determine Face
`Orientation & Pose
`
`Mark Face Features (Eye, Nose & Mouth)
`X
`50.24
`
`
`
`
`
`s
`-
`al Semi-Frontal
`5040PN Se-1
`
`
`
`NO s:
`
`
`
`Apply 2-D Transforms to
`b generate Frontai Face
`Reson
`
`- Half-Profile
`505 w
`s Face- x
`
`8 Estate Stretc. Factor or
`xxxxxxxxxxxxxxxxxxxxxacacacacaca
`
`Main Workflow Mode
`and return to ide
`
`Map Face Region to 3.
`Normalized Face Model
`Rotate Frontal Position-5056
`
`X
`
`Regiof
`
`
`
`
`
`
`
`Riga.
`
`pply 2D transforms for
`lumiration aid Scale
`
`p Pass Nortalized Frofia
`Face Region to Workflow
`Module and return to idle
`
`
`
`FIG. 5(a)
`
`Petitioner Apple Inc. - Ex. 1054, p. 10
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 10 of 29
`
`US 2010/0066822 A1
`
`Face Normalization Modie (Mode 8)
`
`5 O.
`
`Workflow (die State)
`
`
`
`
`
`5120
`
`
`
`
`
`
`
`FreFitec
`Yes. >Apply Face Feature locatin Fiter
`No
`eternie Face
`Orientation & Pose s
`5130
`
`Mark Face Features Eye, Nose & O
`524.
`
`)
`
`
`
`-
`Map onto 3D Face Model
`s seriental
`Beard Generate Multi-View
`51407 NaCee-
`- Fans
`NC
`5145
`Apply Normalization
`Fiters for Scaling & k
`
`
`
`
`
`
`
`Orientatio? E. 5 1 5
`
`Pass Normalized Rotated
`Face Region to Workflow
`Wode with Pose Rotation
`ata, retu" to die
`
`FIG. 5(b)
`
`Petitioner Apple Inc. - Ex. 1054, p. 11
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 11 of 29
`
`US 2010/0066822 A1
`
`FIG. 6(a)
`
`FIG. 6(b)
`
`FIG. 6(c)
`
`
`
`FIG. 6(d)
`
`Petitioner Apple Inc. - Ex. 1054, p. 12
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 12 of 29
`
`US 2010/0066822 A1
`
`
`
`FIG 7(d)
`
`FIG. 7(e)
`
`Fig. 7(f)
`
`Petitioner Apple Inc. - Ex. 1054, p. 13
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 13 of 29
`
`US 2010/0066822 A1
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 14
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 14 of 29
`
`US 2010/0066822 A1
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 15
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 15 of 29
`
`US 2010/0066822 A1
`
`
`
`
`
`@ZETTJ uee wieqe ||
`
`Petitioner Apple Inc. - Ex. 1054, p. 16
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 16 of 29
`
`US 2010/0066822 A1
`
`
`
`FIG. 9(b)
`
`Petitioner Apple Inc. - Ex. 1054, p. 17
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 17 of 29
`
`US 2010/0066822 A1
`
`FIG. 10(a)
`
`
`
`10120
`
`FIG. 10(b)
`
`Petitioner Apple Inc. - Ex. 1054, p. 18
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 18 of 29
`
`US 2010/0066822 A1
`
`
`
`FIG. 11(b)
`
`Petitioner Apple Inc. - Ex. 1054, p. 19
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 19 of 29
`
`US 2010/0066822 A1
`
`s
`
`y-3-
`a
`*
`
`es
`
`-
`
`-
`
`f
`
`FC
`4
`
`es
`
`i
`f
`f
`Y&
`* -
`><
`11220
`
`as r-e-r-retire .
`re
`t
`--
`3A
`---.S.
`FC
`*s
`".
`“s
`W
`'
`f
`is . .
`.
`.
`. . .
`.
`i--------- a- - -:
`if
`;
`s
`FC ; :
`3B v.
`i;
`if
`e -- ... -- - -
`
`r
`
`Ps
`
`sa
`
`as Y are
`
`FC
`3
`*
`Y
`
`a
`
`Petitioner Apple Inc. - Ex. 1054, p. 20
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 20 of 29
`
`US 2010/0066822 A1
`
`1200
`
`FIG. 12(a)
`
`N-12130
`
`x12232
`NY-12230
`
`>12242
`JCFC,
`
`112d20
`FC
`
`
`
`
`
`
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 21
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 21 of 29
`
`US 2010/0066822 A1
`
`
`
`13052
`
`13056
`13050
`
`13010
`13060
`
`FIG. 13(a)
`
`Petitioner Apple Inc. - Ex. 1054, p. 22
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 22 of 29
`
`US 2010/0066822 A1
`
`.
`
`"
`
`13120-N
`
`new ID,
`1
`i v
`/ 9d p
`-1340
`7
`a
`',
`, /
`1
`it
`f
`f
`f;
`fi
`
`--
`i/
`f
`
`-13130
`
`2
`
`\
`W.
`W.
`
`f
`
`1
`
`f
`W
`Y -- 1
`
`f-13110
`
`A.
`
`FIG. 13(b)
`
`Petitioner Apple Inc. - Ex. 1054, p. 23
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 23 of 29
`
`US 2010/0066822 A1
`
`1
`
`
`
`as
`
`wa Exm a
`
`Y?
`
`3. S.
`N
`
`- 13210
`
`N \
`
`v
`
`f
`
`YN -13220
`
`N-13246
`N13240
`
`FIG. 13(c)
`
`Petitioner Apple Inc. - Ex. 1054, p. 24
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 24 of 29
`
`US 2010/0066822 A1
`
`^ A
`
`xx
`
`- r a
`
`few
`
`1
`
`FIG. 13(d)
`
`
`
`FIG. 13(e)
`
`Petitioner Apple Inc. - Ex. 1054, p. 25
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 25 of 29
`
`US 2010/0066822 A1
`
`
`
`
`
`E?IEREEDT???Tº TOEGIE:55:
`
`Petitioner Apple Inc. - Ex. 1054, p. 26
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 26 of 29
`
`US 2010/0066822 A1
`
`
`
`A3
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 27
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 27 of 29
`
`US 2010/0066822 A1
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 28
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 28 of 29
`
`US 2010/0066822 A1
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 29
`
`

`

`Patent Application Publication
`
`Mar. 18, 2010 Sheet 29 of 29
`
`US 2010/0066822 A1
`
`
`
`Petitioner Apple Inc. - Ex. 1054, p. 30
`
`

`

`US 2010/0066822 A1
`
`Mar. 18, 2010
`
`CLASSIFICATION AND ORGANIZATION OF
`CONSUMER DIGITAL IMAGES USING
`WORKFLOW, AND FACE DETECTION AND
`RECOGNITION
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`0001. This application is a continuation in part (CIP) of
`U.S. patent application Ser. No. 10/764,335, filed Jan. 22.
`2004, which is one of a series of contemporaneously-filed
`patent applications including U.S. Ser. No. 10/764,339, now
`U.S. Pat. No. 7,551,755, entitled, “Classification and Orga
`nization of Consumer Digital Images using Workflow, and
`Face Detection and Recognition': U.S. Ser. No. 10/764,336,
`now U.S. Pat. No. 7,558,408, entitled, “A Classification Sys
`tem for Consumer Digital Images using Workflow and User
`Interface Modules, and Face Detection and Recognition':
`U.S. Ser. No. 10/764,335, entitled, “A Classification Data
`base for Consumer Digital Images': U.S. Ser. No. 10/764,
`274, now U.S. Pat. No. 7,555,148, entitled, “A Classification
`System for Consumer Digital Images using Workflow, Face
`Detection, Normalization, and Face Recognition'; and U.S.
`Ser. No. 10/763,801, now U.S. Pat. No. 7,564,994, entitled,
`'A Classification System for Consumer Digital Images using
`Automatic Workflow and Face Detection and Recognition'.
`
`BACKGROUND
`0002 1. Field of the Invention
`0003. The invention relates to digital image processing,
`particularly to the field of automatic or semiautomatic group
`ing and classification of images in a database or image col
`lection and based on the occurrence of faces in the images and
`the identification and classification of Such faces.
`0004 2. Description of the Related Art
`0005. The techniques of face detection and face recogni
`tion are each being explored by those skilled and a great many
`advancement have been made in those respective fields in
`recent years. Face detection has to do with the problem of
`locating regions within a digital image or video sequence
`which have a high probability of representing a human face.
`Face recognition involves the analysis of such a “face region'
`and its comparison with a database of known faces to deter
`mine if the unknown “face region' is sufficiently similar to
`any of the known faces to represent a high probability match.
`The related field of tracking involves face or identity recog
`nition between different frames in a temporal sequence of
`frames. A useful review of face detection is provided by Yang
`et al., in IEEE Transactions on Pattern Analysis and Machine
`Intelligence, Vol. 24, No. 1, pages 34-58, January 2002. A
`review of face recognition techniques is given in Zhanget al.,
`Proceedings of the IEEE, Vol. 85, No. 9, pages 1423-1435,
`September 1997.
`0006. Other related art refers to the grouping, classifica
`tion, management, presentation and access to collections of
`digital images in databases, file-systems or other storage
`mechanisms, being based on image content, global image
`parameters, or image metadata. Such content based
`approaches analyze the image content using spatial color
`distribution, texture, shape, object location and geometry, etc.
`However they do not explicitly teach to utilize face recogni
`tion in conjunction with these techniques, or to initially detect
`faces in their images, prior to applying a recognition process.
`It is recognized in the present invention that an advantageous
`
`system that provides automation in the detection, recognition
`and classification processing of digital images would be
`highly desirable.
`0007. None of the prior art references that are cited
`throughout the description below provides this feature. Many
`of the classification techniques described are applied to entire
`images and they do not teach to detect faces in an image, or to
`perform recognition of Such faces. Many of these references
`concentrate on methods storing or accessing images using
`databases, but they do not employ in conjunction with these
`methods the advantageous image processing techniques
`described by inventors of the present invention.
`0008. Some of the medical applications provide classifi
`cation and archiving of images into particular groups that are
`associated with a single customer. For example, the customer
`may be a patient and the classification may be particularly
`related to medical diagnosis or treatment applications where
`a large amount of image data (X-rays, Ultrasound scans, etc)
`which is related to a single patient may be gathered. However,
`these do not utilize face recognition as a means to compile or
`manage this image data, i.e., a user is expected to categorize
`the image according to the associated patient.
`0009. Further references available in the literature of the
`related art describe multi-format transcoding applications for
`visual data. Others describe means for constructing digital
`photo albums. These references do not, however, teach to use
`image processing techniques in the management or access of
`the data.
`0010. At this point we note that the present invention is
`presented primarily in the context of collections of consumer
`digital images which would be generated by a typical user of
`a digital camera. Such an image collection is in a constant
`state of growth as new sets of images are added every time the
`user off-loads pictures from the camera onto his computer.
`Because the image set is in a constant state of flux, it is often
`not practical to perform database-wide sorting, grouping or
`management operations every time a few images are added to
`the collection, because this would put an excessive load on the
`users computer. Much of the related art literature describes
`how to function with and operate on a large static image
`collection. Thus when a sizeable batch of new images is
`added, as will oftenhappen when a camera is offloaded, these
`related art teaching do not describe how to perform significant
`image processing and database-wide testing to determine
`similarities between new and existing database images and
`then group and store the new images before the user can
`access and enjoy his pictures. In reality the application of
`image processing techniques, or of other image-related tools
`is understood by the inventors in the present invention as
`being an ongoing process for collections of consumer images
`and for the design of these tools, where possible, to operate as
`automated or semi-automated background processes for
`applications in consumer imaging.
`0011. There is a very compelling need for new and
`improved tools to manage collections of images. More par
`ticularly, there is a need for tools, which can manage and
`organize image collections which are in a constant state of
`change and growth. It is also important that these tools can
`manage and organize Such ad-hoc collections using methods,
`which are easily understandable by the layman and, where
`possible that Such tools can function in semi- or fully-auto
`
`Petitioner Apple Inc. - Ex. 1054, p. 31
`
`

`

`US 2010/0066822 A1
`
`Mar. 18, 2010
`
`matic modes so that their work of cataloging and organizing
`is almost imperceptible to the end-user.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0012 FIGS. 1(a)-1(c) shows an overview of the principle
`components of the invention;
`0013 FIG. 1(a) is an outline of the main system compo
`nents implemented as a computer program.
`0014 FIG. 1(b) shows an alternative embodiment in
`which certain aspects of the system of the preferred embodi
`ment, including the face detection and recognition modules,
`are implemented within an image capture appliance Such as a
`digital camera, while the remaining aspects are implemented
`as a computer program on a desktop computer.
`0015 FIG.1(c) shows an embodiment wherein the system
`is entirely implemented within a digital camera.
`0016 FIG. 2(a) describes an embodiment of a main sys
`tem database.
`0017 FIG. 2(b) gives additional detail of the face recog
`nition data component of the database.
`0.018 FIG.3 describes a main face detection workflow in
`accordance with a preferred embodiment.
`0019 FIGS. 4(a)-4(b) describe a core system workflow in
`accordance with a preferred embodiment.
`0020 FIGS. 5(a)-5(b) shows a face normalization work
`flow in accordance with a preferred embodiment.
`0021
`FIGS. 6(a)-6(e) illustrate shows some of the differ
`ent ways that a face candidate region, obtained from the
`detection module, can be distorted; these distortions should
`be corrected by the normalization module:
`0022 FIG. 6(a) shows a frontal candidate region which is
`incorrectly oriented and must be rotated into an upright posi
`tion prior to applying the face recognition module;
`0023 FIG. 6(b) is a frontal candidate region which is of a
`reduced size and must be enlarged prior to applying the rec
`ognition module;
`0024 FIG. 6(c) is a correct frontal face candidate region
`which does not require either orientation or size correction;
`0025 FIGS. 6(d) and 6(e) illustrate two non-frontal face
`candidate regions which require pose normalization in addi
`tion to size and orientation normalization.
`0026 FIGS. 7(a)-7(f) illustrate how a 3-D model can be
`applied to model a range of face candidate regions:
`0027 FIGS. 7(a)-7(c) illustrate how a simple 1-D scaling
`of a normalized face model can be used to model the majority
`of face candidate regions with good accuracy;
`0028 FIGS. 7(d)-7(f) illustrate how a 2-D face candidate
`region can be mapped onto Such a 3-D normalized face model
`with 1-D scaling along the horizontal axis.
`0029 FIGS. 8(a)-8(b) illustrate how three face regions
`(FR1, FR2 and FR3) may be mapped to faceprints (FP1, FP2,
`and FP3) in a 3-component face space.
`0030 FIG. 8(c) illustrates multiple face regions extracted
`from digital images that have subtle pose, orientational, illu
`mination and/or size distortions to be adjusted automatically
`upon detection in a normalization process in accordance with
`a preferred embodiment prior to automatic or semi-automatic
`face recognition processing:
`0031
`FIG. 9(a) shows a graphical representation of how
`multiple, distinct, face classes, formed from collections of
`closely collocated faceprints can be used to define a unique
`region in face space which is associated with a particular
`person's identity.
`
`0032 FIG. 9(b) illustrates two such identity spaces with
`their associated face classes and faceprints.
`0033 FIG. 10(a) illustrates how a new faceprint creates a
`new face class for a person's identity when it is located at a
`distance further than a certain R, from an existing face
`class.
`0034 FIG.10(b) illustrates how a new faceprint extends or
`grows an existing face class when it is within a distance R
`from the existing face class.
`0035 FIG.11(a) illustrates how an identity region associ
`ated with one person can grow to overlap with the identity
`region of another person.
`0036 FIG.11(b) describes how these overlapping identity
`regions can be separated from each other by shrinking the two
`identity regions into their component face classes.
`0037 FIG. 11(c) illustrates a face class shrinking opera
`tion in accordance with a preferred embodiment.
`0038 FIG. 12(a) shows a face class which has grown over
`time to incorporate a relatively large number of faceprints
`which exhibit localized clustering.
`0039 FIG. 12(b) illustrates explicitly how these faceprints
`are clustered.
`0040 FIG. 12(c) shows how each local cluster can be
`replaced by a single clustered face class which is composed of
`a centre faceprint location in face space and a cluster radius,
`R.
`FIG. 13(a) describes the recognition process where
`0041
`a newly detected faceprint lies in an region of face space
`between two “known identity regions.
`0042 FIG. 13(b) shows how, once the recognition process
`has associated the new faceprint with one of the two known
`identity regions, ID, that identity region then grown to
`include the new faceprint as a new face class within ID.
`0043 FIG. 13(c) shows a similar situation to FIG. 13(a)
`but in this case it is not clear which of the two identity regions
`should be associated with the new faceprint and the system
`must ask the user to make this determination.
`0044 FIG. 13(d) illustrates the case where the user
`chooses ID.
`004.5
`FIG. 13(e) illustrates the case where the user
`chooses ID.
`0046 FIGS. 14(a)-14(d) show a variety of aspects of the
`user interface to the main workflow module.
`0047 FIG. 15(a) illustrates a faceprint associated with an
`acquired facial image:
`0048 FIGS. 15(b) and 15(c) illustrates how a VAR vector
`can be used to align higher order components of flash and
`non-flash feature vectors, or faceprints.
`0049 FIG. 15(d) illustrates a combined, illumination-nor
`malized feature vector for slab and non-flash faceprints.
`0050 FIG. 15(e) illustrates a local baseline faceprint
`determined from a flash image and a radius of variability
`determined from the image in ambient illumination.
`
`INCORPORATION BY REFERENCE
`0051 What follows is a cite list of references each of
`which is, in addition to that which is described as background,
`the invention summary, the abstract, the brief description of
`the drawings and the drawings themselves, hereby incorpo
`rated by reference into the detailed description of the pre
`ferred embodiments below, as disclosing alternative embodi
`ments of elements or features of the preferred embodiments
`not otherwise set forth in detail below. A single one or a
`combination of two or more of these references may be con
`
`Petitioner Apple Inc. - Ex. 1054, p. 32
`
`

`

`US 2010/0066822 A1
`
`Mar. 18, 2010
`
`sulted to obtain a variation of the preferred embodiments
`described in the detailed description herein:
`0.052
`U.S. Pat. Nos. RE33682, RE31370, 4,047,187,
`4,317,991, 4,367,027, 4,638,364, 5,291,234, 5,488,429,
`5,638,136, 5,710,833, 5,724,456, 5,781,650, 5,812, 193,
`5,818,975, 5,835,616, 5,852,823, 5,870,138, 5,911,139,
`5,978,519, 5,991,456, 6,072,904, 6,097,470, 6,101,271,
`6,128,397, 6,148,092, 6,188,777, 6,192,149, 6,249,315,
`6,263,113, 6,268,939, 6,282,317, 6,301,370, 6,332,033,
`6,349,373, 6,351,556, 6,393,148, 6,404,900, 6,407,777,
`6.421,468, 6,438,264, 6,456,732, 6,459,436, 6,473,199,
`6,501,857, 6,502,107, 6,504,942, 6,504,951, 6,516,154,
`6,526,161, 6,564,225, and 6,567,983;
`0053 United States published patent applications no.
`2003/008.4065, 2003/0059121, 2003/0059107, 2003/
`0052991, 2003/004.8950, 2003/0025812, 2002/0172419,
`2002/0168108, 2002/0114535, 2002/0105662, and 2001/
`0031142:
`0054 Japanese patent application no.JP5260360A2;
`0055 British patent application no. GB0031423.7; and
`0056 Yang et al., IEEE Transactions on Pattern Analysis
`and Machine Intelligence, Vol. 24, no. 1, pp. 34-58 (January
`2002).
`
`Illustrative Definitions
`0057 “Face Detection' involves the art of isolating and
`detecting faces in an image; Face Detection includes a pro
`cess of determining whether a human face is present in an
`input image, and may include or is preferably used in com
`bination with determining a position and/or other features,
`properties, parameters or values of parameters of the face
`within the input image:
`0058 “Face Recognition' involves the art of matching an
`unknown facial region from an image with a set of "known
`facial regions.
`0059) “Image-enhancement' or “image correction'
`involves the art of modifying a digital image to improve its
`quality. Such modifications may be “global applied to the
`entire image, or “selective' when applied differently to dif
`ferent portions of the image. Some main categories non
`exhaustively include:
`0060 (i) Contrast Normalization and Image Sharpening.
`0061
`(ii) Image Crop, Zoom and Rotate.
`0062 (iii) Image Color Adjustment and Tone Scaling.
`0063 (iv) Exposure Adjustment and Digital Fill Flash
`applied to a Digital Image.
`0064 (v) Brightness Adjustment with ColorSpace Match
`ing; and Auto-Gamma determination with Image Enhance
`ment.
`0065 (vi) Input/Output device characterizations to deter
`mine Automatic/Batch Image Enhancements.
`0066 (vii) In-Camera Image Enhancement
`0067 (viii) Face Based Image Enhancement.
`0068 “Auto-focusing involves the ability to automati
`cally detect and bring a photographed object into the focus
`field.
`0069. A 'pixel’ is a picture element or a basic unit of the
`composition of an image or any of the Small discrete elements
`that together constitute an image.
`0070 “Digitally-Acquired Image' includes an image that
`is digitally located and held in a detector.
`0071 “Digitally-Captured Image' includes an image that
`is digitally recorded in a permanent file and/or preserved in a
`more or less permanent digital form.
`
`0072 “Digitally-Detected Image': an image comprising
`digitally detected electromagnetic waves.
`0073. A “face region' is a region of a main image which
`has been determined to contain a human face. In particular, it
`may contain a Substantially oval, skin-colored region which
`has physical features corresponding to eyes, nose and mouth,
`or some portion of a face or subset of these facial features.
`0074. A face region is preferably “normalized in accor
`dance with the invention. Prior to extracting face classifier
`parameters (see definition below) from a face region, it is
`preferably first transformed into a normalized form. This may
`involve any or all of three principle steps: (i) resizing to a
`standard "size', e.g., based on the separation of eyes, nose
`and/or mouth; (ii) “orientation” in an upright or other selected
`direction which may involve rotation of the face region; and
`(iii) orientation to compensate for up/down or left/right varia
`tions in the “pose of the face. Note that these normalizations
`may usually performed in reverse order in accordance with a
`preferred embodiment: first pose normalization is imple
`mented, followed by orientation normalization and finally the
`face region is normalized for size. A fourth form of normal
`ization that may be preferably performed is luminance nor
`malization (see below definition), but it is treated or charac
`terized separately from the above, which are referred to as
`spatial normalizations.
`0075 "Face classifier parameters' are a set of values of
`vector and/or scalar classifiers extracted from a normalized
`face region. Typical examples of such a set of classifiers could
`be: (i) principle component vectors, (ii) independent compo
`nent vectors, (iii) 2D fourier transform components, (iv)
`wavelet transform components, (v) gabor components, etc.
`Note that several face classifier techniques may be combined
`to provide a definitive faceprint.
`0076. The set of face classifier parameters associated with
`aparticular face region is known as the “faceprint of that face
`region. The faceprint is preferably a set of face classifier
`parameters and may be subdivided into two or more Subsets
`of face classifier parameters which may overlap.
`(0077. An “archived faceprint” is a set of face classifier
`parameters associated with a particular fate region ultimately
`extracted from a parent image and preferably normalized, and
`stored in the main recognition database, preferably along with
`links to the parent image and the face region.
`0078. A “known identity” is a set of (database) associa
`tions between a known person or other object and one or more
`face classes comprising one or more archived faceprints.
`007.9 The following process is referred to as “luminance
`normalization'. It is common for horizontal and/or vertical
`variations in luminance levels to occur across a face region
`due to the ambient lighting at the time an image was captured
`or other factors such as artificial sources or flashes. In this
`case, certain types of face classifiers may be distorted and it
`may be advantageous to normalize luminance levels across
`the face region prior to extracting face classifier parameters in
`accordance with a preferred embodiment. As typical varia
`tions are linear in form and as the variations manifest them
`selves principally in skin-colored pixels, it is relatively
`straightforward to adjust each image pixel of a face region to
`approximately compensate for Such luminance variations
`caused by ambient lighting.
`0080 When two or more faceprints lie within a certain
`geometric distance of each other in facespace, they may be
`preferably grouped into a single face class. If a newly deter
`mined faceprint lies within this geometric distance of the face
`
`Petitioner Apple Inc. - Ex. 1054, p. 33
`
`

`

`US 2010/0066822 A1
`
`Mar. 18, 2010
`
`class, then this face class may be expanded to include the new
`faceprint, or may be added to the face class without expansion
`if the all of its face classifier values lie within the existing face
`class parameter value ranges. This existing face class is
`referred to as a “prior face class

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket