throbber
(12) United States Patent
`US 8,463,030 B2
`(10) Patent N0.:
`
`Boncyk et a].
`(45) Date of Patent:
`*Jun. 1], 2013
`
`USOO8463030B2
`
`(54)
`
`(75)
`
`IMAGE CAPTURE AND IDEVTIFICATION
`SYSTEM AND PROCESS
`
`Inventors: Wayne C. Boncyk, Evergreen, CA (US);
`Ronald H. Cohen, Pasadena, CA (US)
`
`
`
`(73) Assignee: Nant Holdings IP, LLC, Culver City,
`CA (US)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 171 days.
`.
`.
`.
`.
`.
`This patent is subject to a terminal dis-
`Clalmer-
`
`(21) Appl. No.2 13/069,124
`
`(22) Filed:
`
`Mar. 22, 2011
`.
`.
`.
`PH" Publ‘catl‘m Data
`US ”ll/022812831
`Sep. 22' 2011
`
`(65)
`
`(52) U.S. Cl.
`USPC ......................................................... 382/165
`(58) Field of Classification Search
`USPC ................. 382/181, 162, 165, 100, 305, 224,
`382/1157118; 705/2617272, 23; 348/239,
`348/211.2 211.6, 207.1, 460, 552; 713/186,
`713/168; 455/414274143, 412.1, 411;
`709/2017203, 2177219, 250
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`5,579,471 A
`11/1996 Barber eta].
`5,615,324 A
`3/1997 Kuboyama
`5,625,765 A
`4/1997 Ellenby et a1.
`5,682,332 A
`10/1997 Ellenby et :11.
`5,724,579 A
`3/1998 Suzuki
`5,742,521 A
`4/1998 Ellenby et a1.
`5,751,286 A
`5/l998 Barber et a].
`5 768 633 A
`6/1998 Allen et al.
`5:815:411 A
`9/1998 Ellenby et 211.
`5,926,116 A
`7/1999 Kimno et al.
`5,933,823 A
`8/1999 Cullen et a1.
`5,933,829 A
`8/1999 Durst et :11.
`
`Related U.S. Application Data
`
`(Continued)
`
`(60) Division of application No. 13/037,317, filed on Feb,
`28, 2011, now Pat. No. 8,224,078, which is a division
`ofapplication No. 12/333,630, filed on Dec. 12, 2008,
`now Pat. No. 7,899,243, which is a division of
`a
`lication No. 10/492,243, filed as a
`lication No.
`Pliijr/Irsoz/354o7 on Nov. 5, 2002, 88w Pat. No.
`7,477,780, which is a continuation of application No.
`393922539g2, filed on Nov. 5, 2001, now Pat. No.
`’
`.5,
`_
`_
`,
`PmVlsmnal application NO- 60/246295: filed on NOV-
`6, 2000: provisional application N0~ 60/317,521, filed
`on Sep. 5,5 2001~
`
`(60)
`
`(51)
`
`Int. Cl.
`G06K 9/00
`
`(2006.01)
`
`EP
`EP
`
`FOREIGN PATENA DOCUMENTS
`0920179
`6/1999
`1012725 A1
`3000
`(Continued)
`.
`.
`.
`,
`anary Emm’W” * 15h”? I Sher‘i‘h
`(74) Attorney, Agent, or Firm 7 Fish &Assoc1ates, PC
`(57)
`ABSTRACT
`A digital image of the object is captured and the object is
`recognized from plurality of objects in a database. An infor-
`mation address corresponding to the object is then used to
`access information and initiate commtmication pertinent to
`the object.
`
`38 Claims, 7 Drawing Sheets
`
`DATABASE
`MATCHING
`
`
`
`
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 1 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 1 of 23
`
`

`

`US 8,463,030 B2
`
`PageZ
`
`U.S. PATENT DOCUMENTS
`5,978,773 A
`11/1999 lludetzetal,
`5,991,827 A
`11/1999 Ellcnby et a1.
`50315115 A
`252000 Ellcnby 8‘31
`6937936 A
`362000 Ellfenby 6‘ 31'
`6055596 A
`4,3000 5111mm“ et 31-
`61064398 A
`5/,~000 Ellenby.“ 31;
`.
`6,081,612 A
`6/2000 Gutkow1cz—Ixrus1netal.
`6~093a113 A
`363000 Ellcnby “BL
`“035656 A
`33000 Dumetal
`6,144,848 A
`11/2000 Walsh etal,
`54/3339 Bl
`162001 EllePby
`6481317 Bl
`162001 219116131-
`91825090 Bl
`”,2001 Pealrs
`.
`91997048 Bl
`362001 Hu‘ie‘z f” a1;
`6,208,749 B1
`3/2001 Gutkow1cz—Ixrus1netal.
`“£55409 Bl
`762001 Wan‘v’
`63/3549 Bl
`363001 Ellcnby “ML
`6386996 “I
`952001 R110”
`.
`63077556 Bl
`“”001 Ellel‘byFtaL
`6,307,957 B1
`10/2001 Gutkowwz-Kmsm etal.
`61393447 32
`563002 Danneelsetal
`63965135 Bl
`562002 EllePbY ‘3‘ 31'
`63967597 Bl
`32002 Squ‘uae‘aL
`€411,725 Bl
`9,3002 Rhoads
`64145696 Bl
`763002 Ellcnby PHIL
`94305554 51
`862002 Rmhscmld
`6,434,561 B1
`8/2002 Durst, Jn et al.
`6 453 361 B1
`9/2002 Moms
`‘
`’
`6,522,292 B1
`2/2003 Ellcnby et a1.
`6,522,889 B1
`2,2003 Ammo
`65325298 B1
`3,2003 Cambier et a1.
`65535310 B1
`372003 Ellcnby st 31.
`6,542,933 B1
`4/2003 Durst, Jr. et 61.
`6,567,122 B1
`5/2003 Anderson et a1.
`6,651,053 B1
`11/2003 Rothschild
`6,674,923 B1
`1/2004 Shih et 21].
`6,674,993 B1
`1/2004 Tar-boru-iech
`6,675,165 B1
`1/2004 Rothschild
`6,690,370 B2
`2/2004 Ellenby et a1.
`6,691,914 B2
`2/2004 Isherwoodet a1.
`6,714,969 B1
`3/2004 Klein etal.
`6,724,914 B2
`4/2004 Brundagc ct a1.
`6,738,630 B2
`5/2004 Ashmore
`6,766,363 B1
`7/2004 Rothschild
`6,804,726 B1
`10/2004 Ellenby et a1.
`6,842,181 B2
`1/2005 Acharya
`6,865,608 B2
`3/2005 Hunter
`6,885,771 B2
`4/2005 Takahashi
`6,993,573 B2
`1/2006 Hunter
`7,016,532 B2
`3/2006 Boncyketal.
`7,031,536 B2
`4/2006 Kajiwara
`7,031,875 B2
`4/2006 Ellcnby et al,
`7,127,094 B1
`10/2006 Elbaum et a1.
`7,245,273 B2
`7/2007 Eberl et a1.
`7,362,922 B2
`4/2008 Nishiyama et a1,
`7,383,209 B2
`6/2008 Hudetz et a1.
`7,430,588 B2
`9/2008 Hunter
`7,641,342 B2
`1/2010 Eberletal.
`7,696,905 B2
`4/2010 Ellcnby ctal.
`
`................ 382/181
`
`31
`7j2010 Hlldetz etdal.
`3581? 33:63:23,? '
`3/2011 John et :11,
`7/2012 Boncyket 3.1.
`8/2001 Durst, Jr. et al.
`10/2001 Durst et 61.
`11/2001 Hunter et 211.
`“/2001 Hunter
`2/2002 Sckiguchi et 211.
`5/2002 Ohsawa
`7/2002 Ikeda
`7/2002 Boncyket 3.1.
`8/2002 Lev etal,
`8/2002 Frigon
`10/2002 Chcatlc ct 61.
`100002 Schneider
`11/2002 Ellenbyetal.
`5/2003 Burg etal.
`10/2004 Boncyketal.
`1/2005 Stavely et a1.
`20005 Ellenbyetal.
`7/2005 Dnrre11eta1.
`8/2005 Neven,Sr.
`7/2006 Ellenbyetal.
`8/2006 Bllenbyetol,
`5/2007 Eberl et a1
`6/2007 Pentenriederetal.
`8/2007 Platonov et 61,
`1/2008 Gil
`
`7,765,126 B2
`733193132
`7,916,138 B2
`8:218:874 B2*
`2001/0011276 A
`2001/0032252 A
`2001/0044824 A
`2001/0047426 A
`2002/0019819 A
`2002/0055957 A
`2002/0089524 A
`2002/0090132 A
`2002/0102966 A
`2002/0103813 A
`2002/0140988 A
`2002/0156866 A
`2002/0163521 A
`2003/0095681 A
`2004/0208372 A
`2005/0015370 A
`2005/0024501 A
`2005/0162523 A
`2005/0185060 A
`2006/0161379 A
`2006/0190812 A
`2007/0109619 A
`2007/0146391 A
`2007/0182739 A
`NOR/0021953 A
`2008/0157946 A
`2010/0045933 A
`2010/0188638 A
`
`
`
`7/2008 Eberletal.
`2/2010 Eberletal.
`
`7/2010 Bherl et 211.
`
`
`,
`4
`FOREIGN PATJNI DOCUMJNTS
`1354260 A2
`10/2003
`1355258
`10/2003
`2264669
`12/2010
`2407230
`/2005
`1091634
`4/1998
`10289243
`10/1998
`2001101191
`4/2001
`2001282825
`10/2001
`97/49060
`12/1997
`98/37811
`9/1998
`9916024
`4/1999
`9942946 A2
`8/1999
`9942947 A2
`8/1999
`99/44010
`9/1999
`0124050
`4/2001
`0149056
`/2001
`0163487 A1
`/2001
`0171282 A1
`9/2001
`0173603
`10/2001
`02/01143
`/2002
`02059716 A2
`/2002
`02073818 A1
`9/2002
`02082799
`10/2002
`
`EP
`EP
`EP
`GB
`JP
`JP
`JP
`JP
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`wo
`
`* cited by examiner
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 2 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 2 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 1 of7
`
`US 8,463,030 B2
`
`10
`
`____ SYNIBOLIC
`INIAGE
`
`OBJECT
`MAGE
`
`
`INPUT MAGE
`DECOMPOSITION
`
`
`
`
`
` DATABASE
`MATCHING
`
`FIG. 1
`
`SELECT BEST
`MATCH
`
`
`
`40
`
`42
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 3 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 3 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 2 of7
`
`US 8,463,030 B2
`
`
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 4 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 4 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 3 of7
`
`US 8,463,030 B2
`
`START
`
`FOR EACH INPUT IMAGE
`SEGMENT GROUIj
`
`FOR EACH OBJECT IN
`DATABA SE
`
`FOR EACH VIEW OF THIS
`OBJECT
`
`FOR EACH SEGMENT
`GROUP IN THIS VIEW
`
`
`GREYSCALE
`
`COD/.[PARISON
`
`
`
`
`WAVELET
`
`COMPARISON
`
`
`FIG. 3A
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 5 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 5 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 4 of7
`
`US 8,463,030 B2
`
`CALCULATE COD/[BINED
`
`MATCH SCORE
`
`NEXT SEGIVIENT GROUP IN
`
`THIS DATABASE VIEW
`
`NEXT VIEW OF THIS
`
`DATABASE OBJECT
`
`NEXT OBJECT IN
`DATABASE -
`
`NEXT INPUT IMAGE
`
`SEGNIENT GROUP
`
`FIG. 3B
`
`FINISH
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 6 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 6 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 5 017
`
`US 8,463,030 B2
`
`TARGET
`
`OBJECT
`
`CONTENT
`
`SERVER
`
`
`
`0 2
`
`1 0 0
`
`1 1
`
`1
`
`FIG. 4
`
`IMAGE
`
`PROCESSING
`
`
`BROWSER l
`
`
`
`
`
`
`
`OBJECT I DATABASE
`
`
`
`
`TARGET
`
`OBJECT
`
`INFORMATION
`
`IMAGE DATA
`
`5
`
`1 0 7
`
`1
`
`RECOGNITION
`
`IDENTIFICATION SERVER
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 7 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 7 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 6 of7
`
`US 8,463,030 B2
`
`OBJECT
`
`CONTENT
`
`SERVER
`
`2 0 2
`
`2 0 0
`
`12
`
`1
`
`
`
`CAMERA
`
`MAGE
`
`PROCESSING
`
`TERMINAL
`
`BROWSER
`
`IMAGE DATA
`
`TARGET
`
`OBJECT
`
`INFORMATION
`
`205
`
`IDENTIFICATION SERVER
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 8 of 23
`
` TARGET
`
`
`
`
`
`
`
`
`
`OBJECT
`
`RECOGNITION
`
`I
`
`
`
`DATABASE
`
` ‘
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 8 of 23
`
`

`

`US. Patent
`
`Jun. 11,2013
`
`Sheet 7 017
`
`US 8,463,030 B2
`
`TARGET
`
`OBJECT
`
`‘ SPACECRAFT
`
`DATA SYSTEM
`
`302
`
`300
`
`310
`
`FIG. 5
`
`PROCESSING
`
`IMAGE
`
`IMAGE DATA
`
` TARGET
`
`INFORMATION
`
`OBJECT
`
`3 0 5
`
`307
`
`3 0 9
`
`303
`
`3 0 6
`
`I
`
`
`DATABASE
`
`IDENTIFICATION SERVER
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 9 of 23
`
` ‘
`
`
`OBJECT
`
`RECOGNITION
`
`
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 9 of 23
`
`

`

`US 8,463,030 B2
`
`1
`IMAGE CAPTURE AND IDENTIFICATION
`SYSTEM AND PROCESS
`
`This application is a divisional of Ser. No. 13/037,317 filed
`Feb. 28, 2011 which is a divisional of Ser. No. 12/333,630
`filed Dec. 12, 2008 which is a divisional of Ser. No. 10/492,
`243 filed Apr. 9, 2004 which is a National Phase of PCT/
`USO2/35407 filed Nov. 5, 2002. These and all other refer-
`enced patents and applications are incorporated herein by
`reference in their entirety. Where a definition or use of a term
`in a reference that is incorporated by reference is inconsistent
`or contrary to the definition of that term provided herein, the
`definition of that term provided herein is deemed to be con-
`trolling.
`
`10
`
`15
`
`TECHNICAL FIELD
`
`The invention relates an identification method and process
`for objects from digitally captured images thereof that uses
`image characteristics to identify an object from a plurality of
`objects in a database.
`
`BACKGROUND ART
`
`30
`
`35
`
`4o
`
`45
`
`There is a need to provide hyperlink functionality in known ,
`objects Without modification to the objects, through reliably
`detecting and identifying the objects based only on the
`appearance of the object, and then locating and supplying
`information pertinent to the object or initiating communica—
`tions pertinent to the object by supplying an information
`address, such as a Uniform Resource Locator (URL), perti-
`nent to the object.
`There is a need to determine the position and orientation of
`known objects based only on imagery of the objects.
`The detection, identification, determination ofposition and
`orientation, and subsequent information provision and com-
`munication must occur without modification or disfigure-
`ment of the object, without the need for any marks, symbols.
`codes, barcodes, or characters on the object, without the need
`to touch or disturb the object, without the need for special
`lighting other than that required for normal human vision,
`without the need for any communication device (radio fre—
`quency. infrared etc.) to be attached to or nearby the object,
`and without human assistance in the identification process.
`The objects to be detected and identified may be 3-dimen-
`sional objects, 2-dimcnsional images (e.g., on paper), or 2-di-
`mensional
`images of 3—dimensional objects, or human
`beings.
`There is a need to provide such identification and hyperlink
`services to persons using mobile computing devices, such as ,
`Personal Digital Assistants (PDAs) and cellular telephones.
`There is a need to provide such identification and hyperlink
`services to machines, such as factory robots and spacecraft.
`Examples include:
`identifying pictures or other art in a museum, where it is
`desired to provide additional
`information about such art
`objects to museum visitors via mobile wireless devices;
`provision of content (information, text, graphics, music,
`video, etc), communications, and transaction mechanisms
`between companies and individuals, via networks (wireless
`or otherwise) initiated by the individuals “pointing and click-
`ing” with camera-equipped mobile devices on magazine
`advertisements, posters, billboards, consumer products,
`music or video disks or tapes, buildings, vehicles, etc.;
`establishment of a communications link with a machine,
`such a vending machine or information kiosk, by “pointing
`and clicking” on the machine with a camera-equip ped mobile
`
`2
`wireless device and then execution of communications or
`transactions between the mobile wireless device and the
`machine;
`identification of objects or parts in a factory, such as on an
`assembly line, by capturing an image of the objects or parts,
`and then providing information pertinent to the identified
`objects or parts;
`identification ofa part of a machine, such as an aircraft part,
`by a technician “pointing and clicking” on the part with a
`camera-equipped mobile wireless device, and then supplying
`pertinent content to the technician, such maintenance instruc-
`tions or history for the identified part;
`identification or screening of individual(s) by a security
`officer “pointing and clicking” a camera-equipped mobile
`wireless device at the individual(s) and then receiving iden-
`tification information pertinent to the individuals after the
`individuals have been identified by face recognition software;
`identification, screening, or validation of doctnnents, such
`as passports, by a security officer “pointing and clicking” a
`camera-equipped device at the document and receiving a
`response from a remote computer;
`determination ofthe position and orientation of an object in
`space by a spacecraft nearby the object, based on imagery of
`the obj ect. so that the spacecraft can maneuver relative to the
`object or execute a rendezvous with the object;
`identification of objects from aircraft or spacecraft by cap-
`turing imagery of the objects and then identifying the objects
`via image recognition performed on a local or remote com—
`puter;
`watching movie previews streamed to a camera-equipped
`wireless device by “pointing and clicking” with such a device
`on a movie theatre sign or poster, or on a digital video disc box
`or videotape box;
`listening to audio recording samples streamed to a camera-
`equipped wireless device by “pointing and clicking” with
`such a device on a compact disk (CD) box, videotape box, or
`print media advertisement;
`purchasing movie. concert, or sporting event tickets by
`“pointing and clicking” on a theater, advertisement, or other
`obj ect with a camera-equipped wireless device;
`purchasing an item by “pointing and clicking” on the
`object with a camera-equipped wireless device and thus ini-
`a
`tiating a transaction;
`interacting with television programming by ‘pointing and
`clicking” at the television screen with a camera-equipped
`device, thus capturing an image of the screen content and
`having that image sent to a remote computer and identified,
`thus initiating interaction based on the screen content
`received (an example is purchasing an item on the television
`screen by “pointing and clicking” at the screen when the item
`is on the screen);
`interacting with a computer-system based game and with
`other players of the game by “pointing and clicking” on
`objects in the physical environment that are considered to be
`part of the game;
`paying a bus fare by “pointing and clicking” with a mobile
`wireless camera-equip ped device, on a fare machine in a bus,
`and thus establishing a communications link between the
`device and the fare machine and enabling the fare payment
`transaction;
`establishment of a communicationbetween a mobile wire-
`less camera-equipped device and a computer with an Internet
`connection by “pointing and clicking” with the device on the
`computer and thus providing to the mobile device an Internet
`address at which it can communicate with the computer, thus
`establishing communications with the computer despite the
`
`60
`
`65
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p.10 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 10 of 23
`
`

`

`US 8,463,030 B2
`
`3
`absence of a local network or any direct communication
`between the device and the computer;
`use of a mobile wireless camera—equipped device as a
`point-of-sale terminal by, for example, “pointing and click-
`ing” on an item to be purchased, thus identifying the item and
`initiating a transaction.
`
`DISCLOSURE OF INVENTION
`
`The present invention solves the above stated needs. Once
`an image is captured digitally, a search of the image deter—
`mines whether symbolic content is included in the image. If
`so the symbol is decoded and communication is opened with
`the proper database, usually using the Internet, wherein the
`best match for the symbol is returned. In some instances, a
`symbol may be detected, but non-ambiguous identification is
`not possible. In that case and when a symbolic image can not
`be detected, the image is decomposed through identification
`algorithms where unique characteristics of the image are
`determined. These characteristics are then used to provide the
`best match or matches in the data base, the “best” determina-
`tion being assisted by the partial symbolic information, ifthat
`is available.
`Therefore the present invention provides technology and
`processes that can accommodate linking objects and images
`to information via a network such as the Internet, which
`requires no modification to the linked object. Traditional
`methods for linking objects to digital information, including
`applying a barcode, radio or optical transceiver or transmitter,
`or some other means of identification to the object, or modi-
`fying the image or object so as to encode detectable informa-
`tion in it, are not required because the image or object can be
`identified solely by its visual appearance. The users or devices
`may even interact with objects by “linking” to them. For
`example, a user may link to a vending machine by “pointing
`and clicking” on it. His device would be connected over the
`Internet to the company that owns the vending machine. The
`company would in turn establish a connection to the vending
`machine, and thus the user would have a communication
`channel established with the vending machine and could
`interact with it.
`The decomposition algorithms of the present invention
`allow fast and reliable detection and recognition of images
`and/or objects based on their visual appearance in an image,
`no matter whether shadows, reflections, partial obscuration,
`and variations in viewing geometry are present. As stated
`above, the present invention also can detect, decode, and
`identify images and objects based on traditional symbols
`which may appear on the obj ect, such as alphantuneric char-
`acters, barcodes, or 2-dimensional matrix codes.
`When a particular object is identified, the position and
`orientation of an object with respect to the user at the time the
`image was captured can be determined based on the appear-
`ance ofthe object in an image. This can be the location and/or
`identity of people scanned by multiple cameras in a security
`system, a passive locator system more accurate than GPS or
`usable in areas where GPS signals calmot be received, the
`location ofspecific vehicles without requiring a transmission
`from the vehicle, and many other uses.
`When the present invention is incorporated into a mobile
`device, such as a portable telephone, the user ofthe device can
`link to images and objects in his or her environment by
`pointing the device at the obj ect ofinterest, then “pointing and
`clicking” to capture an image. Thereafter, the device trans-
`mits the image to another computer (“Server”), wherein the
`image is analyzed and the object or image of interest is
`detected and recognized. Then the network address of infor-
`
`5
`
`10
`
`15
`
`30
`
`35
`
`4o
`
`45
`
`6O
`
`65
`
`4
`mation corresponding to that object is transmitted from the
`(“Server”) back to the mobile device, allowing the mobile
`device to access information using the network address so
`that only a portion of the information concerning the object
`need be stored in the systems database.
`Some or all of the image processing, including image/
`object detection and/or decoding of symbols detected in the
`image may be distributed arbitrarily between the mobile (Cli-
`ent) device and the Server. In other words, some processing
`may be performed in the Client device and some in the Server,
`without specification of which particular processing is per-
`formed in each, or all processing may be performed on one
`platform or the other, or the platforms may be combined so
`that there is only one platform. The image processing can be
`implemented in a parallel computing manner, thus facilitating
`scaling of the system with respect to database size and input
`traffic loading.
`Therefore, it is an object ofthe present invention to provide
`a system and process for identifying digitally captured
`images without requiring modification to the object.
`Another object is to use digital capture devices in ways
`never contemplated by their manufacturer.
`Another object is to allow identification of objects from
`partial views of the object.
`Another object is to provide communication means with
`operative devices without requiring a public connection
`therewith.
`These and other objects and advantages of the present
`invention will become apparent to those skilled in the art after
`considering the following detailed specification,
`together
`with the accompanying drawings wherein:
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a schematic block diagram top-level algorithm
`flowchart;
`FIG. 2 is an idealized view ofimage capture;
`FIGS. 3A and 3B are a schematic block diagram ofprocess
`details of the present invention;
`FIG. 4 is a schematic block diagram of a different expla-
`nation of invention;
`FIG. 5 is a schematic block diagram similar to FIG. 4 for
`cellular telephone and personal data assistant (PDA) applica—
`tions; and
`FIG. 6 is a schematic block diagram for spacecraft appli-
`cations.
`
`BEST MODES FOR CARRYING OUT THE
`INVENTION
`
`The present invention includes a novel process whereby
`information such as Internet content is presented to a user,
`based solely on a remotely acquired image of a physical
`object. Although coded information can be included in the
`remotely acquired image, it is not required since no additional
`information about a physical object, other than its image,
`needs to be encoded in the linked object. There is no need for
`any additional code or device, radio, optical or otherwise, to
`be embedded in or affixed to the object. Image—linked objects
`can be located and identified within user-acquired imagery
`solely by means of digital image processing, with the address
`of pertinent information being returned to the device used to
`acquire the image andpcrform the link. This process is robust
`against digital image noise and corruption (as can result from
`lossy image compression/decompression), perspective error,
`rotation, translation, scale differences,
`illumination varia-
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 11 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 11 of 23
`
`

`

`US 8,463,030 B2
`
`10
`
`15
`
`,
`
`30
`
`35
`
`5
`tions caused by different lighting sources. and partial obscu—
`ration of the target that results from shadowing. reflection or
`blockage.
`Many different variations on machine vision “target loca-
`tion and identification” exist in the current art. However, they
`all
`tend to provide optimal solutions for an arbitrarily
`restricted search space. At the heart of the present invention is
`a high-speed image matching engine that returns unambigu-
`ous matches to target objects contained in a wide variety of
`potential input images. This unique approach to image match—
`ing takes advantage ofthe fact that at least some portion ofthe
`target object will be found in the user-acquired image. The
`parallel image comparison processes embodied in the present
`search technique are, when taken together, unique to the
`process. Further, additional refinement of the process, with
`the inclusion of more and/or different decomposition-param-
`eterization functions, utilized within the overall structure of
`the search loops is not restricted. The detailed process is
`described in the following. FIG. 1 shows the overall process—
`ing flow and steps. These steps are described in further detail
`in the following sections.
`For image capture 10, the User 12 (FIG. 2) utilizes a corn-
`puter, mobile telephone, personal digital assistant. or other
`similar device 14 equipped with an image sensor (such as a
`CCD or CMOS digital camera). The User 12 aligns the sensor
`of the image capture device 14 with the object 16 of interest.
`The linking process is then initiated by suitable means includ-
`ing: the User 12 pressing a button on the device 14 or sensor;
`by the software in the device 14 automatically recognizing
`that an image is to be acquired; by User voice command; or by
`any other appropriate means. The device 14 captures a digital
`image 18 ofthe scene at which it is pointed. This image 18 is
`represented as three separate 2—D matrices of pixels, corre—
`sponding to the raw RGB (Red. Green. Blue) representation
`of the input image. For the purposes of standardizing the
`analytical processes in this embodiment, if the device 14
`supplies an image in other than RGB format, a transformation
`to RGB is accomplished. These analyses could be carried out
`in any standard color format. should the need arise.
`If the server 20 is physically separate from the device 14,
`then user acquired images are transmitted from the device 14
`to the Image Processor/ Server 20 using a conventional digital
`network orwireless network means. If the image 18 has been
`compressed (e.g. via lossy JPEG DCT) in a manner that
`introduces compression artifacts into the reconstructed image
`18, these artifacts may be partially removed by, for example,
`applying a conventional despeckle filter to the reconstructed
`image prior to additional processing.
`The lrnage Type Determination 26 is accomplished with a
`discriminator algorithm which operates on the input image 1 8 ,
`and determines whether the input image contains recogniz—
`able symbols. such as barcodes, matrix codes, or alphanu-
`meric characters. If such symbols are found, the image 18 is
`sent to the Decode Symbol 28 process. Depending on the
`confidence level with which the discriminator algorithm finds
`the symbols, the image 18 also may or alternatively contain an
`object of interest and may therefore also or alternatively be
`sent to the Object Image branch of the process flow. For
`example, if an input image 18 contains both a barcode and an
`object, depending on the clarity with which the barcode is
`detected. the image may be analyzed by both the Object
`Image and Symbolic Image branches, and that branch which
`has the highest success in identification will be used to iden-
`tify and link from the object.
`The image is analyzed to determine the location, size, and
`nature of the symbols in the Decode Symbol 28. The symbols
`are analyzed according to their type, and their content infor-
`
`4o
`
`45
`
`6O
`
`65
`
`6
`mation is extracted. For exam ole, barcodes and alphanumeric
`characters will result in numerical and/or text information.
`For object
`images,
`the aresent
`invention performs a
`“decomposition”, in the lnpu Image Decomposition 34, of a
`high-resolution input image into several different types of
`quantifiable salient parameters. This allows for multiple inde—
`pendent convergent search processes of the database to occur
`in parallel, which greatly im groves image match speed and
`match robustness in the Database Matching 36. The Best
`Match 38 from either the Decode Symbol 28, or the image
`Database Matching 36, or be h, is then determined. If a spe—
`cific URL (or other online address) is associated with the
`image, then an URL Lookup 40 is performed and the lntemet
`address is returned by the URL Return 42.
`The overall flow of the Inout Image Decomposition pro—
`cess is as follows:
`
`
`
`Radiometric Correction
`Segmentation
`Segment Group Generation
`FOR each segment group
`Bounding Box Generation
`Geometric Normalization
`Wavelet Decomposition
`Color Cube Decomposition
`Shape Decomposition
`Low-Resolution Grayscale Image Generation
`FOR END
`
`Each ofthe above steps is explained in further detail below.
`For Radiometric Correction,
`the input image typically is
`transformed to an 8-bit per color plane, RGB representation.
`The RGB image is radiometrically normalized in all three
`channels. This normalization is accomplished by linear gain
`and offset transfonnations that result in the pixel values
`within each color channel spanning a full 8-bit dynamic range
`(256 possible discrete values). An 8-bit dynamic range is
`adequate but, of course, as optical capture devices produce
`higher resolution images and computers get faster and
`memory gets cheaper, higher bit dynamic ranges, such as
`l6-bit, 32-bit or more maybe used.
`For Segmentation, the radiometrically normalized RGB
`image is analyzed for “segments.” or regions of similar color.
`i.e. near equal pixel values for red, green, and blue. These
`segments are defined by their boundaries, which consist of
`sets of (x, y) point pairs. A map of segment boundaries is
`produced, which is maintained separately from the RGB
`input image and is formatted as an x, y binary image map of
`the same aspect ratio as the RGB image.
`For Segment Group Generation, the segments are grouped
`into all possible combinations. These groups are known as
`“segment groups” and represent all possible potential images
`or objects of interest in the input image. The segment groups
`are sorted based on the order in which they will be evaluated.
`Various evaluation order schemes are possible. The particular
`embodiment explained herein utilizes the following “center—
`out” scheme: The first segment group comprises only the
`segment that includes the center of the image. The next seg-
`ment group comprises the previous segment plus the segment
`which is the largest (in number of pixels) and which is adja—
`cent to (touching) the previous segment group. Additional
`segtnents are added using the segment criteria above until no
`segments remain. Bach step,
`in which a new segment is
`added, creates a new and unique segment group.
`For Bounding Box Generation, the elliptical major axis of
`the segment group under consideration (the major axis of an
`ellipsejust large enough to contain the entire segment group)
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 12 of 23
`
`BANK OF AMERICA
`
`IPR2021-01080
`
`Ex. 1001, p. 12 of 23
`
`

`

`US 8,463,030 B2
`
`10
`
`15
`
`30
`
`35
`
`4o
`
`45
`
`8
`image is produced by weighted averaging of pixels within
`each 3x3 cell. The result is contrast bilmed, by reducing the
`number ofdiscrete values assignable to each pixel based upon
`substituting a “binned average” value for all pixels that fall
`within a discrete (TBD) number of brightness bins.
`The above discussion of the particular decomposition
`methods incorporated into this embodiment are not intended
`to indicate that more. or alternate, decomposition methods
`may not also be employed within the context ofthis invention.
`In other words:
`
`FOR each input image segment group
`FOR each database object
`FOR each view of tltis object
`FOR each segment group in tltis view of this database
`object
`Shape Comparison
`Grayscale Comparison
`Wavelet Comparison
`Color Cube Comparison
`Calculate Combined Match Score
`END FOR
`END FOR
`END FOR
`END FOR
`
`7
`is computed. Then a rectangle is constructed within the image
`coordinate system, with long sides parallel to the elliptical
`major axis, of a size just large enough to completely contain
`every pixel in the segment group.
`For Gcomctric Normalization, a copy of the input imagc is
`modified such that all pixels not included in the segment
`group under consideration are set to mid-level gray. The result
`is then resampled and mapped into a “standard aspect” output
`test image space such that the corners ofthe bounding box are
`mapped into the comers ofthe output test image. The standard
`aspect is the same size and aspect ratio as the Reference
`images used to create the database.
`For Wavelet Decomposition, a grayscale representation of
`thc full-color imagc is produccd from thc gcomctrically nor-
`malized image that resulted from the Geometric Nonnaliza—
`tion step. The following procedure is used to derive the gray-
`scale representation. Reduce the three color planes into one
`grayscale image by proportionately adding each R, G, and B
`pixel of the stande corrected color image using the follow—
`ing formula:
`L,,,:0.34*Rx,,+0.55*6,:4,1;+0.44%,
`thcn round to ncarcst intcgcr valuc. Truncatc at 0 and 255,
`if necessary. The resulting matrix L is a standard grayscale
`image. This grayscale representation is at the same spatial
`resolution as the full color image, with an 8-bit dynamic
`Each ofthe above steps is explained in further detail below.
`range. A multi-resolution Wavelet Decomposition of the
`FOR Each Input Image Segment Group
`grayscale image is performed, yielding wavelet coefficients
`for several scale factors. The Wavelet coefficients at various
`This loop considers each combinat

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket