`Case 2:20-cv-07872—MWF-PVC Document 1-3 Filed 08/27/20 Page 1 of 21 Page ID #:113
`
`
`
`
`
`
`
`
`
`
`EXHIBIT C
`
`EXHIBIT C
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 2 of 21 Page ID #:114
`
`USOO8326.038B2
`
`(12) United States Patent
`Boncyk et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,326,038 B2
`*Dec. 4, 2012
`
`(54) OBJECT INFORMATION DERIVED FROM
`OBJECT IMAGES
`
`(75) Inventors: Wayne C. Boncyk, Evergreen, CO (US);
`Ronald H. Cohen, Pasadena, CA (US)
`(73) Assignee: Nant Holdings IP, LLC, Los Angeles,
`CA (US)
`f thi
`the t
`disclai
`Subiect t
`ubject to any d1Sc1a1mer, une term oI un1s
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`This patent is Subject to a terminal dis-
`claimer.
`
`(*) Noti
`O1C
`
`(21) Appl. No.: 13/207,230
`
`1-1.
`(22) Filed:
`
`(65)
`
`Aug. 10, 2011
`O
`O
`Prior Publication Data
`US 2011 FO295.742 A1
`Dec. 1, 2011
`Related U.S. Application Data
`(60) Division of application No. 13/037,330, filed on Feb.
`28, 2011, now Pat. No. 8,218,873, which is a division
`of application No. 12/568,130, filed on Sep. 28, 2009,
`now Pat. No. 7,899,252, which is a division of
`lication No. 1 1/204,901 filed on Aug. 15, 2005
`applica E. "No
`680 .
`"S, is
`a
`continuation-in-part of application No. 09/992,942,
`filed on Nov. 5, 2001, now Pat. No. 7,016,532.
`(60) Provisional application No. 60/317,521, filed on Sep.
`5, 2001, provisional application No. 60/246.295, filed
`on Nov. 6, 2000, provisional application No.
`60/630,524, filed on Nov. 22, 2004, provisional
`application No. 60/625,526, filed on Nov. 4, 2004.
`(51) Int. Cl.
`(2006.01)
`G06K 9/00
`(52) U.S. Cl. ....................................................... 382/181
`
`(58) Field of Classification Search .................. 382/181,
`382/224, 305,100; 455/414.3, 411,414.2,
`455/412.1; 705/26.1, 27.1, 7.33: 348/239,
`348/211.2-211.6, 207.1, 460, 552
`See application file for complete search history.
`References Cited
`
`(56)
`
`U.S. PATENT DOCUMENTS
`5,579,471 A 11/1996 Barber et al.
`5,615,324 A
`3, 1997 Rya
`5,625,765 A
`4/1997 Ellenby et al.
`39.3% A 1992. Scy et al.
`5,742,521 A
`4/1998 Ellenby et al.
`5,751,286 A
`5/1998 Barber et al.
`5,768,663 A
`6, 1998 Lin
`5,815,411 A
`9/1998 Ellenby et al.
`5,926,116 A
`7/1999 Kitano et al.
`5,933,823. A
`8/1999 Cullen et al.
`5,933,829 A
`8, 1999 Durst et al.
`5,978,773. A 1 1/1999 Hudetz et al.
`5.991,827 A 1 1/1999 Ellenby et al.
`6,031,545 A
`2/2000 Ellenby et al.
`(Continued)
`FOREIGN PATENT DOCUMENTS
`O920179
`9, 2000
`(Continued)
`
`EP
`
`Primary Examiner —Ishrat I Sherali
`(74) Attorney, Agent, or Firm — Fish & Associates, PC
`
`ABSTRACT
`(57)
`Search terms are derived automatically from images captured
`by a camera equipped cell phone, PDA, or other image cap
`turing device, Submitted to a search engine to obtain infor
`mation of interest, and at least a portion of the resulting
`information is transmitted back locally to, or nearby, the
`device that captured the image.
`
`24 Claims, 8 Drawing Sheets
`
`SMBC
`
`
`
`
`
`MAGE
`DECOMPOSTTON
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 3 of 21 Page ID #:115
`
`US 8,326,038 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`6,037,963 A
`3/2000 Denton et al.
`6,055,536 A
`4/2000 Shimakawa et al.
`6,064,398 A
`5/2000 Ellenby et al.
`6,081,612 A
`6/2000 Gutkowicz-Krusin et al.
`E: A
`29 Elyst al.
`6.144848. A
`1/2000 &E
`k l l
`3.2 R
`299; E. 1
`6.82090 Bi
`1, 2001 (phet al.
`6,199,048 B1
`3/2001 Hudetzetal
`6,208,749 B1
`3/2001 Gutkowicz-Krusin et al.
`6.256,409 B
`72001 Wang
`6,278.461 B1
`8/2001 Ellenby et al.
`g: R 858: ER seal
`6,307.957 B1
`10/2001 Gutkowicz-Krusin et al.
`3. R
`29: Bill cal
`6,396,537 B1
`5/2002 S E.
`- - w
`quilla et al.
`6,411,725 B1
`6/2002 Rhoads
`6,414,696 B1
`7/2002 Ellenby et al.
`6,430,554 B1
`8/2002 Rothschild
`6,434,561 B1
`8/2002 Durst, Jr. et al.
`3. R 23: R 1
`6522.889 B
`2/2003
`lity et al.
`6,532,298 B1
`3/2003 Cambier et al.
`6,535,210 B1,
`3, 2003 Ellenby et al.
`6542,933 B1
`4/2003 Durst
`etal
`6,567,122 B1
`5/2003 Anderson et al.
`6,651,053 B1
`1 1/2003 Rothschild
`6,674,923 B1
`1/2004 Shih et al.
`6,674,993 B1
`1/2004 Tarbouriech
`6,675,165 B1
`1/2004 Rothschild
`6,690,370 B2
`2/2004 Ellenby et al.
`6,691.914 B2
`2/2004 Isherwood et al.
`6,714,969 B1
`3/2004 Klein et al.
`6,724.914 B2
`4/2004 Brundage et al.
`6,738,630 B2
`5/2004 Ashmore
`6,766,363 B1
`7/2004 Rothschild
`6,804,726 B1
`10/2004 Ellenby et al.
`6,842,181 B2
`1/2005 Acharya
`6,865,608 B2
`3/2005 Hunter
`6,885,771 B2
`4/2005 Takahashi
`6,968,453 B2 * 1 1/2005 Doyle et al. .................. 713/168
`6,993,573 B2
`1/2006 Hunter
`7,016,532 B2
`3/2006 Boncyk et al.
`7,103,772 B2 * 9/2006 Jorgensen et al. ............ 713/168
`7,127,094 B1
`10/2006 Elbaum et al.
`7,301,536 B2 11/2007 Ellenby et al.
`7,353,184 B2 * 4/2008 Kirshenbaum et al. ...... 705,733
`7.356,705 B2 * 4/2008 Ting .............................. 713/186
`7,362.922 B2
`4/2008 Nishiyama et al.
`
`6/2008 Hudetz et al.
`7,383.209 B2
`3. Ea
`2:5 R:
`4/2010 Ellenb tal
`7696.905 B2
`7.5 50. E.
`796.38 B2
`3/2011 John et al.
`8,099.332 B2 *
`1/2012 Lemay et al. ................ TO5/26.1
`8,218,874 B2 * 7/2012 Boncyk et al. ................ 382, 181
`2001/001 1276 A1
`8/2001 Durst, Jr. et al.
`2001/0032252 Al
`10/2001 Durst, Jr. et al.
`2001/0044824 A1 11/2001 Hunter et al.
`2001/0047426 Al
`11/2001 Hunter
`2002/00 19819 A1
`2/2002 Sekiguchi et al.
`28393. A 5.3% hawa
`535856.5. A
`1585 Ek yk et al.
`2002/0102966 A1
`8, 2002 Lev et al.
`6583 SA eatle et al.
`385886. A
`2002/0156866 A1 10, 2002 Schneider
`2002/0163521 A1 1 1/2002 Ellenby et al.
`2003/0095681 A1
`5/2003 Burg et al.
`2004/0208.372 A1 10/2004 Boncyk et al.
`2005/00 15370 A1
`1/2005 Stavely et al.
`2005/0024501 A1
`2/2005 Ellenby et al.
`2005, 0162523 A1
`7, 2005 Darrell et al.
`2005/0185060 A1
`8/2005 Never, Sr.
`2007/01096.19 A1
`5/2007 Eberlet al.
`388853. A
`258. Enea
`2011/0173100 A1* 7/2011 Boncyk et al. ............... 705/27.1
`FOREIGN PATENT DOCUMENTS
`1355.258
`2, 2003
`EP
`2264669
`12/2010
`EP
`2407230
`9, 2003
`GB
`10-91634
`4f1998
`JP
`10-289.243
`10, 1998
`JP
`2001101.191
`4/2001
`JP
`2001282825
`10, 2001
`JP
`97,44737
`11, 1997
`WO
`97,49060
`12/1997
`WO
`98.37811
`9, 1998
`WO
`99.16024
`4f1999
`WO
`99.42.946
`8, 1999
`WO
`99.42947
`8, 1999
`WO
`99.44010
`9, 1999
`WO
`O1,24050
`4/2001
`WO
`O1/49056
`T 2001
`WO
`O1,63487
`8, 2001
`WO
`O1,71282
`9, 2001
`WO
`O 1/73603
`10, 2001
`WO
`O2/O1143
`1, 2002
`WO
`O2/O82799
`10, 2002
`WO
`* cited by examiner
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 4 of 21 Page ID #:116
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 1 of 8
`
`US 8,326,038 B2
`
`
`
`
`
`
`
`
`
`
`
`INPUT IMAGE
`CAPTURE
`
`
`
`
`
`0
`
`OBJECT
`MAGE
`
`ETERMINEY
`MAGE
`TYPE
`
`SYMBOLIC
`MAGE
`
`NPUT IMAGE
`DECOMPOSITON
`
`2, 6
`
`Y 3 4.
`
`DECODE
`
`DATABASE
`MATCHING
`
`
`
`SELECT BEST
`MATCH
`
`F.G. 1
`
`4 0
`
`4 2
`
`URL RETURN
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 5 of 21 Page ID #:117
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 2 of 8
`
`US 8,326,038 B2
`
`
`
`\ s
`
`
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 6 of 21 Page ID #:118
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 3 of 8
`
`US 8,326,038 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`FOREACH INPUT IMAGE
`SEGMENT GROUP
`
`FOREACH OBJECT IN
`DATABASE
`
`
`
`FOREACH SEGMENT
`GROUP IN THIS WEW
`
`
`
`GREYSCALE
`COMPARISON
`
`COLORCUBE
`COMPARSON
`
`
`
`WAVELET
`COMPARISON
`
`FIG. 3A
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 7 of 21 Page ID #:119
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 4 of 8
`
`US 8,326,038 B2
`
`CALCULATE coMBINED
`MATCH SCORE
`
`NEXT SEGMENT GROUPN
`THIS DATABASE VIEW
`
`NEXT VEW OF THIS
`DATABASE OBJECT
`
`NEXT OBJECT IN
`DATABASE .
`
`NEXT INPUT IMAGE
`SEGMENT GROUP
`
`FIG. 3B
`
`FINISH
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 8 of 21 Page ID #:120
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 5 of 8
`
`US 8,326,038 B2
`
`TARGET
`OBJECT
`
`CONTENT
`SERVER
`
`1 O2
`
`1 O O
`
`1 11
`
`FIG. 4
`
`CAMERA
`
`1 0 3
`
`BROWSER
`
`MAGE
`PROCESSING
`
`104
`
`1 1 0
`
`
`
`NAL
`TERM
`
`IMAGE DATA
`
`10
`5
`
`1.
`
`\
`OBJE CT
`RECOGNITION
`
`TARGET
`OBJECT
`NFORMATION
`
`DATABASE
`
`IDENTIFICATION SERVER
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 9 of 21 Page ID #:121
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 6 of 8
`
`US 8,326,038 B2
`
`TARGET
`OBJECT
`
`CONTENT
`SERVER
`
`20 2.
`
`20 0
`
`21
`
`FIG 5
`
`CAMERA
`
`MAGE
`
`203
`
`BROWSER
`al
`
`204
`PROCESSING
`TERMINAL
`
`2 1 0
`
`
`
`MAGE DATA
`
`TARGET
`OBJECT
`|INFORMATION
`2 0 6
`/
`
`
`
`2 O 9
`
`
`
`
`
`
`
`
`
`2 OS
`2 () 7
`\
`\
`OBJE CT
`DATABASE
`RECoNTION
`DENTIFICATION SERVER
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 10 of 21 Page ID #:122
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 7 of 8
`
`US 8,326,038 B2
`
`TARGET
`OBJECT
`
`
`
`SPACECRAFT
`DATA SYSTEM
`
`0 2
`
`300
`
`31 O
`
`3 O 3
`
`
`
`
`
`MAGE
`PROCESSENG 3 O 4
`
`FIG. 6
`
`
`
`IMAGE DATA
`
`
`
`TARGET
`OBJECT
`INFORMATION
`
`3 O 5
`
`
`
`
`
`\
`oBJECT
`RECOGNITION
`
`\
`DATABASE
`
`DENTIFICATION SERVER
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 11 of 21 Page ID #:123
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet 8 of 8
`
`US 8,326,038 B2
`
`a.
`
`A25
`
`internet f
`Network
`
`42
`
`
`
`432
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 12 of 21 Page ID #:124
`
`US 8,326,038 B2
`
`1.
`OBJECT INFORMATION DERVED FROM
`OBJECT IMAGES
`
`This application is a divisional of Ser. No. 13/037,330 filed
`Feb. 28, 2011 which is a divisional of Ser. No. 12/568,130
`filed Sep. 28, 2009 which is a divisional of Ser. No. 1 1/204,
`901 filed Aug. 15, 2005 which is a continuation-in-part of Ser.
`No. 09/992,942 filed Nov. 5, 2001 and issued as 7,016,532 on
`Mar. 1, 2006, which claims priority to provisional application
`No. 60/317,521 filed Sep. 5, 2001 and provisional application
`No. 60/246,295 filed Nov. 6, 2000. Ser. No. 1 1/204,901 filed
`Aug. 15, 2005 and issued as U.S. Pat. No. 7,899,252 on Mar.
`1, 2011 also claims priority to provisional application No.
`60/630,524 filed Nov. 22, 2004 and provisional application
`No. 60/625,526 filed Nov. 4, 2004. These and all other refer
`enced patents and applications are incorporated herein by
`reference in their entirety. Where a definition or use of a term
`in a reference that is incorporated by reference is inconsistent
`or contrary to the definition of that term provided herein, the
`definition of that term provided herein is deemed to be con
`trolling.
`
`5
`
`10
`
`15
`
`FIELD OF THE INVENTION
`
`The field of the invert digital imaging.
`
`BACKGROUND
`
`Several years ago the present inventors pioneered the con
`cept of using digitally captured images to identify objects
`within the images, and then using Such identifications to
`retrieve information from various databases. Examples
`include:
`Using a local device (cell phone, digital camera, PDA or
`other device to capture an image of an object in an art
`museum, identifying the object from the image data, and
`then providing the user with information regarding the
`object (i.e., about or relating to the object);
`Using a local device (cell phone, digital camera, PDA or
`other device) to capture an image of an automobile as it
`drives along a road, identifying the make and model
`from the image data, and then providing a user with a
`link to a website relating to that particular make and
`model;
`Using a local device (cell phone, digital camera, PDA or
`other device) to capture an image of a bar code, logo, or
`other indicia in a magazine, using information contained
`in the indicia to identify a product, and providing a
`telephone number or other contact information relating
`to that product;
`Using a local device (cell phone, digital camera, PDA or
`other device) to photograph a billboard of a restaurant,
`identifying the restaurant from a barcode, special target,
`written language, or other information contained in the
`photograph, and using that information to access a data
`base to provide the user with restaurants location,
`menu, or telephone number, and
`Using a local device (cell phone, digital camera, PDA or
`other device) to capture an image of a sign at a sports
`stadium, using information extracted from the image to
`automatically purchase an entry ticket for the user, and
`providing the user with an entry code that can be used to
`bypass the long lines of ordinary ticket purchasers.
`In Such embodiments it was specifically contemplated that
`analysis of the images could be performed locally (i.e. on the
`cellphone, PDA or other device capturing the image), distally
`at a server, or more preferably using some combination of the
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`two. It was also contemplated that any available database
`could be accessed to provide the returned information,
`including publicly accessible databases on the Internet. It was
`not appreciated, however, that one could integrate these con
`cepts with the searching capabilities of standard Search
`Engines.
`In the 1990s Yahoo!TM introduced the idea of indexing web
`pages accessible on Internet, and providing a Search Engine
`that to access the index. Since that time dozens of other
`searching systems have been developed, which use all man
`ner of various search methods, algorithms, hardware and/or
`Software. All Such systems and methods that accept user
`inputs of Key Information, and then utilize such Key Infor
`mation to provide the user with information of interest, are
`referred to herein as Search Engines. The user, of course, can
`be a natural person, as well as a device (computing or other
`wise), algorithm, system, organization, or any other entity. In
`searching for information, a Search Engine can utilize any
`Suitable search domain, including for example:
`A database (including for example a relational database, an
`object database, or an XIV database).
`A network of resources including for example web pages
`accessible within the Internet; and
`A public or private collection of documents or information
`(e.g., documents, information, and/or messages of a
`company or other organization(s)) such as that main
`tained by LEXISTM.
`In a typical search, Key Information is provided to the
`Search Engine in the form of key words comprising text,
`numbers, strings, or other machine-readable information
`types. The Search Engine then searches its indices of web
`pages for matches, and returns to the user a hyperlinked
`listing of Internet Uniform Resource Locators (“URLs”), as
`well as some brief display of context in which the key word(s)
`are used. The information of interest can sometimes be found
`in the hyperlinked listing, but is more frequently found by
`linking directly to the listed web pages.
`Providing Key Information to Search Engines in the form
`of text strings has inherent difficulties. It involves strategy in
`the selection of the text to be entered, and even with respect to
`the format of the keywords (for example using wildcards).
`Another difficulty is that Small computing and/or telephony
`devices (e.g. telephones, both mobile and non-mobile), have
`small and/or limited keyboards, thus making text entry diffi
`cult.
`
`SUMMARY OF THE INVENTION
`
`The present invention provides apparatus, Systems and
`methods in which: (a) a digital photograph, video, MPEG,
`ANT, or other image is captured using a camera equipped cell
`phone, PDA, or other image capturing device; (b) key words
`or other search criteria are automatically extracted or derived
`from image; (c) the search criteria are Submitted to a Search
`Engine to obtain information of interest; and (d) at least a
`portion of the resulting information is transmitted back
`locally to, or nearby, the device that captured the image.
`Some images so utilized will include symbolic content that
`is sufficient in and of itself to be relatively non-ambiguous.
`Such symbolic content, for example, can be a telephone num
`ber or a web-site address. In such instances the symbolic
`content search criteria can advantageously be utilized as a
`literal in the search criteria. In other instances significant
`additional processing can be needed. For example, an image
`ofan automobile will likely need to be processed to determine
`the make and model, and that information (e.g. MercedesTM
`S500TM) can then be transmitted to the Search Engine to be
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 13 of 21 Page ID #:125
`
`US 8,326,038 B2
`
`4
`the invention, along with the accompanying drawings in
`which like numerals represent like components.
`
`BRIEF DESCRIPTION OF THE DRAWING
`
`10
`
`15
`
`FIG. 1 is a schematic block diagram top-level algorithm
`flowchart;
`FIG. 2 is an idealized view of image capture;
`FIGS. 3A and 3B are a schematic block diagram of process
`details of the present invention;
`FIG. 4 is a schematic block diagram of a different expla
`nation of invention;
`FIG. 5 is a schematic block diagram similar to FIG. 4 for
`cellular telephone and personal data assistant (PDA) applica
`tions; and
`FIG. 6 is a schematic block diagram for spacecraft appli
`cations.
`FIG. 7 is a schematic of a system in which a local device
`captures and image, a search term is automatically derived
`from an image, is Submitted to a search engine to produce a
`results set, and information from the results set is sent back to
`the device.
`
`3
`used as key words for a search. It is also contemplated that
`processing of Some images will result in only best guesses.
`Thus, a side view of an automobile can not be analyzable into
`a particular make and model, and in that case the system can
`provide more generic terms such as SUV or automobile.
`In general, the present invention provides technology and
`processes that can accommodate linking objects and images
`to information via a network such as the Internet, which
`require no modification to the linked object. Traditional meth
`ods for linking objects to digital information, including
`applying a barcode, radio or optical transceiver, or transmit
`ter, or some other means of identification to the object, or
`modifying the image or object so as to encode detectable
`information in it, are not required because the image or object
`can be identified solely by its visual appearance. The users or
`devices can even interact with objects by “linking to them.
`For example, a user can link to a vending machine by “point
`ing and clicking on it. His device would be connected over
`the Internet to the company that owns the vending machine.
`The company would in turn establish a connection to the
`vending machine, and thus the user would have a communi
`cation channel established with the vending machine and
`could interact with it.
`The present invention contemplates any Suitable decompo
`sition algorithms. Clearly, faster and more accurate algo
`rithms are preferred over slower and less accurate algorithms.
`It is especially preferred that algorithms are chosen Such that
`at least some processing can take place locally to the device
`that captures the image. Such processing can in many
`instances eliminate the need to wirelessly transmit detailed
`images, and can eliminate reliance on a distal server that
`might be oversubscribed. Thus, some or all of the image
`processing, including image/object detection and/or decod
`ing of symbols detected in the image can be distributed arbi
`trarily between the mobile (client) device and the server. In
`other words, some processing can be performed in the client
`device and some in the server, without specification of which
`particular processing is performed in each, or all processing
`can be performed on one platform or the other, or the plat
`forms can be combined so that there is only one platform. The
`image processing can be implemented in a parallel computing
`manner, thus facilitating scaling of the system with respect to
`database size and input traffic loading.
`It is further contemplated that some suitable algorithms
`will take into account the position and orientation of an object
`with respect to the user at the time the image was captured,
`which can be determined based on the appearance of the
`object in an image. This can be the location and/or identity of
`people scanned by multiple cameras in a security system, a
`passive locator system more accurate than UPS or usable in
`areas where UPS signals cannot be received, the location of
`specific vehicles without requiring a transmission from the
`vehicle, and many other uses.
`Therefore, it is an object of the present invention to provide
`a system and process for identifying digitally captured
`images without requiring modification to the object.
`Another object is to use digital capture devices in ways
`never contemplated by their manufacturer.
`Another object is to allow identification of from partial
`views of the object.
`Another object is to provide communication means with
`operative devices without requiring a public connection
`therewith.
`Various other objects, features, aspects and advantages of
`the present invention will become more apparent from the
`following detailed description of preferred embodiments of
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`DETAILED DESCRIPTION
`
`FIGS. 1-6 are copied from the priority PCT application,
`PCT/US02/35.407 filed Nov. 5, 2002. Discussion of that those
`figures is set forth later in the application.
`Search Engine-Related Embodiments
`In FIG. 7 a system 400 generally comprises a portable
`imaging device 410, a distal server 420, an electronic com
`munications network 425, and a search engine 430.
`In general, the portable device 410 captures an image 412
`ofan object 415; and transmits information 413 regarding the
`image to the server 420. At least one of the device 410 and the
`server 420 derives a search term 421A, 421B from at least one
`of the image 412 and the transmitted information 413, respec
`tively. At least one of the device 410 and the server 420 cause
`the search term 421A, 421B to be submitted via a network 425
`to a search engine 430 that uses an index 432 of web pages or
`other information. The search engine then uses the search
`term 421A, 421B to produce a results set 434, and causes at
`least a portion of the results set 434 to be transmitted back to
`the portable device 410. In the above discussion it should be
`appreciated that information regarding the image can include
`the entire image, one or more Subsets of the image, as well as
`a name or other information derived from the image, but not
`contained within the image. It should also be appreciated that
`one could use a proxy server between his/her portable device
`and the server. In short, the present application contemplates
`using any complexity of circuitous communication between
`the mobile client and server—not necessarily a direct con
`nection.
`Device 410 can be a cell phone, PDA, laptop computer, or
`any other portable device that optically captures an image. By
`“optically captures” is meant Some sort of light sensitive
`array, the output of which can be processed to comprise a
`visually perceptible image. Viewed from another perspective,
`device 410 can be any camera having telephony capability,
`and especially having cell phone capability. With current
`technology, device 410 would usually have a lens or other
`light focusing mechanism, although it is contemplated that
`advances in electronics can eliminate the need for any physi
`cal focusing mechanism. The term "optically captures' is not
`satisfied by a device that has not optical components, and is
`merely capable of downloading images from the Internet or
`other sources.
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 14 of 21 Page ID #:126
`
`US 8,326,038 B2
`
`5
`It is certainly contemplated that the cell phone or other
`device providing the services discussed herein would operate
`software permitting it to do so. That software could be resi
`dent on the device, in external memory (memory card), or
`paged in as needed.
`Object 415 (referred to as a Thing of Interest” in one or
`more of the priority applications) can be any visually percep
`tible object, regardless of dimension. Contemplated “two
`dimensional objects include objects in which the relevant
`information is Substantially in two dimensional format,
`which includes advertisements and articles in magazine or
`other print media, as well as photographs or designs on bill
`boards, street signs, restaurant or other business signs, user
`manuals, paintings at a museum, and so forth.
`Contemplated three dimensional objects include Substan
`tially all physical objects in which the relevant information is
`derived from the shape of the object and/or the appearance of
`the surface of the object. Thus, an automobile is considered
`herein to have three dimensions of relevance where the shape
`or other dimensions convey information about the make and
`model. Similarly, a window in a building can be considered to
`have three dimensions of relevance where the identity of the
`manufacturer or distributor can be gleaned from the overall
`physical dimensions, detail, and so forth. As another
`example, a beverage container can be considered to have three
`dimensions; information can be obtained from the shape of
`the container but further information can also be obtained
`from the label, printing, logos, text, or other Such visible
`markings on the container (obtaining information from vis
`ible markings on the container enables discrimination
`between different containers that have identical physical
`shape). Contemplated three dimensional objects include sub
`stantially all physical objects in which the relevant informa
`tion is derived from changes over time. For example, the
`speed of a bird or its flight patterns, or a gesture of a person,
`can be captured in multiple images over a period of time, and
`can be relevant information, and can be reduced to search
`terms (referred to as Key Information in one or more of the
`priority documents) for Submission to a search engine. Of
`course, many objects will be considered to have two, three or
`40
`four dimensions of relevance herein. Thus, relevant informa
`tion for an automobile can be provided by each of a two
`dimensional logo on the side of the vehicle, the three dimen
`sional shape of the vehicle, and its four dimensional
`acceleration or handling features.
`It is especially contemplated that objects can include ani
`mate and inanimate objects. Among animate objects are
`included faces of people, and biometric information Such as
`the fingerprint pattern on a human finger, an iris of a person
`and so forth.
`Image 412 is contemplated to be any array of pixels. In
`most cases the pixels will be regularly arranged, but that is not
`absolutely necessary. In most cases the pixels also will num
`bergreater than 19,200 (160x120), such as 78,800 (320x240)
`but they can number few than that. More preferred images
`have greater pixel counts, including for example, 256,000
`(640x400), more preferably at least 2 million, and even more
`preferably at least 4 million. It is not necessary that the image
`be actually constructed at the portable device. Thus, a state
`ment that “the portable device captures an image of an object’
`includes situations where the device receives and derives data
`from light emitted or reflected from the object, even if the data
`is never presented to a user as a visually perceptible image,
`and even if the data is sent to a distal server without ever being
`collected into an image by the device.
`The information transmitted to the server can comprise any
`relevant information regarding the contents of the image.
`
`50
`
`6
`Thus, information 413 could comprise the entire image, or a
`portion of the image. For example, where a user takes a
`picture of a bar code (whether 2D, 3D or any other configu
`ration, the device 410 could process the image 412 to remove
`color and all background except the bar code itself, and then
`merely send the portion of the image containing the bar code
`as the transmitted information 413. In other cases it is con
`templated that the device 410 could sufficiently process the
`image 413 to derive one or more keywords, and then send
`only the keyword(s) as the transmitted information 413. All
`possible combinations are also contemplated. Thus, a user
`might take a photograph of a GucciTM handbag, the device
`412 might derive the word "Gucci” from the image, subtract
`out background except for the handbag, and then transmit: (a)
`the word "Gucci'; and (b) the image of the handbag as the
`transmitted information 413. In Such instances the process
`can be iterative. Thus, the device might initially transmit the
`word "Gucci” as the first transmitted information, receive a
`results set from the search engine indicating clothing acces
`sories, and then Subtract out background except for the hand
`bag, and transmit the image of the handbag as the second
`transmitted information. As discussed above, it is specifically
`contemplated that the device 410 could send the server 420
`numerical/digital data that is mathematically derived from the
`image. Examples include image features and characteristics
`that the server 420 could use in the server recognition process,
`without transmitting the original image.
`As should be apparent by now, the transmitted information
`need not be limited to image information. Sights, sounds, text,
`and all sorts of other information can be included in the
`transmitted information, some of which can be derived
`directly from the image, and some of which can be derived
`indirectly from the image. In addition, the device 410 can also
`capture non-visual information Such as sounds, and that infor
`mation can also be transmitted. Thus, it is contemplated that
`the device could capture the Sounds of a frog, capture an
`image of a lake or forest, and send both to be used as (or
`further analyzed into) search terms.
`Distal server 420 is distal in the sense that it has no hard
`wired link to device 410. Server 420 can be a single device, as
`well as any number of devices coupled together, as for
`example in a server farm. All manner of suitable servers are
`contemplated. Thus, servers can use any reasonable hard
`ware, operate using any reasonable Software, communica
`tions protocols, and so forth.
`In terms of interaction with the device, the various analyti
`cal tasks discussed above can allocated in any suitable man
`ner between server 420 and device 410. For example, in the
`iterative operation discussed above with respect to the
`GucciTM handbag, it is contemplated that the device 410 could
`analyze the image sufficiently to transmit the term "Gucci” as
`an initial search term to the search engine 430, and the server
`420 could then undertake the tasks of subtracting out back
`ground of the image except for the handbag, and transmitting
`the image of the handbag as a second search term.
`In another example, the server 420 could determine that the
`original image provided insufficient information, and send a
`message to the user through the device 410, directing the user
`to take another image (such as from another angle, closer, or
`with greater detail.). Indeed, the server 420 could direct the
`user to take an image of another object entirely, in order to
`help determine identity of the first object. Thus, the user could
`take a first image of a payment display at a ball game, provide
`that image to the server for identification, and then instruct the
`user to take an image of a credit card against which the user
`wants to be billed for entrance into the ball game. The server
`could then process the payment against that credit card, and
`
`10
`
`15
`
`25
`
`30
`
`35
`
`45
`
`55
`
`60
`
`65
`
`
`
`Case 2:20-cv-07872-MWF-PVC Document 1-3 Filed 08/27/20 Page 15 of 21 Page ID #:127
`
`US 8,326,038 B2
`
`25
`
`7
`provide an entry code that the user could type to pass through
`an electronically controlled gate.
`In still another example, a user could use his cellphone to
`capture an image of a screwdriverset at a hardware store, and
`the cell phone could transmit the information derived from
`the image to GoogleTM or some other search engine to find
`comparison prices. The server 420 could then instruct the user
`to turn over the packaging and take another image of the set,
`this time from the back side of the packaging. In this way
`there is iterative interaction among the user's device, the
`server, and the search engine.
`It should also be appreciated tha