`
`(12) United States Patent
`Richey et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,519,844 B2
`*Aug. 27, 2013
`
`(54)
`
`(75)
`
`(73)
`
`(*)
`
`(21)
`(22)
`(65)
`
`(51)
`
`(52)
`
`(58)
`
`(56)
`
`AUGMENTED REALITY AND LOCATION
`DETERMINATION METHODS AND
`APPARATUS
`
`Inventors: Luke Richey, Liberty Lake, WA (US);
`Allen Greaves, Spokane, WA (US)
`Assignee: Gravity Jack, Inc., Liberty Lake, WA
`(US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 407 days.
`This patent is Subject to a terminal dis
`claimer.
`
`Notice:
`
`Appl. No.: 12/847,790
`
`Filed:
`
`Jul. 30, 2010
`
`Prior Publication Data
`US 2012/OO25976 A1
`Feb. 2, 2012
`
`(2006.01)
`
`Int. C.
`G08B I/08
`U.S. C.
`USPC ............... 340/539.13: 340/539.1; 340/539.16
`Field of Classification Search
`USPC ..................................................... 34O7539.13
`See application file for complete search history.
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`5,952,969 A * 9/1999 Hagerman et al. ............ 342/457
`6,064,335 A
`5, 2000 Eschenbach
`6,124,825. A
`9, 2000 Eschenbach
`8,094,833 B2
`1/2012 Mao et al.
`2004/0239756 Al 12/2004 Aliaga et al.
`2005, 0124293 A1
`6/2005 Alicherry et al.
`2005/0192024 A1* 9/2005 Sheynblat .................. 455,456.1
`2007/0202838 A1
`8/2007 Zancola et al.
`2008/021 1813 A1* 9, 2008 Jamwal et al. ................ 345.426
`2008/0280624 A1* 1 1/2008 Wrappe ..................... 455,456.1
`
`2008/0300854 A1 12/2008 Eibye
`2009,01350O2 A1
`5/2009 Blinnikka et al.
`2009,0244097 A1 10, 2009 Estevez
`2010.0109864 A1
`5/2010 Haartsen et al.
`2010/027731.0 A1* 11/2010 Imae ........................ 340,539.13
`2010/028964.0 A1 11/2010 Annamalai
`2010/0315418 A1 12/2010 Woo
`2012fOO25974 A1
`2/2012 Richey et al.
`2012fOO25975 A1
`2/2012 Richey et al.
`FOREIGN PATENT DOCUMENTS
`WO PCT/US2O11,045586
`3, 2012
`
`OTHER PUBLICATIONS
`
`“Micello Indoor Maps Service for iPhone Users'; Dec. 30, 2009; 11
`pp.; www.mydigitallife.info/2009/09/27/micello-indoor-maps-ser
`vice-for-iphone-users/.
`“apc Nokia working on indoor GPS system'; Flynn, D.; http://
`apcmag.com/print.aspx?id=3038&mode+print; Dec. 30, 2009; 1 pp.
`“Fast Approximate Nearest Neighbors With Automatic Algorithm
`Configuration': Muja et al., 2009; 10 pp.
`“Speeded-Up Robust Features (SURF): Bay et al.; Sep. 10, 2008;
`pp. 1-14.
`
`* cited by examiner
`Primary Examiner — Kerri McNally
`(74) Attorney, Agent, or Firm — Wells St. John P.S.
`(57)
`ABSTRACT
`Augmented reality and location determination methods and
`apparatus are disclosed according to Some aspects of the
`description. In one aspect, an augmented reality method
`includes accessing first location information regarding a loca
`tion of a user interaction device in a physical world, wherein
`the user interaction device is configured to generate an aug
`mented reality representation with respect to the physical
`world, using the first location information, generating second
`location information which has increased accuracy regarding
`the location of the user interaction device in the physical
`world, and communicating augmented data to the user inter
`action device, and wherein the augmented data comprises the
`augmented reality representation.
`
`30 Claims, 11 Drawing Sheets
`
`
`
`
`
`f
`
`12
`
`2E %
`2,
`
`e
`
`4,
`
`4,
`
`Niantic's Exhibit No. 1034
`Page 001
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 1 of 11
`
`US 8,519,844 B2
`
`
`
`Niantic's Exhibit No. 1034
`Page 002
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 2 of 11
`
`US 8,519,844 B2
`
`A
`
`
`LNANS9VNVN\
`
`AOIAAG
`
`oo
`
`am
`
`y3asn
`
`OLyasn
`
`NOILOVYALNI
`
`SOIAAG
`
`NOILOVYALNI
`
`JOIASC
`
`NOLLOVYSLNI
`
`AOIAAG
`
`Yasn
`
`MYOMLAN
`
`NOILOVYSLNI
`
`AOIAAG
`
`Y3asn
`
`Niantic's Exhibit No. 1034
`Page 003
`
`Niantic's Exhibit No. 1034
`Page 003
`
`
`
`
`
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 3 of 11
`
`US 8,519,844 B2
`
`
`
`08
`
`
`
`ZE ZEZ ZEZ
`
`}}EST)
`
`EOV-RHELNI
`
`Niantic's Exhibit No. 1034
`Page 004
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 4 of 11
`
`US 8,519,844 B2
`
`
`
`,
`
`09
`
`Niantic's Exhibit No. 1034
`Page 005
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 5 of 11
`
`US 8,519,844 B2
`
`89
`
`Niantic's Exhibit No. 1034
`Page 006
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 6 of 11
`
`US 8,519,844 B2
`
`A 10
`
`A 12
`
`DETERMINE
`INTIAL LOCATION
`
`OUTPUT
`NITAL LOCATION
`
`A 14
`
`NO
`
`
`
`OTHER
`DEVICES
`NEAR
`
`YES
`
`
`
`
`
`
`
`IMPLEMENT
`COMMUNICATIONS
`WITH DEVICES
`
`OUTPUT INFORMATION
`REGARDING
`COMMUNICATIONS
`
`
`
`NO
`
`
`
`A22
`
`
`
`A20
`
`RECEIVE
`INDICATION
`OF MARKER(S)
`PRESENT
`
`YES
`
`CAPTURE IMAGES
`
`
`
`
`
`PERFORM
`IMAGE RECOGNITION
`OPERATIONS
`
`A24
`
`A 16
`
`Zzz7 A
`4 - Z. AZ7
`
`A18
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`UPDATE LOCATION
`USING RESULTS
`OF COMMUNICATIONS
`AND IMAGE
`PROCESSING
`
`
`
`A26
`
`A28
`
`
`
`UTILIZE
`REFINED
`LOCATION
`INFORMATION
`
`Niantic's Exhibit No. 1034
`Page 007
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 7 of 11
`
`US 8,519,844 B2
`
`
`
`ACCESSINITIAL
`LOCATION INFORMATION
`
`A 100
`
`A 102
`
`ACCESS DATABASE
`
`
`
`A 104
`
`A 106
`
`IS
`MAPPING
`EMPTY
`2
`
`YES
`CREATE
`NEW ENTRY
`
`A 108
`?
`COMPARE TIMESTAMP
`
`HAS
`DEVICE BEEN
`A SENDER
`
`RESET ENTRY
`
`OTHER
`DEVICES
`NEAR
`
`INSTRUCT DEVICE
`TO BE AWAITER
`
`IMPLEMENT OPERATIONS
`WITH USER INTERACTION
`DEVICES
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Niantic's Exhibit No. 1034
`Page 008
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 8 of 11
`
`US 8,519,844 B2
`
`COMMUNICATE
`INITIAL LOCATION
`INFORMATION
`
`ACCESS
`NITIAL LOCATION
`INFORMATION
`
`GENERATE
`UNIQUE IDENTIFIER
`
`SEARCE FOR
`OTHER DEVICES
`
`
`
`COMMUNICATE
`IDENTIFIERTO
`DEVICES
`
`ACCESS
`READY SIGNALS
`
`COMMUNICATE
`IDENTIFIER
`
`OUTPUT SIGNAL
`AND RECORD TIME
`
`A 150
`
`A 152
`
`A 164
`
`A 156
`
`A 158
`
`A 160
`
`A 162
`
`A 164
`
`Zzz7 A
`
`COMMUNICATE
`INITIATING TIME
`
`ACCESS
`INITIATING TIME
`
`ACCESS
`RECEPTION TIMES
`
`PROCESS DATA
`AND OUTPUT
`COMPLETION SIGNAL
`
`ACCESS
`COMPLETION SIGNAL
`
`COMMUNCATE
`QUERY AS TO
`NEXT STATE
`
`ACCESS RESPONSE
`FOR STATE
`
`ENTER
`SPECIFIED STATE
`
`A 166
`
`A 168
`
`A 170
`
`A 172
`
`A 174
`
`A 176
`
`A 178
`
`A 180
`
`Niantic's Exhibit No. 1034
`Page 009
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 9 of 11
`
`US 8,519,844 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`RECEIVE
`IDENTIFIER
`
`PREPARE FOR RECEPTION
`AND OUTPUT READY SIGNAL
`
`WAIT FOR SIGNAL
`
`
`
`TIME
`INTERVAL
`PASSED
`
`OUTPUT NOTIFICATION
`
`OUTPUT QUERY
`FOR NEXT MODE
`
`A210
`
`SIGNAL
`RECEIVED
`FROM SENDER
`
`
`
`A214
`
`TIMESTAMP
`RECEPTION OF
`SIGNAL
`
`A216
`
`OUTPUT TIMESTAMP
`
`
`
`
`
`ENTER
`INSTRUCTED STATE
`
`Zzz747
`
`Niantic's Exhibit No. 1034
`Page 0010
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 10 of 11
`
`US 8,519,844 B2
`
`101
`
`Zzz7 ZZZ77
`
`
`
`Niantic's Exhibit No. 1034
`Page 0011
`
`
`
`U.S. Patent
`
`Aug. 27, 2013
`
`Sheet 11 of 11
`
`US 8,519,844 B2
`
`
`
`ACCESSINITIAL
`LOCATION
`INFORMATION
`
`IDENTIFY
`MARKER(S)
`
`COMMUNICATE
`MARKERS
`
`ACCESS IMAGES
`
`PROCESS IMAGES
`TOIDENTIFY MARKERS
`
`COMPARE IMAGES
`TO DETERMINE
`LOCATION
`
`Zzz z7 Z 7
`
`STORE IMAGES
`AND LOCATION
`
`COMMUNICATE
`LOCATION
`INFORMATION
`
`A300
`
`A302
`
`A304
`
`A306
`
`A308
`
`A31
`0
`
`A312
`
`A314
`
`Niantic's Exhibit No. 1034
`Page 0012
`
`
`
`US 8,519,844 B2
`
`1.
`AUGMENTED REALITY AND LOCATION
`DETERMINATION METHODS AND
`APPARATUS
`
`TECHNICAL FIELD
`
`This disclosure relates to augmented reality and location
`determination methods and apparatus.
`
`BACKGROUND
`
`Computing systems have continually evolved and the
`popularity of computing systems continues to increase. The
`advancement of computing systems creates new uses and
`applications for the computing systems. For example, the
`processing speeds, storage capacities and network communi
`cation speeds are constantly increasing enabling the use of
`computing systems in increasing numbers of applications.
`Furthermore, computing systems have evolved from typi
`cal office or desk systems to smaller devices, some of which
`have increased portability, which further expands the possible
`applications of the computing systems. More specifically,
`notebook computers have evolved from desktop computers,
`and more recently, handheld portable devices have also
`advanced significantly. Personal digital assistants, media
`players, cellular telephones, Smartphones, and other portable
`devices have increased processing power and storage capaci
`ties while communications networks have also been
`improved allowing greater rates of data transfer between the
`computing Systems.
`Some computing systems and networks have evolved to a
`Sufficient extent to perform augmented reality operations
`which augment the physical world with virtual computer
`generated imagery in one example. In addition, Some portable
`computing systems have sufficient processing, storage and
`communications capabilities to provide real-time augmented
`reality data for mobile users.
`At least some aspects of the disclosure are directed to
`improved methods, apparatus and programming for imple
`menting augmented reality operations.
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`45
`
`50
`
`FIG. 1 is an illustrative representation of a user interaction
`device implementing augmented reality operations according
`to one embodiment.
`FIG. 2 is a functional block diagram of a media system
`according to one embodiment.
`FIG.3 is a functional block diagram of a computing system
`according to one embodiment.
`FIG. 4 is a functional block diagram of communications
`circuitry of a user interaction device according to one
`embodiment.
`55
`FIG. 5 is a functional block diagram of a user interface of
`a user interaction device according to one embodiment.
`FIG. 6 is a flow chart of a method implemented by a user
`interaction device to implement augmented reality operations
`according to one embodiment.
`FIG. 7 is a flow chart of a method implemented by a
`management device with respect to a plurality of user inter
`action devices according to one embodiment.
`FIG. 8 is a flow chart of a method of outputting wireless
`communications signals according to one embodiment.
`FIG. 9 is a flow chart of a method of receiving wireless
`communications signals according to one embodiment.
`
`60
`
`65
`
`2
`FIGS. 10a–10c are illustrative representations of a method
`of determining refined location information of a user interac
`tive device according to one embodiment.
`FIG. 11 is a flow chart of a method implemented by a
`management device with respect to image recognition opera
`tions according to one embodiment.
`
`DETAILED DESCRIPTION
`
`Attention is directed to the following commonly assigned
`applications, which are incorporated herein by reference:
`U.S. patent application Ser. No. 12/847,754 entitled “Aug
`mented Reality and Location Determination Methods and
`Apparatus” by inventors Luke Richey and Allen Greaves, and
`U.S. patent application Ser. No. 12/847,771 entitled “Aug
`mented Reality and Location Determination Methods and
`Apparatus” by inventors Luke Richey and Allen Greaves.
`Some aspects of the disclosure described herein are
`directed towards apparatus, methods and programming for
`implementing augmented reality operations where the physi
`cal world is augmented with additional information, Such as
`virtual objects. For example, images of the physical world
`observed through user interaction devices may be augmented
`or enhanced with augmented reality representations, for
`example in the form of visual and/or audio data which may be
`experienced by users. In one example embodiment, aug
`mented reality representations may include virtual objects
`which augment a user's experience of the physical world. The
`virtual objects may be associated with physical world objects
`which may be static or dynamically moving. Some of the
`described embodiments include a media system configured to
`implement and co-ordinate or manage augmented reality
`operations of one user interaction device or a plurality of user
`interaction devices which may be interacting in a collabora
`tive augmented reality session in one arrangement.
`Some augmented reality systems use location information
`regarding locations of the user interaction devices and loca
`tions of physical objects in the physical world to accurately
`augment the physical world with the augmented reality rep
`resentations. The location information may be used to asso
`ciate virtual objects with respective objects of the physical
`world in one illustrative example. At least some aspects of the
`disclosure are directed towards increasing the accuracy of
`generated location information regarding the locations of
`user interaction devices and which location information may
`be used to implement augmented reality operations. At dif
`ferent times, a plurality of different techniques may be avail
`able to determine the location information of user interaction
`devices. In some embodiments, information from the differ
`ent techniques may be utilized and/or combined to provide
`location information of the user interaction devices of
`increased accuracy compared with other available location
`information of the user interaction devices, perhaps obtained
`from a single source. Additional aspects are described in the
`following disclosure.
`According to one embodiment, a location determination
`method includes accessing first location information regard
`ing a location of a user interaction device in a physical world,
`wherein the user interaction device is configured to generate
`an augmented reality representation with respect to the physi
`cal world, using the first location information, identifying a
`plurality of wireless communication devices which are proxi
`mately located with respect to the user interaction device,
`initiating wireless communications between the user interac
`tion device and the wireless communications devices, after
`the initiating, accessing information regarding the wireless
`communications of the user interaction device and the wire
`
`Niantic's Exhibit No. 1034
`Page 0013
`
`
`
`US 8,519,844 B2
`
`5
`
`10
`
`15
`
`25
`
`3
`less communication devices, and using the information
`regarding the wireless communications, determining second
`location information regarding the location of the user inter
`action device, and wherein the second location information
`has increased accuracy with respect to the location of the user
`interaction device in the physical world compared with the
`first location information.
`According to another embodiment. A location determina
`tion method comprises accessing first location information
`regarding a location of a user interaction device in a physical
`world, wherein the user interaction device is configured to
`generate an augmented reality representation with respect to
`the physical world, using the first location information, iden
`tifying a marker which is proximately located with respect to
`the location of the user interaction device, accessing an image
`generated by the user interaction device which includes the
`marker, and processing the image to determine second loca
`tion information regarding the location of the user interaction
`device, and wherein the second location information has
`increased accuracy with respect to the location of the user
`interaction device in the physical world compared with the
`first location information.
`According to yet another embodiment, an augmented real
`ity method comprises accessing first location information
`regarding a location of a user interaction device in a physical
`world, wherein the user interaction device is configured to
`generate an augmented reality representation with respect to
`the physical world, using the first location information, gen
`erating second location information which has increased
`accuracy regarding the location of the user interaction device
`in the physical world, and communicating augmented data to
`the user interaction device, and wherein the augmented data
`comprises the augmented reality representation.
`According to another embodiment, a location determina
`tion method comprises using a user interaction device, emit
`ting a wireless communications signal at a first moment in
`time, using a plurality of wireless communication devices,
`receiving the wireless communications signal emitted by the
`user interaction device at a plurality of second moments in
`time, and using the first and second moments in time, deter
`mining information regarding a location of the user interac
`tion device.
`According to another embodiment, a computing system
`comprises communications circuitry configured to imple
`ment communications externally of the computing system,
`and processing circuitry coupled with the communications
`circuitry, and wherein the processing circuitry is configured
`to access first location information regarding a location of a
`user interaction device in a physical world, to use the first
`location information to identify a plurality of wireless com
`50
`munications devices which are proximately located with
`respect to the location of the user interaction device, to control
`the communications circuitry to output a control signal which
`is configured to initiate wireless communications between the
`user interaction device and the wireless communications
`55
`devices, to access information regarding the wireless com
`munications betweentheuser interaction device and the wire
`less communications devices, and to use the information
`regarding the wireless communications to determine second
`location information regarding the location of the user inter
`action device and which has increased accuracy with respect
`to the location of the user interaction device in the physical
`world compared with the first location information.
`According to another embodiment, a computing system
`comprises communications circuitry configured to imple
`ment communications externally of the computing system
`and processing circuitry coupled with the communications
`
`30
`
`4
`circuitry, and wherein the processing circuitry is configured
`to access first location information received by the commu
`nications circuitry regarding a location of a user interaction
`device in a physical world, to identify a marker which is
`proximately located with respect to the location of the user
`interaction device, to access a plurality of images generated
`by the user interaction device, and to process the images with
`respect to the marker to determine second location informa
`tion regarding the location of the user interaction device and
`which has increased accuracy with respect to the location of
`the user interaction device in the physical world compared
`with the first location information
`According to another embodiment, a computing system
`comprises communications circuitry configured to imple
`ment communications externally of the computing system
`and processing circuitry coupled with the communications
`circuitry, and wherein the processing circuitry is configured
`to access first location information received by the commu
`nications circuitry regarding a location of a user interaction
`device in a physical world, to generate second location infor
`mation which has increased accuracy regarding the location
`of the user interaction device in the physical world, and to
`control the communications circuitry to communicate aug
`mented data comprising an augmented reality representation
`to the user interaction device.
`According to another embodiment, an augmented reality
`user interaction device comprises a camera, a display system,
`communications circuitry configured to implement wireless
`communications externally of the user interaction device, and
`processing circuitry coupled with the camera, the display
`system, and the communications circuitry, wherein the pro
`cessing circuitry is configured to control the display system to
`generate a plurality of images which comprise image data
`generated by the camera and augmented data which augments
`the image data with an augmented reality representation,
`wherein the processing circuitry is further configured to con
`trol the communications circuitry to communicate first loca
`tion information regarding a location of the user interaction
`device in a physical world externally of the user interaction
`device, and wherein the processing circuitry is further con
`figured to access second location information regarding the
`location of the user interaction device after the outputting of
`the first location information and to use the second location
`information to generate the plurality of images, and wherein
`the second location information has increased accuracy
`regarding the location of the user interaction device in the
`physical world compared with the first location information.
`Referring to FIG. 1, one example of augmented reality
`aspects of the disclosure is described. FIG. 1 illustrates a user
`interaction device 10 which is used to generate an image of
`the physical world and which is augmented by an augmented
`reality representation. More specifically, in the example of
`FIG. 1, the user interaction device 10 includes a camera (not
`shown) which is configured to capture images of the physical
`world and which may depicted using a display 12. As a user
`moves the user interaction device 10, a plurality of images are
`captured of different scenes viewed by the camera of the
`device 10.
`In the illustrated example, the scene viewed by the device
`10 includes a marker 14 on a wall of the physical world. The
`generated image depicted using the display 12 includes an
`augmented reality representation 18 which augments a user's
`experience of the physical world by replacing the physical
`world marker 14 with the representation 18. In the illustrated
`example, the augmented reality representation 18 is a virtual
`3D object in the form of a puppy, which may be selected by
`another user to be associated with the marker 14.
`
`35
`
`40
`
`45
`
`60
`
`65
`
`Niantic's Exhibit No. 1034
`Page 0014
`
`
`
`US 8,519,844 B2
`
`5
`The use of marker 14 is one example of augmented reality
`operations which may be implemented using the user inter
`action device 10 and other augmented reality operations may
`be implemented in other embodiments. For example, virtual
`objects may be associated with other physical objects of the
`physical world, such as other user interaction devices 10 (not
`shown), in images generated by device 10. In some embodi
`ments, augmented reality representations 18 may entirely
`replace physical objects of the physical world.
`In one more specific example, the augmented reality rep
`resentations 18 may include advertising objects (e.g., banner
`with a product name) and the representations 18 may be
`associated with famous physical structures of the physical
`world when observed through a user interaction device 10.
`For example, a user at a significant football game may view a
`virtual object banner draped between the physical world goal
`posts when a user of a device 10 captures images of the end
`Zone during a football game. Companies may pay advertising
`fees to have augmented reality representations of advertise
`ments of their products associated with physical world
`objects and which may be viewed by users using their user
`interaction devices 10 who are proximately located to the
`physical world objects in one embodiment.
`Location information regarding the locations of the user
`interaction device 10 and other physical objects in the physi
`cal world may be used to generate augmented reality repre
`sentations 18 in captured images. In one example, the location
`information may be used to depict the augmented reality
`representations 18 accurately associated with content (e.g.,
`objects) of the physical world (e.g., other user interaction
`devices, buildings, structures, mountains, etc.).
`For a static physical object which does not move (e.g.,
`marker 14), location information may be included with the
`augmented data which determines where the augmented real
`ity representations 18 are to be displayed with respect to
`content of the physical world when the static physical object
`is within the field of view of the camera. Forportable physical
`objects (e.g., user interaction devices), the augmented data
`may be associated with an identifier of the portable physical
`object. Identification information of the portable physical
`object and location information of the portable physical
`object may be used to determine when the portable physical
`object is present within the field of view of the camera and
`where augmented reality representations 18 associated with
`the portable physical object should be shown in generated
`images. Location information regarding the user interaction
`devices 10 and/or physical objects may be used to accurately
`show the augmented reality representations 18 associated
`with the user interaction devices 10 and/or physical world
`objects in images generated by the user interaction devices 10
`in one embodiment.
`In some embodiments, one user interaction device 10 may
`be present and the user may be experiencing augmented real
`ity representations with respect to physical objects of the
`physical world. In other examples (e.g., FIG. 1), a plurality of
`users having user interaction devices 10 may be present and
`proximately located to one another and experiencing aug
`mented reality representations with respect to one another in
`a collaborative session and/or physical world objects. The
`augmented reality representations may be associated with
`user interaction devices 10 of users and/or with physical
`objects. Users and user interaction devices 10 may be free to
`enter and leave interactive augmented reality collaborative
`sessions in Some embodiments.
`It is desired to provide accurate information regarding the
`locations of user interaction devices 10 to correctly associate
`augmented reality representations with respect to physical
`
`40
`
`45
`
`6
`world objects. As described further below, methods and appa
`ratus are described which enable location information of the
`user interaction devices 10 to be determined with increased
`accuracy compared with, for example, arrangements which
`use conventional location determination methods, such as a
`global positioning system (GPS). In some embodiments dis
`closed below, the user interaction devices 10 may be config
`ured to communicate (e.g., wirelessly) with one another as
`well as with external devices (e.g., a management device,
`Wi-Fi communications devices) to implement augmented
`reality operations including operations with respect to deter
`mining location information of the user interaction devices 10
`of increased accuracy.
`Referring to FIG. 2, one example of a media system 20 is
`shown. Media system 20 is configured to implement opera
`tions with respect augmenting the physical world with aug
`mented reality representations. For example, media system
`20 is configured to assist user interaction devices 10 with the
`generation of augmented reality representations. In a more
`specific example, media system 20 is configured to perform
`operations with respect to determining locations of user inter
`action devices 10 (which may be portable) for use in accu
`rately associating augmented reality representations with the
`physical world. In one embodiment, media system 20 may
`communicate augmented data with respect to the user inter
`action devices 10 which may be used by the devices 10 to
`generate augmented reality representations.
`In the illustrated example configuration of FIG. 2, media
`system 20 includes a plurality of use interaction devices 10
`and a management device 22. User interaction devices 10
`may be configured to communicate with one another as well
`as with management device 22. For example, the user inter
`action devices 10 may communicate with management
`device 22 via a network 24. Network 24 may be considered to
`be a part of media system 20 or may be external of media
`system 20 in different embodiments. In some embodiments,
`the user interaction devices 10 may also implement wireless
`communications with respect to other wireless communica
`tions devices (e.g., Wi-Fi communications devices) which
`may be within the communications ranges of the devices 10
`(the Wi-Fi communications devices are not shown in FIG. 2).
`The user interaction devices 10 may be proximately
`located with respect to one another in a group (e.g., within
`communications ranges of the devices 10 to implement com
`munications with respect to one another) or in different geo
`graphical locations and not proximately located to one
`another. For example, different groups of user interaction
`devices 10 may exist in different geographical locations. User
`interaction devices 10 which are proximately located to one
`another may participate in a collaborative augmented reality
`session where augmented reality representations may be
`associated with the devices 10 in one embodiment. Addition
`ally, only one user interaction devices 10 may be present in a
`given geographical location and may be implementing aug
`mented reality operations with respect to static physical
`world objects.
`User interaction devices 10 may be computing systems
`(e.g., one example is described with respect to FIG. 3) in one
`embodiment. The user interaction devices 10 may have sub
`stantially the same configurations or have different configu
`rations in example embodiments. In some examples, user
`interaction devices 10 may be configured as portable media
`devices, personal digital assistants, cellular telephones,
`Smartphones, personal computers, notebook computers,
`glasses worn by a user including a camera and display system
`capable of generating images, or any other device capable of
`capturing images of the physical world and generating
`
`10
`
`15
`
`25
`
`30
`
`35
`
`50
`
`55
`
`60
`
`65
`
`Niantic's Exhibit No. 1034
`Page 0015
`
`
`
`7
`images and/or other media content for consumption by a user
`which include visual images of the physical world which are
`augmented by one or more augmented reality representations
`(e.g., additional virtual image content and/or audible content
`which augments physical world content).
`In one embodiment, management device 22 may be a
`server which is configured as a computing system, for
`example as described below with respect to FIG. 3. Manage
`ment device 22 is configured to implement communications
`with respect to user interaction devices 10 in the described
`embodiment. Management device 22 may be configured to
`perform a plurality of operations with respect to the genera
`tion of augmented reality representations by the user interac
`tion devices. Example operations performed include opera
`tions with respect to co-ordination and management of user
`interaction devices 10, co-ordination and management of
`communications between user interaction devices 10 and a
`plurality of other wireless communications devices, deter
`mining locations of user interaction devices 10 (which may be
`portable), and storing and communicating augmented data
`for use by the user interaction devices 10 to generate aug
`mented reality representations.
`In one example embodiment, the user interaction devices
`10 may communicate augmented data of their respective aug
`mented reality representations (e.g., the above-described
`puppy) to the management device 22, perhaps for storage, and
`the management device 22 may thereafter provide the aug
`mented data to others of the user interaction devices 10 foruse
`in generating the augmented reality representations with
`respect to the devices 10 which provided the augmented data.
`For example, in FIG. 1, the management device 22 may
`communicate augmented data which includes the puppy rep
`resentation to user interaction device 10 which uses the aug
`mented data to generate the augmented reality representation
`18.
`In one embodiment, identification data may be used to
`associate augmented data with respective appropriate objects
`of the physical world, such as user interaction devices 10 or
`other physical world objects. With respect to user interaction
`devices as described further below in one embodiment with
`respect to FIG. 1, initial location information regarding the
`location of a user interaction device 10 may be used to search
`a database of the management device 22 to identify other
`wireless communications devices (e.g., user interaction
`devices, Wi-Fi communications devices) within a communi
`cations range of the user interaction device 10. The database
`may include location information regarding the devices 10
`which the initial location information is searched against, and
`identification information which uniquely