`
`(12) United States Patent
`Spivack
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,745,494 B2
`Jun. 3, 2014
`
`(54) SYSTEMAND METHOD FOR CONTROL OF
`A SIMULATED OBJECT THAT IS
`ASSOCATED WITH A PHYSICAL
`LOCATION IN THE REAL WORLD
`ENVIRONMENT
`
`(75) Inventor: Nova T. Spivack, San Francisco, CA
`(US)
`
`(73) Assignee: Zambala LLLP, Henderson, NV (US)
`(*) Notice:
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 707 days.
`(21) Appl. No.: 12/473,143
`(22) Filed:
`May 27, 2009
`(65)
`Prior Publication Data
`US 2010/O3O2143 A1
`Dec. 2, 2010
`
`(2013.01)
`
`(51) Int. Cl.
`G06F 3/048
`(52) U.S. Cl.
`USPC ........................................... 715/706; 34.5/156
`(58) Field of Classification Search
`USPC .................. 345/156-184; 715/706, 764-862;
`703/1–22:434/11-27
`See application file for complete search history.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`4,519,490 A
`4,829,899 A
`5,600,777 A
`5,604,907 A
`5,623,657 A
`6,023,270 A
`6,028,593. A
`6,080,063 A
`6,302,941 B1
`
`5, 1985 White
`5, 1989 Wilker et al.
`2/1997 Wang et al.
`2f1997 Conner et al.
`4/1997 Conner et al.
`2/2000 Brush, II et al.
`2/2000 Rosenberg et al.
`6, 2000 Khosla
`10/2001 Oya et al.
`
`1 1/2001 Johnson
`6,314,167 B1
`4/2003 Lannert et al.
`6,549,893 B1
`6/2003 Buckley et al.
`6,572,380 B1
`1/2004 Bansal et al.
`6,680,909 B1
`1/2006 Nguyen et al.
`6,983.232 B2
`5/2006 Lannert et al.
`7,054,848 B1
`6/2006 Chesley et al.
`7,065,553 B1
`7/2006 Sexton et al.
`7,072.919 B2
`7,155.496 B2 12/2006 Froyd et al.
`7,353,160 B2
`4/2008 Voigt
`7.487, 177 B2
`2/2009 Kilian-Kehr et al.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`WO WO-2006O24856
`3, 2006
`WO WO-2009002879
`12/2008
`
`OTHER PUBLICATIONS
`
`International Search Report PCT/US2010/035282 dated Feb. 1,
`2001; pp. 1-3.
`Written Opinion PCT/US2010/035282 dated Feb. 1, 2011; pp. 1-6.
`(Continued)
`
`Primary Examiner — Liliana Cerullo
`(74) Attorney, Agent, or Firm — Perkins Coie LLP
`
`ABSTRACT
`(57)
`Systems and methods for control of a simulated object that is
`associated with a physical location in the real world environ
`ment are herein disclosed. In one aspect, embodiments of the
`present disclosure include a method, which may be imple
`mented on a system, of determining whether a location data
`and a timing data satisfy a criterion. Responsive to determin
`ing that the location data and the timing data satisfy the
`criterion, the method enables access of the simulated object in
`a simulated environment by a user via a device. The simulated
`object generally includes attributes that are perceived by the
`user via the device. In one embodiment, the location data
`includes a location of the device and the timing data includes
`a time when the device is located at the location.
`
`65 Claims, 20 Drawing Sheets
`
`
`
`
`
`104
`
`04:
`
`
`
`lser Interface
`
`User interface
`
`User interface
`
`serinterface
`
`
`
`028
`
`
`
`Network
`
`16
`
`
`
`
`
`
`
`
`
`38tServer
`
`124
`
`User
`Repository
`
`128
`
`Simulated
`Object
`repositary
`
`30
`
`Niantic's Exhibit No. 1021
`Page 001
`
`
`
`US 8,745,494 B2
`Page 2
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4/2009 Hatcherson et al.
`7,516,052 B2
`6/2009 Nguyen et al.
`7,546,225 B2
`6/2009 Abramson et al.
`7,555,725 B2
`8, 2009 Edecker et al.
`7,570,261 B1
`3/2010 Froyd et al.
`7,685,508 B2
`6, 2010 Bordes et al.
`7,739,479 B2
`9/2010 Kusumoto et al.
`7,797,168 B2
`7,859,551 B2 12/2010 Bulman et al.
`7.991,706 B2 * 8/2011 Mattern ........................ TOS/401
`7.996,264 B2
`8/2011 Kusumoto et al.
`8,046,338 B2 10/2011 Basso et al.
`8, 117,281 B2
`2/2012 Robinson et al.
`8, 181,152 B2
`5, 2012 Choi et al.
`8,191,121 B2
`5/2012 Ruppert et al.
`8, 192.283 B2
`6/2012 Ruppert et al.
`8,201,229 B2
`6/2012 Ruppert et al.
`8,246.467 B2
`8/2012 Huang et al.
`8.255,961 B2
`8, 2012 Ellis
`8,279,862 B2 10/2012 Sbisa et al.
`8,287,383 B1
`10/2012 Etter et al.
`8,307,273 B2 11/2012 Pea et al.
`8,566,786 B2 10/2013 Choi et al.
`2002/0010734 A1
`1/2002 Ebersole et al.
`2002fOO29298 A1
`3, 2002 Wilson
`2002fOO42921 A1
`4, 2002 Ellis
`2002fOO57340 A1
`5, 2002 Fernandez et al.
`2002/0133325 A1
`9, 2002 Hoare et al.
`2002fO1845 16 A1 12/2002 Hale et al.
`2003, OO64712 A1
`4/2003 Gaston et al.
`2003/0217122 A1 11/2003 Roese et al.
`2003/0221022 A1 11/2003 Sexton et al.
`2004.0002843 A1
`1/2004 Robarts et al.
`2004/0027258 A1
`2/2004 Pechatnikov et al.
`2004/005368.6 A1
`3/2004 Pacey et al.
`2004/0095311 A1
`5/2004 Tarlton et al.
`2004/0158455 A1
`8/2004 Spivacket al.
`2005, OOO9608 A1
`1/2005 Robarts et al.
`2005/0172O18 A1
`8, 2005 Devine et al.
`2005/0267731 A1 12/2005 Hatcherson et al.
`2005/0268254 A1 12/2005 Abramson et al.
`ck
`
`
`
`2006/0235674 A1 10/2006 Voigt
`2006/02878.15 A1 12/2006 Gluck ........................... TO1,208
`2007/0024644 A1
`2/2007 Bailey
`2007/0117576 A1
`5/2007 Huston
`2007/02O3903 A1
`8/2007 Attaran Rezaei et al.
`2007/0214449 A1
`9, 2007 Choi et al.
`2007/0223675 A1* 9, 2007 Surin et al. ............... 379,202.01
`2007,0265089 A1 11/2007 Robarts et al.
`2007/0279494 A1 12/2007 Aman et al.
`2007,0299559 A1 12/2007 Janssen et al.
`2008, 0026838 A1
`1/2008 Dunstan et al.
`2008, OO31234 A1
`2/2008 Sbisa et al.
`2008/0036653 Al
`22008 Huston
`2008. O133189 A1
`6/2008 Criswell et al.
`2008/O162707 A1
`7/2008 Becket al.
`2008/0189360 A1* 8/2008 Kiley et al. ................... TO9,203
`2008/0220397 A1* 9/2008 Capone et al. .................. 434/20
`2008/0222295 A1
`9, 2008 Robinson et al.
`2008/024.7636 A1 10, 2008 Davis et al.
`2008/0261564 Al 10/2008 Logan
`2008/0294663 A1 1 1/2008 Heinley et al.
`2008/0320419 A1 12/2008 Mataset al.
`2009,0005018 A1
`1/2009 Forstallet al. ............. 455,414.1
`2009,000514.0 A1
`1/2009 Rose et al.
`2009.0036.186 A1
`2/2009 Benco et al.
`20090069033 A1
`3/2009 Karstens et al. ........... 455,456.3
`2009.0089825 A1
`4, 2009 Coldwell
`2009.0102616 A1
`4/2009 Stone et al.
`2009.0125823 A1
`5, 2009 Moll et al.
`2009/0265257 A1* 10/2009 Klinger et al. .................. 705/27
`2009,0293,011 A1 11/2009 Nassar
`2010.001782.0 A1
`1/2010 Thevathasan et al.
`2010/0217573 A1
`8/2010 Hatcherson et al.
`2010/0228776 A1
`9/2010 Melkote et al.
`2011 0161861 A1
`6/2011 Abramson et al.
`2011 0161872 A1
`6/2011 Abramson et al.
`2011/0225.069 Al
`9, 2011 Cramer et al.
`2012/0059720 A1
`3/2012 Musabji et al.
`2012/0174062 A1
`7, 2012 Choi et al.
`2013/0174268 A1
`7/2013 Wang et al.
`2013,0179272 A1
`7, 2013 Bonev et al.
`2013/0328.933 A1 12/2013 Abramson et al.
`
`OTHER PUBLICATIONS
`
`SSSI. A 'S STOR: "Object-Oriented Programming" as shown in http:/en.wikipedia.
`
`2006/0189386 A1* 8/2006 Rosenberg ......
`2006/0192852 A1
`8, 2006 Rosenthal et al.
`2006/0223635 A1 10/2006 Rosenberg
`2006/0230073 Al 10/2006 Gopalakrishnan
`
`... 463,37
`
`org/wiki/Object-oriented programming, dated Apr. 22, 2009, last
`accessed Nov. 4, 2013, pp. 1-9.
`
`* cited by examiner
`
`Niantic's Exhibit No. 1021
`Page 002
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`al 3. S.4
`
`S.
`
`S.
`
`
`
`Niantic's Exhibit No. 1021
`Page 003
`
`
`
`U.S. Patent
`
`US 8,745,494 B2
`
`
`
`Z '9ICH
`
`Niantic's Exhibit No. 1021
`Page 004
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 3 of 20
`
`US 8,745,494 B2
`
`%.oum_:E_w
`
`836w
`“mowwm
`
`I>828quEctwoamm
`
`am:“8.30
`
`uflmSEE
`
`wmmomm
`
`
`
`floomtBEx8502
`
`amEmcm.cohmmwin2.622
`
`
`
`EmE:9_>cm_
`
`
`
`0:622.Bm_:E_w
`
`gm
`
`
`
`0:62286.35%
`
`fl
`
`cosm004
`
`
`
`.omcmw‘2255.amEEF2282
`
`
`
`
`
`:o_mm_E_mn_wmmoo<
`
`_9:mEco._>cw
`
`Bmcow
`
`%853%9:8_m=:_>
`8m353%35am_m=t_>
`
`
`
`
`
`oocmEStmn._m3t_>Emmcms.226935
`
`“anew”“3”:£22,.3%.me.293520m2232Iantn_.u<cozoflmHEmEEm.EmEoEwn.mc_>m_n_Lomcmwcome/«5m:2.622
`
`
`
`
`
`
`
`60.30.O_m3t_>9m2262_o:cooaom.30
`
`2.62255550
`
`
`
`
`
`
`
`
`
`Fm.55R
`
`Niantic's Exhibit No. 1021
`
`Page 005
`
`Niantic's Exhibit No. 1021
`Page 005
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`US 8,745,494 B2
`
`
`
`Niantic's Exhibit No. 1021
`Page 006
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 5 of 20
`
`US 8,745,494 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Niantic's Exhibit No. 1021
`Page 007
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 6 of 20
`
`US 8,745,494 B2
`
`
`
`
`
`SueXeedS/Qun
`
`5F5
`
`Niantic's Exhibit No. 1021
`Page 008
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 7 of 20
`
`US 8,745,494 B2
`
`506
`
`500
`
`I
`I § 1
`4
`
`¢§H§
`hflfl%9090
` O
`N4”?
`fifififik
`;¢¢¢¢¢{
` -o
`coco
`‘fiflfi’
`R?
`too
`
`
`ogfigfi
`9
`#3930
`9909Q90 0
`_¢
`o
`9r
`3%flfl
`r3»‘y9
`
`
`0%?>06.
`
`fi’
`$fifi
`)6
`»o¢«.
`f5
`Wflfl
`
`
`Qflflflfififi9-¢¢o§o Q
`
`-oo¢ooooo
`
`%%%%%
`
`O 0.0
`§¢¥¢9
`9 0+O‘O’ O
`
`
`o
`N¥W§ko:
`
` to:o.99404¥5
`’7
`
`
`
`FIG.5A
`
`
`
`Niantic's Exhibit No. 1021
`
`Page 009
`
`Niantic's Exhibit No. 1021
`Page 009
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 8 of 20
`
`US 8,745,494 B2
`
`
`
`
`
`
`
`
`
`Niantic's Exhibit No. 1021
`Page 0010
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 9 of 20
`
`US 8,745,494 B2
`
`
`
`O9 (ADIH
`
`Niantic's Exhibit No. 1021
`Page 0011
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 10 of 20
`
`US 8,745,494 B2
`
`
`
`3
`
`3
`
`s
`
`Niantic's Exhibit No. 1021
`Page 0012
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 11 of 20
`
`US 8,745,494 B2
`
`079
`
`
`
`
`
`
`
`TIS ’91. H.
`
`Niantic's Exhibit No. 1021
`Page 0013
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 12 of 20
`
`US 8,745,494 B2
`
`552
`
`.
`
`550
`
`is
`
`554
`
`FIG. 5F
`
`Niantic's Exhibit No. 1021
`Page 0014
`
`
`
`U.S. Patent
`
`US 8,745,494 B2
`
`099
`
`Z99
`
`999
`
`
`
`
`
`
`
`
`
`
`
`
`
`E.
`
`s
`
`899
`
`Niantic's Exhibit No. 1021
`Page 0015
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 14 of 20
`
`US 8,745,494 B2
`
`602
`
`6
`60
`
`60
`8
`
`612
`
`614
`
`Determine a location data and/or a timing data
`
`604
`
`Does
`the timing data and
`location data satisfy a
`Criterion?
`
`
`
`
`
`Yes
`
`Enable access of the simulated object to a user
`in a simulated environment via a device
`
`Receive a request from the user to interact with
`the simulated object using the device
`
`
`
`
`
`
`
`610
`
`ls
`the user permitted
`to perform the requested
`action?
`
`Yes
`Perform the requested action on the simulated
`object
`
`
`
`Update the attributes of the simulated object on
`the device according to the requested action that
`is performed to be perceived by the user using
`the device
`
`FIG. 6
`
`Niantic's Exhibit No. 1021
`Page 0016
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 15 of 20
`
`US 8,745,494 B2
`
`702
`
`704
`
`708
`
`710
`
`12
`7
`
`714
`
`for access based on the location data
`
`706
`
`
`
`
`
`
`
`ls the
`user authorized to access
`the simulated object?
`
`Yes
`Provide the simulated object for display
`On the device
`
`Render a simulated environment in which the
`simulated object is located, the simulated
`environment to be presented on the device
`
`Updating characteristics of the simulated object
`presented on the device according to external
`stimuli to be perceived by the user
`
`Receive a request from the user to perform a
`requested action on the simulated object
`
`
`
`716
`
`
`
`
`
`ls the
`user authorized to
`perform the requested
`action?
`
`
`
`
`
`Yes
`Update a portion of the characteristics of the
`simulated object presented on the device
`according to an effect of the requested action
`such that updates are perceived by the user
`
`718
`
`FIG. 74
`
`Niantic's Exhibit No. 1021
`Page 0017
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 16 of 20
`
`US 8,745,494 B2
`
`722
`
`Detect velocity/acceleration
`
`724
`
`Sense a gesture
`726
`
`Detect motion of the device
`
`Adjust a perspective of the simulated environment
`presented on the device according to the detected
`motion of the device
`
`Continuously or periodically determine updated
`locations of the device
`
`Identify an updated set of simulated objects
`available for access based on the updated
`locations
`
`Present the updated set of the simulated objects
`in the simulated environment to the user through
`the device
`
`Render a user interface for display on the device,
`the user interface having a map of the physical
`location in the simulated environment
`
`Receive a selection of a region on the map made
`by the user via the user interface, the region
`corresponding to a set of selected physical
`locations
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`728
`
`732
`
`734
`
`736
`
`742
`
`744
`
`746
`
`Detect the simulated objects that are available for
`access in the region selected by the user
`
`748
`
`
`
`Present the simulated objects to be perceived by
`the user via the device
`
`FIG. 7B
`
`Niantic's Exhibit No. 1021
`Page 0018
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 17 of 20
`
`US 8,745,494 B2
`
`802
`
`804
`
`806
`
`808
`
`810
`
`812
`
`814
`
`Identify physical characteristics of the physical
`location where the real participant is located in
`the real World environment
`
`Generate the simulated playing field for display
`On the device
`
`Detect user interaction with the device
`
`ldentify a user requested action on a simulated
`object in the simulated playing field
`
`Update a characteristic of the simulated object
`in the simulated playing field according to the
`user requested action
`
`Present the simulated object via the device
`such that the updated characteristic of the
`simulated object is perceived by the user
`
`Provide a simulated participant in the simulated
`playing field
`
`FIG. 8
`
`Niantic's Exhibit No. 1021
`Page 0019
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 18 of 20
`
`US 8,745,494 B2
`
`902
`
`904
`
`906
`
`908
`
`910
`
`912
`
`914
`
`916
`
`918
`
`Generate a gaming environment
`
`Provide a gaming environment to the
`real user via the device
`
`Detecting movement of the real user
`
`Update a characteristic of the simulated
`object in the gaming environment at
`least partially based on the movement
`of the real user
`
`Detect user interaction With the device
`
`Identify a user requested action on the
`simulated object in the gaming
`environment
`
`Update the simulated object in the
`gaming environment according to the
`user requested action
`
`Detect movement of a second real user
`
`Update the second simulated object in
`the gaming environment at least
`partially based on the movement of the
`Second real user
`
`
`
`FIG. 9
`
`Niantic's Exhibit No. 1021
`Page 0020
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 19 of 20
`
`US 8,745,494 B2
`
`
`
`
`
`
`
`
`
`Generate a simulated object for display
`on a device located in the physical
`location, the simulated object being
`controlled by a real performer giving a
`live performance
`
`Monitor the live performance given by
`the real performer
`
`Update the simulated object in real time
`or near real time according to the live
`performance
`
`Present updates to the simulated object
`on the device in the physical location
`
`Generate multiple simulated objects for
`display on devices located in multiple
`physical locations
`
`
`
`1002
`
`1004
`
`1006
`
`1008
`
`1010
`
`FIG. I.0
`
`Niantic's Exhibit No. 1021
`Page 0021
`
`
`
`U.S. Patent
`
`Jun. 3, 2014
`
`Sheet 20 of 20
`
`US 8,745,494 B2
`
`1100
`
`Processor
`Sin
`
`instructions
`
`Main Memory
`
`-
`
`N
`
`instructions
`
`Non-volatile Memory
`
`
`
`Network interface Device
`
`Video Display
`
`Alpha-numeric input Device
`
`Cursor Control Device
`
`
`
`Drive Unit
`Machine-readable Medium
`
`S
`
`N
`
`Instructions
`
`Signal Generation Device
`
`FIG. II
`
`Niantic's Exhibit No. 1021
`Page 0022
`
`
`
`US 8,745,494 B2
`
`1.
`SYSTEMAND METHOD FOR CONTROL OF
`A SIMULATED OBJECT THAT IS
`ASSOCATED WITH A PHYSICAL
`LOCATION IN THE REAL WORLD
`ENVIRONMENT
`
`TECHNICAL FIELD
`
`This technology relates generally to virtual reality and in
`particular, to virtual realities representing and associated with
`a physical location and applications thereof.
`
`10
`
`BACKGROUND
`
`Miniaturization of consumer electronics with sophisti
`cated graphics capabilities and expansive computing power
`has augmented the activities one can engage in via consumer
`electronics and in particular, portable electronics such as cell
`phones, PDAs, Blackberries, iPhones, and the like.
`Further, portable electronics or other electronics devices
`now generally include GPS or other types of location sensing
`capabilities. Thus, mobile application capabilities and user
`experiences can be enhanced with the awareness of location
`information, such as location data that includes the real time
`or current location of the user or the device.
`
`15
`
`25
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`30
`
`40
`
`45
`
`FIG. 1 illustrates an example block diagram of client
`devices able to communicate with a host server that generates
`and controls access to simulated objects through a network.
`FIG. 2 depicts an example block diagram of the compo
`nents of a host server that generates and controls simulated
`objects.
`FIG. 3A depicts an example functional block diagram of
`35
`the host server that generates and controls access to simulated
`objects.
`FIG. 3B depicts an example block diagram illustrating the
`components of the host server that generates and controls
`access to simulated objects.
`FIG. 4A depicts an example functional block diagram of a
`client device that presents simulated objects to a user and
`processes interactions with the simulated objects.
`FIG. 4B depicts an example block diagram of the client
`device that presents simulated objects to a user and facilitates
`user interactions with the simulated objects.
`FIG. 5A illustrates a diagrammatic example of a simulated
`playing field that is provided via a device.
`FIG. 5B illustrates a diagrammatic example of virtual per
`formances with a simulated object that is controlled by a real
`performer.
`FIG. 5C illustrates an example screenshot on a device
`displaying a simulated environment with a simulated object
`associated with a physical object in a physical location in the
`real world environment.
`FIG.5D illustrates a diagrammatic example of an arcade
`game in a gaming environment that corresponds to a physical
`location and real players in a real world environment.
`FIG. 5E illustrates a diagrammatic example of a virtual
`game having a simulated combat environment that is played
`by a real user in a real world environment via a device.
`FIG.5F illustrates a diagrammatic example of a simulated
`object representing an interactive puzzle or a component
`thereof.
`FIG. 5G illustrates a diagrammatic example of simulated
`objects that represent real-time or near-real time information/
`data projected onto geographical locations in a map.
`
`50
`
`55
`
`60
`
`65
`
`2
`FIG. 6 depicts a flow chart illustrating an example process
`for time-based control/manipulation of a simulated object
`that is associated with a physical location in a real world
`environment.
`FIG. 7A depicts a flow chart illustrating an example pro
`cess for facilitating user interaction with a simulated object
`that is associated with a physical location in a real world
`environment.
`FIG.7B depicts a flow chart illustrating example processes
`for updating the simulated object and the simulated environ
`ment according to external stimulus.
`FIG. 8 depicts a flow chart illustrating an example process
`for simulating a virtual sports game played by a real partici
`pant in a real world environment.
`FIG.9 depicts a flow chart illustrating an example process
`for simulating a virtual game played by a real user in a real
`world environment.
`FIG. 10 a flow chart illustrating an example process for
`simulating a virtual performance in a real world environment.
`FIG. 11 shows a diagrammatic representation of a machine
`in the example form of a computer system within which a set
`of instructions, for causing the machine to performany one or
`more of the methodologies discussed herein, may be
`executed, according to one embodiment.
`
`DETAILED DESCRIPTION
`
`The following description and drawings are illustrative and
`are not to be construed as limiting. Numerous specific details
`are described to provide a thorough understanding of the
`disclosure. However, in certain instances, well-known or con
`ventional details are not described in order to avoid obscuring
`the description. References to one or an embodiment in the
`present disclosure can be, but not necessarily are, references
`to the same embodiment; and, Such references mean at least
`one of the embodiments.
`Reference in this specification to “one embodiment' or “an
`embodiment’ means that a particular feature, structure, or
`characteristic described in connection with the embodiment
`is included in at least one embodiment of the disclosure. The
`appearances of the phrase “in one embodiment in various
`places in the specification are not necessarily all referring to
`the same embodiment, nor are separate or alternative embodi
`ments mutually exclusive of other embodiments. Moreover,
`various features are described which may be exhibited by
`Some embodiments and not by others. Similarly, various
`requirements are described which may be requirements for
`some embodiments but not other embodiments.
`The terms used in this specification generally have their
`ordinary meanings in the art, within the context of the disclo
`Sure, and in the specific context where each term is used.
`Certain terms that are used to describe the disclosure are
`discussed below, or elsewhere in the specification, to provide
`additional guidance to the practitioner regarding the descrip
`tion of the disclosure. For convenience, certain terms may be
`highlighted, for example usingitalics and/or quotation marks.
`The use of highlighting has no influence on the scope and
`meaning of a term; the scope and meaning of a term is the
`same, in the same context, whether or not it is highlighted. It
`will be appreciated that same thing can be said in more than
`one way.
`Consequently, alternative language and synonyms may be
`used for any one or more of the terms discussed herein, nor is
`any special significance to be placed upon whether or not a
`term is elaborated or discussed herein. Synonyms for certain
`terms are provided. A recital of one or more synonyms does
`not exclude the use of other synonyms. The use of examples
`
`Niantic's Exhibit No. 1021
`Page 0023
`
`
`
`3
`anywhere in this specification including examples of any
`terms discussed herein is illustrative only, and is not intended
`to further limit the scope and meaning of the disclosure or of
`any exemplified term. Likewise, the disclosure is not limited
`to various embodiments given in this specification.
`Without intent to further limit the scope of the disclosure,
`examples of instruments, apparatus, methods and their
`related results according to the embodiments of the present
`disclosure are given below. Note that titles or subtitles may be
`used in the examples for convenience of a reader, which in no
`way should limit the scope of the disclosure. Unless other
`wise defined, all technical and scientific terms used herein
`have the same meaning as commonly understood by one of
`ordinary skill in the art to which this disclosure pertains. In
`the case of conflict, the present document, including defini
`tions will control.
`Embodiments of the present disclosure include systems
`and methods for control of a simulated object that is associ
`ated with a physical location in the real world environment.
`FIG. 1 illustrates an example block diagram of client
`devices 102A-Nable to communicate with a host server 124
`that generates and controls access to simulated objects
`through a network 106.
`The client devices 102A-N can be any system and/or
`device, and/or any combination of devices/systems that is
`able to establish a connection with another device, a server
`and/or other systems. The client devices 102A-N typically
`include a display and/or other output functionalities to
`present information and data exchanged between among the
`devices 102A-N and the host server 124. For example, the
`client devices 102A-N can be any of, but are not limited to, a
`server desktop, a desktop computer, a computer cluster, or
`portable devices including, a notebook, a laptop computer, a
`handheld computer, a palmtop computer, a mobile phone, a
`cell phone, a smart phone, a PDA, a Blackberry device, a
`Treo, an iPhone, cover headsets, heads-up displays, helmet
`mounted display, head-mounted display, Scanned-beam dis
`play, wearable computer Such as mobile enabled watches,
`and/or any other mobile interfaces and viewing devices, etc.
`The client devices 102A-N may be location-aware devices
`that are able to determine their own location or identify loca
`tion information from an external source. In one embodiment,
`the client devices 102A-N are coupled to a network 106. In
`some embodiments, the devices 102A-N and host server 124
`may be directly connected to one another.
`In one embodiment, the host server 124 is operable to
`provide simulated objects (e.g., objects, computer-controlled
`objects, or simulated objects) that correspond to real world
`physical locations to be presented to users on client devices
`102A-N. The simulated objects are typically software entities
`or occurrences that are controlled by computer programs and
`can be generated upon request when certain criteria are met.
`The host server 124 also processes interactions of simulated
`object with one another and actions on simulated objects
`caused by stimulus from a real user and/or the real world
`environment. Services and functions provided by the host
`server 124 and the components therein are described in detail
`with further references to the examples of FIG. 3A-3B.
`The client devices 102A-N are generally operable to pro
`vide access (e.g., visible access, audible access) to the simu
`lated objects to users, for example via user interface 104A-N
`displayed on the display units. The devices 102A-N may be
`able to detect simulated objects based on location and/or
`timing data and provide those objects authorized by the user
`for access via the devices. Services and functions provided by
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 8,745,494 B2
`
`5
`
`10
`
`15
`
`4
`the client devices 102A-N and the components therein are
`described in detail with further references to the examples of
`FIG. 4A-4B.
`The network 106, over which the client devices 102A-N
`and the host server 124 communicate, may be a telephonic
`network, an open network, Such as the Internet, or a private
`network, such as an intranet and/or the extranet. For example,
`the Internet can provide file transfer, remote log in, email,
`news, RSS, and other services through any known or conve
`nient protocol, such as, but is not limited to the TCP/IP
`protocol, Open System Interconnections (OSI), FTP, UPnP.
`iSCSI, NSF, ISDN, PDH, RS-232, SDH, SONET, etc.
`The network 106 can be any collection of distinct networks
`operating wholly or partially in conjunction to provide con
`nectivity to the client devices 102A-N and the host server 124
`and may appear as one or more networks to the serviced
`systems and devices. In one embodiment, communications to
`and from the client devices 102A-N can be achieved by, an
`open network, Such as the Internet, or a private network, Such
`as an intranet and/or the extranet. In one embodiment, com
`munications can be achieved by a secure communications
`protocol. Such as secure sockets layer (SSL), or transport
`layer security (TLS).
`In addition, communications can be achieved via one or
`more wireless networks, such as, but is not limited to, one or
`more of a Local Area Network (LAN), Wireless Local Area
`Network (WLAN), a Personal area network (PAN), a Campus
`area network (CAN), a Metropolitan area network (MAN), a
`Wide area network (WAN), a Wireless wide area network
`(WWAN), Global System for Mobile Communications
`(GSM), Personal Communications Service (PCS), Digital
`Advanced Mobile Phone Service (D-Amps), Bluetooth, Wi
`Fi, Fixed Wireless Data, 2G, 2.5G, 3G networks, enhanced
`data rates for GSM evolution (EDGE), General packet radio
`service (GPRS), enhanced GPRS, messaging protocols such
`as, TCP/IP, SMS, MMS, extensible messaging and presence
`protocol (XMPP), real time messaging protocol (RTMP),
`instant messaging and presence protocol (IMPP), instant
`messaging, USSD, IRC, or any other wireless data networks
`or messaging protocols.
`The host server 124 may include or be coupled to a user
`repository 128 and/or a simulated object repository 130. The
`user data repository 128 can store Software, descriptive data,
`images, system information, drivers, and/or any other data
`item utilized by other components of the host server 124
`and/or any other servers for operation. The user data reposi
`tory 128 may be managed by a database management system
`(DBMS), for example but not limited to, Oracle, DB2,
`Microsoft Access, Microsoft SQL Server, PostgreSQL,
`MySQL, FileMaker, etc.
`The user data repository 128 and/or the simulated object
`repository 130 can be implemented via object-oriented tech
`nology and/or via text files, and can be managed by a distrib
`uted database management system, an object-oriented data
`base management system (OODBMS) (e.g., ConceptBase,
`FastDB Main Memory Database Management System, JDO
`Instruments, ObjectDB, etc.), an object-relational database
`management system (ORDBMS) (e.g., Informix, OpenLink
`Virtuoso, VMDS, etc.), a file system, and/or any other con
`Venient or known database management package.
`In some embodiments, the host server 124 is able to pro
`vide data to be stored in the user data repository 128 and/or
`the simulated object repository 130 and/or can retrieve data
`stored in the user data repository 128 and/or the simulated
`object repository 130. The user data repository 128 can store
`user information, user preferences, access permissions asso
`ciated with the users, device information, hardware informa
`
`Niantic's Exhibit No. 1021
`Page 0024
`
`
`
`5
`tion, etc. The simulated object repository 130 can store soft
`ware entities (e.g., computer programs) that control simulated
`objects and the simulated environments in which they are
`presented for visual/audible access or control/manipulation.
`The simulated object repository 130 may further include
`simulated objects and their associated data structures with
`metadata defining the simulated object including its associ
`ated access permission.
`FIG. 2 depicts an example block diagram of the compo
`nents of a host server 224 that generates and controls simu
`lated objects.
`In the example of FIG. 2, the host server 224 includes a
`network controller 202, a firewall 204, a multimedia server
`206, an application server 208, a web application server 212,
`a gaming server 214, and a database including a database
`storage 216 and database software 218.
`In the example of FIG. 2, the network controller 202 can be
`a networking device that enables the host server 224 to medi
`ate data in a network with an entity that is external to the host
`server 224, through any known and/or convenient communi
`cations protocol Supported by the host and the external entity.
`The network controller 202 can include one or more of a
`network adaptor card, a wireless network interface card, a
`router, an access point, a wireless router, a Switch, a multi
`layer Switch, a protocol converter, a gateway, a bridge, bridge
`router, a hub, a digital media receiver, and/or a repeater.
`The firewall 204, can, in some embodiments, governand/or
`manage permission to access/proxy data in a computer net
`work, and track varying levels of trust between different
`machines and/or applications. The firewall 204 can be any
`number of modules having any combination of hardware
`and/or software components able to enforce a predetermined
`set of access rights between a particular set of machines and
`applications, machines and machines, and/or applications
`and applications, for example, to regulate the flow of traffic
`and resource sharing between these varying entities. The
`firewall 204 may additionally manage and/or have access to
`an access control list which details permissions including for
`example, the access and operation rights of an object by an
`individual, a machine, and/oran application, and the circum
`stances under which the permission rights stand.
`Other network security functions can be performed or
`included in the functions of the firewall 204, can be, for
`example, but are not limited to, intrusion-prevention, intru
`sion detection, next-generation firewall, personal firewall,
`etc. without deviating from the novel art of this disclosure. In
`some embodiments, the functionalities of the network con
`troller 202 and the firewall 204 are partially or wholly com
`bined and the functions of which can be implemented in any
`combination of software and/or hardware, in part or in whole.
`In the example of FIG. 2, the host server 200 includes the
`multimedia server 206 or a combination of multimedia serv
`ers to manage images, photographs, animation, Video, audio
`content, graphical content, documents, and/or other types of
`multimedia data for