`(12) Patent Application Publication (10) Pub. No.: US 2006/0105838A1
`Mullen
`(43) Pub. Date:
`May 18, 2006
`
`US 20060105838A1
`
`(54) LOCATION-BASED GAMES AND
`AUGMENTED REALITY SYSTEMS
`
`(76) Inventor: Jeffrey D. Mullen, New York, NY (US)
`Correspondence Address:
`UEFFREY D. MULLEN
`ySRs:5"
`(21) Appl. No.:
`11/281,812
`
`(22) Filed:
`
`Nov. 16, 2005
`
`Related U.S. Application Data
`
`(60) Provisional application No. 60/628,475, filed on Nov.
`16, 2004.
`
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`A63F I3/00
`(52) U.S. Cl. ................................................................ 463/31
`
`ABSTRACT
`(57)
`Handheld location based games are provided in which a
`user's physical location correlates to the virtual location of
`a virtual character on a virtual playfield. Augmented Reality
`(AR) systems are provided in which video game indicia are
`overlaid onto a user's physical environment. A landscape
`detector is provided that may obtain information about the
`user's landscape, in addition to the user's location, in order
`to provide overlaying information to an AR head-mounted
`display and control information to non-user controlled video
`game characters.
`
`
`
`323s
`
`2
`
`Niantic's Exhibit No. 1005
`Page 001
`
`
`
`Patent Application Publication May 18, 2006 Sheet 1 of 14
`
`US 2006/0105838A1
`
`
`
`2%
`
`%
`22%
`
`FIG. 1
`
`Niantic's Exhibit No. 1005
`Page 002
`
`
`
`Patent Application Publication May 18, 2006 Sheet 2 of 14
`
`US 2006/0105838A1
`
`
`
`Niantic's Exhibit No. 1005
`Page 003
`
`
`
`MP
`
`PM
`
`a
`
`u
`
`2
`
`1A838501
`
`03
`
`
`
`
`
`tDAME><IE|_<D.—.~=>,mgm...”
`
`
`0.F2,ITI_II"I.“I.“I___wIFIHILILILIPIrlrIrLImIII"y_________
`4IrIILILILI+urIrIrITLILILI+IrarIrITLI
`.mP
`mm.mN
`60_S
`mNwP
`S_‘U_‘_____ILILILIFIF
`._____.___________WJul":JIqulqlfllfilfiJlJIJIAI...I..|_II_II.IJIvflown
`ITIHILILILI+Irlrlr.._|.._|..l..wm.m____________nITITLIIT..|+I+ITITI_I|.I..I..I+I+ITITITITM3mwx.cmwnNr__________________.mITIHIJI4|4I+I1I1I1I...JIJI4I+uw|1|1ITJIm04m_n_><._n_._<:ho<
`nMa
`1__________..___._.—.__.____._
`
`S.__.___—.___.____.0.._II_II.I._IIJIJI41411I1I1JIJI
`
`
`
`fITIIIT..I;I+I+ITITITLI..I..I+I+|TITIT|TILILILu+IrIrIrLILI0____..__—..__._.4_.3ITI_||T..I.T.+I+ITITITIIT..I+I+I+ITIWIITa___...______‘__.____mu_uu_un_u4n._|4|di_.u_.l_uIJIJI4I+I1I1I1ITJI
`
`
`_‘_____
`
`
`
`FNMV’IOI‘DFwC)
`
`Niantic's Exhibit No. 1005
`
`Page 004
`
`Niantic's Exhibit No. 1005
`Page 004
`
`
`
`
`Patent Application Publication May 18, 2006 Sheet 4 of 14
`
`US 2006/0105838A1
`
`40 U
`
`s
`
`
`
`
`
`
`
`Niantic's Exhibit No. 1005
`Page 005
`
`
`
`Vw\
`
`«.mm
`
`\W
`
`Niantic's Exhibit No. 1005
`Page 006
`
`
`
`Patent Application Publication May 18, 2006 Sheet 6 of 14
`
`US 2006/0105838A1
`
`
`
`- A -
`is
`
`N
`R
`N
`
`S KN Š
`
`N
`N
`
`s
`
`Niantic's Exhibit No. 1005
`Page 007
`
`
`
`Patent Application Publication May 18, 2006 Sheet 7 of 14
`
`US 2006/0105838A1
`
`
`
`
`
`CITEI-AVTd TV/TML>JIA
`
`ve. "N" of 'V' "
`
`" (O "N. " OO " or
`
`v
`
`ve
`w
`
`w
`
`w
`
`w
`
`w
`
`v
`
`v
`
`V
`
`vs N
`
`Niantic's Exhibit No. 1005
`Page 008
`
`
`
`Patent Application Publication May 18, 2006 Sheet 8 0f 14
`
`US 2006/0105838 A1
`
`
`
`ITI..m..d...mI
`
`.N.UI.____I_II“Iwhumummmw-................NFC---
`
`
`m+I..+m.om..__.._____I.II.I.44|II.II.IJIJI4I4I4I1I1JIJI0I_I.Hm_I..Im__.Im.m.
`I.IIILII.III...I.I.IIrI_ILILI..I...IrI.II.II_ILI.___..._..____..ITI.I+II.ITITI.ILILIII+I+ITITI_II_I
`I.ImImd.w.m...__I_.._____TI.II.II.III+I..ITITI_II.I
`
`_
`
`..__I.II.ImI.m...mimhI
`
`I_<D._.m=>
`
`
`
`
`
`cum—.wv:urn—‘3.nvmv:3mwhmnvnNDIE—”.53....
`
`
`
`
`
`._III_II_II.IJII_I.I—II_.I1l_|l_ll_ll_l ITIILILIII+I+ITITITI.ILIII+I..ITITI.II.I.u______.__m_m_m.m_m_m_m__I.II_II.II.III...I..I.II.II.II.II.III4I..I.II.II_II.I..__._._._m.m.m.m.m.m_m__I.IIHIJIJI4|
`
`
`IFI.Im_LImI..Im.—m._.I.II_IIrI.ILII.ILI.rIrI_rIrI_ILI_.___.._..__....0.
`
`
`_.___.._..
`
`
`ITLIMLhm—LImImLI.IITITI.ILILIII+II.ITITI.IWWI.
`.".___I_I__..m_m_m.m.m_m.m.c..
`
`VNMV’IO‘DNwO
`
`‘—
`FVFFVFFFFFN
`
`Niantic's Exhibit No. 1005
`
`Page 009
`
`Niantic's Exhibit No. 1005
`Page 009
`
`
`
`
`Patent Application Publication May 18, 2006 Sheet 9 of 14
`
`US 2006/0105838A1
`
`
`
`
`
`(Z‘A‘X uo A ‘X NOLLISOd NOJ NOILVINNO-INITHOLdIMOSEO “?'E) NOILVINNO-INIRIOLdl>|0SEC BOXINJIVW
`
`
`
`
`
`Lael),
`
`
`
`
`
`X NOILISOd JO-I (JOLOBA) SHOLOBA -IO XINJIWW
`
`
`
`
`
`
`
`
`
`
`
`
`
`(Z‘A‘X uo A ‘X NOILISOd OL CIBIVIOOSSV NOILWOOT A HOWEW MOH MELNIOd “?‘B) SMELNIOd HOXINJIVW
`
`
`
`
`
`
`
`[×].
`
`
`
`
`
`Niantic's Exhibit No. 1005
`Page 0010
`
`
`
`ØZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ
`
`
`
`ZZZZZZZZZZZZZZZZZZZZZZZZZZ
`
`00||,
`
`??) 0g
`
`Niantic's Exhibit No. 1005
`Page 0011
`
`
`
`.0."—
`
`E.
`
`Niantic's Exhibit No. 1005
`Page 0012
`
`
`
`Patent Application Publication May 18, 2006 Sheet 12 of 14
`
`US 2006/0105838A1
`
`1200
`
`
`
`
`
`0Z , 61, 81, 11, 191, G |, º ?, º ? Z1, 1, 1, 0], 6 | 8
`
`
`
`
`
`CITEI-AVTd TV/TML>JIA
`
`v
`
`N " (
`
`" V '
`
`' to " N. "CO " or
`
`v
`
`V
`w
`
`w
`
`w
`
`w
`
`w
`
`w
`
`via v
`
`v
`
`Niantic's Exhibit No. 1005
`Page 0013
`
`
`
`Patent Application Publication May 18, 2006 Sheet 13 0f 14
`
`US 2006/0105838 A1
`
`
`
`04m_u_><4n_1.4506.
`
` ovmwhwm¢mww
`
`Mr
`
`I.ul_m_zl<uw_z<om_zfiom_FWmfgqumfmguwfuufi.I
`
`
`
` 2.0."— _
`
`
`
`
`
`IJIJIAIAIflIWIWIWJI_.______
`
`IJ|4|4|+|1|111ITJI__._.____
`
`IL|1I¢I+I+ITITITLI__._.____
`
`ILILILI+IrIrIrITLI____.____
`
`ILILILIkIrIrIrIFLI__;______
`
`ILILILIPIPIFIFLILI_.._._.__
`
`_______._
`
`Niantic's Exhibit No. 1005
`
`Page 0014
`
`Niantic's Exhibit No. 1005
`Page 0014
`
`
`
`
`Patent Application Publication May 18, 2006 Sheet 14 of 14
`
`US 2006/0105838 A1
`
`
`
`
`
`xmozfimz€12.33.@3de
`
`mmt_s_mz<E
`
`____________________
`
`Amvmm<m<h<a
`
`MEGS—mm.2:2525.39.50
`
`AmjoEzoo"Saz.AmvmoSmE_______._._.
`
`20:50..Egon.
`
`AmvmosmnAmvmomaow
`
`
`
`mmwr.nwr
`
`..._________....._.
`
`3‘.07.
`
`................................3.55
`
`83M.9.IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII.
`
`
`
`2%.mm<0woz<I_HZMZOQEOU
`
`
`
`
`
`{.qu
`
`FMZKMFE“"hMZZNhZ._
`IIIIIIJ_IIIIIIIJ
`
`>m<zo_._.<._.m
`mI_m<.rm_On_
`
`mmo_>mo
`
`wmo_>m_o
`
`5.:
`
`.2200
`
`mwht—zmzé...
`
`10526.
`
`mm>_momm
`
`$312.55_
`30-mV.
`
`I_<ZO_._._DD<
`
`.III.III.m3_mEamon”.
`
`.03
`
`oz<av
`
`
`
`nE«8809:.$0.55.
`
`__.________
`
`Niantic's Exhibit No. 1005
`
`Page 0015
`
`Niantic's Exhibit No. 1005
`Page 0015
`
`
`
`
`
`
`
`
`
`US 2006/01 05838A1
`
`May 18, 2006
`
`LOCATION-BASED GAMES AND AUGMENTED
`REALITY SYSTEMS
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`0001) This application claims the benefit of U.S. Provi
`sional Patent Application No. 60/628,475 (Docket No. JDM/
`008 PROV) filed on Aug. 16, 2004 and titled “Location
`Based Games and Augmented Reality Systems, which is
`hereby incorporated by reference herein in its entirety.
`
`BACKGROUND OF THE INVENTION
`0002 This invention relates to location-based game sys
`temS.
`0003) Virtual Reality (VR) systems have been developed
`in which a user is provided with a non-transparent head
`mounted display. This display provides images to the user
`Such that the user is immersed in a virtual, alternate reality.
`A user cannot see his/her physical environment while
`immersed in Such a virtual, alternate reality. Accordingly,
`VR systems are deficient because a user cannot easily move
`around a physical environment while immersed in the virtual
`reality because a user cannot see his/her physical environ
`ment. If a user begins to physically move in his/her physical
`environment without being able to see his/her physical
`environment then the user may trip, or bump into, a physical
`object (e.g., a rock or chair).
`0004 As a result of the mobility constraints of traditional
`VR systems, a user is traditionally placed on a platform that
`is surrounded by padded safety rails. A user cannot move
`outside of this protected platform and moves through the
`virtual, alternate reality created by the VR system through
`the use of a manual joystick. Such a VR system is deficient
`because it severely limits the way that a user may interact
`with the virtual, alternate reality provided by the VR system.
`0005 Traditional manual controls occasionally have a
`primary control and a number of Supplemental controls.
`Such a primary control occasionally takes the form of a
`joystick. The primary control occasionally provides the
`main control signal to a video game. Traditionally, the main
`control signal controls the location of a video game char
`acter in a virtual world. Such controls, however, are deficient
`because the controls require unnatural user movement to
`generate the primary control signal. It is therefore desirable
`to provide a primary control device that does not require
`unnatural user movement to generate a primary control
`signal to a Video game.
`
`SUMMARY OF THE INVENTION
`0006. A handheld location-based game system is pro
`vided in which a user's physical position on a physical
`playfield (e.g., the physical world, a physical environment,
`or a defined physical playfield) correlates to a video game
`character's location in a virtual playfield. In this manner, a
`video game character may be controlled without the need for
`a joystick. A handheld location-based game system is also
`provided that includes manual controls. Such manual con
`trols may be utilized during location-based gameplay. An
`Augmented Realty (AR) game system may also be provided
`as, for example, a location-based game system and may
`display virtual indicia on a semi-transparent head-mounted
`
`display Such that a user can see both virtual indicia and
`his/her physical environment. Virtual indicia may also be
`provided that interacts with a physical environment. For
`example, information may be provided to an AR game
`system about a user's physical environment. Furthering this
`example, the location of a doorway in a room may be
`provided to an AR video game Such that a virtual character
`may be seen by a user of an AR display to walk through a
`doorway and disappear. Thus, a video game system may be
`provided that augments a user's environment to provide a
`Video game. As the video game progresses, the user may
`interact with his/her physical environment in order to play a
`virtual game (e.g., by walking through his/her environment
`and shooting at Virtual ducks flying through the air).
`0007. A switch may also provided that allows a user to
`manually play a location-based game (e.g., an AR game). In
`this manner, a user may be able to obtain functionality from
`the location-based game system when the user is notable to
`move. Such a benefit may be utilized, for example, when the
`user is a passenger in a moving car or sick in bed.
`0008. A location-based game can, however, be provided
`while a player is in bed or is in a car. For example, a
`location-based game could be implemented based on the
`movement of a car Such that the movement of the car is
`translated into a control signal for the game (e.g., the
`location of a car in a game). Such a video game system may
`be embedded in a vehicle (e.g., a car). If a car is safely
`utilized on a large parking lot, a number of games may be
`realized as location-based games (e.g., as augmented reality
`games where game graphics are selectively provided on the
`vehicle's dash/windows). While in bed, a location-based
`game can be provided by translating Small differences in
`location of the system (or a controller for a system). The
`system (or a controller to the system) can, for example, be
`a fishing rod such that the movement of a flick of the system
`is used to generate a control signal for casting a virtual rod
`in a virtual video game system.
`0009. A playmat is provided that may be packaged with
`a handheld location-based game system. If the handheld
`system is dedicated to a single game (e.g., the handheld
`system is not enabled to download additional games, play
`additional games, or receive cartridge-based or disc-based
`games) then the playmat may be representative of that single
`game. Additionally, such a single game dedicated system (or
`any system or any controller) may be shaped similar to, for
`example, the main character of the video game, or any
`character or characteristic of a video game, in order to
`increase the whimsical and festive nature of playing the
`game.
`0010) A playmat for a location-based system (e.g., an AR
`system) may correlate to a particular virtual playfield. Doing
`so may assist the user in playing the handheld location-based
`game. Using the classic game of FROGGER as an example,
`the virtual playfield of FROGGER may be the same for each
`level (disregarding the types of virtual computer controlled
`objects used in each level). Thus, a FROGGER playmat may
`include indicia representative of this virtual playfield. Addi
`tional non-playfield indicia may be provided on the playmat
`to provide additional location-based game information to a
`user. For example, a “START circle may be provided on the
`playmat. Such a “START circle may correlate to the
`location where the user can start each level.
`
`Niantic's Exhibit No. 1005
`Page 0016
`
`
`
`US 2006/01 05838A1
`
`May 18, 2006
`
`0011. In this manner, the location-based game may be
`configured (e.g., Scaled) Such that user movements on the
`playmat playfield correlate to location-based game move
`ments on the virtual playfield. Thus, the location-based
`game may utilize the knowledge of the origin location (e.g.,
`the START location) and the playmat perimeters (e.g.,
`physical playfield perimeters) to provide a location-based
`game. Such a playmat may also be beneficial to locating
`devices that may accumulate errors over time. As Such the
`location-based game and the playmat may be configured to
`reduce the chance of such accumulation errors. For example,
`if a locating device accumulates noticeable location errors
`after 30 seconds of gameplay, each game level may be
`limited to a particular period of time (e.g., 30 seconds). In
`configuring the size of the playmat, and appropriately scal
`ing the location-based game to the playmat, the amount of
`time before noticeable errors accrue can be manipulated.
`Similarly, the average amount of time before noticeable
`errors accrue can be determined such that an appropriate
`time limit for finishing a level can be subsequently deter
`mined. Persons skilled in the art will appreciate that a
`number of location-based systems can be fabricated in
`which noticeable errors may never accrue. One such
`example may be a hybrid locating system that uses an
`inertial positioning system (e.g., any number of accelerom
`eters and/or gyroscopes) to determine location between
`signals received from a positioning device (e.g., a GPS
`device).
`0012 Taking FROGGER as an example, a user may be
`given 30 seconds to travel between the “START location
`and an "END” location on the playmat. After the user has
`completed a level (or dies), that user may be instructed to
`return to the origin position Such that the locating device
`may reset so that the errors accumulated during the last level
`is removed but the location-based game knows where the
`user is. Multiple playmats may be packaged together with a
`location-based game. Playmats that are similar but have
`different sizes may also be packaged together with a loca
`tion-based game. Persons skilled in the art will appreciate
`that a location-based game can measure the exact location of
`a device (e.g., via a positioning system such as a GPS
`system) and/or follow the movement of a device to deter
`mine changes in movement (e.g., via any number of accel
`erometers and/or gyroscopes) or a hyrbid of different types
`of sensors.
`0013 Playmats may also include holes such that they
`may be easily pegged into the ground or Such that gaming
`components may be attached to the playmat at particular
`locations. For example, if the locating device is a local
`positioning system, positioning transmitters may be posi
`tioned at pre-determined locations on the playmat (e.g.,
`holes cut into the playmat for receiving the transmitters) to
`expedite the correct setup of the local positioning systems.
`0014 Systems and methods of scaling information from
`physical playfields to a video game system are also pro
`vided. Systems and methods of storing Such information are
`also provided.
`0015. A location-based gaming system with a head
`mounted display is provided such that video game indicia
`may be overlaid onto the user's physical playfield. Such a
`head-mounted display may be transparent Such that a user
`can see through the display in areas where virtual indicia is
`
`not displayed. In this manner, the user may travel as fast as
`he/she wants to without risk of bumping into, or tripping
`over, a physical object. The display may also be non
`transparent. A camera, however, may be provided on, or
`around, the head-mounted display to capture the physical
`environment. The physical environment can then be
`manipulated by a processor Such that, for example, virtual
`indicia (e.g., a video game character or component is added)
`is added to the physical environment. The combination of
`the physical environment and virtual indicia can then be
`displayed to a user on a head-mounted display (or any type
`of display) Such that a user can still see his/her physical
`environment even with a non-transparent display. As such, a
`non-transparent video game system is provided that can
`provide both augmented reality and virtual reality function
`alities.
`0016 Such overlays may include games in which only
`Video game characters and interactive objects are overlaid
`onto the physical environment. As a result, the user's
`physical environment may be structurally kept the same.
`0017. To increase the safety of a game that allows for a
`high-level of user mobility, Such video game indicia may be
`provided with different contrasts (e.g., strength) at different
`distances from a user Such that at least close virtual objects/
`characters are semi-transparent Such that close, physical
`hazards may be recognized by a user. Similarly, no virtual
`indicia may be allowed to come within a certain distance of
`a user (from the user's perspective). Thus, a virtual indicia
`(e.g., a video game character) may never block a hazardous
`object that is close to a user.
`0018. A landscape detector may be provided with a
`location-based game system such that information on the
`physical terrain of the user's physical environment may be
`utilized by the gaming system. If the locating device is a
`GPS device (or if the area has already been scanned),
`landscape information may be retrieved from memory that
`stores Such information.
`0019. The information provided by the landscape detec
`tor may be utilized, for example, to position where portions
`of a video game playfield, objects, and characters may be
`positioned on a display (e.g., on a head-mounted display).
`Such information may also be used, for example, to control
`the movement characteristics of computer-controlled video
`game characters and indicia. A directional device may also
`be included to determine the direction and/or pitch that the
`user (e.g., the head-mounted display) is facing. Such infor
`mation may be utilized to determine the rotation of a users
`head as well as the user's visual perspective. Thus, the
`head-mounted display may, in itself, provide control signals
`to a video game.
`0020. To simplify the process of providing an augmented
`Video game system, a video game may be played in the game
`system in an ever-changing virtual world. Portions of this
`game, and perspectives of those portions, may be selectively
`displayed to the user at a particular time. In this manner, the
`complexity of an augmented reality game may be simplified
`Such that the augmented system renders a non-visible game
`based on control signals and, for example, portions of this
`game are selected and provided to a display based on the
`control signals. Thus, any type of video game hardware, or
`software, may be utilized and a module may be provided for
`the image selection process. Such a module can also con
`
`Niantic's Exhibit No. 1005
`Page 0017
`
`
`
`US 2006/01 05838A1
`
`May 18, 2006
`
`figure the image to be displayed to a user (e.g., aligned with
`a user's physical terrain) or another device (e.g., another
`module) can be utilized to correlate selected indicia on a
`physical terrain. Such modules can also convert control
`signals provided by an AR game controller (e.g., head
`mounted movement sensors and/or positioning systems) to
`control signals understood by any game system. As such, a
`classic 3-Dimensional video game (e.g., Goldeneye 007 or
`Pacman 3D) can be played on a classic video-game system
`(e.g., Nintendo 64 or PlayStation), yet be provided as an AR
`game system. This may be accomplished through the use of
`an add-on module that may, for example, translate control
`signals communicated between the game system and add-on
`module, selective images (and audio) from the game system
`to display on the AR display, and determine how to display
`the selected images (and audio) on the AR display (and AR
`Sound system).
`0021. Such an add-on module can interface via character
`control interfaces and audio/video outputs on a game sys
`tem. In Such a configuration, the add-on module may receive
`just static images/sounds. Thus, the add-on module may be
`provided with information to process the image, determine
`the location of video game indicia (e.g., a video game
`enemy) and determine the action of video game indicia (e.g.,
`a video game enemy shooting). Such information may be
`provided in a look-up table provided on a remote database
`Such that copies of images for a number of, or all of video
`game indicia for any particular game can be obtained and
`compared to the received still images. Thus, the module can
`determine how video game indicia is being presented in the
`game Such that the video game indicia can be appropriately
`presented on an AR display. Furthermore, an AR add-on
`module can be interfaced to the control processor of a game
`system (or any component of a video game system such as
`the system's rendering device). Such an interface may, for
`example, directly skew what is being rendered and how it is
`being rendered for later use in an AR display. Such an
`interface may also, for example, monitor the movement and
`status of video game indicia directly from the video game
`system (e.g., directly from the processor).
`0022. An AR game system may be utilized in many
`applications. For example, an AR game system may be
`utilized in the military training of soldiers. To accommodate
`Such an application the landscape of a pre-determined area
`(e.g., a particular Square area of a military based) may be
`scanned at a point before a game is played. Virtual objects
`may be generated using this scanned information that cor
`respond to physical objects. Such a process may be utilized
`when the landscape is being scanned as a game is playing.
`Thus, the video game system may construct a virtual world
`similar to the scanned physical world and generate com
`puter-controlled characters (and interactive or impenetrable
`objects) on the physical world. Such a pre-scanned system
`may be relatively inexpensive. If the virtual game characters
`are computer-controlled enemy combatants (or manually
`controlled characters by instructors at a stationary game
`device or manually controlled characters by instructors
`using a location-based game device on a different, location
`synched playfield) then military soldiers may be trained in a
`variety of safe, combat-realistic situations.
`0023 The systems and methods of the present invention
`may be utilized for any type of Augmented Reality (AR)
`application and is not limited to video games. For example,
`
`AR applications for wearable-computers may be provided.
`In one such application, virtual advertisements can be dis
`played on a user's head-mounted display to augment these
`virtual advertisements over the real world. The display of
`virtual advertisements may be provided to such a wearable
`computer when the wearable computer reaches a particular
`location. The virtual advertisements can be displayed within
`a physical environment based on the characteristics of the
`physical environment (e.g., displayed in front of you if
`nobody is walking in front of you or displayed above the
`heads of people walking in front of you if people are walking
`in front of you). Similarly, AR-phone calls may be realized
`Such that the image of the person you are calling is displayed
`in your physical environment (e.g., the person you are
`having a telephone conversation is displayed as walking
`besides you or a two-dimensional video is displayed in front
`of you).
`BRIEF DESCRIPTION OF THE DRAWINGS
`0024. The principles and advantages of the present inven
`tion can be more clearly understood from the following
`detailed description considered in conjunction with the
`following drawings, in which the same reference numerals
`denote the same structural elements throughout, and in
`which:
`0025 FIG. 1 is an illustration of a handheld location
`based game system and accompanying playmat constructed
`in accordance with the principles of the present invention;
`0026 FIG. 2 is an illustration of a handheld location
`based game system in the shape of the virtual character that
`the location-based game system controls constructed in
`accordance with the principles of the present invention;
`0027 FIG. 3 is an illustration of scaling a virtual play
`field to a physical playfield constructed in accordance with
`the principles of the present invention;
`0028 FIG. 4 is an illustration of landscape detection
`constructed in accordance with the principles of the present
`invention;
`0029 FIG. 5 is an illustration of virtual impenetrable
`object construction on a playfield based on detected land
`scape objects constructed in accordance with the principles
`of the present invention;
`0030 FIG. 6 is an illustration of computer-controlled
`character positioning and movement based on detected
`landscape objects constructed in accordance with the prin
`ciples of the present invention;
`0031
`FIG. 7 is an illustration of virtual playfield map
`ping that includes physically detected impenetrable objects
`constructed in accordance with the principles of the present
`invention;
`0032 FIG. 8 is an illustration of computer controls for
`Video game characters in a location-based game constructed
`in accordance with the principles of the present invention;
`0033 FIG. 9 is an illustration of different data storage
`structures for a location-based game constructed in accor
`dance with the principles of the present invention;
`0034 FIG. 10 is an illustration of displaying video game
`characters in a semi-visible display for a location-based
`game constructed in accordance with the principles of the
`present invention;
`
`Niantic's Exhibit No. 1005
`Page 0018
`
`
`
`US 2006/01 05838A1
`
`May 18, 2006
`
`0035 FIG. 11 is an illustration of displaying video game
`characters with different transparencies for a location-based
`game constructed in accordance with the principles of the
`present invention;
`0.036
`FIG. 12 is an illustration of computer controlled
`movement of video game characters in a virtual playfield
`constructed in accordance with the principles of the present
`invention;
`0037 FIG. 13 is an illustration of pre-scanning a physi
`cal playfield constructed in accordance with the principles of
`the present invention; and
`0038 FIG. 14 is an illustration of a location-based game
`topology constructed in accordance with the principles of
`the present invention.
`
`DETAILED DESCRIPTION OF THE DRAWINGS
`0039 U.S. Provisional Patent Application No. 60/603,
`481 filed on Aug. 20, 2004 entitled “Wireless Devices With
`Flexible Monitors and Keyboards” (Docket No. JDM/007
`PROV) and U.S. patent application Ser. No. 11/208,943 filed
`on Aug. 22, 2005 entitled “Wireless Devices With Flexible
`Monitors and Keyboards” (Docket No. JDM/007) are
`hereby incorporated by reference herein in their entirety.
`0040 U.S. Provisional Patent Application No. 60/560,
`435 filed on Apr. 7, 2004 entitled “Advanced Cooperative
`Defensive Military Tactics, Armor, and Systems” (Docket
`No. JDM/006 PROV) and U.S. patent application Ser. No.
`11/101,782 filed on Apr. 7, 2005 entitled “Advanced Coop
`erative Defensive Military Tactics, Armor, and Systems’
`(Docket No. JDM/006) are hereby incorporated by reference
`herein in their entirety.
`0041 U.S. Provisional Application No. 60/560,435 filed
`on Sep. 2, 2003 entitled “Systems and Methods for Location
`Based Games and Employment of the Same on Location
`Enabled Devices” (Docket No. JDM/005 PROV) and U.S.
`patent application Ser. No. 10/932,536 filed on Sep. 1, 2004
`entitled “Systems and Methods for Location-Based Games
`and Employment of the Same on Location-Enabled
`Devices” (Docket No. JDM/005) are hereby incorporated by
`reference herein in their entirety.
`0042 U.S. patent application Ser. No. 10/797,801 filed
`on Mar. 9, 2004 titled “Systems and Methods for Providing
`Remote Incoming Call Notification for Wireless Tele
`phones' (Docket No. JDM/004) is hereby incorporated by
`reference herein in its entirety.
`0043 U.S. Provisional Patent Application No. 60/367,
`967 filed on Mar. 25, 2002 entitled “Systems and Methods
`for Locating Cellular Phones” (Docket No. JDM/002
`PROV) and U.S. patent application Ser. No. 10/400,296
`filed on Mar. 25, 2003 titled “Systems and Methods for
`Locating Wireless Telephones and Security Measures for the
`Same" (Docket No. JDM/002) are hereby incorporated by
`reference herein in their entirety.
`0044 Turning first to FIG. 1, gaming system 100 is
`provided that includes handheld game system 101 and
`playmat 150.
`0045 Gaming system 100 may be a location-based game
`system in which the physical location (or physical move
`ment) of a user on a physical playfield determines the virtual
`
`location (or virtual movement) of a virtual character on a
`virtual playfield. Location information may be obtained
`through, for example, any type of triangulation technique
`Such as a GPS system or a localized positioning system
`(LPS). For example, the time it takes multiple signals from
`multiple transmitters to reach device 101 may be utilized to
`determine the position of device 101. Location information
`may alternatively be obtained through various cell phone or
`wireless LAN location techniques. For example, a user's
`signal strength between multiple hubs or base stations may
`be utilized to determine that user's location. As per another
`example, inertial movement sensors such as accelerometers
`and/or gyroscopes may be utilized to keep track of a user's
`movement in a particular direction. In this manner, the user's
`location may be determined and updated based on the user's
`movements. Hybrids of such systems may also be utilized.
`For example, an accelerometer may be utilized to keep track
`of a user's position until a second locating signal is provided
`(e.g., a GPS system). In this manner, a GPS signal may be
`the master locating signal while the accelerometer provides
`location updates between GPS signals. Generally, device
`140 is the locating device (or locating devices) for game
`system 101.
`0046 Game system 101 may include manual controls
`120 and manual control switch 132 that turns ON and OFF
`location-based controls. In this manner, a user may still
`obtain functionality from game system 101 while, for
`example, sitting on a park bench. ON/OFF switch 131 may
`control when device 101 is turned ON and OFF.
`0047 Persons skilled in the art will appreciate that con
`trols similar to manual controls 120 and 131 may also be
`provided on an AR game system. Thus, a user may use
`manual controls to control the location of a video game
`character in an AR game (e.g., control what first-person
`perspective is displayed on an AR display) without physi
`cally moving. A user may also use manual controls similar
`to manual controls 120 and 131 to toggle between an AR and
`VR game, toggle between AR and VR configurations of a
`game, and toggle from a location-based control scheme to a
`manual control Scheme after an AR game configuration has
`been toggled to a VR game configuration. Thus, if a user is
`located in an environment that makes location-based AR
`gameplay difficult, (e.g., a small-room or in a car), the user
`can instruct the game system to provide a VR version of the
`game to be played with a manual controller. Thus, a user
`may instruct an AR/VR game system to display all virtual
`indicia on a head-mounted display (and/or render all virtual
`indicia) and not allow any areas of the display to become
`transparent. Thus, a user may instruct an AR/VR game
`system to Switch from location-based control to manual
`input control. For systems with multiple control signals
`generated from multiple control devices, a Switch for alter
`nate control schemes may be provid