throbber
1111111111111111 IIIIII IIIII 11111 1111111111 111111111111111 IIIII 11111 111111111111111 11111111
`US 20070242131Al
`
`c19) United States
`c12) Patent Application Publication
`Sanz-Pastor et al.
`
`c10) Pub. No.: US 2007/0242131 Al
`Oct. 18, 2007
`(43) Pub. Date:
`
`(54) LOCATION BASED WIRELESS
`COLLABORATIVE ENVIRONMENT WITH A
`VISUAL USER INTERFACE
`
`(76)
`
`Inventors: Ignacio Sanz-Pastor, San Francisco,
`CA (US); David L. Morgan III,
`Leawood, CA (US); Javier Castellar,
`Truckee, CA (US)
`
`Correspondence Address:
`FENWICK & WEST LLP
`SILICON VALLEY CENTER
`801 CALIFORNIA STREET
`MOUNTAIN VIEW, CA 94041 (US)
`
`(21) Appl. No.:
`
`11/618,672
`
`(22) Filed:
`
`Dec. 29, 2006
`
`Related U.S. Application Data
`
`(60) Provisional application No. 60/755,732, filed on Dec.
`29, 2005.
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`H04N 7114
`(2006.01)
`H04M 3/42
`(2006.01)
`H04Q 7120
`(2006.01)
`(52) U.S. Cl. ................... 348/14.02; 455/416; 455/456.6;
`455/457; 348/E07
`
`(57)
`
`ABSTRACT
`
`A wireless networked device incorporating a display, a video
`camera and a geo-location system receives geo-located data
`messages from a server system. Messages can be viewed by
`panning the device, revealing the message's real world
`location as icons and text overlaid on top of the camera input
`on the display. The user can reply to the message from her
`location, add data to an existing message at its original
`location, send new messages to other users of the system or
`place a message at a location for other users. World Wide
`Web geo-located data can be explored using the system's
`user interface as a browser. The server system uses the
`physical location of the receiving device to limit messages
`and data sent to each device according to range and filtering
`criteria, and can determine line of sight between the device
`and each actual message to simulate occlusion effects.
`
`402
`
`Niantic's Exhibit No. 1004
`Page 001
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 1 of 9
`
`US 2007/0242131 Al
`
`Device
`300a
`
`►
`
`Server
`102
`
`Device
`300b
`
`Device
`300c
`
`System 100
`
`Fig. 1
`
`Niantic's Exhibit No. 1004
`Page 002
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 2 of 9
`
`US 2007/0242131 Al
`
`Server 102
`
`Message Management
`Module
`202
`
`Device-Specific Message
`Queues
`204
`
`Elevation
`Database
`206
`
`Global
`Message
`Database
`208
`
`Network Server Module
`210
`
`Fig. 2
`
`Niantic's Exhibit No. 1004
`Page 003
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 3 of 9
`
`US 2007/0242131 Al
`
`Video Camera
`306
`
`Geo-Location
`Device (GPS)
`308
`
`Wireless Network
`Interface
`310
`
`View Tracking
`Device (Inertial
`Sensor)
`312
`
`High Resolution Screen 302
`
`'
`
`r7
`
`UI Button
`314
`
`--
`
`'
`
`!
`
`User Interface
`304
`
`Device 300
`
`Fig. 3
`
`Niantic's Exhibit No. 1004
`Page 004
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 4 of 9
`
`US 2007/0242131 Al
`
`402
`
`Fig. 4
`
`Niantic's Exhibit No. 1004
`Page 005
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 5 of 9
`
`US 2007/0242131 Al
`
`Fig. 5
`
`Niantic's Exhibit No. 1004
`Page 006
`
`

`

`(') ... ....
`.... 0 =
`""O = O" -....
`.... 0 =
`~ "e -....
`('D = .....
`~ .....
`""O
`
`~ .....
`
`(')
`
`~ .....
`
`(')
`
`> ....
`~ ....
`N ....
`
`.i;...
`N
`0
`~
`0
`0
`N
`rJJ
`c
`
`0 ....
`O'I
`.....
`rJJ =(cid:173)
`
`('D
`('D
`
`1,0
`
`-....J
`0
`0
`N
`~CIO
`
`0
`
`310
`
`Interface
`
`Wireless Network
`
`ill
`
`Manager
`
`Client Network
`
`Fig. 6
`
`.1Q.6.
`
`Video Camera
`
`d04
`
`Capture Module
`Camera Image
`
`614
`DB
`
`Local Message
`
`602
`
`Management Module
`
`Client Message
`
`liQ§
`
`Rendering Engine
`
`Graphics
`
`UI Manager
`
`612
`
`610
`
`Manager
`
`View Trackin,g
`
`~
`UI
`
`312
`
`Device
`
`View Tracking
`
`608
`
`Manager
`
`Geo-Location
`
`fLocation
`
`308
`evice
`
`Device 300
`
`Niantic's Exhibit No. 1004
`Page 007
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 7 of 9
`
`US 2007/0242131 Al
`
`Server receives
`client location &
`update request
`702
`
`Retrieve current
`messages from
`message DB
`704
`
`Generate client
`target message
`range sorted list
`705
`
`For each
`message in
`target list:
`706
`
`Send active message
`queue or required
`subset to client device
`720
`
`'
`
`Receive client
`message updates
`and store in DB
`722
`
`Determine message line
`of sight using server
`elevation database
`724
`
`'
`
`Compare message and
`line of sight ranges and
`determine visibility
`726
`
`Yes
`
`Occlude?
`710
`
`'
`
`Update dynamic
`message
`attributes
`712
`
`No
`
`M~~~~ge,,
`active?
`
`~Ad,?ct;~)
`
`No message queue
`716
`
`/,,,,
`
`Fig. 7
`
`/1,'list message,
`
`~< in target list? >
`
`'',,, 718 //
`
`Niantic's Exhibit No. 1004
`Page 008
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 8 of 9
`
`US 2007/0242131 Al
`
`Determine client location and
`view direction
`802
`
`y
`
`YES
`
`Send location ad update
`request to server
`806
`
`Update user interface
`822
`
`/Message updates ,
`required
`',,>>NO
`824
`
`,,,,,,,r,/'
`
`YES
`
`Send message update return
`data to server
`826
`
`r
`
`Caputre camera input and
`display as background
`808
`
`Capture camera input and
`display as background
`812
`
`End
`
`Receive message data from
`server and store in local DB
`810
`
`Set camera transform for
`location and view direction
`814
`
`Loop: For all messages in
`local DB
`816
`
`'
`
`Message occluded,pr
`filtered out?
`818
`
`NO
`
`YES
`
`NO
`
`I
`
`Last message ,n local
`DB?
`820
`
`-YE - - - - - - - - - - - - - - - - - -~
`
`Determine 3D transformation
`for message current location
`828
`
`Is transf6rrn~il'position
`v1Sible from curr'errt
`location and view
`
`,,,,,"::_,,_
`
`dir~~in>/// NO
`
`,,r
`
`YES
`
`Render message header text
`and icons at transformed
`position
`832
`
`Fig. 8
`
`Niantic's Exhibit No. 1004
`Page 009
`
`

`

`Patent Application Publication Oct. 18, 2007 Sheet 9 of 9
`
`US 2007/0242131 Al
`
`902
`
`904
`
`Fig. 9
`
`906
`
`Niantic's Exhibit No. 1004
`Page 0010
`
`

`

`US 2007/0242131 Al
`
`Oct. 18, 2007
`
`1
`
`LOCATION BASED WIRELESS COLLABORATIVE
`ENVIRONMENT WITH A VISUAL USER
`INTERFACE
`
`areas-services such as Yahoo! and Google Local allow the
`user to search for restaurants close to a given location, and
`display the returned results on a map.
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application claims the benefit of U.S. Provi(cid:173)
`sional Application Ser. No. 60/755,732, filed on Dec. 29,
`2005, which is incorporated by reference herein in its
`entirety.
`
`BACKGROUND OF THE INVENTION
`
`[0002] 1. Field of the Invention
`
`[0003] The present invention relates generally to the field
`of computer graphics and networked wireless communica(cid:173)
`tion.
`
`[0004] 2. Description of the Related Art
`[0005] The World-Wide-Web, e-mail and instant messag(cid:173)
`ing have revolutionized team collaboration, together with
`applications such as Lotus Notes and Microsoft Groove. The
`availability of mobile computing devices, together with
`access to the Global Positioning System (GPS) network are
`starting to bring the tools available for fixed location team
`collaboration to any place at any time. Devices such as the
`RIM Blackberry and the Palm Treo have brought e-mail to
`mobile devices with great success, but their effectiveness is
`certainly limited compared to the environment available to
`systems designed for use in fixed locations, in many cases
`due to interface and usability problems. Tools that work
`exceptionally well in an office environment prove them(cid:173)
`selves grossly inadequate for the same tasks when they need
`to be performed in the field and, in many cases, under
`pressure and adverse circumstances, as is often the case for
`rescue teams, military operations, law enforcement, infra(cid:173)
`structure repair crews, and other teams that need to get a job
`done quickly and with efficient coordination. Currently,
`those teams typically rely on radio or cellular network
`communications without the capability of storing informa(cid:173)
`tion to be shared; or they use text-based messaging/elec(cid:173)
`tronic mail systems that are hard to integrate with what is
`happening on the field and where it is taking place.
`
`[0006]
`In addition, browsing the World Wide Web on
`mobile devices is a much less rewarding experience than
`doing so on larger computers. Small screens, cumbersome
`interfaces and slow update speeds limit the usability of
`mobile devices.
`
`[0007] Recent trends in Internet content generation have
`seen the appearance of geotags-XML fields added to a web
`page that provide exact latitude and longitude coordinates.
`All of this is fostering developments in Internet mapping and
`cartography, from the original Internet maps to advanced
`applications such as Google Maps, Goggle Earth and
`Microsoft's TerraServer.
`
`[0008] These applications use traditional maps and com(cid:173)
`puter graphics renderings of real world satellite imagery to
`allow users to view and navigate locations, access content
`and interact with the information available on the web with
`geographic locality. The appearance of geolocation tags on
`web content are enabling applications to be used not just for
`mapping but to display Internet search results for localized
`
`[0009] Maps are ideal for fixed-location computing with
`large displays, but the small screen sizes and interfacing
`constraints of mobile devices can limit their usability in
`mobile applications. In addition, a map has to be interpreted
`by the user and reconciled with her actual position in the real
`world, sometimes requiring significant effort to understand
`fully the information represented on the map.
`
`[0010] Military aircraft have long incorporated a different
`type of display, the Heads Up Display (HUD), where a
`representation of the aircraft instruments is displayed on a
`see-through mirror and superimposed over the out-the(cid:173)
`window scene the pilot sees through the aircraft's canopy.
`HUD systems have repeatedly proven to increase pilot
`effectiveness and response time. Recently, HUD systems
`have appeared on civil aircraft and even in automobiles.
`
`[0011] Augmented reality is a branch of computer graph(cid:173)
`ics that focuses on the incorporation of interactively-ren(cid:173)
`dered imagery into real-world scenes. In most cases, it is
`implemented by using see-through head-mounted displays
`(HMDs) where the user can see both the real world sur(cid:173)
`rounding her and a perspective-matched computer graphics
`rendering of objects in the scene. The field was pioneered by
`Ivan Sutherland, who introduced the first see-through HMD
`in 1968.
`
`[0012] Augmented reality has been used for applications
`such as aircraft maintenance training and navigation in
`complex environments such as a factory floor, where the
`user can see information displayed over the real scene,
`annotating the real world. Some recent projects such as "A
`Touring Machine" developed at Columbia University in
`1997 allow annotation of real world locations and interac(cid:173)
`tion with geographically tagged database content on a trans(cid:173)
`portable computing device.
`
`[0013] While some existing wireless data communications
`tools such as text messaging, e-mail and instant messaging
`can be useful, making use of those while deployed in the
`field is cumbersome and inefficient. A limitation of these
`systems is that even though the information shared might
`have relevance to a specific physical location, these systems
`do not adapt the presentation of the information according to
`the perspective from one's location. Representing geo(cid:173)
`graphically tagged data on a map can improve the efficiency
`and has been used by certain DARPA military unit test
`wireless communication systems, but this forces the team
`members to constantly re-interpret the map and its corre(cid:173)
`spondence to the real world scenario around them as they
`move, something made harder by the small screen real estate
`available on mobile devices.
`
`SUMMARY OF THE INVENTION
`
`[0014] The present invention provides a system having
`advantages associated with a heads-up display as well as
`augmented reality technology allowing interaction within a
`collaborative environment similar to e-mail or instant mes(cid:173)
`saging but with geographic locality, enabling teams to share
`information while on location with the same flexibility and
`immediacy that e-mail and instant messaging have brought
`to fixed location, office-based teams.
`
`Niantic's Exhibit No. 1004
`Page 0011
`
`

`

`US 2007/0242131 Al
`
`Oct. 18, 2007
`
`2
`
`[0015] A system in accordance with the present invention
`includes a server in communication with one or more client
`devices over a network such as a cellular telephone network.
`A client device includes a video capture device such as a
`video camera, which displays a live captured image on a
`screen of the client device. Data received from the server or
`other devices is overlaid on the live image in real time. In
`this way, the client device functions as a window between
`the real world and the virtual world of a networked collabo(cid:173)
`rative environment by fusing data from the virtual world
`with live video from the device's camera. Users of this
`system can gaze through this window by pointing their
`device at areas of interest in their real environment and
`viewing the scene on the device's screen as with a common
`video camera's viewfinder, but with messages and data from
`the virtual world overlaid on the real scene. The user can
`interact with others in the collaborative environment by
`accessing and creating messages and data presented via the
`client device window.
`
`[0016] The present invention simplifies team collaboration
`on mobile devices, by allowing users to access and create
`geographically tagged information. A system in one embodi(cid:173)
`ment uses a wireless computing device as its delivery
`platform, connected to a server system and/or other wireless
`devices over a network. The wireless device is also equipped
`with a high resolution display capable of rendering real time
`graphics, a video camera, a geo-location device that pro(cid:173)
`vides its current position (such as a GPS receiver or a
`radiolocation device using triangulation of cell phone net(cid:173)
`work base station signals), and a view tracking system (such
`as an inertial tracker or a software based image tracker) that
`determines the orientation of its camera in real time. In one
`embodiment, the present invention includes a networked
`client device and a server side application; alternatively the
`functionality provided by the server can be carried out by
`client devices in the case of a peer-to-peer network.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0017] FIG. 1 illustrates wireless clients in communication
`with a server in accordance with an embodiment of the
`present invention.
`
`[0018] FIG. 2 is a block diagram of a message server in
`accordance with an embodiment of the present invention.
`
`[0019] FIG. 3 illustrates a wireless client device in accor(cid:173)
`dance with an embodiment of the present invention.
`
`[0020] FIG. 4 illustrates an emergency response applica(cid:173)
`tion in accordance with an embodiment of the present
`invention.
`
`[0021] FIG. 5 illustrates a message viewed in an emer(cid:173)
`gency response application in accordance with an embodi(cid:173)
`ment of the present invention.
`
`[0022] FIG. 6 is a block diagram of a wireless client
`device in accordance with an embodiment of the present
`invention.
`
`[0023] FIG. 7 is a flow chart illustrating a method of
`operation of a message server in accordance with an
`embodiment of the present invention.
`
`[0024] FIG. 8 is a flow chart illustrating a main loop flow
`for a client device in accordance with an embodiment of the
`present invention.
`
`[0025] FIG. 9 illustrates airport departure procedures pro(cid:173)
`vided as an example of an embodiment of the present
`invention
`
`[0026] The figures depict preferred embodiments of the
`present invention for purposes of illustration only. One
`skilled in the art will readily recognize from the following
`discussion that alternative embodiments of the structures
`and methods illustrated herein may be employed without
`departing from the principles of the invention described
`herein.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`
`[0027] FIG. 1 illustrates a system 100 for providing wire(cid:173)
`less collaboration in accordance with an embodiment of the
`present invention. System 100 includes a server 102 and
`wireless client devices 300a, 300b, 300c. As illustrated,
`server 102 is in contact with client devices 300a, 300b and
`300c via a wireless interface, for example a cellular network.
`Multiple client devices 300a, 300b, and 300c are illustrated
`to indicate that server 102 may be in contact with a plurality
`of client devices. For clarity of description, we refer gener(cid:173)
`ally to client device 300, though any number of client
`devices may be in operation and communication with server
`102. The operation of and interaction between server 102
`and client device 300 is described further below.
`
`[0028]
`In one embodiment, geographically tagged mes(cid:173)
`sages are received and sent by client device 300 via its
`wireless network interface 310. Messages in one embodi(cid:173)
`ment include latitude, longitude and elevation coordinates,
`and in one embodiment Extensible Markup Language
`(XML) standard geo-locations tags are used.
`
`[0029] Client device 300 presents those messages to a user
`on screen 302 as a graphics overlay on top of input from
`video camera 306. As the user pans the device around her
`environment, she can see basic information about each
`message at its actual physical location on screen 302,
`combined with the real world image captured by the camera
`306. Such information can include, for example, a user(cid:173)
`selected icon, message subject, coordinates, range and time
`information for each message. Messages in one embodiment
`are color- and size-coded and filtered according to distance,
`time, priority level, category, sender and other user-set
`criteria to facilitate navigation. Device 300 can also deter(cid:173)
`mine occlusion information for each message, and not
`present occluded messages or present them in an attenuated
`fashion by making them transparent when drawn or using a
`different color coding.
`
`[0030] At any time, a user can expand a particular message
`and see its contents in more detail overlaid in front of the
`scene, for example by centering the desired message icon on
`the screen 302 by pointing the camera 306 at it, and pushing
`a button 314 while the message is contained inside a
`selection target box located at the center of the screen 302.
`Other UI implementations for selecting and displaying a
`message can be used, as will be apparent to those of skill in
`the art. Similar to e-mail messages, the geographically
`tagged messages can contain, for example, text, audio,
`video, pictures or a hyperlinked URL to additional content,
`in addition to their coordinates and time and date informa(cid:173)
`tion.
`
`Niantic's Exhibit No. 1004
`Page 0012
`
`

`

`US 2007/0242131 Al
`
`Oct. 18, 2007
`
`3
`
`[0031] After receiving a message, the user can add infor(cid:173)
`mation to the message at a specific location for other users
`to see, edit its content, reply to the user who sent the message
`from her current location, send a new message to a given
`user, or post a message for a given group of target users at
`a specific location to see. In addition, if she desires, her
`current location can be sent as a continuously updated
`message, so other users can find her location by just panning
`around their devices from their current position-or she can
`find the location of any other members who are currently
`broadcasting their position by panning around her camera
`and seeing their icons over the real-world camera image.
`
`[0032] To facilitate input on the go, in one embodiment
`data in addition to text is supported. For example, a voice
`message can be recorded as audio; preset simple messages
`that cover commonly conveyed information, or a predeter(cid:173)
`mined icon with a given contextual meaning can also be
`attached to a message without requiring keyboard input.
`Once read, the message can be closed to return to the camera
`input message interface, and deleted if desired.
`
`[0033] Client wireless device 300 captures input from
`video camera 306 in real time and paints it as a background
`on screen 302. Client device 300 also determines its current
`location using geo-location device 308, and its current view
`direction using view tracking device 312.
`
`[0034] As noted, messages (including text or other data
`items) preferably include geo-location information and may
`be marked as active or not active. In one embodiment, server
`102 determines the active status of each message and
`communicates the status to device 300. For each received
`message that is active and meets display criteria selected by
`the device user, e.g., based on a time the message was sent,
`the message's recipients, messages of a certain importance,
`etc., a screen space position is determined from the received
`coordinates and client device's 300 current location and
`view direction. Using color in one embodiment to represent
`range, priority, age or other attributes of the message, if the
`computed screen space position is contained on screen 302,
`client device 300 renders the message source, subject, time
`and date at the proper screen position. As the user pans and
`moves the camera, the message locations follow their real(cid:173)
`world screen projected positions, allowing the user to asso(cid:173)
`ciate each message with its location by looking around with
`the device.
`
`[0035]
`In one embodiment, messages are sent to client
`device 300 either when the user requests an update, at
`periodic intervals or whenever a certain threshold for posi(cid:173)
`tional change is exceeded-for example, whenever device
`300 moves more than 20 meters in any direction. In the
`embodiment illustrated in FIG. 1, messages are received
`from server 102; in a peer-to-peer environment, server 102
`is not present, and messages are received from other client
`devices. A peer-to-peer embodiment is described further
`below.
`
`[0036] FIG. 2 illustrates an additional view of server 102.
`Server 102 maintains a database 208 of all active messages
`and their coordinates. When client device 300 requests an
`update and sends its current location to the server 102, server
`102 creates an active message list for that device by deter(cid:173)
`mining the range of all messages targeted for that client
`device 300 or any of the user groups it belongs to, and
`adding the messages that are proximately located, i.e. that
`
`are closer than a reception radius threshold, to a range sorted
`list. The reception radius threshold may be selected by a user
`of device 300 or by an operator of server 102, or some
`combination of the two.
`
`[0037]
`In one embodiment, for each message in the range(cid:173)
`sorted list, server 102 determines a line-of-sight query from
`the device's position to the message coordinates, using
`geometric database 206 of terrain elevation and three(cid:173)
`dimensional models of structures and vegetation specific to
`that location, and then updates an occlusion attribute for the
`message that are specific to each device's settings.
`
`[0038] Once updated, the message is placed into a mes(cid:173)
`sage queue 204 to be sent to client device 300 via its wireless
`network interface 310. To economize bandwidth, partial
`updates are possible where only messages that change are
`sent, and where indices to currently-stored local messages
`can be sent to re-order the list according to the current device
`location and selected filtering criteria.
`
`[0039] Client device 300 in one embodiment can partici(cid:173)
`pate in a peer-to-peer network without the presence of server
`102. In such an embodiment, client devices pass messages to
`each other until they reach the desired device(s). In such an
`embodiment, each client (peer) device performs operations
`that would otherwise be performed by the server, including
`range-sorting and filtering messages according to its current
`location. In one embodiment, no occlusion information is
`generated in the peer-to-peer protocol, if the client devices
`do not contain a geometric database of the area to query
`against.
`
`[0040]
`In another embodiment, messages can be sent to
`the client devices not only from other devices in the col(cid:173)
`laborative environment, but from any networked computer
`by adding geographic tag information to the text of any
`e-mail, instant messenger post or web mail. In this case,
`server 102 receives the global network traffic and translates
`the incoming data into the proper messages format. In a
`similar fashion, client devices can send geo-located mes(cid:173)
`sages to any networked computer as e-mail, and server 102
`translates those into pop3, IMAP, SMTP or other suitable
`network data.
`
`[0041] User interface 304 of device 300 can also be used
`to view web content that includes geographical tags, thus
`providing a browser interface that can simplify interaction
`with any data that has inherent locality.
`
`System Architecture
`
`[0042] FIG. 3 is a diagram of a wireless client device 300
`in accordance with an embodiment of the present invention.
`Device 300 is a computing device with a graphics-capable
`screen 302 and a user interface 304, and preferably includes
`at least one button 314, a video camera 306 that can support
`live video input, and a wireless network interface 310 for
`supporting a connection such as Wi-Fi, Wi-Max, EDGE or
`WCDMA. Device 300 also preferably has a geo-location
`subsystem 308 that provides the latitude, longitude, approxi(cid:173)
`mate heading and altitude of the device 300 at regular
`intervals, in one embodiment at least once per second. In one
`embodiment this is supplied by a GPS receiver, and alter(cid:173)
`natively device 300 can also use radiolocation by triangu(cid:173)
`lating cell tower signals. One example of geo-location
`technology is Rosum Inc.'s TV-GPS triangulation based
`GPS. Device 300 also includes a view tracking device 312
`
`Niantic's Exhibit No. 1004
`Page 0013
`
`

`

`US 2007/0242131 Al
`
`Oct. 18, 2007
`
`4
`
`that determines the spatial orientation-i.e. which direction
`the camera is looking-of the device in real time. View
`tracking device 312 in one embodiment includes an inertial
`three-degree of freedom tracker such as those made by
`Intersense Inc. of Bedford, Mass.; alternatively a software(cid:173)
`based image tracker is used on the captured video; or a
`magnetic tracker, gyroscope or any other method of deter(cid:173)
`mining real world orientation is suitable.
`[0043] Device 300 may be a tablet PC, laptop, pocket PC,
`PDA, smart phone digital video camera, digital binoculars,
`laser range finder, GPS navigation device or other equip(cid:173)
`ment that incorporates the described sub-components and
`functionality, including a graphics-capable screen and user
`interface.
`[0044] User interface 304 supports panning the device
`around in the same way a handheld camera is used. As the
`user points the camera 306 in different directions, messages
`are shown overlaid with the camera input on screen 302 at
`their actual location in the real world. In one embodiment,
`the message representation includes information on the
`subject, sender, time and distance. For example, FIG. 4
`illustrates an example in which device 300 is used as part of
`an emergency response operation. Areal-time video display
`406 shows a flooded area, with two messages 402, 404
`overlaid on the image 406. One message 402 indicates that
`it is from joe@rescuel, sent at 15:40:03, and having text
`"Gas Leak" at a distance of0.1 miles; the other message 404
`indicates that it is from mark@rescuel, sent at 12:30:00,
`reads "Structural Damage" and is located at a distance of0.5
`miles".
`
`[0045] As the user pans the device 300 around the scene,
`the message icon moves to reflect its actual real world
`position in relation to the current location of device 300 as
`determined from the GPS positioning device 308 and ori(cid:173)
`entation tracking device 312. A user-determined sorting
`criteria filters out messages to only the relevant subset(cid:173)
`including distance, type, priority, sender, recipient list, time
`and other factors. In FIG. 4, for example, the full message
`402 descriptor can be seen now that the user has panned the
`device, and reads mark@rescuel 12:30:00 Structural Dam(cid:173)
`age 0.5 miles.
`[0046]
`In one embodiment, by centering a message on the
`crosshair at the center at the screen and clicking button 314
`in the user interface 304, the user can expand it and see the
`message's full contents overlaid on the camera input. This is
`illustrated, for example, in FIG. 5. The user can then view
`any attached files, edit the message, post a reply from her
`current location, at the message location or at a different
`place, or remove the message.
`[0047] Client device 300 sends update requests containing
`current geo-location and view data to server 102, and server
`102 responds with updated range sorted message data
`according to the client device's current location. Device 300
`can also send message updates for server 102 to store in
`global message database 208 if required, including new
`messages added by the user on the client device, or existing
`message updates or replies. Server 102 can also be con(cid:173)
`nected to the Internet for interfacing with other messaging
`systems and accessing other geo-located web content that
`can be displayed on the scene as well.
`[0048]
`In one embodiment, server 102 and client device
`300 use XML-based message data for networked commu-
`
`nications, which are transmitted using standard known Inter(cid:173)
`net protocols such as HTTP, SOAP and WSDL. In one
`embodiment, the delivery system uses an HTTP server such
`as the Apache HTTP Server.
`
`[0049]
`In one embodiment, client device software com(cid:173)
`ponents map to the FIG. 3 hardware components as illus(cid:173)
`trated in FIG. 6. FIG. 6 includes a local message database
`614 that caches the messages pertinent to each client, and a
`central message manager 602 that arbitrates the use of all the
`other components. A geo-location manager 608 controls
`interfacing with the geo-location device 308 of FIG. 3, a
`view tracking manager 610 that interacts with the view
`tracking device 312, a camera image capture manager 604,
`a graphics rendering engine 606, a user interface manager
`612 and a wireless client network manager 616, all con(cid:173)
`nected to central message manager 602.
`
`[0050] Server 102, in addition to global message database
`208 that stores messages for all users, has a message queue
`204 specific to each client, described further below with
`respect to FIG. 7, and which is sorted by range from each
`message's position to the client device's current location,
`and where each message is particularized for the client,
`including view occlusion information.
`
`[0051]
`In order to be able to determine view occlusion
`factors, server 102 includes elevation database 206 for
`terrain, buildings and other cultural features such as bridges
`and water towers. Message management module 202 deter(cid:173)
`mines a geometric intersection from a device's location to
`the coordinate of each message, and by comparing the
`resulting range with the actual distance between the two
`points, determines whether the message is visible from the
`device's position. This can be used to occlude or modify the
`appearance of the message when displayed.
`
`[0052] Message management module 202 arbitrates the
`interaction of all the components of the server message
`system, including building each device's current target mes(cid:173)
`sage queue 204, and network server module 210 asynchro(cid:173)
`nously communicates with all the target devices 300 as data
`becomes available. Network server module 210 provides
`message data updates to the global message database 208,
`but uses each device's specific message queue 204 to send
`data to the client device 300.
`
`[0053] The role of server 102 can in alternative embodi(cid:173)
`ments be fulfilled by client devices, building a peer-to-peer
`network, where clients share all the messages that they have
`in common, combining their local message databases.
`
`[0054] FIG. 7 illustrates a flow chart for a processing loop
`executed by server 102 for a given device 300 update. Server
`102 initially receives 702 an update request from the client
`device 300 that includes the device's current geographic
`location c

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket