`(10) Patent No.:
`US 8,326,327 B2
`
`Hymel et al.
`(45) Date of Patent:
`Dec. 4, 2012
`
`USOO8326327B2
`
`(54) SYSTEM AND METHOD FOR DETERMINING
`ACTION SPOT LOCATIONS RELATIVE TO
`THE LOCATION OF A MOBILE DEVICE
`
`(75)
`
`Inventors: James Allen Hymel, Kitchener (CA);
`{33 Ph‘hPPe Boucmrds Waterloo
`
`(73) Assignee: Research In Motion Limited, Waterloo
`CA
`(
`)
`Subject to any disclaimer, the term of this
`-
`-
`patent 1s extended or adjusted under 35
`U-S-C~ 15403) by I74 days~
`
`( * ) Notice:
`
`(21) Appl. No.: 12/870,676
`
`(22)
`
`Filed:
`
`Aug. 27, 2010
`
`.
`
`(65)
`
`(51)
`
`(56)
`
`Prior Publication Data
`US 2012/0052880 A1
`Mar. 1,2012
`11 .
`I t C]
`(200901)
`H04W4/02
`(52) US. Cl.
`.................................. 455/456.3; 455/456.1
`(58) Field Of Classification Search ............... 455/456.1,
`455/456.3, 456.6, 457, 566, 95, 550.1
`See application file for complete search history.
`_
`References Clted
`
`U'S' PATENT DOCUMENTS
`5,757,290 A *
`5/ 1998 Watanabe et a1.
`........ 340/995.14
`e a e a .
`200570%5733,4914; ill *
`3/3882 :fikfiryat ~1~ ~~~~~~~~~~~~~~~~~~~~~ 701/455
`2008/0045 138 A1
`2/2008 Milic-Frayling et 31.
`2008/0163057 A1*
`7/2008 Lohi et 31.
`..................... 715/718
`2009/0051785 A1
`2/2009 Kamada et a1.
`2009/0098888 A1*
`4/2009 Yoon .......................... 455/4562
`
`av1s et
`.
`3/388; PDilslfalnsajzt al.
`3883/8333; :1
`7/2009 Tysowski et al.
`2009/0189811 A1
`12/2009 Millmore et a1.
`2009/0319595 A1
`1/2010 Pereira et a1.
`................. 455/457
`2010/0004005 A1*
`2011/0288770 A1* 11/2011 Greasby ........................ 701/208
`FOREIGN PATENT DOCUMENTS
`2007036737
`4/2007
`
`WO
`
`OTHER PUBLICATIONS
`Extended European Search Report dated May 18, 2011. In corre-
`Spondmg apphcatlon NO' 10174308'6'
`.
`.
`Francesca, Carmagnola et a1. “tag-based user modelmg for somal
`.
`.
`.
`.
`,,
`.
`mult1-dev1ce adaptlve guldes , User Modelmg and User-Adapted
`Interaction, Kluwer Academic Publishers, D0, v01. 18, N0. 5, Jul. 29,
`2008, pp. 497-538, XP019650064, ISSN: 1573-1391. D01: D01:
`10.1007/Sll257-008-9052-2 abstract pp. 498-500* pp. 510-515.
`
`(Continued)
`
`.
`Primary Examiner 7 Cong Tran
`(74) Attorney, Agent, or Firm 7 Novak Druce + Qu1gg LLP
`
`.
`.
`.
`.
`ABSTRACT
`(57)
`A system and method for deterrn1n1ng act1on spot locat1ons
`relative to the location of a mobile device includes a display
`and a processor module communicatively coupled to the dis-
`play The processor module is configured to I'CCCIVC execup
`able instructions to display a graphical user interface of the
`display; receive data indicative of the current location of the
`mobile device; and determine at least one action spot relative
`to the current location of the mobile device. The processor
`module can signify the action spot on the graphical user
`interface and provide an indication of activity level occurring
`at the action spot. The action spot can be a location where at
`.
`.
`.
`.
`.
`.
`least one other mobile device has engaged in documenting
`act1on w1th1n a predetermlned perlod of time from when the
`mob1le dev1ce arr1ved at the current locatlon.
`
`20 Claims, 10 Drawing Sheets
`
`Display a graphical user
`interface
`
`Receive data indicative
`of current location
`
`Determine at least one
`action spot within a
`predetermined distance
`from the current location
`
`1010‘
`
`1020
`
`1030'
`
`1040
`
`1050
`
`one action spot
`
`Signify the at least one
`action spot on the graphical
`user interface
`
`Provide an indication of the
`activity level at the at least
`
`Snap Inc. EX. 1001 Page 0001
`
`Snap Inc. Ex. 1001 Page 0001
`
`
`
`US 8,326,327 B2
`Page 2
`
`OTHER PUBLICATIONS
`
`Jesper Kjeldskov, Jeni Paay: “Just-for-us: a context-aware mobile
`information system facilitating sociality”, ACM, 2 PENN Plaza ,
`suite 7017NeWYork USA, Sep. 19, 2005-Sep. 22, 2005, pp. 23-30,
`XP040026719, Salzburg DOI: 10.1145/1085777.1085782 ISBN:
`1-59593-089-2 abstract * figures 3-9* section 4.
`Presselite. Twitter 360. http://WWW.tWitter-360.com.Retrieval Nov.
`29, 2010.
`Association for Computing Machinery. Inferring generic activities
`and events from image content and bags of geo-tags; http://portal.
`acm.org/citation.cfm?id:1386361&dl:GUIDE&coll:GUIDE
`&CFID:76303014&CFTOKEN:93381868.Retrieval Nov.
`2010.
`IEEE. Annotating collections of photos using hierarchical event and
`scene
`models.
`http://ieeexplore.ieee.org/Xplore/login.
`jsp?url:http%3A%2F%2Fieeexplore.ieee.org%2F
`
`29,
`
`ie15%2F455 8014%2F45 8733 5%2F04587382.
`pd1%3 Farnumber%3D45 873 82&authDecis ion:203. Retrieval Nov.
`29, 20 10.
`Bongwon Suh. Semi -automatic photo annotation strategies using
`event based clustering and clothing based person recognition. http://
`WWW.sciencedirect.com/scienceiob:ArticleURL&,udi:B6VOD-
`4N68NFK-1&,user:10&icoverDate:07%2F31%2F2007&,
`alid:15 61701294&,rdoc:1&ifmt:high&,orig:search&i
`origin:search&,zone:rsItilistiitem&,cdi:5644&7sort:r&i
`st:13&idocanchor:&view:c&ict:1&iacct:C0000 50221&,
`version: 1 &iuerersion
`:0&iuserid:10&
`md5:d1fd8b6eb5d6ef3ebf18835ddc41e761&searchtype:a.
`Retrieval Nov. 30, 2010.
`Partial European Search Report mailed Jan. 28, 201 1. In correspond-
`ing application No. 101743086.
`
`* cited by examiner
`
`Snap Inc. EX. 1001 Page 0002
`
`Snap Inc. Ex. 1001 Page 0002
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 1 of 10
`
`US 8,326,327 B2
`
`1000
`
`MiG-x
`
`flteptey e grephieet user
`
`interface
`
`Receive date tndteettve
`
`teetM
`
`Determine at teeet one
`
`eetien epet within a
`predetermined dietenee
`
`et Current teeetien
`ttern the current ieeetten
`Signity the at teeet one 23045::
`
`‘
`_
`action epet en the grephseet
`user interface
`
`m
`
`Prevtee en Endteetien at the
`
`
`”tOeL \
`
`
`activity tevet at the at teeet
`ene eetéen epet
`
`F56, “t
`
`Snap Inc. EX. 1001 Page 0003
`
`Snap Inc. Ex. 1001 Page 0003
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 2 of 10
`
`US 8,326,327 B2
`
`7'02
`
`ma
`
`’/
`
`5
`
`NEEGHBOR
`
`STATUE
`
`GARDEN
`
`.,
`
`{32}
`
`PARK
`
`
`
`Fifi. 2
`
`Snap Inc. EX. 1001 Page 0004
`
`Snap Inc. Ex. 1001 Page 0004
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 3 of 10
`
`US 8,326,327 B2
`
`STATUE
`GARDEN
`
`/,,.-305
`
`@E8. QT.‘
`
`RQSE .\
`GARDEN
`
`‘
`
`“\aos
`
`PARKENG
`u
`
`“
`
`‘
`
`K
`
`‘.
`
`‘
`
`0
`
`0
`
`‘
`
`‘
`
`0.
`
`LAKEsmE
`
`MUSEUM
`
`i
`
`Fifi. 3
`
`Snap Inc. EX. 1001 Page 0005
`
`Snap Inc. Ex. 1001 Page 0005
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 4 of 10
`
`US 8,326,327 B2
`
`702 \
`
`STORE FRQNT
`
`LAKEgéfiE
`
`PARKiNG
`
`Fifi. 4i
`
`Snap Inc. EX. 1001 Page 0006
`
`Snap Inc. Ex. 1001 Page 0006
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 5 of 10
`
`US 8,326,327 B2
`
`
`
`HQ. 5
`
`Snap Inc. EX. 1001 Page 0007
`
`Snap Inc. Ex. 1001 Page 0007
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 6 of 10
`
`US 8,326,327 B2
`
`.-/ 570
`
`"1»-m-..‘r-—-..
`
`LLj
`
`\
`
`E3 \\\§%RWNG
`{ma 59% €359
`£3va £3
`QN
`STREET
`
`UK <
`
`E
`(I:
`3...“
`
`ZL
`
`L:
`
`Fifi 0
`
`Snap Inc. EX. 1001 Page 0008
`
`EAST
`
`PARKiNG
`
`Snap Inc. Ex. 1001 Page 0008
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 7 of 10
`
`US 8,326,327 B2
`
`#06
`
`702 \
`
`_,
`
`HEADENG: 2340
`
`aISTANCE: 0.2 Mi
`_ fETA: 2 Min
`
`/ 70-4
`
`ACTEVETY mum; 300
`
`‘
`
`ACTEVETY TYPE : Camera-
`
`HG. ?’
`
`Snap Inc. EX. 1001 Page 0009
`
`Snap Inc. Ex. 1001 Page 0009
`
`
`
`US. Patent
`
`Dec. 4, 2012
`
`Sheet 8 of 10
`
`US 8,326,327 B2
`
`‘Qflflflflflflflflflflflflflflflflfl
`
`.aaaaamaaflmflmmflmmam
`
`CH3
`{3E3
`{SIC}
`OE}
`CECE
`CECE
`CH3
`CECE
`£312}
`£353
`BE}
`E113
`
`flflflflflflflfliflflfl
`
`DBBBUUUQE‘JBU
`
`BBBEBBDBDEB
`
`ETA: 4 Min~\
`
`BEARENG: 90°
`DISTANCE: 0.3 Mi
`
`“
`
`\
`
`ACTSVETY: 2,000“ 0
`
`Fifi. 8
`
`Snap Inc. EX. 1001 Page 0010
`
`Snap Inc. Ex. 1001 Page 0010
`
`
`
`U.S. Patent
`
`Dec. 4, 2012
`
`Sheet90f10
`
`US 8,326,327 B2
`
`mmm
`
`mmT
`
`§Egg
`
`
`
`ma.22%
`
`
` m22"memmgw.//2.3/2£82mmmm‘fimqy/Hmmm
`3:i9xmEmLmEnM2/43
`“Q:mme/22mg
`
`
`
`
`22222222212we?2222222222222\\4mfimmagmaix/L
`mwhflommmEm3m\$mwm31/]!mama
`
`”mgfixmEgg/!3m
`2Amvamwgem...................................................................................1%
`
`
`£0mamumfim222222222222222222222222222222222222222222222f
`
`§<m
`
`mgwgmo.//N3
`
`mg25
`mcawmgcsagfi
`
`mgmwmmmgm
`
`Snap Inc. EX. 1001 Page 0011
`
`Snap Inc. Ex. 1001 Page 0011
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`4.,
`
`23,
`
`3,
`
`2B
`
`mmagma
`
`a:\\mmmgafi
`
`\\\\m$7mficg
`
`em:.\
`
`$ng
`
`wow/mm
`
`/,Q:
`
`£50m;
`
`
`
`
`
`81mm§x\\mcmwomREE,
`
`ME.mE
`
`6““““““““““““““““““““““““““““““
`
`
`
` 33mmfofiwz15>thmE:‘.\\wcawmumcsgaamfig35.».m0
`
`$ng
`
`33mmx
`
`aHxkcémz
`
`33399”
`
`magma
`
`.-,/rQ:
`
`Snap Inc. EX. 1001 Page 0012
`
`Snap Inc. Ex. 1001 Page 0012
`
`
`
`
`
`
`
`
`US 8,326,327 B2
`
`1
`SYSTEM AND METHOD FOR DETERMINING
`ACTION SPOT LOCATIONS RELATIVE TO
`THE LOCATION OF A MOBILE DEVICE
`
`FIELD OF TECHNOLOGY
`
`The subject matter herein generally relates to mobile
`devices, and more specifically relates to a system and method
`for determining an action spot based on the location of a
`mobile device.
`
`BACKGROUND
`
`With the advent of more robust electronic systems,
`advancements of mobile devices are becoming more preva-
`lent. Mobile devices can provide a variety offunctions includ-
`ing, for example, telephonic, audio/video, and gaming func-
`tions. Mobile devices can include mobile stations such as
`
`cellular telephones, smart telephones, portable gaming sys-
`tems, portable audio and video players, electronic writing or
`typing tablets, handheld messaging devices, personal digital
`assistants, and handheld computers.
`Mobile devices allow users to have an integrated device
`which can perform a variety of different tasks. For example, a
`mobile device can be enabled for each of or some of the
`
`following functions: voice transmission (cell phones), text
`transmission (pagers and PDAs), sending and receiving data
`for viewing of Internet websites, multi-media messages,
`videography and photography. Additionally, mobile devices
`can include one or more applications such as a map applica-
`tion or a navigation application for retrieving maps and direc-
`tions to locations relative to the mobile device.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Implementations of the present technology will now be
`described, by way of example only, with reference to the
`attached figures, wherein:
`FIG. 1 is an illustrative flow chart of a method for deter-
`
`mining a mobile device’s current location and signifying and
`action spot, in accordance with an exemplary implementation
`of the present technology;
`FIG. 2 is an illustrative implementation of an electronic
`device with a map displayed in accordance with the present
`technology;
`FIG. 3 is an illustrative implementation of a graphical user
`interface displaying an action spot within a predetermined
`distance from a current location of a mobile device shown in
`FIG. 2;
`FIG. 4 is an illustrative implementation of a display of a
`mobile device signifying a plurality of action spots present
`within the vicinity of the current location of the mobile
`device, in accordance with the present technology;
`FIG. 5 is an illustrative implementation of a graphical user
`interface of a mobile device displaying a venue-specific map
`and action spots in accordance with the present technology;
`FIG. 6 is an illustrative implementation of a graphical user
`interface of a mobile device displaying the documenting
`action associated with an action spot within a predetermined
`distance from the current location of the mobile device;
`FIG. 7 is an illustrative implementation of a graphical user
`interface of a mobile device having a compass showing at
`least the distance and direction to an action spot proximate to
`the mobile device;
`FIG. 8 is an illustrative implementation of a graphical user
`interface for determining action spots that utilizes a camera
`viewfinder of an integrated camera of the mobile device;
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`
`FIG. 9 is a block diagram representing a mobile device
`interacting in a communication network in accordance with
`an exemplary implementation of the present technology; and
`FIG. 10 is a block diagram representing the interaction
`between a plurality of resources, a mobile device, and a
`processor configured to determine action spots relative to the
`location of the mobile device in accordance with an exem-
`
`plary implementation of the present technology.
`
`DETAILED DESCRIPTION
`
`For simplicity and clarity of illustration, where appropri-
`ate, reference numerals have been repeated among the differ-
`ent figures to indicate corresponding or analogous elements.
`In addition, numerous specific details are set forth in order to
`provide a thorough understanding of the implementations
`described herein. However, those of ordinary skill in the art
`will understand that the implementations described herein
`can be practiced without these specific details. In other
`instances, methods, procedures and components have not
`been described in detail so as not to obscure the related
`
`relevant feature being described. Also, the description is not
`to be considered as limiting the scope of the implementations
`described herein.
`
`Several definitions that apply throughout this disclosure
`will now be presented. The word “coupled” is defined as
`connected, whether directly or indirectly through intervening
`components, and is not necessarily limited to physical con-
`nections. The term “communicatively coupled” is defined as
`connected whether directly or indirectly though intervening
`components, is not necessarily limited to a physical connec-
`tion, and allows for the transfer of data. The term “mobile
`device” is defined as any electronic device that is capable ofat
`least accepting information entries from a user and includes
`the device’s own power source. A “wireless communication”
`means communication that occurs without wires using elec-
`tromagnetic radiation. The term “highlight” refers to altering
`the appearance of a graphical item displayed on the display
`screen to indicate that the graphical item has been selected for
`execution. For example, highlighting can include changing
`the color of the graphical item, changing the font or appear-
`ance ofthe graphical item, applying a background color to the
`graphical item, superimposing a block of semi-transparent
`color over the graphical item, placing a border around the
`graphical item, enlarging the graphical item as compared to
`other graphical items proximate to the highlighted graphical
`item, or other similar and known methods of highlighting
`graphical items or text items display on a display screen. The
`term “memory” refers to transitory memory and non-transi-
`tory memory. For example, non-transitory memory can be
`implemented as Random Access Memory (RAM), Read-
`Only Memory (ROM), flash, ferromagnetic, phase-change
`memory, and other non-transitory memory technologies.
`The term “activity” refers to an action taken by a mobile
`device. For example, an activity can include but is not limited
`to a documenting action (such as a text messaging, emailing,
`blogging, posting a message on a social networking intemet
`site, or any other documenting actions), a recording action
`(such as video recording, audio recording, or photographing
`taken by a mobile device) or any other action where the
`mobile device is being used to observe and make note of a
`location or an event currently occurring at the location of the
`mobile device. The term “action spot” refers to a location or
`an event where at least one activity is occurring relative to the
`current location of another mobile device.
`
`When mobile devices are enabled for navigational func-
`tions, mobile devices can retrieve and display maps and direc-
`
`Snap Inc. EX. 1001 Page 0013
`
`Snap Inc. Ex. 1001 Page 0013
`
`
`
`US 8,326,327 B2
`
`3
`tions to locations relative to the current location ofthe mobile
`
`4
`mined as a location where at least one other mobile device has
`
`device. Typically, the maps and directions are limited in infor-
`mation. For example, maps are limited to displaying the
`streets within a city. In order to find information relating to
`events and happenings currently occurring proximate to the
`mobile device’s present location,
`the user of the mobile
`device will have to search an external resource, such as an
`electronic events calendar, intemet sites, intemet calendars of
`individual business or event holders (stores, restaurants, con-
`cert venues, bars, etc.), and compare the locations of the
`found events and happenings to the mobile device’s current
`location. Such a process of manually researching events and
`happenings, determining the location of the events and hap-
`penings, and comparing the location of the events and hap-
`penings to the user’s current location is tedious and results in
`user frustration. Moreover, the results ofthe user’ s research of
`current events and happenings can be incomplete and inac-
`curate, and the user can miss certain happenings that are close
`in proximity to the current location of the user’s mobile
`device.
`
`The present disclosure provides a system and method of
`determining action spot locations relative to the location of a
`mobile device.
`In one implementation, a mobile device
`includes a display and a processor module communicatively
`coupled to the display. The processor can be configured to
`receive executable instructions to: determine a current loca-
`
`tion of the mobile device; determine at least one action spot,
`within a predetermined distance from the current location of
`the mobile device; signify the at least one action spot with a
`graphical item on the display of the mobile device; marking
`the graphical item according to an activity level of the at least
`one action spot. The activity spot can include a location
`relative to the current location of the mobile device where at
`
`least one other mobile device has engaged in documenting
`action within a predetermined period of time.
`FIG. 1 is an illustrative implementation of a flow chart of a
`method 1000 for determining action spots relative to the
`location of a mobile device. The method 1000 can be imple-
`mented on any mobile device, such as a cell phone, a smart
`phone, a netbook, a global position system (GPS) device, an
`electronic, table, an electronic pad, a personal digital assistant
`(PDA), or any other similar electronic device which includes
`a display and a processor communicatively coupled to the
`display. In FIG. 1, a graphical user interface can be displayed
`on the display of a mobile device (Block 1010). For example,
`the graphical user interface can be a map, an interactive map,
`a graphical user interface associated with an application con-
`figured to retrieve maps and directions, a graphical user inter-
`face associated with an application configured to determine
`action spot locations, a graphical user interface of a camera
`application, or any other similar graphical user interface
`where the location of the mobile device and action spots
`relative to the location of the mobile device can be displayed.
`Data indicative ofthe current location of the mobile device
`
`is received (Block 1020) and can be displayed on the graphi-
`cal user interface. In the illustrated implementation, a proces-
`sor of the mobile device can receive the data indicative of the
`
`current location of the mobile device. In at least some imple-
`mentations, the data indicative of the current location of the
`mobile device can be received from a satellite positioning
`system, a communications network system, a triangulariza-
`tion system, or any other system that allows for determining
`the location or position of a mobile device.
`The processor can determine at
`least one action spot
`located within a predetermined distance from the current
`location of the mobile device (Block 1030). In at least one
`implementation, the at least one action spot can be deter-
`
`engaged in a documenting action within a predetermined
`period of time from the time the mobile device arrived at the
`current location of the mobile device. For example, the pro-
`cessor can determine the at least one action spot as the loca-
`tion where at least one other mobile device is composing an
`email, composing a text message, messaging on an instant
`messenger application, posting messages, pictures, or videos
`on a social networking site, posting on a virtual posting
`mechanism, or any other similar documenting action. Alter-
`natively, the at least one action spot can be determined based
`on at least one other mobile device performing a recording
`action, such as video recording, audio recording, or photo-
`graphing, within a predetermined distance from the current
`location of the mobile device. In another implementation, the
`at least one action spot can be determined by monitoring the
`number of data packet transmissions occurring within a par-
`ticular geographical area or the number of data packets being
`transmitted from at least one other mobile device. In yet other
`implementations, the at least action spot can be the location
`where at least one other mobile device has documented,
`recorded, accounted, chronicled, or otherwise has taken note
`of a location or a current happening occurring at the location.
`The at least one action spot is signified on the graphical
`user interface (Block 1040). For example, the processor can
`execute instructions to display the at least one action spot on
`the graphical user interface as a graphical item such as an
`icon, a picture, a text representation, a drawing, an image, a
`symbol, or any other graphical item that is representative of
`the at least one action spot. The at least one action spot can
`also be displayed relative to the current location ofthe mobile
`device. The processor can determine the level of activity at the
`at least one action spot and can provide an indication of the
`activity of the level at the at least one action spot on the
`graphical user interface (Block 1050). With a graphical indi-
`cation of the action spots and activity levels associated with
`the action spots, a user can review information related to
`current happenings within the vicinity of the user’s mobile
`device. Additionally, information relating to the popularity of
`and the current event occurring within the vicinity surround-
`ing or associated with the current position of mobile devices
`is readily available to the mobile device without having to use
`an external device or a manual search engine, such as an
`internet search engine.
`Exemplary implementations of the method 1000 for deter-
`mining action spot locations relative to the location of a
`mobile device will be described in relation to FIGS. 2-8.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`FIG. 2 is exemplary implementation of the system and
`method of determining an action spot location implemented
`on a mobile device that is a mobile communication device.
`
`50
`
`The mobile device 100 includes a housing which encases
`internal components of the device, such as a microprocessor
`110 (shown in FIG. 9), a printed circuit board (not shown),
`and other operational components. One of ordinary skill in
`the art will understand that other operational components can
`be included in the mobile device 100, but the present disclo-
`sure will not discuss such operational components in detail
`for the sake ofbrevity. The present disclosure provides details
`as to the components utilized in the implementation of the
`system and method of determining an action spot location on
`a mobile device.
`
`The mobile device 100 includes a display screen 102 for
`displaying graphical user-interfaces associated with applica-
`tions programmed on the mobile device 100. The display
`screen 102 can be a liquid crystal display (LCD) screen, a
`light emitting diode (LED) screen, on organic light emitting
`diode (OLED) screen, an active-matrix organic light emitting
`
`55
`
`60
`
`65
`
`Snap Inc. EX. 1001 Page 0014
`
`Snap Inc. Ex. 1001 Page 0014
`
`
`
`US 8,326,327 B2
`
`5
`diode (AMOLED) screen, a nanocrystal display, a nanotube
`display, a touch-sensitive display screen, or any display
`screen on which graphical or Visual elements can be dis-
`played. Above the display screen 102 is a speaker 106 for
`emitting sound from the mobile device 100. Below the dis-
`play screen 102 is a navigation tool 103. The navigation tool
`103 can be an omnidirectional pad, a jogball, a trackball, an
`omnidirectional joystick, a scroll wheel, an optical navigation
`tool, an optical trackball, or any other navigation tool. Below
`the navigation tool 103 is a keyboard 104 having a plurality of
`keys 105. In the illustrated implementation, each key 105 of
`the keyboard 104 bears at least one of an alphabetic, numeric,
`symbolic, or functional indicia. The indicia signify the data
`input to be input upon actuation of the key 105 bearing the
`indicia. In FIG. 2, the keyboard 104 is a reduced keyboard,
`where at least one key 105 is associated with more than one
`alphabetic indicia. In an alternative implementation, the key-
`board 104 can be a full keyboard having each key 105 asso-
`ciated with an alphabetic indicia. The indicia on the keys 122
`of the keyboard 104 are arranged in a QWERTY keyboard
`layout 107; however, one of ordinary skill in the art will
`appreciate that the keyboard layout 107 can be an AZERTY
`layout, a QWERTZ layout, a DVORAK layout, a pinyin
`Chinese keyboard layout, or any other keyboard layout that
`allows a user to input alphabetic, numeric, symbolic, and
`functional indicia. The keys 105 can be press-actuable keys,
`touch— sensitive keys, capacitive keys, or any other similar key
`that allows for the input of data to a processor of the mobile
`device upon user-engagement with the key 105.
`In the illustrated implementation ofFIG. 2, a graphical user
`interface 206 in the form of a map is displayed on the display
`screen 102. The map 206 can be a representation of the
`vicinity surrounding the current location of the mobile device
`100. In at least one implementation, the map 206 can be
`displayed in response the selection and execution of a map
`application, a navigation application, an application for deter-
`mining action spots, or any other similar application that
`provides directions, maps, and information relating to geo-
`graphical locations on the mobile device 100. In an alternative
`implementation, a graphical user interface such as a dialogue
`box can be displayed in response to the launching of an
`application for determining action spots. In another imple-
`mentation, an interactive map can be displayed allowing the
`user of the mobile device 100 to select graphical items,
`manipulate the map, or otherwise alter the map displayed on
`the mobile device 100.
`
`FIG. 3 an exemplary implementation of a graphical user
`interface associated with a system for determining an action
`spot relative to the location of a mobile device. More specifi-
`cally, FIG. 3 illustrates a screenshot of a graphical user inter-
`face 206 displayed on the display 102 of a mobile device 100
`that is an interactive map. In the illustrated implementation,
`the map 206 can be displayed after an application configured
`to determine action spots has been selected and launched. The
`map 206 can include graphical representations 308 ofvenues,
`locations, monuments, buildings, streets, lakes, and other
`locational landmarks representing the vicinity and area sur-
`rounding the current location of the mobile device 100.
`Upon selection and execution of an application to display
`the map 206, a processor 110 (shown in FIG. 9) can execute
`instructions to determine the current location of the mobile
`
`device 100 by retrieving positional data at a position module
`101 (shown in FIG. 9) communicatively coupled to the pro-
`cessor 110. The position module 101 can gather the positional
`data from a GPS system, a triangularization system, a com-
`munications network system, or any other system that can
`determine the position of a mobile device. The current loca-
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`tion 302 ofthe mobile device 100 is identified on the map 206
`by a graphical item. In FIG. 3, the current location 302 of the
`mobile device 100 is identified by a graphical item that is a
`star. In alternative implementations, the current location 302
`can a graphical item that is a circle, a square, or any other
`shape, an human-shaped icon, a text representation, a picture
`or photo, or any other graphical or textual item that signifies
`the current location 302 of the mobile device 100.
`
`The processor 110 can determine whether there are action
`spots 304, 306 relative to the current location 302 of the
`mobile device 100 and signify the action spots 304, 306 on the
`map 206. In the illustrated implementation, the processor 110
`identifies two action spots with a predetermined distance
`from the current location 302 of the mobile device 100. The
`
`action spots 304, 306 are signified on the map 206 by graphi-
`cal items that are clouds. However, one of ordinary skill in the
`art will appreciate that the graphical items can be any other
`shape, a picture, any graphical item, a textual representation,
`a symbolic representation, or any other graphical representa-
`tion that signifies the presence of an action spot within a
`predetermined distance from the current location 302 of the
`mobile device 100.
`
`Also illustrated in FIG. 3, the action spots 304, 306 can
`have different sizes to indicate the level of activity associated
`with the action spot 304, 306. For example, the larger in size
`the graphical item is compared to other graphical items rep-
`resenting action spots, the more activity is occurring at the
`location identified by the graphical
`item. In the specific
`implementation in FIG. 3, the graphical item associated with
`action spot 304 proximate to the lake is larger than the graphi-
`cal item associated with action spot 306 proximate to the
`venue entitled Celebrity Court. The larger graphical item
`associated with action spot 304 can indicate that more docu-
`menting activity has occurred at the lake than at the Celebrity
`Court, and thus, the action spot 304 is a more active, a more
`popular, or a more lively location than action spot 306. The
`level of activity associated with the action spot 304, 306 can
`also be represented by varying the colors of the graphical
`items representing the action spots 304, 306. For example, a
`graphical item that is yellow can represent a moderate amount
`of documenting action; while a graphical item of green rep-
`resents a large amount of documenting action, and thus an
`increased likelihood that the action spot associated with a
`green graphical item is a more happening location, a more
`popular location, or a location where a large number ofpeople
`have gathered to witness and document a current event or
`happening. In other words, the indication of the level of
`activity includes coloring the graphical item in accordance
`with a range of activity occurring at the at least one action
`spot, 304, 306.
`The implementation of the present technology illustrated
`in FIG. 3 illustrates the results of the processor’s 110 deter-
`mination of action spots 304, 306, where the action spots 304,
`306 based on locations where at least one other mobile device
`
`has engaged in documenting action within a specific period of
`time. Reference will now be made with respect to FIG. 10 in
`regards to the processor’s 110 determination of the action
`spots 304, 306. FIG. 10 is a block diagram ofthe processor’s
`110 interaction and communication with the mobile device
`
`100 and a plurality of resources from which the process 110
`can retrieve data representative of documenting actions
`occurring within a predetermined distance from the mobile
`device 100. In at least one implementation, the processor 110
`can retrieve the data from a resource 1110, 1130, 1140 con-
`figured to monitor the documenting actions ofmobile devices
`within a predefined geographical location. For example, the
`resource can be an external server 1110 of the communica-
`
`Snap Inc. EX. 1001 Page 0015
`
`Snap Inc. Ex. 1001 Page 0015
`
`
`
`US 8,326,327 B2
`
`7
`tions network provider ofthe mobile device 100. The external
`server 1110 can monitor the documenting actions of other
`mobile devices 1120 on the same communications network
`
`provider as the mobile device 100 and transmit data to the
`mobile device 100 indicative of action spots located within a
`predetermined distance from the current location 302 of the
`mobile device 100. For example, the server 1110 can monitor
`and log where other mobile devices 1120 are capturing
`images, capturing videos, or transmitting messages, such as
`text messages, instant messages, virtual posts, or any combi-
`nation thereof, and identify the locations as action spots. The
`server 1110 can also monitor the number of images, videos,
`messages, and posts being captured or transmitted at various
`locations to determine the level of documenting activity
`occurring at the various actions spots based on at least one of
`the aforementioned monitored activities.
`In at
`least one
`
`implementation, the processor 110 can transmit the current
`location 302 of the mobile device 100 to the server 1110, and
`a second processor (not shown) coupled to the server 1110
`can determine which action spots are proximate to or in the
`same vicinity as the current location 302 ofthe mobile device
`100. The server 1110 can also transmit the action spot loca-
`tions and levels of activity of the processor 110 of the mobile
`de