`
`(12) United States Patent
`(10) Patent No.:
`US 8,825,084 B2
`
`Hymel et al.
`(45) Date of Patent:
`Sep. 2, 2014
`
`(54) SYSTEM AND METHOD FOR DETERMINING
`ACTION SPOT LOCATIONS RELATIVE TO
`THE LOCATION OF A MOBILE DEVICE
`
`(71) Applicant: Research in Motion Limited, Waterloo
`(CA)
`
`(72)
`
`Inventors: James Allen Hymel, Kitchener (CA);
`Jean Philippe Bouchard, Waterloo
`(CA)
`
`(73) Assignee: BlackBerry Limited, Waterloo (CA)
`
`( * ) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl.No.: 13/648,167
`
`(22)
`
`Filed:
`
`Oct. 9, 2012
`
`(65)
`
`Prior Publication Data
`
`US 2013/0035116A1
`
`Feb. 7, 2013
`
`Related US. Application Data
`
`................ 342/451
`7/2012 Paulson etal.
`8,228,234 B2 *
`7/2012 Busch .............
`455/456.1
`8,229,458 B2 *
`
`8,290,513 B2 * 10/2012 Forstall et al.
`..
`455/456.3
`
`2003/0076808 A1*
`4/2003 McNiff et al.
`................ 370/345
`2005/0073443 A1
`4/2005 Sheha et al.
`2008/0045138 A1
`2/2008 Milic-Frayling et al.
`2008/0102809 A1 *
`5/2008 Beyer ............................ 455/420
`2008/0163057 A1
`7/2008 Lohi et al.
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`WO
`WO
`
`2007036737
`20070036737
`
`4/2007
`4/2007
`
`OTHER PUBLICATIONS
`
`Presselite. Twitter 360. http://www.twitter-360.com.Retrieva1 Nov.
`29, 2010.
`
`(Continued)
`
`Primary Examiner 7 Cong Tran
`
`(74) Attorney, Agent, or FirmiNovak Druce Connolly
`Bove + Quigg LLP
`
`(63) Continuation of application No. 12/870,676, filed on
`Aug. 27, 2010, now Pat. No. 8,326,327.
`
`(57)
`
`ABSTRACT
`
`(51)
`
`(2009.01)
`
`Int. Cl.
`H04W24/00
`(52) US. Cl.
`USPC ..................................... 455/456.3; 455/456.1
`(58) Field of Classification Search
`USPC ................ 455/404.2, 408, 409, 456.1, 456.2,
`455/456.3, 456.5
`See application file for complete search history.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`5,757,290 A
`6,853,911 B1
`8,140,403 B2*
`
`5/ 1998 Watanabe et al.
`2/2005 Sakarya
`3/2012 Ramalingam et al.
`
`A system, server, mobile device, and method for determining
`action spot location. The action spot location can be deter-
`mined relative to the location of a mobile device. The mobile
`
`device can include a display and a processor module commu-
`nicatively coupled to the display. The system, server, mobile
`device, and method can receive data indicative of the current
`location of the mobile device, and determine at least one
`action spot relative to the current location of the mobile
`device. The action spot can be a location where at least one
`other mobile device has engaged in documenting action
`within a predetermined period of time from when the mobile
`device arrived at the current location.
`
`....... 705/261
`
`17 Claims, 10 Drawing Sheets
`
`c8
`
`1010
`
`1020
`
`max
`
`10417
`
`1050
`
`Receive data indicative
`of current localion
`
`
`Display a graphical user
`interiace
`
`
`
`Determine at least one
`
`action spot within a
`
`predetermined distance
`from the current location
`
`
`
`
`Signily the at least one
`action spot on the graphical
`
`user interface
`
`
`Provide an indication of the
`
`acirvrly level at the at least
`
`one action spot
`
`
`
`
`Snap Inc. EX. 1001 Page 0001
`
`Snap Inc. Ex. 1001 Page 0001
`
`
`
`US 8,825,084 B2
`Page 2
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`1/2009 Forstall et al.
`2009/0006994 A1
`2009/0047972 A1*
`2/2009 Neeraj
`....................... 455/4561
`2/2009 Kamada et a1.
`2009/0051785 A1
`4/2009 Yoon
`2009/0098888 A1
`5/2009 Pilskalns et al.
`2009/0132941 A1
`7/2009 Davis et a1.
`2009/0176509 A1
`2009/0189811 A1
`7/2009 Tysowski et a1.
`12/2009 Millmore et al.
`2009/0319595 A1
`1/2010 Pereira et al.
`2010/0004005 A1
`2/2010 Nachman et al.
`2010/0035596 A1
`5/2010 Lin et al.
`...................... 705/14.5
`2010/0125492 A1*
`9/2010 Saavedra et al.
`........... 455/4563
`2010/0248746 A1*
`2/2011 Wolfe ............................. 705/10
`2011/0040603 A1*
`9/2011 Ramalingam et al.
`....... 705/261
`2011/0238517 A1*
`11/2011 Greasby
`2011/0288770 A1
`OTHER PUBLICATIONS
`
`Association for Computing Machinery. Inferring generic activities
`and events from image content and bags of geo-tags; http://portal.
`acm.org/citation.cfm?id:1386361&dl:GUIDE&coll:GUIDE
`&CFID:76303014&CFTOKEN:93381868.Retrieval Nov.
`2010.
`IEEE. Annotating collections of photos using hierarchical event and
`scene
`models.
`http://ieeexplore.ieee.org/Xplore/login.
`jsp?url:http%3A%2F%2Fieeexplore.ieee.org%2F
`ie15%2F4558014%2F4587335%2F04587382.
`pd1%3Farnumber%3D4587382&authDecis
`Nov. 29, 2010.
`Bongwoh Suh. Semi-automatic photo annotation strategies using
`event based clustering and clothing based person recognition. http://
`
`ion:-203. Retrieval
`
`29,
`
`WWW.sciencedirect.com/scienceiob:ArticleURL&,udi:B6VOD-
`4N68NFK-1&,user:10&icoverDate:07%2F31%2F2007&,
`alid:15 61701294&,rdoc:1&ifmt:high&,orig:search&i
`origin:search&,zone:rsltilistiitem&,cdi:5644&,sort:r&,
`st:13&idocanchor:&view:c&ict:1&iacct:C0000 50221&,
`version: 1 &,uerersion:0&,userid:1 0&md5 :dl fd8b6eb5d6ef
`3ebf1883 5ddc41e761&searchtype:a. Retrieval Nov. 30, 2010.
`Jesper Kj eldskov, Jeni Paay: “Just-for-us: a context-aware mobile
`information system facilitating sociality”, ACM, 2 Penn Plaza , suite
`7017New York USA, Sep. 19, 2005-Sep. 22, 2005, pp. 23-30,
`XP040026719, Salzburg DOI: 10.1145/1085777.1085782 ISBN:
`1-59593 -089-2 abstract figures 3-9 section 4.
`Extended European Search Report dated May 18, 2011. In corre-
`sponding application No. 101743086
`Francesca, Carmagnola et a1. “tag-based user modeling for social
`multi-device adaptive guides”, User Modeling and User-Adapted
`Interaction, Kluwer Academic Publishers, Do, vol. 18, No. 5, Jul. 29,
`2008, pp. 497-538, XP019650064, ISSN: 1573-1391. D01: D01:
`10. 1007/8 1 1257-008-9052-2 abstract pp. 498-500 pp. 510-515.
`Partial European Search Report mailed Jan. 28, 201 1. In correspond-
`ing application No. 101743086
`Examination report mailed Feb. 14, 2013, in corresponding European
`patent application No. 10 1 74308 .6.
`Notice of Allowance and Fee(s) Due mailed Sep. 12, 2013, in corre-
`sponding European patent application No. 101743086
`Extended European Search Report mailed Dec. 18, 2013, in corre-
`sponding European patent application No. 131833436
`Office Action mailed Dec. 12, 2013,
`in corresponding Canadian
`patent application No. 2,748,971.
`Extended European Search Report mailed Sep. 18, 2013, in corre-
`sponding European patent application No. 101743086
`
`* cited by examiner
`
`Snap Inc. EX. 1001 Page 0002
`
`Snap Inc. Ex. 1001 Page 0002
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 1 of 10
`
`US 8,825,084 B2
`
`1000
`
`rare—x
`
`Diepiay a graphicai user
`
`interface
`
`Receive data indicative
`
`cf current Eecaiiee
`frem the current iccatice
`Signi'iy the at Eeaei cee rare
`
`7930“\
`
`Determine ai ieaai eee
`
`aciieri epet wiihie a
`predetermined dieiaece
`
`‘
`,
`aeiicri epe’r err {he graphicai
`user interface
`
`3
`
`m
`
`Previde an iedicaiiee er the
`
`
`10am \
`
`
`activity Eevei a: ihe at ieaei
`erre aeiicri seer
`
`F56. ‘5
`
`Snap Inc. EX. 1001 Page 0003
`
`Snap Inc. Ex. 1001 Page 0003
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 2 of 10
`
`US 8,825,084 B2
`
`5/ 100
`
`
`
`STORE FRONT
`
`
`
`@ARQEN AVENUE
`
`
`OFFECE
`» PARK
`
`
`
`
`
`LAKESEEE
`.
`TENNES comm
`
`
`208
`
`/ 103
`
`
`
`B] {:3 <$\\"
`:EEELQAKE 553%
`
`
`
`
`ii /‘397
`
`FIG. 2
`
`Snap Inc. EX. 1001 Page 0004
`
`Snap Inc. Ex. 1001 Page 0004
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 3 of 10
`
`US 8,825,084 B2
`
`\ STORE FRONT
`
`QAEEEN MEEQE
`
`
`NEEGHBCR
`
`STATUE
`GARDEN
`
`
`
`
`PARKING
`fl
`E
`
`‘
`
`8
`
`ROSE \
`GARBEN
`
`-
`
`‘
`
`LAKESEEE
`
`x406
`
`\308
`
`FIG, 3
`
`Snap Inc. EX. 1001 Page 0005
`
`Snap Inc. Ex. 1001 Page 0005
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 4 of 10
`
`US 8,825,084 B2
`
`STORE FRONT
`
`PEG. 4
`
`Snap Inc. EX. 1001 Page 0006
`
`Snap Inc. Ex. 1001 Page 0006
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 5 of 10
`
`US 8,825,084 B2
`
`
`
`Fifi 5
`
`Snap Inc. EX. 1001 Page 0007
`
`Snap Inc. Ex. 1001 Page 0007
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 6 of 10
`
`US 8,825,084 B2
`
`
`
`C. '\ PARKENG
`
`If,
`
`F1")?
`
`"“méilz'fékfiéé"
`
`\
`
`SQ sf?)
`STREET \
`
`PEG. 5
`
`Snap Inc. EX. 1001 Page 0008
`
`Snap Inc. Ex. 1001 Page 0008
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 7 of 10
`
`US 8,825,084 B2
`
`rETA: 2 Min
`
`HEADENG: 234°
`
`ACTEVETY CQUNT: 300 ””
`
`DISTANCE: 0.2 Mi
`
`ACTEVETY TYPE : Gamera-
`
`HG. F
`
`Snap Inc. EX. 1001 Page 0009
`
`Snap Inc. Ex. 1001 Page 0009
`
`
`
`US. Patent
`
`Sep. 2, 2014
`
`Sheet 8 of 10
`
`US 8,825,084 B2
`
`flflflflflflflflflflflflflflflflflfl
`
`flflflflflflflflflflflflflflflflflfl
`
`,/
`
`DE
`SD
`88
`GD
`GD
`GE
`BB
`QB
`mm
`GB
`GD
`BU
`
`DEB BBBBBB
`
`DBDDUDDQ
`
`UBDDUDUU
`
`flfiflflflflflfl
`
`Astivéiy mama?
`
`’
`
`i
`
`{ BEARENG: 9:30
`DISTANCE: 0.3 Mi
`
`‘\
`\
`
`ACTiVETY: 2989/"
`ETA14 Min~\\
`
`HQ 8
`
`Snap Inc. EX. 1001 Page 0010
`
`Snap Inc. Ex. 1001 Page 0010
`
`
`
`U.S. Patent
`
`Sep.2,2014
`
`Sheet90f10
`
`US 8,825,084 B2
`
`R;mommmom
`
`Qmm
`
`mmwx
`
`wmfi
`
`m:commmgmzmama8888888888888888888888
`
`m.moo,»Awramwwuci‘yggcm
`
`WmEExm
`
`35mm»
`
`
`
` .ggAgém;,fafiygWWI\(Ex!wMaoxm..N»mn,xmEfimEm8m
`
`wcwfimOAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\gx
`
`
`
`0.mumumwg0$5588888888888888
`
`
`
`
`
`mg@3nguHm50$3
`mggmgcsEEo8%EMf.
`
`
`
`mgmwmmmgwmafimhmfim
`
`..............................................................................................
`
`mam
`
`Snap Inc. EX. 1001 Page 0011
`
`Snap Inc. Ex. 1001 Page 0011
`
`
`
`
`
`
`
`U.S. Patent
`
`Sep. 2, 2014
`
`Sheet 10 of 10
`
`US 8,825,084 B2
`
`em:\\
`
`gm:
`
`83%
`
`mmfiofi
`
`83mm
`
`mmficfi
`
`83mm
`
`$3Rx
`
`
`
`
`
`.Emma«mag
`
`
`
`mm,.mE
`
`Egan.“
`
`
`
`E:s\\mgasmowcuggao
`
`azmfiafimzH
`
`ktmnmemfi
`
`Lmamm
`
`#353
`
`mam
`
`I
`/I
`
`Q
`m,mX
`h
`
`améfimz
`
`Bmmmmem
`
`33%
`
`$532,,/r9:
`
`/E
`
`Snap Inc. EX. 1001 Page 0012
`
`Snap Inc. Ex. 1001 Page 0012
`
`
`
`
`
`
`
`
`US 8,825,084 B2
`
`1
`SYSTEM AND METHOD FOR DETERMINING
`ACTION SPOT LOCATIONS RELATIVE TO
`THE LOCATION OF A MOBILE DEVICE
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation of US. application Ser.
`No. 12/870,676, filed Aug. 27, 2010. US. application Ser.
`No. 12/870,676 is incorporated by reference in its entirety.
`
`FIELD OF TECHNOLOGY
`
`The subject matter herein generally relates to mobile
`devices, and more specifically relates to a system and method
`for determining an action spot based on the location of a
`mobile device.
`
`BACKGROUND
`
`With the advent of more robust electronic systems,
`advancements of mobile devices are becoming more preva-
`lent. Mobile devices can provide a variety offunctions includ-
`ing, for example, telephonic, audio/video, and gaming func-
`tions. Mobile devices can include mobile stations such as
`
`cellular telephones, smart telephones, portable gaming sys-
`tems, portable audio and video players, electronic writing or
`typing tablets, handheld messaging devices, personal digital
`assistants, and handheld computers.
`Mobile devices allow users to have an integrated device
`which can perform a variety of different tasks. For example, a
`mobile device can be enabled for each of or some of the
`
`following functions: voice transmission (cell phones), text
`transmission (pagers and PDAs), sending and receiving data
`for viewing of Internet websites, multi-media messages,
`videography and photography. Additionally, mobile devices
`can include one or more applications such as a map applica-
`tion or a navigation application for retrieving maps and direc-
`tions to locations relative to the mobile device.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Implementations of the present technology will now be
`described, by way of example only, with reference to the
`attached figures, wherein:
`FIG. 1 is an illustrative flow chart of a method for deter-
`
`mining a mobile device’s current location and signifying and
`action spot, in accordance with an exemplary implementation
`of the present technology;
`FIG. 2 is an illustrative implementation of an electronic
`device with a map displayed in accordance with the present
`technology;
`FIG. 3 is an illustrative implementation of a graphical user
`interface displaying an action spot within a predetermined
`distance from a current location of a mobile device shown in
`FIG. 2;
`FIG. 4 is an illustrative implementation of a display of a
`mobile device signifying a plurality of action spots present
`within the vicinity of the current location of the mobile
`device, in accordance with the present technology;
`FIG. 5 is an illustrative implementation of a graphical user
`interface of a mobile device displaying a venue-specific map
`and action spots in accordance with the present technology;
`FIG. 6 is an illustrative implementation of a graphical user
`interface of a mobile device displaying the documenting
`action associated with an action spot within a predetermined
`distance from the current location of the mobile device;
`
`2
`
`FIG. 7 is an illustrative implementation of a graphical user
`interface of a mobile device having a compass showing at
`least the distance and direction to an action spot proximate to
`the mobile device;
`FIG. 8 is an illustrative implementation of a graphical user
`interface for determining action spots that utilizes a camera
`viewfinder of an integrated camera of the mobile device;
`FIG. 9 is a block diagram representing a mobile device
`interacting in a communication network in accordance with
`an exemplary implementation of the present technology; and
`FIG. 10 is a block diagram representing the interaction
`between a plurality of resources, a mobile device, and a
`processor configured to determine action spots relative to the
`location of the mobile device in accordance with an exem-
`
`plary implementation of the present technology.
`
`DETAILED DESCRIPTION
`
`For simplicity and clarity of illustration, where appropri-
`ate, reference numerals have been repeated among the differ-
`ent figures to indicate corresponding or analogous elements.
`In addition, numerous specific details are set forth in order to
`provide a thorough understanding of the implementations
`described herein. However, those of ordinary skill in the art
`will understand that the implementations described herein
`can be practiced without these specific details. In other
`instances, methods, procedures and components have not
`been described in detail so as not to obscure the related
`
`10
`
`15
`
`20
`
`25
`
`30
`
`relevant feature being described. Also, the description is not
`to be considered as limiting the scope of the implementations
`described herein.
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Several definitions that apply throughout this disclosure
`will now be presented. The word “coupled” is defined as
`connected, whether directly or indirectly through intervening
`components, and is not necessarily limited to physical con-
`nections. The term “communicatively coupled” is defined as
`connected whether directly or indirectly though intervening
`components, is not necessarily limited to a physical connec-
`tion, and allows for the transfer of data. The term “mobile
`device” is defined as any electronic device that is capable ofat
`least accepting information entries from a user and includes
`the device’s own power source. A “wireless communication”
`means communication that occurs without wires using elec-
`tromagnetic radiation. The term “highlight” refers to altering
`the appearance of a graphical item displayed on the display
`screen to indicate that the graphical item has been selected for
`execution. For example, highlighting can include changing
`the color of the graphical item, changing the font or appear-
`ance ofthe graphical item, applying a background color to the
`graphical item, superimposing a block of semi-transparent
`color over the graphical item, placing a border around the
`graphical item, enlarging the graphical item as compared to
`other graphical items proximate to the highlighted graphical
`item, or other similar and known methods of highlighting
`graphical items or text items display on a display screen. The
`term “memory” refers to transitory memory and non-transi-
`tory memory. For example, non-transitory memory can be
`implemented as Random Access Memory (RAM), Read-
`Only Memory (ROM), flash, ferromagnetic, phase-change
`memory, and other non-transitory memory technologies.
`The term “activity” refers to an action taken by a mobile
`device. For example, an activity can include but is not limited
`to a documenting action (such as a text messaging, emailing,
`blogging, posting a message on a social networking intemet
`site, or any other documenting actions), a recording action
`(such as video recording, audio recording, or photographing
`taken by a mobile device) or any other action where the
`
`Snap Inc. EX. 1001 Page 0013
`
`Snap Inc. Ex. 1001 Page 0013
`
`
`
`US 8,825,084 B2
`
`3
`mobile device is being used to observe and make note of a
`location or an event currently occurring at the location of the
`mobile device. The term “action spot” refers to a location or
`an event where at least one activity is occurring relative to the
`current location of another mobile device.
`
`When mobile devices are enabled for navigational func-
`tions, mobile devices can retrieve and display maps and direc-
`tions to locations relative to the current location ofthe mobile
`
`device. Typically, the maps and directions are limited in infor-
`mation. For example, maps are limited to displaying the
`streets within a city. In order to find information relating to
`events and happenings currently occurring proximate to the
`mobile device’s present location,
`the user of the mobile
`device will have to search an external resource, such as an
`electronic events calendar, intemet sites, intemet calendars of
`individual business or event holders (stores, restaurants, con-
`cert venues, bars, etc.), and compare the locations of the
`found events and happenings to the mobile device’s current
`location. Such a process of manually researching events and
`happenings, determining the location of the events and hap-
`penings, and comparing the location of the events and hap-
`penings to the user’s current location is tedious and results in
`user frustration. Moreover, the results ofthe user’ s research of
`current events and happenings can be incomplete and inac-
`curate, and the user can miss certain happenings that are close
`in proximity to the current location of the user’s mobile
`device.
`
`The present disclosure provides a system and method of
`determining action spot locations relative to the location of a
`mobile device.
`In one implementation, a mobile device
`includes a display and a processor module communicatively
`coupled to the display. The processor can be configured to
`receive executable instructions to: determine a current loca-
`
`tion of the mobile device; determine at least one action spot,
`within a predetermined distance from the current location of
`the mobile device; signify the at least one action spot with a
`graphical item on the display of the mobile device; marking
`the graphical item according to an activity level of the at least
`one action spot. The activity spot can include a location
`relative to the current location of the mobile device where at
`
`least one other mobile device has engaged in documenting
`action within a predetermined period of time.
`FIG. 1 is an illustrative implementation of a flow chart of a
`method 1000 for determining action spots relative to the
`location of a mobile device. The method 1000 can be imple-
`mented on any mobile device, such as a cell phone, a smart
`phone, a netbook, a global position system (GPS) device, an
`electronic, table, an electronic pad, a personal digital assistant
`(PDA), or any other similar electronic device which includes
`a display and a processor communicatively coupled to the
`display. In FIG. 1, a graphical user interface can be displayed
`on the display of a mobile device (Block 1010). For example,
`the graphical user interface can be a map, an interactive map,
`a graphical user interface associated with an application con-
`figured to retrieve maps and directions, a graphical user inter-
`face associated with an application configured to determine
`action spot locations, a graphical user interface of a camera
`application, or any other similar graphical user interface
`where the location of the mobile device and action spots
`relative to the location of the mobile device can be displayed.
`Data indicative ofthe current location of the mobile device
`
`is received (Block 1020) and can be displayed on the graphi-
`cal user interface. In the illustrated implementation, a proces-
`sor of the mobile device can receive the data indicative of the
`
`current location of the mobile device. In at least some imple-
`mentations, the data indicative of the current location of the
`mobile device can be received from a satellite positioning
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`system, a communications network system, a triangulariza-
`tion system, or any other system that allows for determining
`the location or position of a mobile device.
`The processor can determine at
`least one action spot
`located within a predetermined distance from the current
`location of the mobile device (Block 1030). In at least one
`implementation, the at least one action spot can be deter-
`mined as a location where at least one other mobile device has
`
`engaged in a documenting action within a predetermined
`period of time from the time the mobile device arrived at the
`current location of the mobile device. For example, the pro-
`cessor can determine the at least one action spot as the loca-
`tion where at least one other mobile device is composing an
`email, composing a text message, messaging on an instant
`messenger application, posting messages, pictures, or videos
`on a social networking site, posting on a virtual posting
`mechanism, or any other similar documenting action. Alter-
`natively, the at least one action spot can be determined based
`on at least one other mobile device performing a recording
`action, such as video recording, audio recording, or photo-
`graphing, within a predetermined distance from the current
`location of the mobile device. In another implementation, the
`at least one action spot can be determined by monitoring the
`number of data packet transmissions occurring within a par-
`ticular geographical area or the number of data packets being
`transmitted from at least one other mobile device. In yet other
`implementations, the at least action spot can be the location
`where at least one other mobile device has documented,
`recorded, accounted, chronicled, or otherwise has taken note
`of a location or a current happening occurring at the location.
`The at least one action spot is signified on the graphical
`user interface (Block 1040). For example, the processor can
`execute instructions to display the at least one action spot on
`the graphical user interface as a graphical item such as an
`icon, a picture, a text representation, a drawing, an image, a
`symbol, or any other graphical item that is representative of
`the at least one action spot. The at least one action spot can
`also be displayed relative to the current location ofthe mobile
`device. The processor can determine the level of activity at the
`at least one action spot and can provide an indication of the
`activity of the level at the at least one action spot on the
`graphical user interface (Block 1050). With a graphical indi-
`cation of the action spots and activity levels associated with
`the action spots, a user can review information related to
`current happenings within the vicinity of the user’s mobile
`device. Additionally, information relating to the popularity of
`and the current event occurring within the vicinity surround-
`ing or associated with the current position of mobile devices
`is readily available to the mobile device without having to use
`an external device or a manual search engine, such as an
`internet search engine.
`Exemplary implementations of the method 1000 for deter-
`mining action spot locations relative to the location of a
`mobile device will be described in relation to FIGS. 2-8.
`
`FIG. 2 is exemplary implementation of the system and
`method of determining an action spot location implemented
`on a mobile device that is a mobile communication device.
`
`The mobile device 100 includes a housing which encases
`internal components of the device, such as a microprocessor
`110 (shown in FIG. 9), a printed circuit board (not shown),
`and other operational components. One of ordinary skill in
`the art will understand that other operational components can
`be included in the mobile device 100, but the present disclo-
`sure will not discuss such operational components in detail
`for the sake ofbrevity. The present disclosure provides details
`
`Snap Inc. EX. 1001 Page 0014
`
`Snap Inc. Ex. 1001 Page 0014
`
`
`
`5
`as to the components utilized in the implementation of the
`system and method of determining an action spot location on
`a mobile device.
`
`6
`Upon selection and execution of an application to display
`the map 206, a processor 110 (shown in FIG. 9) can execute
`instructions to determine the current location of the mobile
`
`US 8,825,084 B2
`
`5
`
`15
`
`The mobile device 100 includes a display screen 102 for
`displaying graphical user-interfaces associated with applica-
`tions programmed on the mobile device 100. The display
`screen 102 can be a liquid crystal display (LCD) screen, a
`light emitting diode (LED) screen, on organic light emitting
`diode (OLED) screen, an active-matrix organic light emitting
`diode (AMOLED) screen, a nanocrystal display, a nanotube 10
`display, a touch-sensitive display screen, or any display
`screen on which graphical or Visual elements can be dis-
`played. Above the display screen 102 is a speaker 106 for
`emitting sound from the mobile device 100. Below the dis-
`play screen 102 is a navigation tool 103. The navigation tool
`103 can be an omnidirectional pad, a jogball, a trackball, an
`omnidirectional joystick, a scroll wheel, an optical navigation
`tool, an optical trackball, or any other navigation tool. Below
`the navigation tool 103 is a keyboard 104 having a plurality of
`keys 105. In the illustrated implementation, each key 105 of 20
`the keyboard 104 bears at least one of an alphabetic, numeric,
`symbolic, or functional indicia. The indicia signify the data
`input to be input upon actuation of the key 105 bearing the
`indicia. In FIG. 2, the keyboard 104 is a reduced keyboard,
`where at least one key 105 is associated with more than one 25
`alphabetic indicia. In an alternative implementation, the key-
`board 104 can be a full keyboard having each key 105 asso-
`ciated with an alphabetic indicia. The indicia on the keys 122
`of the keyboard 104 are arranged in a QWERTY keyboard
`layout 107; however, one of ordinary skill in the art will 30
`appreciate that the keyboard layout 107 can be an AZERTY
`layout, a QWERTZ layout, a DVORAK layout, a pinyin
`Chinese keyboard layout, or any other keyboard layout that
`allows a user to input alphabetic, numeric, symbolic, and
`functional indicia. The keys 105 can be press-actuable keys, 35
`touch-sensitive keys, capacitive keys, or any other similar key
`that allows for the input of data to a processor of the mobile
`device upon user-engagement with the key 105.
`In the illustrated implementation ofFIG. 2, a graphical user
`interface 206 in the form of a map is displayed on the display 40
`screen 102. The map 206 can be a representation of the
`vicinity surrounding the current location of the mobile device
`100. In at least one implementation, the map 206 can be
`displayed in response the selection and execution of a map
`application, a navigation application, an application for deter- 45
`mining action spots, or any other similar application that
`provides directions, maps, and information relating to geo-
`graphical locations on the mobile device 100. In an alternative
`implementation, a graphical user interface such as a dialogue
`box can be displayed in response to the launching of an 50
`application for determining action spots. In another imple-
`mentation, an interactive map can be displayed allowing the
`user of the mobile device 100 to select graphical items,
`manipulate the map, or otherwise alter the map displayed on
`the mobile device 100.
`
`55
`
`FIG. 3 an exemplary implementation of a graphical user
`interface associated with a system for determining an action
`spot relative to the location of a mobile device. More specifi-
`cally, FIG. 3 illustrates a screenshot of a graphical user inter-
`face 206 displayed on the display 102 of a mobile device 100 60
`that is an interactive map. In the illustrated implementation,
`the map 206 can be displayed after an application configured
`to determine action spots has been selected and launched. The
`map 206 can include graphical representations 308 ofvenues,
`locations, monuments, buildings, streets, lakes, and other 65
`locational landmarks representing the vicinity and area sur-
`rounding the current location of the mobile device 100.
`
`device 100 by retrieving positional data at a position module
`101 (shown in FIG. 9) communicatively coupled to the pro-
`cessor 110. The position module 101 can gather the positional
`data from a GPS system, a triangularization system, a com-
`munications network system, or any other system that can
`determine the position of a mobile device. The current loca-
`tion 302 ofthe mobile device 100 is identified on the map 206
`by a graphical item. In FIG. 3, the current location 302 of the
`mobile device 100 is identified by a graphical item that is a
`star. In alternative implementations, the current location 302
`can a graphical item that is a circle, a square, or any other
`shape, an human-shaped icon, a text representation, a picture
`or photo, or any other graphical or textual item that signifies
`the current location 302 of the mobile device 100.
`
`The processor 110 can determine whether there are action
`spots 304, 306 relative to the current location 302 of the
`mobile device 100 and signify the action spots 304, 306 on the
`map 206. In the illustrated implementation, the processor 110
`identifies two action spots with a predetermined distance
`from the current location 302 of the mobile device 100. The
`
`action spots 304, 306 are signified on the map 206 by graphi-
`cal items that are clouds. However, one of ordinary skill in the
`art will appreciate that the graphical items can be any other
`shape, a picture, any graphical item, a textual representation,
`a symbolic representation, or any other graphical representa-
`tion that signifies the presence of an action spot within a
`predetermined distance from the current location 302 of the
`mobile device 100.
`
`Also illustrated in FIG. 3, the action spots 304, 306 can
`have different sizes to indicate the level of activity associated
`with the action spot 304, 306. For example, the larger in size
`the graphical item is compared to other graphical items rep-
`resenting action spots, the more activity is occurring at the
`location identified by the graphical
`item. In the specific
`implementation in FIG. 3, the graphical item associated with
`action spot 304 proximate to the lake is larger than the graphi-
`cal item associated with action spot 306 proximate to the
`venue entitled Celebrity Court. The larger graphical item
`associated with action spot 304 can indicate that more docu-
`menting activity has occurred at the lake than at the Celebrity
`Court, and thus, the action spot 304 is a more active, a more
`popular, or a more lively location than action spot 306. The
`level of activity associated with the action spot 304, 306 can
`also be represented by varying the colors of the graphical
`items representing the action spots 304, 306. For example, a
`graphical item that is yellow can represent a moderate amount
`of documenting action; while a graphical item of green rep-
`resents a large amount of documenting action, and thus an
`increased likelihood that the action spot associated with a
`green graphical item is a more happening location, a more
`popular location, or a location where a large number ofpeople
`have gathered to witness and document a current event or
`happening. In other words, the indication of the level of
`activity includes coloring the graphical item in accordance
`with a range of activity occurring at the at least one action
`spot, 304, 306.
`The implementation of the present technology illustrated
`in FIG. 3 illustrates the results of the processor’s 110 deter-
`mination of action spots 304, 306, where the action spots 304,
`306 based on locations where at least one other mobile device
`
`has engaged in documenting action within a specific period of
`time. Reference will now be made with respect to FIG. 10 in
`regards to the processor’s 110 determination of the action
`spots 304, 306. FIG. 10 is a block diagram ofthe processor’s
`
`Snap Inc. EX. 1001 Page 0015
`
`Snap Inc. Ex. 1001 Page 0015
`
`
`
`US 8,825,084 B2
`
`7
`110 interaction and communication with the mobile device
`
`100 and a plurality of resources from which the process 110
`can retrieve data representative of documenting actions
`occurring within a predetermined distance from the mobile
`device 100. In at least one implementation, the processor 110
`can retrieve the data from a resource 1110, 1130, 1140 con-
`figured to monitor the do