`
`United States Patent
`BrOWn et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8.279,173 B2
`*Oct. 2, 2012
`
`US008279173B2
`
`(54) USER INTERFACE FOR SELECTINGA
`PHOTO TAG
`
`(75) Inventors: Michael S. Brown, Kilchanar (CA);
`Gerhard D. Klassen, Waterloo (CA);
`Terrill Dent, Waterloo (CA)
`
`(73) Assignee: Research In Motion Limited, Waterloo,
`Ontario
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`This patent is Subject to a terminal dis-
`claimer.
`(21) Appl. No.: 13/252,807
`
`(22) Filed:
`(65)
`
`Oct. 4, 2011
`Prior Publication Data
`US 2012/0023436A1
`Jan. 26, 2012
`
`Related U.S. Application Data
`(63) Continuation of application No. 1 1/746,285, filed on
`May 9, 2007, now Pat. No. 8,031,170.
`
`(51) Ea's/00
`
`(2006.01)
`
`6,650,889 B1 * 1 1/2003 Evans et al. ................ 455,412.1
`7,636,450 B1
`12/2009 Bourdev
`7,693,906 B1
`4/2010 Amidon et al.
`7,735,018 B2
`6, 2010 Bakhash
`7,840,907 B2 11/2010 Kikuchi et al.
`8,031,170 B2 * 10/2011 Brown et al. ................. 345,156
`2003/0088582 A1
`5/2003 Pflug
`2004/0039988 A1
`2/2004 Lee et al.
`2004/02521 19 A1 12/2004 Hunleth et al.
`2005/OO30588 A1
`2/2005 Reese et al.
`2005, OO39108 A1
`2/2005 Hudson
`2005/0057576 A1
`3/2005 Shen et al.
`(Continued)
`
`CN
`
`FOREIGN PATENT DOCUMENTS
`1193632 A
`1, 2002
`(Continued)
`
`OTHER PUBLICATIONS
`Second Office Action mailed Jun. 22, 2011. In Chinese patent appli
`tion No. 200810144678.1.
`CaO NO
`
`Continued
`(Continued)
`
`Primary Examiner — Nitin Patel
`(74) Attorney, Agent, or Firm — Novak Druce + Quigg LLP
`
`ABSTRACT
`(57)
`There is disclosed a user interface for selecting a photo tag. In
`an embodiment, the user interface embodies a method of
`so 34554 E. selecting a photo tag for a tagged photo, comprising: provid
`E. c - - - - - ificati- - - - - -s
`.
`ing a tag entry field for entering a photo tag; in dependence
`(58) Field of Classification Search ..................
`s
`S
`lication file f
`i.838,
`5/412.1
`upon a string entered by a user, displaying in a matching tag
`ee appl1cauon Ille Ior complete searcn n1Story.
`list any tags from one or more selected tag sources matching
`the entered String. The method may further comprise display
`ing a tag type for each tag appearing in the matching tag list.
`The method may further comprise allowing user selection of
`a tag in the matching tag list to complete the tag entry field.
`
`(56)
`
`References Cited
`
`5,479,602 s EN points
`aecker et al.
`6,002,401 A 12, 1999 Baker
`6,317,142 B1
`1 1/2001 Decoste et al.
`
`
`
`408
`
`20 Claims, 10 Drawing Sheets
`
`410
`
`404
`
`404
`
`406
`
`412
`
`ext i typed before
`Thee Boukeluth
`Tim Jackson
`
`
`
`Facebook's Exhibit No. 1001
`Page 1
`
`
`
`US 8,279,173 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`2005. O19301.0 A1
`9, 2005 DeShan et al.
`2006/0173918 A1
`8, 2006 Nakase et al.
`2006/02621 16 A1 11/2006 Moshiri et al.
`2008/002 1876 A1
`1/2008 Ahern et al. ...................... 707/3
`2008/0106594 A1
`5, 2008 Thrun
`2008, 0215583 A1
`9/2008 Gunawardena et al.
`2008/0282177 A1 11, 2008 Brown et al.
`2008/0306921 A1 12/2008 Rothmuller et al.
`2008/0309617 A1 12/2008 Kong et al.
`2009/0225.178 A1
`9, 2009 Nakase et al.
`
`CN
`WO
`
`FOREIGN PATENT DOCUMENTS
`1933643 A
`3, 2007
`2006102656 A
`9, 2006
`
`OTHER PUBLICATIONS
`English translation on Second Office Action mailed Jun. 11, 2011. In
`Chinese patent application No. 200810144678.1.
`Third Office Action mailed Feb. 29, 2012. In Chinese patent appli
`cation No. 200810144678.1.
`English translation on Third Office Action mailed Feb. 29, 2012. In
`Chinese patent application No. 2008101446781.
`Office Action mailed Jan. 10, 2012. In Canadian patent application
`No. 2,630,947.
`Office Action mailed Nov. 18, 2010. In Canadian patent application
`No. 2,630,947.
`First Office Action mailed May 5, 2010. In corresponding Chinese
`patent application No. 200810144678.1.
`English translation on First Office Action mailed May 5, 2010. In
`corresponding Chinese patent application No. 200810144678.1.
`Ballagas, R. et al.; The Smart phone: A ubiquitous input device,
`pervasive computing, IEEE (online) vol. 5, No. 1, Jan. 2006,pp.
`70-77, XP002447 195, ISSN: 1536-1268: Retrieved (by EPO) from
`
`the internet (on Aug. 17, 2007): URL:http://ieeexplore.ieee.org/ie15/
`7756/33539/01593574.pdf?tp=&isnumber=arnumber=1593574.
`Ahern et al., Zone tag: Designing context-aware mobile media cap
`ture to increase participation, yahoo! Research Berkeley, Sep. 6.
`2006.
`http://groups.ischool.berkeley.edu/pics/papers Ahern et
`al Zonetag pics06.pdf.
`Communication Pursuant to Article 94(3) EPC mailed Jan. 30, 2008.
`In European patent application No. 07107866.1.
`Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC
`mailed Aug. 27, 2008. In European patent application No. 07107866.
`1.
`Decision of the Examining Division mailed Dec. 12, 2008. In Euro
`pean patent application No. 07107866.1.
`Extended European Search Reportmailed Febuary 23, 2009. In Euro
`pean patent application No. 08170532.9.
`Communication Pursuant to Article 94(3) EPC mailed Jan. 11, 2010.
`In European patent application No. 08170532.9.
`Wiseman, Josh; "iPhoto, meet Facebook'; Mar. 15, 2007; http://blog.
`facebook.com/blog.php?post–22536571.
`Extended European Search Report mailed Aug. 28, 2007. In Euro
`pean patent application No. 07107866.1.
`First Office Action mailed May 2010. In corresponding Chinese
`patent application No. 2008.10144678.1.
`Ballagas Retal. The Smartphone: A ubiquitous input device, perva
`sive computing, IEEE (online) vol. 5, No. 1, Jan. 2006,pp. 70-77.
`XPO02447195, ISSN: 1536-1268: Retrieved (by EPO) from the
`internet (on Aug. 17, 2007): URL:http://ieeexplore.ieee.org/ie15/
`7756/33539/01593574.pdf?tp=&isnumber=arnumber=1593574.
`Ahern etal. Zone tag: Designing context-aware mobile media capture
`to increase participation, yahoo! Research Berkeley, Sep. 6, 2006,
`http://groups. iSchool.berkeley.edu/pics/papers Ahern et al
`Zonetag pics06.pdf.
`* cited by examiner
`
`Facebook's Exhibit No. 1001
`Page 2
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 1 of 10
`
`US 8,279,173 B2
`
`126
`
`128
`
`132
`
`W+
`
`130
`
`SIMIRUIM
`
`Battery
`
`110
`
`134
`
`
`
`136
`
`Application
`
`Image
`Applications
`
`Display
`108
`
`Flash Memory
`106
`
`104
`
`Communication
`Subsystem
`
`148
`
`Main
`PrOCeSSOr
`
`Auxiliary I/O
`112
`Data Port
`114
`Keyboard
`
`116
`Trackball
`117
`
`146
`
`102
`
`126
`
`GPS
`Subsystem
`
`Short-Range
`Communications
`
`122
`
`FIG. 1 N 100
`
`Facebook's Exhibit No. 1001
`Page 3
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 2 of 10
`
`US 8,279,173 B2
`
`127
`
`
`
`N.
`
`FIG.2
`
`Facebook's Exhibit No. 1001
`Page 4
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 3 of 10
`
`US 8,279,173 B2
`
`302
`
`301
`
`304
`
`& 306
`
`SSS Š
`Š&isi
`
`
`
`
`
`
`
`
`
`
`
`
`
`302
`
`
`
`308
`
`
`
`
`
`
`
`
`
`arrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrra
`
`
`
`s sinsists
`
`:
`
`ss:
`
`SSS
`
`
`
`SSS
`
`
`
`FIG. 3B N-3s
`
`Facebook's Exhibit No. 1001
`Page 5
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 4 of 10
`
`US 8,279,173 B2
`
`302
`
`
`
`
`
`
`
`S
`
`:S
`
`s
`
`
`
`SS Sas
`SSSSSSSSS
`ŠxŠišš: Š& S.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`310
`
`312
`
`
`
`302
`
`s
`
`s S
`Y S & Ess:
`S&SSSSS
`3.
`
`,
`
`As Š
`s&SS:
`Šiš SSSSSSSSSSS&
`
`
`
`
`
`
`
`SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS
`
`310
`
`Facebook's Exhibit No. 1001
`Page 6
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 5 of 10
`
`US 8,279,173 B2
`
`
`
`
`
`316
`
`&
`
`SSSSSSSSSSSSSSS
`
`sT. T. T. T. T.....................Nix ises:
`s & s Š
`is is is is:
`SSS
`
`314
`
`
`
`Facebook's Exhibit No. 1001
`Page 7
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 6 of 10
`
`US 8,279,173 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`410
`
`412
`
`
`
`SSSSS
`:S
`SS & Siss
`
`SSSSSSSSSSSSS
`
`acebook :S
`
`SSSSSSS 8.
`
`:::::: ::::::::: i: ::::::
`
`
`
`SSSSSSSSSSSSSSSSSS
`
`s s
`&
`Ssssssssssssssssssssssssssssssssssssssssssss
`& :
`
`&
`
`
`
`
`
`
`
`
`
`& S.
`
`404
`
`
`
`
`
`Š
`
`406
`
`FIG. 4A
`
`N 400A
`
`
`
`S$ y SSSSSSSSSSSSSS
`
`Š
`
`404
`
`:S
`
`
`
`O
`
`s:
`
`& aramie
`Terrill Deft
`text i typed before
`S. Tee EOluket
`
`in.
`FIG. 4B
`
`406
`
`ge
`
`Facebook's Exhibit No. 1001
`Page 8
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 7 of 10
`
`US 8,279,173 B2
`
`OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOY YO
`
`& & S. S
`SSSSSSSSSS
`
`O
`
`
`
`SSSSSSSSSSSS
`
`
`
`404
`
`an Inters.
`
`SSS
`
`SSSSSSSSSSSSS
`
`Šss
`S
`
`410
`
`
`
`
`
`412a
`
`412b
`
`
`
`k's text i typed before
`& 406
`
`'''''''''''' &s
`
`'
`
`FIG. 4C
`
`Sac
`
`410
`
`
`
`
`
`
`
`
`
`
`
`
`
`412a
`
`412b
`
`
`
`
`
`X. SSS
`SSSSSSSSS
`
`
`
`S.
`
`irrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
`s
`
`is
`a
`‘S’
`-ss Š 404
`SSSSSSSSSSSSSSSSSSSS
`
`
`
`
`
`& S
`
`
`
`
`
`
`
`406
`
`ser N S.
`$:
`...:
`Š
`
`412C
`
`FIG. 4D N-op
`
`Facebook's Exhibit No. 1001
`Page 9
`
`
`
`US. Patent
`
`Oct. 2, 2012
`
`Sheet 8 of 10
`
`US 8,279,173 B2
`
`
`\\
`4A
`
`
`
`\:~\‘iz§;§Rm“
`
`
`
`
`
`‘x\\\\\ \\ \‘ MM
`
`
`
`\.
`taxi: i Emma} the fiaima'ad 33mg,
`ix
`3ng- \ \\ \.\‘
`mwmxxxm m\}\\\§\\\k
`“\
`
`$§§§§§§N§W$®E§ ;
`
`\$553535§5$§b§555§§5555$$55335$55$\‘B\\
`
`
`
`
`N \\“ 5:2:
`
`:\ fikfi
`
`:
`
`'
`
`
`
`
`
`
`
`410
`
`4120
`
`410
`
`404
`
`406
`
`404
`
`406
`
`FIG. 4E
`
`K4DOE
`
`
`‘N.
`
`\
`\\\\
`§ WE
`\vh“:xiv:\\\\\‘1<\\<\\\\\\\\\\'~2\\\<\\'~2\1‘<\'
`i'\\
`"
`\
`
`
`‘ :§\_\
`“M g _\s\_\':~§ :K xix;
`\fide a A:
`mm- m“-“NEW
`4:
`Mfg,\ .
`
`
`
`/
`\ xx;
`*
`*
`'E‘kx‘K “‘w
`:x, 3“,
`\ . 3N
`..
`
`
`
`
`kkk$h§x§§§m “\\ \
`kkkk
`
`
`
`
`
`
`
`
`ix
`
`.
`
`
` ‘.-----.- ‘ - {A
`33% *§\\\‘~‘\\\~
`
`max}
`
`~
`‘-*:‘\‘$
`
`
`
`\mwmmwmmmwww‘nmmg“
`‘-
`
`
`3x
`1
`
`"fi‘:
`
`FIG. 4F
`
`K4OOF
`
`Facebook's Exhibit No. 1001
`
`Page 1 0
`
`Facebook's Exhibit No. 1001
`Page 10
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 9 of 10
`
`US 8,279,173 B2
`
`rt
`
`Display a photo in a display
`
`Provide a photo tagging mode for
`editing tags for the displayed photo
`
`For each tag used to tag a Subjector
`object in the photo, identify the tag
`type and aSSOciate a unique pointer
`for each tag type to highlight the
`Corresponding tagged Subjector
`object
`
`Display a tag list with the photo
`
`Identify the tag type of each tag in the
`tag list using a visual identifier
`
`Upon user Selection of a tag in the tag
`list, highlight the aSSOciated tagged
`Subjector object in the photo using the
`unique pointer
`
`Upon user Selection of a tagged
`Subjector object in the photo, highlight
`the aSSOciated tag in the tag list
`
`504
`
`506
`
`508
`
`510
`
`512
`
`514
`
`516
`
`500
`
`Displaying Context dataaSSOciated
`with the tag
`
`F G. 5
`
`
`
`In the photo tagging mode, Constrain
`the navigation device pointer to be
`navigable only within the boundaries
`of the photo
`
`518
`
`End
`
`Facebook's Exhibit No. 1001
`Page 11
`
`
`
`U.S. Patent
`
`Oct. 2, 2012
`
`Sheet 10 of 10
`
`US 8,279,173 B2
`
`Start
`
`Providing a tag entry field for entering
`a photo tag
`
`Independence upon a string entered
`by the user, displaying in a matching
`tag list any tags from One or more
`selected tag sources matching the
`entered string
`
`Displaying a tag type for each tag
`appearing in the matching tag list
`
`Ordering tags in the in the tag list by
`tag type
`
`Allowing user Selection of a tag in the
`matching tag list to complete the tag
`entry field
`
`
`
`If there are no tags remaining in the
`matching tag list, adding a new free
`fom text string to a free-form text
`Cache
`
`End
`
`FIG. 6
`
`602
`
`604
`
`606
`
`608
`
`610
`
`612
`
`600
`
`Facebook's Exhibit No. 1001
`Page 12
`
`
`
`1.
`USER INTERFACE FOR SELECTINGA
`PHOTO TAG
`
`US 8,279,173 B2
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This application is a continuation of U.S. application Ser.
`No. 1 1/746,285 entitled “USER INTERFACE FOR
`SELECTING A PHOTO TAG” and filed on May 9, 2007 now
`U.S. Pat. No. 8,031,170. U.S. application Ser. No. 1 1/746.285
`is fully incorporated by reference herein.
`
`10
`
`FIELD OF THE INVENTION
`
`The present invention relates generally to a user interface
`for selecting a photo tag.
`
`15
`
`BACKGROUND
`
`Identifying people or objects in photographs is popular in
`many online contexts, such as photo sharing, Social network
`ing, etc. Selecting a “tag” to associate with an identified point
`in a photograph can be a complicated task if there are many
`potential tags to choose from. In addition, wireless mobile
`communication device where there are constraints on the size
`of the display and the flexibility of the input method, some of
`these common techniques used on desktops and laptops with
`full sized screens do not work as well.
`What is needed is an improved user interface for selecting
`tags in a smaller display, such as may be found on a wireless
`mobile communication device.
`
`25
`
`30
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`In the figures which illustrate exemplary embodiments:
`FIG. 1 is an illustration of a device in accordance with an
`embodiment;
`FIG. 2 is a schematic block diagram of a communication
`subsystem component in the device of FIG. 1;
`FIGS. 3A to 3E are illustrative user interface screens for
`editing photo tags in accordance with an embodiment;
`FIGS. 4A to 4F are illustrative user interface screens for
`selecting a photo tag in accordance with an embodiment;
`FIG. 5 is an illustrative flowchart of a method for editing
`photo tags in accordance with an embodiment; and
`FIG. 6 is an illustrative flowchart of a method for selecting
`a photo tag in accordance with an embodiment.
`
`DETAILED DESCRIPTION
`
`35
`
`40
`
`45
`
`50
`
`55
`
`As noted above, the present invention relates to a user
`interface for selecting a photo tag, particularly within a
`Smaller display, Such as may be found on a wireless mobile
`communication device.
`In an illustrative embodiment, the invention may be prac
`ticed with a handheld mobile communication device in a
`wireless operating environment. Shown in FIG. 1 is a sche
`matic block diagram of an illustrative handheld mobile com
`munication device 100. The communication device 100 may
`60
`comprise a number of components, including a main proces
`sor 102 which controls the overall operation of communica
`tion device 100. Communication functions, including data
`and Voice communications, may be performed through a
`communication Subsystem 104. The communication Sub
`system 104 may receive messages from and send messages to
`a wireless network 200.
`
`65
`
`2
`The main processor 102 may also interact with additional
`Subsystems such as a random access memory (RAM) 106, a
`flash memory 108, a display 110, an auxiliary input/output
`(I/O) subsystem 112, a data port 114, a keyboard 116, a
`trackball 117, a speaker 118, a microphone 120, short-range
`communications 122, a GPS Subsystem 124, a camera mod
`ule 126, and associated camera logic 128.
`Some of the subsystems of the communication device 100
`may perform communication-related functions, whereas
`other subsystems may provide “resident’ or on-device func
`tions. By way of example, the display 110 and the keyboard
`116 may be used for both communication-related functions,
`Such as entering a text message for transmission over the
`network 200, and device-resident functions such as a calcu
`lator or task list. The trackball 117 may be used for various
`navigation functions, such as navigating through a graphical
`user interface (GUI) menu displayed on display 110. The
`trackball 117 may also be configured with a secondary actua
`tion feature, Such as allowing a user to depress the trackball,
`to allow selection of a highlighted item.
`The camera module 126 may be adapted to capture an
`image through a lens onto a light sensitive image sensor Such
`as a charge coupled device (CCD) sensor array or a comple
`mentary metal oxide semiconductor (CMOS) sensor array.
`The camera lens may be a fixed focus lens, or a variable focus
`lens with or without Zoom features and controlled by camera
`logic 128 to focus an image onto the CCD or CMOS sensor
`array. The size and pixel density of the CCD or CMOS sensor
`array may be Suitably selected for the image resolution
`required for a particular application. Camera logic 128 may
`also control the camera lens aperture and/or shutter speed by
`incorporating a suitable light exposure meter. Image capture
`using camera module 126 may initiated by a user controlling
`a dedicated camera shutter, or a context dependent program
`mable button or key (on keyboard 116, for example) that may
`act as a camera shutter button.
`Once captured by the CCD or CMOS sensor array, the
`image may then be processed by camera logic 128 into a
`Suitable digital image file format such as Joint Photographic
`Experts Group (JPEG), Tagged-Image File Format (TIFF),
`Bit Mapping (BMP), different variations on these standard
`image file formats, or a vendor proprietary RAW image for
`mat. The image file format may allow for the addition of
`image metadata to an image file in an industry standards
`exchangeable image file format (EXIF), or in some vendor
`proprietary meta-data format. The image file may then be
`stored in available device storage such as RAM 106 or flash
`memory 108, and displayed on display 110.
`Still referring to FIG. 1, operating system software used by
`the main processor 102 is typically stored in a persistent store
`such as flash memory 108. Those skilled in the art will appre
`ciate that the operating system, specific device applications,
`or parts thereof, may be temporarily loaded into a volatile
`store, such as the RAM 106, for processing by main processor
`102.
`The communication device 100 may send and receive com
`munication signals over the wireless network 200 after
`required network registration or activation procedures have
`been completed. Network access may be associated with a
`subscriber or user of the communication device 100.
`The communication device 100 may be a battery-powered
`device and may include a battery interface 132 for receiving
`one or more rechargeable batteries 130. In some embodi
`ments, the battery 130 may be a smart battery with an embed
`ded microprocessor. The battery interface 132 is coupled to a
`regulator (not shown), which assists the battery 130 in pro
`viding power V+ to the communication device 100. The bat
`
`Facebook's Exhibit No. 1001
`Page 13
`
`
`
`US 8,279,173 B2
`
`10
`
`15
`
`3
`tery 130 may be used to power all components and modules in
`the communication device 100, including the camera module
`126 and associated camera logic 128.
`The main processor 102, in addition to its operating system
`functions, enables execution of various Software applications
`134 on the communication device 100. A subset of software
`applications 134 that control basic device operations, includ
`ing data and Voice communication applications, will nor
`mally be installed on the communication device 100 during
`its manufacture.
`The Software applications 134 may include a messaging
`application 136. The messaging application 136 can be any
`suitable software program that allows a subscriber or user of
`the communication device 100 to send and receive wireless
`text communications. Various alternatives exist for the mes
`saging application 136 as is well known to those skilled in the
`art. Messages that have been sent or received by the user are
`typically stored in local storage such as flash memory 108 of
`the communication device 100, or in some other suitable
`storage element in the communication device 100. In an alter
`native embodiment, some of the sent and received messages
`may be stored remotely from the communication device 100
`Such as in a data store of an associated host system that the
`communication device 100 communicates with. In an
`embodiment, the messaging application 136 may include a
`Message List user interface that is configured to allow a user
`to see a list of message objects (i.e. email messages) in a
`convenient list form. This will be described in detail further
`below.
`Still referring to FIG. 1, communication device 100 may
`execute an Image Applications Module 148 that may be
`operatively integrated with camera module 126, camera logic
`128, main processor 102, RAM 106, display 110 and various
`other modules and components to provide various image
`application functions for the images captured by the camera
`module 126. Image Applications Module 148 may include
`various Sub modules which may interact with each other, and
`with other application modules such as the messaging appli
`cation 136, Internet browser module 138, address book mod
`ule 142, etc. in order to perform various functions.
`40
`In an embodiment, one of the Sub modules of the Image
`Applications Module 148 may be a photo tagging module
`148A configured to allow a user to identify various subjects
`and objects within a photo. Photo tagging module 148A will
`be discussed in more detail further below.
`In an embodiment, another of the submodules of the Image
`Applications Module 148 may be a photo tagging module
`148A configured to allow a user to select a photo tag from a
`list of tags associated with various Subjects and objects within
`a photo. Photo tag selection module 148B will be discussed in
`more detail further below.
`The communication device 100 may further include a
`device state module 140, an address book 142, a Personal
`Information Manager (PIM) 144, and various other modules
`150. Additional software applications may also be loaded
`onto the communication device 100 through at least one of the
`wireless network 200, the auxiliary I/O subsystem 112, the
`data port 114, the short-range communications Subsystem
`122, or other device subsystem 124.
`Now referring to FIG. 2, shown is an illustrative front view
`of a handheld mobile communication device 100 that may
`provide a Suitable operating environment. As shown, the com
`munication device 100 may include a display 110, a keyboard
`116, and other input or navigation means such as a trackball
`117. The display 110 may be configured to display various
`screens allowing the user of device 100 to view screen outputs
`from the various software applications 134, including the
`
`30
`
`4
`image applications 148. Display 110 may also be configured
`to provide a touch-sensitive screen input in response to a
`prompt or query displayed on display 110. The communica
`tion device 100 may further include a camera lens that may be
`used to capture an image as described above with reference to
`FIG. 1. In an embodiment, the integrated camera 126 may
`provide a camera lens 127 on the back of the communication
`device 100, such that a user may use the display 110 as a
`camera viewfinder for framing an image.
`Now referring to FIG. 3A, shown is an illustrative user
`interface screen 300A in which photo tagging module 148A
`may be configured for tagging a photograph in accordance
`with an embodiment. As shown, a photo 301 of a subject 302
`is displayed within the boundaries of the user interface. With
`this user interface, a tag list 304 may include various tags
`associated subject 302 or other subjects or objects within the
`photo 301. The user may click an “Add” button 306 in order
`to enter a photo tagging mode as described below.
`Now referring to FIG. 3B, as shown in screen 300B, once
`in the photo tagging mode, the photo tagging module 148A
`may be configured to display to the user a pointer, such as
`cross-hair pointer 308 on the photo 301. The user may move
`the cross-hair pointer 308 around the photo 301, but unlike
`common web browser interfaces, the crosshair pointer 308
`may be constrained by the photo tagging module 148A to be
`navigable only within the boundaries of the photo 301. The
`user does not have the ability to move the cross-hair pointer
`around the rest of the screen and perform other tasks, and the
`navigation device (e.g. trackball 117) is thus dedicated to this
`photo tagging function until the user exits tagging mode. As
`will be appreciated, this may allow users to avoid inadvert
`ently clicking on some button or menu option just outside of
`the borders photo 301 when attempting to tag a subject or
`object near an edge of the photo. This may be particularly
`useful on a smaller display, where greater dexterity may be
`required in order to navigate within a small area using a small
`navigation device.
`When in tagging mode, the user may cancel tagging mode
`using the escape key and return to simply viewing the existing
`tag information. Alternatively, the user may choose to add a
`tag at the current location of the crosshairpointer 308 using an
`enter key or clicking on the navigation device (e.g. trackball
`117).
`If the user chooses to add a tag, the user may be presented
`with another user interface to select the tag type and the
`context information associated with the tag. As an illustrative
`example, the tag types could include a free-form alphanu
`meric string, FacebookTM friends, address book entries (in
`address book 142), browser bookmarks (in Internet browser
`module 138), etc.
`Now referring to FIG. 3C, as shown by way of illustration
`in screen 300C, when a tag has been added by a user using
`photo tagging module 148A, the added tag 310 associated
`with subject 302 may appear in the tag list 304 near the photo
`301.
`In an embodiment, the tag 310 could also include an icon,
`or some other type of visual indicator adjacent the tag 310,
`indicating what type oftag it is. Thus, many different types of
`tags may be used to tag subjects or objects in the photo 301.
`Now referring to FIG. 3D, as shown by illustration in
`screen 300D, when the user scrolls over the tag 310 in the tag
`list 304, the corresponding subject 302 in the photo may be
`highlighted by the photo tagging module 148A in Some way.
`This highlighting could be different, depending on what type
`of tag it is. For example, for a FacebookTM friend the pointer
`could be a grey square pointer 312 (e.g. like Facebook uses
`online). As another example shown in screen 300E of FIG.
`
`25
`
`35
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Facebook's Exhibit No. 1001
`Page 14
`
`
`
`US 8,279,173 B2
`
`10
`
`30
`
`35
`
`25
`
`5
`3E, for a free-form text tag 314, the tagged point in the photo
`301 could be indicated with a circle pointer 316.
`When the user scrolls over a tag 310 in the tag list 304, a
`menu may be activated with options associated with the tag.
`For example, there may be menu items to edit or delete the 5
`tag. There may also be context sensitive menu options asso
`ciated with the specific tag type. For example, for a Facebook
`friend there may be an item to view the friend's Facebook
`profile. For an address book entry, there may be an item to
`view the user's address card. For a browser bookmark, there
`may be an item to visit that website.
`In another embodiment, once Subjects or objects have been
`tagged in the photo 301, photo tagging module 148A may be
`configured such that upon user selection of a tag310 in the tag
`list 304 the corresponding tagged subject or object in the 15
`photo 301 may be highlighted using the corresponding
`pointer identifying the tag type.
`In another embodiment, once subjects (e.g. subject 302) or
`objects have been tagged in the photo 301, photo tagging
`module 148A may be configured such that upon user selec
`tion of or scrolling over a tagged Subject or object in the photo
`301, the corresponding tag 310 may be highlighted in the tag
`list 304. As well, context data associated with the tag may be
`displayed (e.g. in a pop-up window) upon user selection, or
`automatically if so configured.
`Thus, using the above described user interface, a plurality
`oftag types may be used to tag subjects and objects in a photo,
`and a type-specific data may be associated with each tag—
`Such as the visual indicator or symbol used to highlight the
`tagged subject or object in the photo, the custom actions
`available for the tag, etc.
`Now referring to FIG. 4A, shown in screen 400A is an
`illustrative tag selection user interface 404 for displaying a
`tag search facility as may be presented by photo tag selection
`module 148B. As shown in FIG. 4A, the user is initially
`presented with a tag entry field 406 indicating that he should
`start typing a tag. Upon completion of typing, the user may
`click “OK” 408 to select the tag.
`In an embodiment, as the user begins to type, photo tag
`selection module 148B may be configured to search one or
`more selected “tag sources' for tags that match the currently
`entered text. As shown by way of illustration in screen 400B
`of FIG. 4B, these tag sources could include, for example, a list
`of friends from an online service like FacebookTM, a list of
`contacts from the user's address book 142, a list of the user's 45
`browser bookmarks (in Internet browser 138), a cache of
`recent free-form text entries, etc.
`As shown in screen 400C of FIG. 4C, and 400D of FIG.4D,
`photo tag selection module 148B may be configured to dis
`play any matching tags (e.g. 412a, 412b, 412c) from one of 50
`the tag sources to the tag being typed by the user in the tag
`entry field 406 in a matching tag list 412. Each tag may have
`an icon or some other visual identifier associated with it that
`clearly indicates its type, and allows the user to quickly dis
`tinguish between different types of tags.
`If the user types text that does not match any tag from the
`tag sources in the matching tag list 412, the photo tag selec
`tion module 148B may create a new free-form tag entry and
`add it to a free-form text cache as a new tag entry. The
`free-form text cache may then become one of the tag sources 60
`for any Subsequent tag selection by the user.
`As the user continues to type, if a tag that the user wishes to
`select appears in the matching tag list 412, the user can scroll
`to the tag in the matching tag list 412 and select it by pressing
`enter or clicking on the navigation device (e.g. trackball 117). 65
`For example, as shown in screen 400E of FIG. 4E, the user
`may select a tag 412c which may then be placed into the tag
`
`40
`
`55
`
`6
`entry field 406. The matchingtag list 412 then disappears, and
`the selected tag may appear beside the photo (e.g. tag 310 as
`shown in FIG. 3C associated with subject 302 in the photo
`301).
`Significantly, as the matching tag list 412 includes possible
`tags that may be used from various selected tag sources (such
`as the user's Facebook friends, the user's address book 142, a
`list of the user's browser bookmarks from Internet browser
`138, a cache of the recent free-form text entries, etc.), the user
`is provided with a simple way to associate Subjects or objects
`in a photo with a predefined “tag” from one of a number of
`selected tag sources, as may be defined by the user. Thus, the
`free-form text cache would just be one of many possible tag
`Sources, all of which contribute matching tag entries to the
`matching tag list 412.
`Now referring to FIG. 4F, once a tag has been entered into
`the tag entry field 406, photo tag selection module 148B may
`be configured to allow the user to choose to view some con
`text data associated with the tag (e.g. an address card if the tag
`identifies a contact in the user's address book 142). Finally,
`photo tag selection module 148B may be configured to allow
`the user to accept the new tag and return to the photo tagging
`user interface (described above with respect to FIGS. 3A to
`3E), or cancel the tag selection and return to the photo tagging
`user interface.
`In an embodiment, in addition to the tag sources mentioned
`above, another type of tag source may be landmark tags with
`associated geographic location information. For example, ifa
`photo contains a number of distinctive landmarks, it may be
`possible for each landmark to be tagged with a unique geo
`graphic location tag (e.g. specific latitude and longitude coor
`dinates for each landmark). Such a list of geographic location
`tags may be obtained, for example, as a user visits each of the
`landmarks identified in the photo.
`For example, an aerial photo of the National Mall in Wash
`ington D.C. may show a number of famous landmarks such as
`the Lincoln Memorial, Vietnam Veterans Memorial, and The
`Washington Monument in the same photo. A user who has
`visited each of these landmarks, provided appropriate names,
`and recorded geographic location information at each land
`mark location may then Subsequently select a landmark tag
`by name from the prepared landmark tag list in order to tag the
`appropriate points in the photo at which each of the landmarks
`appear. Once a user tags each landmark appearing in the
`photo using the appropriate landmark tag, the corresponding
`geographic coordinates also become available as context
`based information accessible through the tag.
`In an alternative embodiment, a list of famous landmarks
`for various cities may be prepared by a third party Such that a
`user need not be at each location to record the geographic
`coordinates. In this case a landmark tag may be selected