throbber
as) United States
`a2) Patent Application Publication 10) Pub. No.: US 2008/0133526 Al
`(43) Pub. Date: Jun. 5, 2008
`
`Haitaniet al.
`
`US 20080133526A1
`
`(54) METHOD AND SYSTEM FOR PROCESSING
`IMAGES USING TIME AND LOCATION
`FILTERS
`
`(22)
`
`Filed:
`
`Mar.22, 2007
`
`Related U.S. Application Data
`
`(75)
`
`Inventors:
`
`Robert Y. Haitani, Menlo Park, CA
`(US); Richard Dellinger, San Jose,
`CA (US); Paul Chambers, San
`Jose, CA (US); Mitch Allen,
`Mountain View, CA (US); Matthew
`W. Crowley, Los Altos, CA (US)
`
`Correspondence Address:
`FOLEY & LARDNER LLP
`777 EAST WISCONSIN AVENUE
`MILWAUKEE,WI53202-5306
`
`(73) Assignee:
`
`Palm,Inc.
`
`(21) Appl. No.:
`
`11/726,709
`
`34
`
`(60) Provisional application No. 60/873,066, filed on Dec.
`5, 2006.
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`(2006.01)
`GO6F 7/08
`(2006.01)
`GO6F 7/00
`(52) US. Ch. oe ce ceeeeeereeseee 1707/7; 707/E17.033
`
`(57)
`
`ABSTRACT
`
`A device may process images(e.g. sort, group, file, e-mail,
`etc.) using variousfilters. Thefilters may relate to non-image
`datain the imagefiles to be processed. Thefilters may include
`time andlocationfilters.
`
`EXTERNAL
`POWER
`SUPPLY
`
`
`
`
`
`
`POWER
`SUPPLY
`CIRCUIT
`
`
`
`
`
`
`eae
`
`AUDIO
`CIRCUIT
`
`LOCATION
`CIRCUIT
`
`31
`
`24
`
`USER INPUT
`DEVICES
`
`REMOVABLE
`MEMORY
`
`
`
`
`
`
`
`
`
`
`
`CONNECTOR ~&2
`
`
`=
`
`NETWORK|4-——_
`EXTERNAL
`DEVICE
`
`44
`
`ao TRANSCEIVER
`ie
`
`EXTERNAL
`DEVICE
`
`SERVER
`
`\48
`
`WEB HOSTING
`SERVER |
`
`‘|
`
`4B
`
`SAMSUNG 1032
`
`SAMSUNG 1032
`
`1
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 1 of 8
`
`US 2008/0133526 Al
`
`1
`FIG.
`34
`56
`Ko EXTERNAL
`
`suppry||BATTERY .
`eS 52
`\
`
`POWER
`
`CAMERA
`
`0
`\
`
`POWER
`SUPPLY
`CIRCUIT
`
`DISPLAY
`
`
`
`18
`6
`pee beepery)
`
`7° '|IMAGE PROCESSING !
`|
`CIRCUIT
`DISPLAY DRIVER
`!
`NM ,
`32
`
`(
`
`22~
`
`AUDIO
`CIRCUIT
`
`AUDIO
`DRIVER
`
`|
`
`BN oO)
`
`38
`
`!
`
`34
`
`!
`NVM CONTROLLER|
`Stub dpff |
`—
`L
`6
`3
`
`LOCATION
`CIRCUIT
`
`MICROPROCESSER
`
`a
`
`USER INPUT
`DEVICES
`
`/
`
`CELLULAR
`TRANSCEIVER
`
`REMOVABLE |
`
`MEMORY
`
`{DEDICATED
`MEMORY
`
`NETWORK|———
`ae
`TRANSCEIVER
`44
`DEVICE
`
`(oem)
`CONNECTOR
`\40a Yo WEB HOSTING)ag
`
`EXTERNAL
`
`DEVICE
`SA
`
`* a”
`
`SERVER
`
`\46
`
`42
`
`{_SERVER
`.
`
`2
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 2 of 8
`
`US 2008/0133526 Al
`
`10
`
`fo
`
`CALENDAR 7 3
`
`INFORMATION
`
`110
`
`|
`
`PERSONAL
`SETTINGS
`
`USER
`
`INPUTS
`
`IMAGE
`FILE
`MEMORY
`
` a
`
`4>wsria NO Oo
`
`102
`
`130
`
`INTERNET
`BROWSER
`
`TIME & DATE
`INFORMATION
`
`\
`
`104
`
`126
`
`\
`
`3
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 3 of 8
`
`US 2008/0133526 Al
`
`FIG. 3
`
`ASSOCIATE
`OBTAIN
`| OBTAIN
`IDENTIFY
`TRANSMIT
`
`IMAGE LOCATION||IMAGE WITHOBJECTS TIME
`
`
`
`
`
`GROUPS REMOTE|211IMAGES |
`DEVICE
`BUDDY | 299
`
`
`
`
`i IN IMAGE_] [INFORMATION]|AN EVENT[INFORMATION]
`
`
`ACCESS Oo ASSOCIATE -—
`
` IMAGEPROCESSING
`
`
`
`(
`UPLOAD
`IMAGE
`CONTACT
`
`\ DISPLAY|[GENERATE
`264
`262
`
`
`
`CAPTURE
`IMAGE
`PROCESS |-*~~~2350 STORE
`
`
`IMAGE
`IMAGE
`FILE
`
`?40
`
`DEVICE
`
`IMPORT
`IMAGE
`
`
`
`
`TRANSMIT][ACCESS ACCESS 244
`
`
`
`DATA
`
`DATA
`
`DATA
`
`260
`
`IMAGES
`
`FILTER
`
`_____|FILTER
`216
`IMAGES
`
`248
`
`-
`
`RECEIVER USER
`INPUTS
`
`252 ———=|CALENDAR ORENTS —,
`
`
`
`‘ACCESS
`
`APPLICATION
`
`250
`
`4
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 4 of 8
`
`US 2008/0133526 Al
`
`FIG. 4
`
`{270
`
`ee
`
`280
`
`
`STORE
`PRE-STORED
`CONFIGURE & STORE
`
`
`GENERIC
`SETTINGS INDIVIDUAL SETTINGS
`
`UPLOAD
`FORMATS
`
`276
`
`
`FORMAT
`IMAGE
`
`
`
`
`/
`FILE
`
`
`10
`or
`
`46
`
`UPLOAD
`
`5
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 5 of 8
`
`US 2008/0133526 Al
`
`
`goOF BIRFEL@ Remmi
`
`sonnet hey Tire emmnne 4
`
`
`ooae
`
`Lesonce
`
`
`ae
`:
`EL innennnenteneters Wa
`Fredonia. HY
`
`
`2 Maude Ave. Surmy vale. CA
`
`
`Mountain View CA
`
`
`NewYork AY
`
`
`Porc France,
`San Francisco, OA
`Sunnyvale, CA
`
`FIG. 6
`
`Tn
`
`O48
`
`: fugust 2006
`
`
`
`
`
`
`
`
`This month
`: oeplember 2006
`
`
`
`6
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 6 of 8
`
`US 2008/0133526 Al
`
`any Time
`September 11-27, 2006
`
`a Juby 8-19, 2005
`
`August 3-13. 2004
`
`Chonse tine...
`
`
`
`
`
`
`
`a
`
`ye
`
`ktaee
`
`48
`aig
`cae
`%
`\
`j
`
`
`
`
`AUS ommmnmnne ae Blew York
`SOHUNGNEE ememem O25
`aoA seincnanionocmionccenctle ee Any Time Barecnecvennnnteinnnsincnas aat
`
`FIiG.g
`
`7
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 7 of 8
`
`US 2008/0133526 Al
`
`FIG. 10
`
`504
`
`504
`
`502
`
`4)S/MiTWITeis}
`8: 00 a0
`
`sol
`\
`
`[9:00 @ALL HANDS (BOARD ROOM) MO ~___ so
`10:00
`:
`522———- 11:00 4» 79
`12:00 OLUNCH MEETING one
`116
`set |: 00
`a,
`528 ———_>2:00 @ FINANCE BOARD_ft (CONFERENCE coon
`3:30
`a
`918
`
`924
`
`4:00
`5: 00
`
`|
`
`508
`
`514
`
`516
`
`512
`
`2
`
`8
`
`

`

`Patent Application Publication
`
`Jun. 5, 2008 Sheet 8 of 8
`
`US 2008/0133526 Al
`
`
`ao _/ FIG.
`
`11 A
`
`640
`
`FIG. 11D
`
`FIG. 11E
`
`
`
`652
`
`|
`\ FIG. 11F
`
`9
`
`

`

`US 2008/0133526 Al
`
`Jun. 5, 2008
`
`METHOD AND SYSTEM FOR PROCESSING
`IMAGES USING TIME AND LOCATION
`FILTERS
`
`FIGS. 11A-F are diagrams of a smartphone accord-
`[0011]
`ing to one exemplary embodimentof the device described in
`FIG.1.
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`DETAILED DESCRIPTION OF EXEMPLARY
`EMBODIMENTS
`
`[0001] The present application claimspriority to U.S. Pro-
`visional Pat. App. 60/873,066 filed Dec. 5, 2006 under 35
`USC § 119(e),the disclosure of which is hereby incorporated
`by reference in its entirety. The present applicationis related
`toa USpatent application filed on the samedayas the present
`application,titled “METHOD FOR PROCESSING IMAGE
`FILES USING NON-IMAGE APPLICATIONS,” and is
`related to a US patent application filed on the same day as the
`present applicationtitled “AUTO-BLOG FROM A MOBILE
`DEVICE,” both of which claim priority to U.S. Provisional
`Pat. App. 60/873,066. The disclosures of these two applica-
`tions are hereby incorporated by reference in their entirety.
`
`BACKGROUND
`
`[0002] Users obtain digital pictures and movies from a
`variety of sources including digital cameras, digitization of
`photographstaken with film cameras, etc. These digital cam-
`eras may be stand-alone cameras or maybe integrated into
`other devices such as cell phones (including Smartphones).
`[0003] Ausermay capture hundredsor thousands (or more)
`pictures and movies over the course of time using these vari-
`ous devices. The task of organizing these pictures often falls
`to the user of the device. Some systems provide a userinter-
`face that allows a userto sort through pictures using a time-
`line. Other systems allow a user to manually label and orga-
`nizepicturesinto virtual albums. The software that creates the
`album may include a drag and drop user interface or may
`includelabeling pictures taken with a common album (folder)
`name. Some systems have allowed a user to search by loca-
`tion on a mapif a user takes the time to label the location of
`each picture.
`[0004] Many devices add various non-image data to an
`imagefile which can be viewed by subsequent devices. For
`example, many devices might include a time and date stamp,
`a make and modelof the camera used to capture the image,
`shutter speed, an indication whethera flash wasused, etc. One
`standard image file format used by digital cameras is the
`EXIFfile format standard. The EXIF format includes defined
`fields for defined types of data and includes openfields which
`can be used to enter non-defined data.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram of someportions of a
`[0005]
`system and apparatus according to one embodiment;
`[0006]
`FIG. 2 is a functional diagram according to one
`embodiment, which may be used with the system of FIG. 1;
`[0007]
`FIG. 3 is a diagram according to one embodiment,
`which may be used with the system of FIG. 1;
`[0008]
`FIG.4 is a diagram according to one embodiment,
`which may be used with the system of FIG. 1;
`[0009]
`FIGS. 5-9 are screen shots of a filter and image
`display function according t one embodiment, which may be
`used with the system of FIG. 1;
`[0010]
`FIG. 10 is a screen shot of a calendar application
`which may be used to access and/or organize images accord-
`ing to one embodiment, which may be used with the system of
`FIG.1; and
`
`[0012] Referring to FIGS. 1 and 2, a system 8 includes a
`portable hand-held device 10. The portable handheld device
`10 maybe a cell phone (such as a Smartphone) that includes
`a cellular transceiver 36. Portable hand held device 10 may
`include a camera 12 to capture images. Camera 12 may be
`configurable to capture still
`images (pictures), moving
`images (movies), or both still and moving images. Device 10
`mayuse display 14 as a digital viewfinderthat allows a user to
`preview a shot before capturing an image and/or to view a
`movie as it is being captured.
`[0013]
`Images captured by camera 12 may be processed by
`processing circuit 32 (e.g. microprocessor 26 and/or image
`processing hardware 16). Imagefiles based on the captured
`images may be saved in memory 34,38, transmitted to other
`systems 46,48 (e.g. by transmitters 36,44 or data port 40), or
`otherwise processed by device 10.
`[0014]
`Processing circuit 32 may be configured to run one
`or more applications. For instance, device 10 may be used to
`capture images from camera 12 using an image application
`112 run byprocessing circuit 32. As explained below, images
`captured by camera 12 may be formed into imagefiles con-
`taining various data relating to the captured image.
`[0015]
`Image application 112 may be used to enhance an
`amount of information recorded in the imagefile relating to
`the image captured by camera 12. For example, image appli-
`cation 112 may use information from other applications run
`by device 10 to add data to the imagefiles created by the
`image application 112. For example, an image application
`112 maybe configured to obtain information from a location
`application 114, a calendarapplication 116, and/or a contacts
`application 118 running on device 10 and, based on the infor-
`mation obtained, add data to an imagefile.
`[0016] Additionally,
`image application 112 may be
`designed to enhance user functionality once images have
`been obtained. For example, image application 112 mayalso
`be configured to display images on display 14. Image appli-
`cation 112 mayincludevariousfilters used to limit the num-
`ber of images displayed. As discussedbelow,these filters may
`be user selectable, may use the data in the imagefile obtained
`from non-image applications including any of the non-image
`applications discussed below, may be configured based on
`data in the imagefiles 104 stored on device 10, etc. As another
`example, similar filters may also be used to group imagesinto
`folders (such as virtual albums, system file folders, etc.). As
`still another example, image application 112 may use data
`stored in the imagefiles 104, contact information 118, calen-
`dar information 116, and/or upload information 260 (FIGS. 3
`and 4) to increase the ease of sharing images.
`[0017] The images operated on by image application 112
`may include images captured by camera 12, and/or may
`include images obtained from sources other than camera 12.
`For example, images maybetransferred to device 10 using
`one or moreofdata port 40,transceiver 36, transceiver 44, and
`memory 38. As another example, a number of images stored
`on a remote storage (e.g. on a server 46,48), a personal com-
`puter, or other remote device may be accessed by device 10.
`[0018]
`Image application 112 maybe limitedto a particular
`type of image (e.g. still
`images (photographs), moving
`
`10
`
`10
`
`

`

`US 2008/0133526 Al
`
`Jun. 5, 2008
`
`images (movies), etc.) or may be configured to handle mul-
`tiple types of images. Image application 112 may be a stand-
`alone application, or may be integrated into other applica-
`tions. Image application 112 may be formed by acombination
`of functions of separate, distinct programs of device 10.
`[0019] Referring to FIG. 3, an image application 112 may
`handle images obtained (captured) by camera 12 of device 10
`at block 202 and/or images obtained (imported) from a source
`outside of device 10 at block 222.
`
`[0020] At block 202, an image may be captured by device
`10 such as by using camera 12 (or by someother device such
`as an external camera controlled by device 10 through data
`port 40). Capturing an imageat block 202 may be performed
`under the control of processing circuit 32 and/or in response
`to a user input registered on a user input device 31. For
`example, processing circuit 32 may execute an image captur-
`ing application 112 (FIG. 2) which includes a command
`portion that allows users to input a commandto capture an
`image using a button or touch screen input.
`[0021] An image captured on camera 12 at block 202 can
`have any standard image processing performedonit at block
`204 (e.g. format conversion, white balancing, tone correction,
`edge correction, red-eye reduction, compression, CFA inter-
`polation, etc.) and remain essentially the same image This
`imageprocessing at block 204 may be performed by a micro-
`processor 26 (FIG. 1) and/or by dedicated hardware such as
`an image processing circuit 16 (FIG. 1).
`[0022] An imagefile may be formedat block 230 using the
`image data captured by the camera at block 202 and/or pro-
`cessed at block 204. The imagefile may use a standard image
`file format (e.g. EXIF, JFIF, GIF, PICT, MPEG, AVI, motion
`JPEG,etc.) or may use anon-standard format. The image data
`in the imagefile may be compressedat block 230 (such as by
`JPEG compression, MPEG compression, LZW compression,
`other DCT-based compression, etc.), including highly com-
`pressed with a lossy-type image compression,butstill convey
`essentially the same image. Compression may be performed
`by a microprocessor 26, by an imageprocessing circuit 16, or
`by someother processing circuitry of processing circuit 32.
`[0023] The full size image in the image file may be an
`image having about the same resolution as camera 12. In
`some embodiments, the image in the image file may have a
`resolution smaller than resolution ofthe camera 12 (e.g. a full
`set of data is acquired from camera 12 and imageprocessing
`circuit 16 reduces the resolution of the image data received
`from the camera to form the full size image inthefile). In
`some embodiments, the user may be given an option to
`choosethe resolution ofthe full size image.
`[0024] A thumbnail version of the image (a reduced size
`version of the image, almost always smaller than the full size
`version) may also be added to the imagefile at block 230. Like
`the other processing on the image data, the thumbnail may be
`formed using microprocessor 26, image processing circuit
`16, or some other processing circuitry of processing circuit
`32. The thumbnail ofthe image generally conveysessentially
`the same image as the full size version of the image (even
`when they are image-processed—see block 204 of FIG.
`3—separately).
`
`Adding Information to Image Files
`
`[0025] Once an imagefile is formed at block 230 (which
`maybebefore or after someorall of the image data has been
`addedto the imagefile), additional data (e.g. non-image data)
`can be addedto the imagefile corresponding to the image that
`
`was captured to enhance the amount of information stored
`about the image. Enhancing the amountof data stored about
`the image can increase the numberof techniques (discussed
`below) ableto be applied to the images in some embodiments.
`This additional information may be addedto thefile before or
`after the image data is addedto the imagefile.
`[0026]
`Information relating to the time at which the image
`was obtained (based on data retrieved at block 208) is typi-
`cally added to the imagefile.
`[0027] Also, location information can be obtained at block
`206 (such as from a location application 114—FIG. 2—and/
`or location circuit 24—FIG.1) and addedto the imagefile at
`block 230. Location information can include coordinate
`information such as latitude and longitude coordinates; text
`information such as one or more ofthe nameofthestreet,city,
`state, province, country and/or other location designation at
`whichthe image wasobtained; information regarding thecell
`towersin the vicinity ofdevice 10, etc. In many embodiments,
`the location information is retrieved automatically from a
`location determining circuit 24 (FIG. 1) or based on data from
`alocation determining circuit 24 compared to a location name
`(e.g. map) database of a location application 114 (FIG.2).
`Location information can also be obtained by comparing the
`network address (e.g. MAC addresses or other information)
`from a point used to access a network (e.g. a WiFi network)
`compared to a database (which may be on or remote from
`device 10) that identifies the location of the access point
`(identified based on the MAC address recorded when the
`image was captured).
`[0028] Where location name information is to be added,
`device 10 may be configuredto store the location nameinfor-
`mation(e.g. in memory 34,38, hard-coded, etc.) for a range of
`locations, including the location at which the image is cap-
`tured. In some embodiments (particularly for a portable hand-
`held device such as a smartphone), device 10 may notstore
`this information for every (or any) location, and may need to
`retrieve this location information. In embodiments where
`information needs to be retrieved, it can be retrieved from a
`remote database (e.g. a database on server 46) or some other
`source. Device 10 may obtain information from the remote
`database using a wireless transceiver 36,44 to access a WAN
`(e.g. the Internet) to which the remote database is connected.
`Device 10 could be configuredto obtain this information only
`when(or additionally when) making a wired connection to a
`database (e.g. when syncing to a user’s personal computer).
`In some embodiments, such as some of the embodiments
`requiring a wired connection,location name information may
`not be added to well after a picture is captured.
`[0029]
`Insome embodiments, device 10 may be configured
`to automatically update the location informationit has stored.
`For example, device 10 may be configured to receive location
`coordinates based on data from location circuit 24, determine
`thatit does not have location name information for the region
`whereit is located, and obtain location name information for
`that region from the remote database (e.g. by sending its
`coordinates to the remote database). Device 10 may be con-
`tinuously updating its stored location name information or
`mayupdate this information in responseto a user opening the
`imageapplication (e.g. a picture or video capturing applica-
`tion).
`Insome embodiments, ratherthan(orin addition to)
`[0030]
`continuously updating location name information, device 10
`may obtain location name information in response to an
`image being captured. For example, device 10 may be con-
`
`11
`
`11
`
`

`

`US 2008/0133526 Al
`
`Jun. 5, 2008
`
`figured to capture an image, obtain coordinate information
`from a location circuit 24 in response to the image being
`captured, send the coordinate information (or other non-name
`location information) to a remote database, and receive loca-
`tion name information associated with the coordinate infor-
`mation from the remote database.
`
`In some embodiments, a combination of the two
`[0031]
`previously discussed techniques may be used. For example,
`city, region, and country location name information may be
`obtained automatically in the background. However, street
`level location name information maynot be downloadeduntil
`a picture is captured.
`[0032]
`In some embodiments, the amount of data down-
`loaded for an area may depend on how manypictures are
`being obtained in the area. For example, if a large number of
`pictures are being taken closely in time in a city, then more
`information might be downloaded and saved to device 10
`(e.g. automatically). As another example,ifpictures are being
`taken in a close time range in a tight geographical area then
`less information is downloaded, whereas if pictures are being
`taken in the same time framein a larger geographic area, then
`more information is downloaded and saved (e.g. automati-
`cally).
`In some embodiments, the detail of information
`[0033]
`downloaded might change (and might change automatically).
`For example, in a user’s home area, more detailed informa-
`tion might be downloaded. As another example, in more
`densely populated area more detailed information might be
`downloaded.Asstill another example, the detail of informa-
`tion downloaded maybe userselectable.
`[0034]
`Inaddition (oras an alternative) to information from
`a location application 114, in some embodimentsthe location
`information may be information that is manually input by a
`user on a user input device 40. Further, in other embodiments
`location information is retrieved from another source with
`whichthe imagefile is associated (e.g. the location informa-
`tion stored for an event associated with the image—seedis-
`cussion of block 210, below—maybeused as the location
`information for the image).
`[0035]
`In addition to adding location information at block
`206, images maybe associated at block 214 andthis associa-
`tion maybe used to add data to the imagefile. Files may be
`associated at block 214 by any number of means. Asa first
`example ofa meansfor associating images, processing circuit
`32 may automatically associate images based on similar data
`(e.g. non-image data) within the image files. Common non-
`image data may include that the images of the imagefiles
`were captured at a common location, were captured during
`the same time period (such as during an event listed in the
`calendar application, see block 210 below), that images are
`clustered together in time, and/or other data associated with
`the image (such asdata in the imagefiles that indicate that the
`imagefiles include images of one or more people from an
`associated group of people). Multiple criteria may be used to
`associate images (e.g. images are required to have been taken
`at a commontime and at a commonlocation).
`[0036]
`Thecriteria used to associate images at block 214
`mayvary based on the user’s location. For example, in an area
`around a user’s home town the images may be required to
`have a closer link than images acquired while a user was on
`vacation. This may be a closer link on onecriteria or on a
`combination ofcriteria.
`
`[0037] The criteria for association at block 214 may also
`vary based on the device from which an image was captured.
`
`For example, images captured on the hand-held device 10
`maybefreely associated based solely on a factorrelating to a
`time at which the image was captured. However, device 10
`may be configured to associate images not captured by device
`10 based on a combination oftime with anotherfactor such as
`location, names of people associated with the image, etc.
`[0038] Also, the criteria for association at block 214 may
`differ depending on the number of and whichcriteria of the
`pictures match. For example, a lessstrict time criteria may be
`used if the images were all taken at a similar location. As
`another example,a less strict location criteria might be used if
`the images largely included the same group of people in the
`images.
`[0039] As a second example of a means for associating
`images, images may be associated at block 214 based on
`actions of a user (e.g. a user assigning the images to a com-
`monfolder, a user selecting a number of images and choosing
`a commandto associate the selected images, etc.).
`[0040] Once imagesare associated at block 214, non-image
`data can be addedto the imagefiles at block 230 based on the
`association of images at block 214. As one example, the
`non-imagedata representing the fact that the imagesare asso-
`ciated could be addedto the imagefile. As another example,
`non-image data from one imagefile may be addedto another
`imagefile based on the association. For instance, event infor-
`mation associated with one image could be added to the
`imagefile of an associated image, namesofpeople associated
`with one image could be addedto the imagefile of an asso-
`ciated image, location information associated with one image
`could be addedto the imagefile of an associated image,etc.
`If a commonfolderis used to associate images at block 214,
`a user mayassign data to the folder to signify commonprop-
`erties of images in the folder, which data assigned to the
`folder will be added at block 230 to all image files in that
`folder.
`
`[0041] Another source of non-image data to be addedto an
`imagefile at block 230 is non-imagedata thatis based on the
`image in the image file. An image may be subjected to an
`image recognition program at block 212 that recognizes
`objects (e.g. people) in an image. According to one embodi-
`ment, the image recognition program is used to identify
`people located in an image. The image recognition program
`may be pre-trained to identify certain individuals (such as
`individuals the user may photograph regularly) and then look
`for those people in the images of device 10.
`[0042] Data based on the object recognition can be addedto
`the imagefiles. As one example, the namesorother identifi-
`cations of the people recognized in the image at block 212
`maybe addedto the imagefile. As another example, a user
`may set one or more pre-defined groups of individuals in a
`configuration phase. These groups may be accessedat block
`218. Ifa user identified in the imageis associated with a group
`(e.g. family, school friends, co-workers, etc.) then a label
`corresponding to that group may be addedto the imagefile
`data.
`
`[0043] The image recognition application may be run by
`hand held device 10, or may be run on a device 46 (FIG. 1) that
`is remote from handheld device 10. If the recognition appli-
`cation is remote from device 10, then someorall ofthe image
`file may be transmitted to the remote device 46 at block 216.
`Remote device 46 maybe configured to transmit the file back
`to hand held device 10 at block 216, and/or hand held device
`10 may be configured to access remote device 46 and obtain
`the recognition data at block 216.
`
`12
`
`12
`
`

`

`US 2008/0133526 Al
`
`Jun. 5, 2008
`
`[0044] Another source ofnon-imagedata that can be added
`to the imagefile is event data. An image maybeassociated
`with an event at block 210. Hand-held device 10 may be
`configured to automatically associate an image with the
`event, or a user might manually associate an image with the
`event. Hand-held device 10 may automatically associate an
`image with an event by comparing non-image data of the
`image with one or more events in a calendar application 116
`(FIG. 2). For example, an image may be associated with an
`event by comparing the time (e.g. date and time of day) at
`which the image was obtained to the time of the event. As
`another example, an image might be associated with an event
`based on the location of the event recorded in the calendar
`
`application comparedto the location at which the image was
`captured.
`If event data is automatically obtained and/or
`[0045]
`entered as non-imagedata in the imagefile, the image appli-
`cation 112 may be configuredto access the calendar applica-
`tion 116 (FIG. 2) of device 10 and search the calendar appli-
`cation 116 for events that mightbe related.
`[0046] Also, if event data is automatically obtained and/or
`entered as non-imagedata in the imagefile, a hierarchy may
`beused to determine which event correspondsto an image. As
`one example, an event that was scheduled to occur fora period
`oftimethat includesthe time at which the image was captured
`might be given the highest priority, an event that is close in
`time to (but does not encompass) the time ofthe picture might
`be given a secondpriority. The calendar application may also
`have all day events scheduled, which have less specificity of
`time than the defined time events suchas thefirst and second
`priority events. All day events scheduled the date the image
`wascaptured maybe given a thirdpriority.
`[0047]
`For events that are close in time but not exact, the
`criteria used to judge closeness might be pre-set or might be
`variable. For example, the criteria might be morestrict if the
`user has a lot of events scheduledin the calendar application
`(e.g. on a particular day), and less strict if there are fewer
`events. Other criteria may be used to generate a hierarchy as
`well, including a complicated hierarchy based on more than
`one factor (e.g. more than just time). Exemplary factors
`include timeof the event versus time of the picture, location
`ofthe event versus location ofthe picture, people associated
`with the event versus people associated with the picture,
`association with pictures that have been associated with the
`event(e.g. clusters of photos), etc.
`[0048] The location at which the picture was taken com-
`pared to the location of the event might be used to exclude
`association with an improperevent.
`[0049]
`Ifan imagefile is associated with an event at block
`210, data entered for the event in the calendar application 116
`may be addedto the imagefile at block 230. This may include
`the name ofthe event, other attendees of the event, a classi-
`fication of the event (business, personal, etc.), a location at
`which the event took place, a tag which associates the event
`with the imagefile, and/or other information entered for or
`related to the event.
`
`[0050] An event stored on device 10 may be an eventasso-
`ciated with a userof the device (e.g. a user’s personal calen-
`dar) or could be an event associated with someone with whom
`the user is associated (e.g. a family member, a co-worker,
`etc.). One or more calendarapplications 116 (FIG.2) running
`on device 10 may be configuredto store a user’s event infor-
`mation along with event information from other people.
`
`In addition to obtaining information relating to an
`[0051]
`event stored on device 10, calendar information may be
`obtained from sources remote from device 10. For example, a
`user may havea databaseset up for family membercalendars
`which can be accessed from device 10 over a network, when
`a user synchronizes their device with a personal computer,
`etc. As another example, a “buddy”of the user of device 10
`mayhavethe userof device 10 listed as an attendee at an event
`on their calendar. Device 10 may be configured to access the
`buddy’s event information (e.g. on a remote database, from a
`device within range of the user—e.g. within a Bluetooth
`connection range—etc.) and add event information based on
`the buddy’s eventthatlists the user as an attendee. As another
`example, a system may be used to track movementof device
`10 and other users (e.g. a central tracking system that uses
`GPSpositions from devices carried by the users). If the user
`of device 10 is in proximity to another user during an event
`listed by the other user, the event information listed by the
`other user may be added to images captured by device 10.
`[0052]
`In additionto private events, one or more databases
`may be scannedfor a list of public events that were taking
`place at about the same time and about the samelocation at
`which the image was captured.
`[0053] Thus, even if a user does not have an eventlisted,
`device 10 may be configured to access the remote database
`(e.g. the family membercalendars, buddylist events, public
`events, etc.) and look for event information in the remote
`database.
`
`[0054] Any of the differentiators listed above may be used
`to determine whether the image is associated with the event
`not listed in a calendar application 116 on device 10 and/or
`not directly associated with the user. For example, the loca-
`tion at which the image wastaken,the time at which the image
`was taken, people identified in the images, the locations of
`other individuals, and other information may be examinedto
`determine whether a user was really attending an event
`obtained from a non-user source (i.e. whether these other
`sources of information are consistent with information
`
`regarding the non-user obtained event).
`[0055] As discussed above,
`images associated with an
`event at block 210 maythen be associated with each other at
`block 214. Conversely, images associated with each other at
`block 214 (particularly where the images were captured at
`about the same time period—e.g. clustered together) may
`then be associated with the event at block 210 even though
`someofthe associated pictures were not themselves captured
`during the time period listed for the event in the calendar
`application 116.
`[0056]
`Inaddition to obtaining information from device 10,
`information maybe obtainedat block 211 from sources out-
`side of device 10. Information may include event informa-
`tion, location information, and other information not con-
`tained on device 10. For example, the time and location at
`which an image was taken can be compared to times and
`locations ofpublic events (e.g. from a database, from a search
`of the Internet, etc.). If an image appears to have been taken
`close in time andlocationto the time andlocation ofthe event,
`information may be added to the imagefile based on the
`event.
`
`[0057] As another example, information relating to busi-
`nesses located where the image was captured can be obtained
`from a remote database. This information may be associated
`with the image. This information can also be used t

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket