throbber
US007831913B2
`
`(12) United States Patent
`MacLaurin
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7,831,913 B2
`Nov. 9, 2010
`
`(54) SELECTION-BASED ITEM TAGGING
`(75) Inventor: Matthew B. MacLaurin, Woodinville,
`WA (US)
`(73) Assignee: Microsoft Corporation, Redmond, WA
`(US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 442 days.
`(21) Appl. No.: 11/193,586
`
`(*) Notice:
`
`(22) Filed:
`
`Jul. 29, 2005
`
`(65)
`
`Prior Publication Data
`US 2007/0O28.171 A1
`Feb. 1, 2007
`
`(51) Int. Cl.
`(2006.01)
`G06F 3/00
`(52) U.S. Cl. ....................... 715/708; 715/705: 715/710;
`715/816; 715/825: 715/968; 715/230; 715/231;
`715/232; 715/233; 704/251; 706/12; 706/59;
`706/934; 707/696; 707/740; 707/741
`(58) Field of Classification Search ................. 715/705,
`715/230 233, 708, 710, 816, 825,968; 707/696,
`707/736 747; 704/251; 706/12, 59,934
`See application file for complete search history.
`References Cited
`
`(56)
`
`U.S. PATENT DOCUMENTS
`
`5,309.359 A * 5/1994 Katz et al. .................. 707/102
`5,404,295 A
`4, 1995 Katz et al. ..................... 707/2
`5,422,984. A * 6/1995 Iokibe et al. ....
`... 706, 12
`5,544,360 A
`8, 1996 Lewak et al. ................... 707/1
`5,548,739 A * 8/1996 Yung .......................... T11 204
`5,600,775 A * 2/1997 King et al. .................. 71.5/2O3
`5,685,003 A * 1 1/1997 Peltonen et al. ............. 71.5/2O2
`5,832,474 A * 1 1/1998 Lopresti et al. ................ 707/2
`5,835,959 A * 1 1/1998 McCool et al. ............. 711/171
`5,864,339 A *
`1/1999 Bedford-Roberts ......... 345,173
`6,026,177 A * 2/2000 Mong et al. ................. 382,156
`6,044,365 A
`3/2000 Cannon et al. ................. 707/2
`
`6,169,983 B1* 1/2001 Chaudhuri et al. ............. 707/2
`6,208,339 B1* 3/2001 Atlas et al. .................. 71.5/780
`6,243,699 B1
`6/2001 Fish .................................. 1.1
`6,295,387 B1 * 9/2001 Burch ........................ 382,311
`6,297,824 B1 * 10/2001 Hearst et al. ................ T15,848
`6.356,891 B1* 3/2002 Agrawal et al. ................ 707/2
`6,377.965 B1 * 4/2002 Hachamovitch et al. .... 715/203
`6,408,301 B1* 6/2002 Patton et al. ................ 707/741
`6,496,828 B1
`12/2002 Cochrane et al. .............. 707/10
`6,519,603 B1* 2/2003 Bays et al. .................. 707/102
`6,711,585 B1* 3/2004 Copperman et al. ...... TO7 104.1
`6,731,312 B2 * 5/2004 Robbin ....................... 71.5/792
`6,751,600 B1* 6/2004 Wolin .......................... TO6, 12
`6,757,692 B1* 6/2004 Davis et al. ................. 707,692
`
`(Continued)
`Primary Examiner Weilun Lo
`Assistant Examiner Eric Wiener
`(74) Attorney, Agent, or Firm Wolf, Greenfield & Sacks,
`P.C.
`
`(57)
`
`ABSTRACT
`
`Item selections along with user inputs are leveraged to pro
`vide users with automated item tagging. Further user interac
`tion with additional windows and other interfacing tech
`niques are not required to tag the item. In one example, a user
`selects items and begins typing a tag which is automatically
`associated with the selected items without further user action.
`Tagging Suggestions can also be Supplied based on a user's
`selection, be dynamically supplied based on a users input
`action, and/or beformulated automatically based on user data
`and/or tags and the like associated with selections by an
`external source. Machine learning can also be utilized to
`facilitate in tag determination. This increases the value of the
`tagged items by providing greater item access flexibility and
`allowing multiple associations (or tags) with each item.
`
`20 Claims, 13 Drawing Sheets
`
`800
`
`
`
`st
`Af
`
`All
`by: X
`
`graphics
`Typera Extension Author unread fresa
`
`88 -
`
`Facebook's Exhibit No. 1006
`Page 1
`
`

`

`US 7,831,913 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`6,766,069 B1* 7/2004 Dance et al. ................ 382.309
`6,795,094 B1* 9/2004 Watanabe et al. ........... 715/762
`6,810,149 B1 * 10/2004 Squilla et al. .....
`... 382,224
`6,810,272 B2 * 10/2004 Kraft et al. ........
`... 455,566
`6.820,094 B1 * 1 1/2004 Ferguson et al. ..
`... 707/200
`6,826,566 B2 * 1 1/2004 Lewak et al. ......
`... 707/4
`6,898,586 B1* 5/2005 Hlava et al. ........................ 1f1
`7,010,751 B2 * 3/2006 Shneiderman .............. 71.5/232
`7,013,307 B2 * 3/2006 Bays et al. .....
`... TO7,102
`7,032,174 B2 * 4/2006 Montero et al. ............. 71.5/257
`7,051.277 B2 *
`5/2006 Kephart et al. .............. 71.5/229
`7,275,063 B2
`9/2007 Horn ................................. 1f1
`7,293,231 B1 * 1 1/2007 Gunn et al. ................. 345,179
`7,392.477 B2* 6/2008 Plastina et al. .............. 715/764
`7,395,089 B1* 7/2008 Hawkins et al. .......... 455/556.1
`7,401,064 B1
`7/2008 Arone et al. ....................... 1f1
`7,437,005 B2 * 10/2008 Drucker et al. .............. 382,224
`7,506,254 B2 * 3/2009 Franz ......................... 715,259
`
`7.587,101 B1* 9/2009 Bourdev ..................... 382,291
`2002fOO16798 A1* 2, 2002 Sakai et al.
`... 707/517
`2002/0069218 A1* 6/2002 Sull et al. ................ 7O7/5O11
`2002/0107829 A1* 8/2002 Sigurjonsson et al. .......... 707/1
`2002/015221.6 A1* 10, 2002 Bouthors .......
`... 707/10
`2003/0120673 A1* 6/2003 Ashby et al. ................ 7O7/1OO
`2003/0172357 A1* 9, 2003 Kao et al. ................... 715,529
`2004/0039988 A1
`2/2004 Lee et al. ........
`715,505
`2004/0083.191 A1* 4/2004 Ronnewinkel et al. ........ T06/20
`2004/0123233 A1* 6/2004 Cleary et al. ................ 715,513
`2004/0172593 A1* 9/2004 Wong et al.
`715 512
`2004/01994.94 A1* 10, 2004 Bhatt ............................ 707/3
`2005/0033803 A1
`2/2005, Vleet et al. ................. TO9,203
`2005/0114357 A1* 5/2005 Chengalvarayan et al. .. 707/100
`2005/0132079 A1* 6/2005 Iglesia et al. ................ TO9/230
`2005/0192924 A1* 9, 2005 Drucker et al. ................. 707/1
`2005, O262081 A1* 11/2005 Newman ...
`... TO7/9
`2006/0031263 A 2.2006 Arrouye et al.
`707/2OO
`2006/0224959 A1 * 10, 2006 McGuire et al. ............ 71.5/7OO
`* cited by examiner
`
`
`
`Facebook's Exhibit No. 1006
`Page 2
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 1 of 13
`
`US 7,831,913 B2
`
`
`
`C
`n
`2
`O
`H
`O
`U
`
`O
`
`s7
`
`Facebook's Exhibit No. 1006
`Page 3
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 2 of 13
`
`US 7,831,913 B2
`
`ZOZ
`
`
`
`·LNE|NOCHINOO SONISO5)\/L
`
`CIESVEI-NO|10ETES
`
`ESOV-lèJELNI·
`
`
`
`
`
`Facebook's Exhibit No. 1006
`Page 4
`
`

`

`U.S. Patent
`
`Nov.9, 2010
`
`Sheet 3 of 13
`
`US 7,831,913 B2
`
`
`
`
`
`AMLNATWONVINLOSYIGNI
`
`‘‘O14.f"913Y
`
`|TWNYSLXSi|ANIHDVWN||Y¥sasn|~~IILILS
`|Sa9uNOsOVONINYYA1
`
`
`
`(S)W3ildaL0373S
`
`
`
`
`
`LNANOdWOD3ONIDDVLGSSVE-NOILOS19S
`
`
`WYOMLIN1V907
`
`
`
`
`LoauiawWHOMLAN1vE0719INSWNOMIANAode
`
`
`
`
`
`(S)NOILSADONSOVLNOILDA13S
`
`2:PO’
`
`LAdNIwasn
`
`ONIDOVL
`
`Y¥3asn
`
`LNANOdINOD
`
`
`
`(S)NOLLSSD9NSOVILNdNISJOVIYSLNI
`
`INdNISVLYSSN
`
`(S)OVLWALI
`
`Facebook's Exhi
`
`it No. 1006
`Page 5
`
`Facebook's Exhibit No. 1006
`Page 5
`
`
`
`
`
`

`

`U.S. Patent
`
`Nov.9, 2010
`
`Sheet 4 of 13
`
`US 7,831,913 B2
`
`
`
`SUISTHUQESIOD
`
`SDTwaesly
`
`BouseYYOES
`
`syuaywyeay|
`
` $3.10}5
`
`OIF
`
`
`
`Tutondoaya|Srmebifowndggyuy
`PS|JODYLUOBUIUY|
`
`sruo|
`
`suabeuspsannAsy|
`
`OusHeuepwaanpiy|
`SAIDIYUNDA|SyOAINY|
`
`
`
`
`
`es!4|peadupysouyny|uolsuaye-ae,adhLajeqtowenAqodurtiy|
`
`
`
`aponasunosAIV@ae
`
`SHUOIPI/ODUOIEUILY|
`
`sruorpalymag}
`
`espuessw
`
`ULMERWebs02]
`
`SYUOIIRUMUY|
`
`
`SSESuagemlugAD'syesuoiewiuy
`
` svetenamed|.esouen|Unga]|ONUODUONBULY4
`
`
`DyeLuoyewluyyty
`OyeUOeUNLY
`
` rodk:3q:su3pIPS||ODSIeISUOEWI|
`Srudipay
`
`
`
`ZOP
`
`‘POP
`
`quawnz0qQO
`
`
`
`abedgamD,
`
`ada
`
`uossagay
`
`oapia
`
`asnw
`
`JapjosO,
`
`quaagBpewOY
`
`an
`
`ardCh
`
`apodsounesaDANIdGB
`
`yaw03GB
`
`wiue[,
`
`sae
`
`owap
`
`anoQB
`
`saey[),
`
`oo
`
`buiwedF,
`
`Auury
`
`80P
`
`olay
`
`suueys
`
`sowioary
`
`Facebook's Exhi
`
`it No. 1006
`Page 6
`
`Facebook's Exhibit No. 1006
`Page 6
`
`
`
`
`
`
`
`
`

`

`U.S. Patent
`
`Nov.9, 2010
`
`Sheet 5 of 13
`
`US 7,831,913 B2
`
`sTuamaypnmed\:3
`
`
`
`SyUse!UONBIIEYOD
`
`
`
`a'yeuogjewiuy|
`
`srucidolundnguyt’
`
`Suomenw
`
`sxuo)
`
`avseSevepeanayyoO|
`SYSALIAYMID
`
`SOaGevewaapy
`
`svaAlpyt
`
`Ssuayugesyyogg|SHlayiuigesy|
`SoeQuBey"ES
`
`svuaTWwaeey
`
`SVOLRDISUCHAUUYEES
`Suorpatjoquoneunuy
`Sajeysuonewwuy|
`
`eaueiyY9posoinogQVG2aes
`
`
`:aeiPeal:[>ae0aulen“Aq
`
`
`Patayplnccgeyeennypercesnimarenypepeseem
`:VESDUEswoMNOAISiUUNETIEIRRM|SYUONeWILYeeit\
`unjejjonuesuoiwewiuy«|
`
`|;i
`adky
`
`P
`
`362dG8MZquauinsagGB
`
`uo0513q3opinQh
`
`oisn
`
`sapios
`
`qangimyews0,
`
`snQ
`
`ais
`
`Sposoinsat
`
`aunpiday
`
`ya.my
`
`wie
`
`saey
`
`andOo
`
`owapoO.
`
`saeyTh
`
`Auuny[ooLh
`
`Buweb
`
`onlayLS.
`
`dueys
`
`SoPIOAR4:$310]5
`
`Ors
`
`XXwos
`
`Facebook's Exhi
`
`it No. 1006
`Page 7
`
`Facebook's Exhibit No. 1006
`Page 7
`
`

`

`U.S. Patent
`
`Nov.9, 2010
`
`Sheet 6 of 13
`
`US 7,831,913 B2
`
`
`
`"BTUfuL[Yuleane
`Peareehe4
`
`SULAQuassy
`
`
`
`—SaaesrqyurAjquassy
`
`
`
`j=:j|"pig‘pealun}ouny|uolsuay|Be|sdé){ayeqsueyctqoduessry,ii
`
`fo
`
`
`
`spo5asinosOIVE@OE
`
`
`
`sTUOIpPaOQuoRewiY|
`
`
`
`SPnalpey”MACKS
`
`Y25puesiuawnsegy:3
`
`WUNETOATEOdy
`
`syuOIewiUYy
`
`DSPBUSyeMUyED
`
`SYuOTSUUO)
`
`sTuom
`
`mrayersuonewiuy|
`UNAOWOQUONneWiUY
`
`
`
`DVIUONewA|S'yIBLvoHewiUY
`4
`Ofpa|/ODAIPISUONFWIUY
`
`syu
`
`Suedndayy>:
`
`
`
`sruotpap“ro0g\:9
`
`syuo
`
`syupiGoigndggyy|
`
`PapopPesLvonewiuy
`
`usheuewacimay|Suadeuewaayy|
`SYPALPWAINGD
`
`sv3aAyf°
`
`Daomed
`
`SFistoD
`BISTROT.
`
`sduayimgesly|
`
`STuayIURPayY|
`
`
`
`sy'sjuawnbsys)'squswnGay
`
`
`
`
`
`syseniniGiyy36045tSrsusuNGToq\sy
`
`quawinzogQO
`
`3684Gamis)
`
`adAL
`
`oisnwauosdadQO
`
`Japjos
`
`ospiaCh
`
`pewsOars
`
`isnDquaaa(),
`
`aposounosQaanyaidLh
`
`yoa.so03B
`
`owap
`
`
`
`aqnoim}
`
`oyOsarj(}
`
`Auury
`
`JayioausoyBuoy(),
`
`
`
`Adjgsuteh
`
`Buueys
`
`709
`
`F09
`
`
`
`sowoary
`
`521015
`
`eAneau
`
`Facebook's Exhi
`
`it No. 1006
`Page 8
`
`Facebook's Exhibit No. 1006
`Page 8
`
`
`
`
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 7 of 13
`
`US 7,831,913 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Facebook's Exhibit No. 1006
`Page 9
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 8 of 13
`
`US 7,831,913 B2
`
`
`
`
`
`apop abunos©
`
`säe 1
`
`808
`
`Facebook's Exhibit No. 1006
`Page 10
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 9 of 13
`
`US 7,831,913 B2
`
`
`
`6 " OIDH
`
`EKOV-HRHEILNI 5) NI L?\d|WOO V NO INE, LI ENO
`
`
`
`
`1SVEIT 1 V HO
`
`NO|19=}TES >JEST V S)N110E LEGI
`
`þ06
`
`906
`
`Facebook's Exhibit No. 1006
`Page 11
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 10 of 13
`
`US 7,831,913 B2
`
`
`
`
`
`1\/ HLINA MEIST) EH-1 50NICJIAO'^ld
`
`
`
`
`
`
`
`€)V_1 WELLI ENO 1SVET
`
`O 1 ES NOCHSER) NI NOILSE5)50[\S
`
`
`NOI LOETES >JEST EHL
`
`
`
`\7 NO WELLI ENO 1SVEIT LV HO
`EKOV-HRHELNI 5) NI L?ldWOO
`
`
`
`
`NO||LOETES HEIST V 9NI 10E LEGI
`
`þ001
`
`Facebook's Exhibit No. 1006
`Page 12
`
`

`

`U.S. Patent
`
`Nov.9, 2010
`
`Sheet 11 of 13
`
`US 7,831,913 B2
`
`LNdNI
`
`
`
`NOILDA1SSYASNVONILOALAG
`
`
`ADVAYSINIONILAdAWOD
`
`
` YASNVONVA90VAYSLNISNILAdINODVNOWALIANO1SV31LVAO
`
`vorl
`
`
`
`LNdNI¥4asnAHL
`
`OLASNOdSAYNINOILSADONS
`
`LVHLIM¥3SNAHLONIGIAOdd
`
`
`
`OVLWALlSNOLSV31
`
`SOTT
`
`LYVLS
`
`COLT
`
`WOM
`
`Facebook's Exhi
`
`it No. 1006
`Page 13
`
`Facebook's Exhibit No. 1006
`Page 13
`
`
`
`

`

`U.S. Patent
`
`Nov. 9, 2010
`
`Sheet 12 of 13
`
`US 7,831,913 B2
`
`ALOWSY
`
`YALNdNOd
`
`ozlooEee
`OPZT|9071
`09Z1,pzzI\”YUOMIEN
`
`zer|000)43HLO
`YOLINONOCIAONISSIOONd:BbzTport
`
`
`ASOMILANWas|]TWoldooy||2S!0GaVHWedOud
`YauY1YO07\LSNOW
`
`
`7Dsawuooudviva|SAIGON|owyooud|W3LSAS
`
`
`wwe?LNOWLONddySPECwruooud|NWH9O%|onvoraayJONILYHadO
`VERYSOMTory||\ue
`
`|SWWHoOUd!NOILVONdd¥!WAISAS
`|Pon
`
`MOIAISN10d3ArMasicSAGSaINGOW
`JOVSYSLNIANA
`uo’L_Wvao0ud87
`Jovsuaini||sowsesint||FOYS8SIN|soyescnt
`etssre==rz[nan]
`
`uSldvdyLINNNILVYSdO
`
`VIVO
`
`Facebook's Exhi
`
`it No. 1006
`Page 14
`
`Facebook's Exhibit No. 1006
`Page 14
`
`
`
`
`
`

`

`Nov.9, 2010
`
`Sheet 13 of 13
`
`US 7,831,913 B2
`
`90ETOTT
`
`MHYOMAWNVASA
`
`NOILVOINNWINOD
`
`YwsaAuas
`
`viva
`
`(S)BYOLS
`
`LNaIND
`
`viva
`
`(S)AHOLS
`
`tlOW
`
`U.S. Patent
`
` FOLTZOET
`
`(S)MSAuaS
`
`(S)LNaIND
`
`Facebook's Exhibit No. 1006
`Page 15
`
`Facebook's Exhibit No. 1006
`Page 15
`
`
`
`
`

`

`US 7,831,913 B2
`
`1.
`SELECTION-BASED ITEMTAGGING
`
`BACKGROUND
`
`With the proliferation of computing devices has come a
`dramatic increase in available information that seems to be
`exponentially growing each year. This requires that storage
`technology keep pace with the growing demand for data
`storage. Vast amounts of data can now be stored on very Small
`devices that are easily transported and accessible almost any
`where in the world via the Internet. Data retrieval techniques
`have expanded in Scale to also meet the growth of stored data.
`Advances in search engines and other data mining tech
`niques facilitate in the extraction of relevant data. Easy
`retrieval of information is paramount in the utilization of
`stored data. The harder the data is to retrieve, the more likely
`it will not be accessed and utilized. On the far end of the
`retrieval spectrum, if the data cannot be found and retrieved at
`all, then technology has failed despite the ability to store the
`data. Its value will lie dormant until technology once again
`advances to allow full access to the data.
`Frequently, it is the timeliness of the information that
`makes its value substantial. The value of retrieving informa
`tion at a desired point in time can be profound. A doctor
`operating on a patient may need access to additional Surgical
`procedures or patient information during the Surgery—mak
`ing information retrieval a possible life and death action at
`that moment. Although this is an extreme example, it shows
`that the patient information, such as allergies to medicines,
`may be of a much lesser value to the doctor after the surgery
`should the patient die on the operating table due to an allergic
`reaction. Thus, having vast amounts of data is of little value if
`the data is not organized in some fashion to allow its retrieval.
`Therefore, data storage techniques such as databases utilize
`various methods to store the data so that it can be retrieved
`easily. Database search engines also utilize different tech
`niques to facilitate in increasing the speed of data retrieval.
`Most people familiar with an office environment will
`readily recognize an office filing cabinet. It typically has four
`or five drawers that contain paper files that are stored in
`folders inside the cabinet. This office concept of organizing
`was carried over into the computer realm in order to more
`easily transition new users to computer technology. Thus,
`typically, computer files are stored in folders on a computers
`hard drive. Computer users organize their files by placing
`related files in a single folder. Eventually, this too became
`unwieldy becausea folder might have several hundred or even
`a thousand files. So, users began to use a hierarchy of folders
`or folders-within-folders to further breakdown the files for
`easier retrieval. This aided retrieval but also required users to
`“dig deeply into the folders to extract the folder with the
`desired information. This was frequently a daunting task if
`there were large hierarchies of folders.
`The folder concept, however, is often challenged by those
`users who do not agree that an item only belongs to a single
`folder. They frequently desire to associate a file with several
`folders to make it easier to find. Some just copy a file into
`different folders to alleviate the problem. That, however, uses
`more storage space and, thus, is not highly desirable for large
`quantities of information. To circumvent this, users have
`begun to “mark” or “tag” the files or data to indicate an
`association rather than placing them in a folder. A tag is
`generally an arbitrary text string associated with an item that
`is utilized to recall that item at a later time. By tagging the
`item, the user is not required to place it in a folder and force it
`into a single category. A user has the flexibility of tagging and,
`thus, associating different types of items such as graphics,
`
`5
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`text, and/or data and the like. It also allows a user to apply
`multiple tags to the same item. Thus, a user can tag a picture
`of a mountain as a vacation picture to enable recalling it as
`a vacation photo and also as desktop wallpaper to enable
`recalling it as a background image on a computer Screen. This
`is accomplished without requiring the actual item to be
`moved or placed into a folder, etc.
`Despite the apparent power and flexibility afforded by
`tagging in contrast to utilizing folders, the folder concept still
`dominates most of today’s computer users. The folder con
`cept is easy to understand and to implement. It is “intuitive'
`for those who work or have worked in office environments
`and only requires a user to drag and drop an item into a folder
`to associate it with other items. In sharp contrast, current
`tagging techniques are cumbersome and require a user to dig
`deeply into associated data of the item, typically requiring
`opening several computer windows and having expert-like
`knowledge in order to correctly tag the item. For these rea
`Sons, tagging has not been well received by most users,
`despite its powerful potential. To overcome a user's unwill
`ingness to utilize complicated implementation procedures,
`tagging has to be as intuitive and easy as the folder concept.
`Only then will users begin to embrace tagging as a replace
`ment for the filing concept that originated from the traditional
`office environment.
`
`SUMMARY
`
`The following presents a simplified summary of the subject
`matter in order to provide a basic understanding of some
`aspects of subject matter embodiments. This summary is not
`an extensive overview of the subject matter. It is not intended
`to identify key/critical elements of the embodiments or to
`delineate the scope of the Subject matter. Its sole purpose is to
`present some concepts of the Subject matter in a simplified
`form as a prelude to the more detailed description that is
`presented later.
`The Subject matter relates generally to information
`retrieval, and more particularly to systems and methods for
`tagging items based on user selections of items. The item
`selections along with user inputs are leveraged to provide
`users with automated item tagging with minimal impact to the
`user, allowing easy recall of the tagged items at another time.
`Further user interaction with additional windows and other
`interfacing techniques are not required to save the tag with the
`item. Thus, for example, the user can select items and begin
`typing a tag which is automatically associated with the
`selected items. In other instances, tagging Suggestions can be
`Supplied based on a user's selection. For example, if the items
`selected are known to be dog related, a tag of “dog” can be
`Suggested to the user based on the selection of the dog related
`items. In another instance, tagging Suggestions can be
`dynamically supplied based on a users input action. For
`example, if a user types "gr” a tag of "graphics’ can be
`Suggested to the user. Tagging Suggestions can also be for
`mulated automatically based on user data and/or tags and the
`like associated with selections by an external source. For
`example, ifa user is determined to be a doctor, medical related
`terminology tag sets can be downloaded from the Internet and
`included in the Supplied tag Suggestions. Thus, the systems
`and methods herein provide an extremely convenient manner
`in which to add tags to items and can, if desired, employ
`machine learning to facilitate tag determination. This
`increases the value of the tagged items by providing greater
`item access flexibility and allowing multiple associations (or
`tags) with each item.
`
`Facebook's Exhibit No. 1006
`Page 16
`
`

`

`US 7,831,913 B2
`
`3
`To the accomplishment of the foregoing and related ends,
`certain illustrative aspects of embodiments are described
`herein in connection with the following description and the
`annexed drawings. These aspects are indicative, however, of
`but a few of the various ways in which the principles of the
`Subject matter may be employed, and the Subject matter is
`intended to include all such aspects and their equivalents.
`Other advantages and novel features of the subject matter
`may become apparent from the following detailed description
`when considered in conjunction with the drawings.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`5
`
`10
`
`15
`
`25
`
`30
`
`35
`
`4
`a process and/or thread of execution and a component may be
`localized on one computer and/or distributed between two or
`more computers. A “thread is the entity within a process that
`the operating system kernel Schedules for execution. As is
`well known in the art, each thread has an associated “context'
`which is the volatile data associated with the execution of the
`thread. A threads context includes the contents of system
`registers and the virtual address belonging to the threads
`process. Thus, the actual data comprising a threads context
`varies as it executes.
`Ad-hoc item tags are simple text-based strings that are a
`useful form of organization for end users. Existing systems
`today that apply tags require cumbersome dialog boxes and/
`or menus that interrupt the users thought process and work
`flow. The systems and methods herein provide an improved
`user interface for applying tags automatically when the user
`has made a selection of items to be tagged and/or provides an
`input Such as, for example, typing any character on a key
`board. Tags can be added to items without entering a complex
`mode and/or Substantially interrupting current activity. The
`type of tag that the user is typing is determined based on
`factors that can include the item selected, other tags applied to
`similar items and/or used recently, and/or the most commonly
`used tags and the like. In one instance, if the user has selected
`one or more items and begins to type, tagging mode is entered
`automatically and a tag buffer collects key strokes to facilitate
`determination of the tag type.
`In FIG. 1, a block diagram of a selection-based tagging
`system 100 in accordance with an aspect of an embodiment is
`shown. The selection-based tagging system 100 is comprised
`of a selection-based tagging component 102 that interfaces
`with a user 104 and an item source 106. The selection-based
`tagging component 102 interacts with the user 104 and pro
`vides a means for the user 104 to select items from the item
`source 106. When a selection is detected by the selection
`based tagging component 102, it 102 provides the user with a
`Suggested tag for that selection. In other instances, the selec
`tion-based tagging component 102 can wait for the user 104
`to provide an input Subsequent and/or prior (ifassociated with
`the subsequent selection) to the selection before the selection
`based tagging component 102 responds with a suggested tag.
`In that scenario, the selection-based tagging component 102
`can respond dynamically to the users input and relay tag
`Suggestions as the user 104 provides inputs. For example, the
`selection-based tagging component 102 can respond with tag
`Suggestions that utilize each character that the user 104 types
`into a keyboard, providing a list oftag suggestions that utilize
`at least some of the typed characters. The selection-based
`tagging component 102 can also provide tag suggestions by
`heuristically determining the tag based on a selected item, a
`tag associated with a similar item, a recently utilized tag, a
`commonly used tag, a rule-based criterion, and/or a heuristic
`based criterion. The input provided by the user 104 can be a
`mouse click, a keyboard keystroke as mentioned, a visual
`indicator (e.g., eye scanning techniques that determine where
`and at what a user is looking), and/or an audible indicator
`(e.g., Verbal commands and the like to instruct a computing
`device what to select, what to input, and what choices to
`select, etc.). The item source 106 can be a local and/or remote
`depository of data and the like. Typically, databases are uti
`lized for information storage and retrieval. The tags provided
`by the user 104 and generated by the selection-based tagging
`component 102 can be stored with the associated data in the
`item source 106 if desired. Tags can also be associated on
`newly created data not yet stored in the item source 106.
`Turning to FIG. 2, another block diagram of a selection
`based tagging system 200 in accordance with an aspect of an
`
`FIG. 1 is a block diagram of a selection-based tagging
`system in accordance with an aspect of an embodiment.
`FIG. 2 is another block diagram of a selection-based tag
`ging system in accordance with an aspect of an embodiment.
`FIG. 3 is yet another block diagram of a selection-based
`tagging system in accordance with an aspect of an embodi
`ment.
`FIG. 4 is an illustration of a user interface with selected
`items in accordance with an aspect of an embodiment.
`FIG. 5 is an illustration of a user interface with a tag input
`by a user for selected items in accordance with an aspect of an
`embodiment.
`FIG. 6 is an illustration of a user interface showing a user
`input tag added to an item tag list in accordance with an aspect
`of an embodiment.
`FIG. 7 is an illustration of a user interface displaying items
`with a specific item tag in accordance with an aspect of an
`embodiment.
`FIG. 8 is an illustration of a user interface with a suggested
`tag in response to a user input in accordance with an aspect of
`an embodiment.
`FIG. 9 is a flow diagram of a method of facilitating item
`tagging in accordance with an aspect of an embodiment.
`FIG. 10 is another flow diagram of a method of facilitating
`item tagging in accordance with an aspect of an embodiment.
`FIG. 11 is yet another flow diagram of a method of facili
`tating item tagging in accordance with an aspect of an
`embodiment.
`FIG. 12 illustrates an example operating environment in
`which an embodiment can function.
`FIG. 13 illustrates another example operating environment
`in which an embodiment can function.
`
`40
`
`45
`
`DETAILED DESCRIPTION
`
`50
`
`The subject matter is now described with reference to the
`drawings, wherein like reference numerals are used to refer to
`like elements throughout. In the following description, for
`purposes of explanation, numerous specific details are set
`forth in order to provide a thorough understanding of the
`subject matter. It may be evident, however, that subject matter
`embodiments may be practiced without these specific details.
`In other instances, well-known structures and devices are
`shown in block diagram form in order to facilitate describing
`the embodiments.
`As used in this application, the term "component' is
`intended to refer to a computer-related entity, either hard
`ware, a combination of hardware and software, Software, or
`Software in execution. For example, a component may be, but
`is not limited to being, a process running on a processor, a
`processor, an object, an executable, a thread of execution, a
`program, and/or a computer. By way of illustration, both an
`65
`application running on a server and the server can be a com
`puter component. One or more components may reside within
`
`55
`
`60
`
`Facebook's Exhibit No. 1006
`Page 17
`
`

`

`5
`embodiment is illustrated. The selection-based tagging sys
`tem 200 is comprised of a selection-based tagging component
`202 that interfaces with a user 204 and an item source 206.
`The selection-based tagging component 202 is comprised of
`a user interface 208 and a tagging component 210. The user
`interface 208 provides the user 204 with a means to view
`and/or select items from the item source 206. The user 204
`can obtain tag suggestions for item selections from the tag
`ging component 210 via the user interface 208. The user 204
`can also input tags for a selection of items to the tagging
`10
`component 210 via the user interface 208. The tagging com
`ponent 210 can also access the item source 206 to locate
`additional tag information, like tags, other associated tags,
`and/or other associated items and the like to facilitate tag
`determinations and/or storage. When the user 204 selects at
`least one item via the user interface 208, the tagging compo
`nent 210 determines a Suggested tag based on, in part, the
`selected item itself. It 210 can look for other similar tags that
`are related to the item and provide those as Suggestions. It 210
`can also suggest commonly used tags, most recently used
`tags, and/or tags based on user data Such as, for example,
`preferences, profession, work topic (e.g., a graphics designer
`working on a project is most likely working on graphics.
`etc.), and/or activity and the like.
`The tagging component 210 can also utilize the user inter
`face 208 to detect when the user 204 is providing an input
`Such as a keystroke and/or mouse click and the like (described
`Supra). This input which is Subsequent and/or prior to the
`selection of the item or items allows the tagging component
`210 to attempt guesses for possible tag Suggestions for the
`user204. For example, if the user 204 inputs a 'g. the tagging
`component 210 can list possible tags that begin with the letter
`''g'' Such as, for example, "graphics.” “group A 'group B.
`“green.” and/or "garage' and the like. As the user 204 types
`more characters (i.e., inputs), the tagging component 210
`dynamically responds by providing tag Suggestions that can
`mimic the characters disclosed up to that point. In a similar
`fashion, if the tagging component 210 recognizes a sequence
`of characters that has associations other than based directly
`on the characters, it 210 can display those tag Suggestions as
`well. For example, the user 204 can type “hom” for home and
`the tagging component 210 can respond with a tag suggestion
`that was previously used by the user 204 and/or synonymous
`such as "house' and the like.
`Looking at FIG. 3, yet another block diagram of a selec
`tion-based tagging system 300 in accordance with an aspect
`of an embodiment is depicted. The selection-based tagging
`system 300 is comprised of a selection-based tagging com
`ponent 302 that interfaces with a user304, an item source 306,
`optional user data 312, optional machine learning 314, and
`optional external tag sources 316. The selection-based tag
`ging component 302 is comprised of a user interface 308 and
`a tagging component 310. The user interface 308 interacts
`with the user 304 to receive and/or provide information
`related to items from the item source 306. The item source
`306 can be local and/or remote to the interface and/or the
`selection-based tagging component 302. In a typical interac
`tion, the user interface 308 detects a selection of at least one
`item by the user 304. The information relating to what items
`are selected is passed to the tagging component 310. The
`tagging component 310 determines at least one tag sugges
`tion based on various parameters and/or data. The user 304
`can then respond by selecting a Suggested tag and/or the user
`304 can provide a user input Such as, for example, by typing
`on a keyboard various characters and the like. The user input
`obtained by the tagging component 310 via the user interface
`308 is utilized to form additional tag Suggestions for relaying
`
`45
`
`25
`
`30
`
`35
`
`40
`
`50
`
`55
`
`60
`
`65
`
`US 7,831,913 B2
`
`5
`
`15
`
`6
`to the user 304 via the user interface 308. The input based tag
`Suggestions are then utilized by the user 304 to make a tag
`selection and/or the user 304 can directly input a different tag
`altogether. The selected and/or direct input tag is then
`obtained by the tagging component 310 and utilized to tag the
`selected items. The utilized tags are then relayed to the user
`via the user interface 308 at appropriate times to facilitate the
`user 304 in recalling items based on tag information. The
`tagging component 310 can also directly store the tags with
`the selected items in the item source 306 if desired.
`The tagging component 310 can also heuristically deter
`mine the tag based on a selected item, a tag associated with a
`similar item, a recently utilized tag, a commonly used tag, a
`rule-based criterion, and/or a heuristic-based criterion.
`Optional machine learning 314 can also be employed to learn
`tag suggestions. Optional user data 312 (e.g., user environ
`ment data, directly entered by the user 304 data, and/or indi
`rectly derived data and the like) can also be utilized by the
`tagging component 310 to determine tag Suggestions. The
`tagging component 310 is not limited to only utilizing inter
`nally obtained and/or local information. Optional external tag
`Sources 316 (e.g., global network connections, local network
`connections, and/or manually entered data and the like) can
`also be employed to provide additional information to facili
`tate tag suggestions. For example, if the user 304 is deter
`mined to be a lawyer (determined from the optional user data
`312), the tagging component 310 can obtain tag information
`related to attorneys via the Internet. The Interne

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket