throbber
USOO7853972B2
`
`US 7,853,972 B2
`(10) Patent No.:
`(12) United States Patent
`
`Brodersen et al.
`(45) Date of Patent:
`Dec. 14, 2010
`
`(54) MEDIA PREVIEW USER INTERFACE
`
`(75)
`
`Inventors: Rainer Brodersen, San Jose, CA (US);
`Rachel Clare Goldeen, Mountain View,
`CA (US); Jeffrey Ma, Redwood City,
`CA (US); Mihnea Calin Pacurariu, L05
`Gatos, CA (US); Eric Taylor Seymour,
`San Jose, CA (US); Steven «‘0st P2110
`Alto, CA (US); David Alan Pound, San
`Carlos, CA (US)
`
`(73) Assignee: Apple Inc., Cupertino, CA (US)
`
`6,725,427 132
`6,768,999 B2
`6,944,632 B2
`
`4/2004 Freeman et a1.
`7/2004 Prager et a1.
`9/2005 Stern
`
`7,292,243 B1
`7,362,331 132
`7,363,591 B2
`7,743,341 B2 *
`2002/0033848 A1
`2002/0083469 A1
`
`11/2007 Burhe
`4/2008 Ordlng
`4/2008 Goldthwaite et at.
`6/2010 Brodersen et a1.
`........... 715/810
`3/2002 S'
`11
`t
`1,
`01ammare ae a
`6/2002 Jeannine et a1.
`
`*
`
`Notice:
`
`J
`y
`Sub'ect to an disclaimer, the term of this
`patent is extended or adjusted under 35
`U‘S‘C‘ 154(1)) by 135 days‘
`(21) Appl.No.:11/530,630
`
`(22)
`
`Filed:
`
`Sep. 11, 2006
`
`(65)
`
`Prior Publication Data
`
`US 2008/0066110A1
`
`Mar. 13, 2008
`
`(51)
`
`Int. Cl.
`(2006.01)
`G06F 3/00
`(2006.01)
`G06F 13/00
`(2006.01)
`H04N 5/445
`(52) US. Cl.
`.............................. 725/40; 725/39; 725/52
`(58) Field of Classification Search ............. 725/37761;
`.
`.
`715/729: 799, 726
`See applIcatIon file for complete search hIStory.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`(Continued)
`FOREIGN PATENT DOCUMENTS
`
`EP
`
`1 469 375 A1
`
`10/2004
`
`(Continued)
`OTHER PUBLICATIONS
`
`U.S.App1. No. 11/530,824, filed Sep. 11, 2006, Madden et a1.
`.
`(Commued)
`Primary Examineriscott Beliveau
`Assistant ExamineriAlexander Q Huerta
`(74) Attorney, Agent, or FirmiFish & Richardson PC.
`
`(57)
`
`ABSTRACT
`
`4/1997 Billock et a1.
`5,619,249 A *
`2/1998 Moran et 31.
`5,717,879 A
`................... 725/43
`5,822,123 A * 10/1998 Davis etal.
`5,880,768 A
`3/1999 Lemmons et a1.
`
`.................. 725/5
`
`A menu hang men“ “311.15 lsfmmged “1 an “Iterface enVl'
`ronment. A first abstractIon 1s arranged prox1mate to the
`menu, the firstabstractionbeingbasedonahighlightedmenu
`item. A second abstraction is transitioned into the interface
`
`5,945,987 A *
`6,006,227 A
`6,335,737 B1
`6,448,987 B1
`6,638,313 B1
`
`8/1999 Dunn ......................... 715/718
`12/1999 Freeman et a1.
`1/2002 Grossman et a1.
`9/2002 Easty et a1.
`10/2003 Freeman et a1.
`
`environment upon the occurrence of an event, the second
`abstraction being proximate to the menu.
`
`25 Claims, 6 Drawing Sheets
`
`302
`
`306
`
`El iTunes Store Presents
`WW‘M‘M‘M‘MW‘M- -\_
`‘fiiScanner Darkly
`\ms\wmwxwxw\w\w\wmmwxw\wm§
`Superman Returns
`Talladega Nights
`Pirates of the Carribbean:...
`Lilo & Stitch
`
`All The President’s Men
`
`George of the Jungle
`
`K500
`
`Apple Exhibit 4235
`
`Apple V. SightSound Technologies
`CBM2013-00020
`
`Page 00001
`
`Apple Exhibit 4235
`Apple v. SightSound Technologies
`CBM2013-00020
`Page 00001
`
`

`

`US 7,853,972 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`2002/0175931
`2003/0110450
`2003/0117425
`2003/0142751
`2003/0174160
`2004/0008211
`2004/0100479
`2004/0140995
`2004/0150657
`2004/0221243
`2004/0250217
`2004/0261031
`2005/0041033
`2005/0044499
`2005/0091597
`2005/0160375
`2005/0246654
`2005/0278656
`2006/0020962
`2006/0031776
`2006/0265409
`2007/0162853
`2007/0288863
`2008/0052742
`2008/0062894
`2008/0065638
`
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1
`A1*
`A1*
`A1*
`
`11/2002
`6/2003
`6/2003
`7/2003
`9/2003
`1/2004
`5/2004
`7/2004
`8/2004
`11/2004
`12/2004
`12/2004
`2/2005
`2/2005
`4/2005
`7/2005
`11/2005
`12/2005
`1/2006
`2/2006
`11/2006
`7/2007
`12/2007
`2/2008
`3/2008
`3/2008
`
`Holtz et a1.
`Sakai
`O’Leary et al.
`Hannuksela
`Deutscher et a1.
`Soden et al.
`Nakano et al.
`Goldthwaite et al.
`Wittenburg et a1.
`Twerdahl et a1.
`Tojo et al.
`Tuomainen et al.
`Hilts
`Allen et a1.
`Ackley
`Sciammarella et a1.
`Hally et a1.
`Goldthwaite et al.
`Stark et al.
`Glein et a1.
`Neumann et al.
`Weber et a1.
`Ording et al.
`Kopf et al.
`.................... 725/34
`Ma et al.
` . 370/263
`Brodersen et al.
`.............. 707/7
`
`3/2008 Brodersen et al.
`2008/0065720 A1
`3/2008 Brodersen et al.
`2008/0066010 A1*
`3/2008 Brodersen et al.
`2008/0066013 A1
`4/2008 Logan et al.
`.................. 725/44
`2008/0092168 A1*
`5/2008 Brodersen et al.
`2008/0122870 A1
`2008/0263585 A1* 10/2008 Gell et al.
`..................... 725/32
`2009/0282372 A1* 11/2009 Jerding et a1.
`............... 715/867
`2010/0077338 A1*
`3/2010 Matthews et al.
`........... 715/779
`
`........... 715/810
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`
`1 510 911 A2
`
`3/2005
`
`OTHER PUBLICATIONS
`
`USPTO Final Office Action in US. Appl. No. 11/530,824, mailed
`May 8, 2009.
`USPTO Final Office Action in US. Appl. No. 11/530,808 mailed
`May 13, 2009.
`USPTO Final Office Action in US. Appl. No. 11/530,643, mailed
`Jun. 5, 2009.
`http://Web.archive.org/Web/
`Rollovers,”
`Image
`“Fading
`20060111080357/http://Www.javascript-fx.com/fadeirollovers/
`generalihelp/helphtml. Jan. 11, 2006, 1 page.
`“Animated
`Image
`Blur,”
`http://webarchiveorg/web/
`20060430062528/http://WWW.tutorio.com/tutorial/animated-image-
`blur, Apr. 30m, 2006, 2 pages.
`
`* cited by examiner
`
`Page 00002
`
`Page 00002
`
`

`

`US. Patent
`
`Dec. 14, 2010
`
`Sheet 1 0f6
`
`US 7,853,972 B2
`
`100 \
`
`108
`
`110
`
`112
`
`102
`
`104
`
`-
`DATA STORE
`
`PROCESSING
`DEVICE
`
`I/O
`
`
`
`
`I
`
`DEVICE
`
`152
`
`150
`
`108
`
`162
`
`164
`
`204
`
`
`
`I/O DEVICE
`
`
`
`
`114
`
`MEDIA
`ENGINE
`
`MEDIA
`ENGINE
`
`MEDIA
`ENGINE
`
`116-1
`
`116-2
`
`116-n
`
`FIG. 1
`
`FIG. 2
`
`COMPUTING
`DEVICE
`
`214-1
`
`CONTENT
`PROVIDER
`
`CONTENT
`PROVIDER
`
`214-2
`
`/200
`
`DATA STORE
`
`FIG. 3
`
`Page 00003
`
`Page 00003
`
`

`

`U.S. Patent
`
`Dec. 14, 2010
`
`Sheet 2 of 6
`
`US 7,853,972 B2
`
`302
`
`WWW.\
`\

`\A anner Dark y
`WWW\
`
`All The President's Men
`
`Superman Returns
`
`Talladega Nights
`
`Pirates of the Caribbean:...
`
`Lilo & Stitch
`
`George of the Jungle
`
`FIG. 4
`
`302
`
`All The President’s Men
`
`402
`
`El iTunes Store Presents
`
`306
`
`WWW\
`Vi Sc
`er Darkl
`\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
`\\\\\\\\\\
`\
`
`Superman Returns
`
`Talladega Nights
`
`Pirates of the Caribbean:...
`
`Lilo & Stitch
`
`George of the Jungle
`
`IIIIIIIIIIIIII
`
`I.
`
`FIG. 5
`
`K 400
`
`Page 00004
`
`Page 00004
`
`

`

`US. Patent
`
`Dec. 14, 2010
`
`Sheet 3 of6
`
`US 7,853,972 B2
`
`302
`
`306
`
`306
`
`' 502 CI iTunes Store Presents
`\\\\\\\\\
`\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
`\A Scanner Darkly
`\:
`
`
`\\\W\\W\\\\\\\\\\\\\\\\\\\\\\\\\W\\\\\\\\\\\\\\\\\\‘W~\\\\\\\\\\\\\\\\\\\\\\\
`
`All The President’s Men
`
`Superman Returns
`
`Talladega Nights
`
`Pirates of the Carribbean:...
`
`Lilo & Stitch
`
`George of the Jungle
`
`FIG. 6
`
`£500
`
`302
`
`502
`
`\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
`\\\AmScanner Darkly
`\\\\\\\
`\\\\\\\\\\\\\\\
`
`Superman Returns
`
`Talladega Nights
`
`All The President’s Men
`
`Pirates of the Carribbean:...
`
`Lilo & Stitch
`
`George of the Jungle
`
`FIG. 7
`
`K500
`
`Page 00005
`
`Page 00005
`
`

`

`US. Patent
`
`Dec. 14, 2010
`
`Sheet 4 of6
`
`US 7,853,972 B2
`
`302
`
`306
`
`306
`
`El iTunes Store Presents
`
`\
`\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ \
`\
`\A Sc n er Darky
`,
`n
`\
`\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\5
`\\\\\\\\\\ \ \
`
`Superman Returns
`
`Talladega Nights
`
`Pirates of the Carribbeanz...
`
`Lilo & Stitch
`
`George of the Jungle
`
`FIG. 8
`
`302
`
`All The President’s Men
`\c‘\ u .«
`
`602 El iTunes Store Presents
`\
`WWWSc ark \
`\
`“A
`anner D
`
`
`WWW
`
`\\
`
`Superman Returns
`
`Talladega Nights
`
`Pirates of the Carribbeanz...
`
`Lilo & Stitch
`
`George of the Jungle
`
`All The President’s Men
`\“i
`«\‘M‘.’ 1‘
`x,”
`v
`.. ~~\~«
`\" “‘\ 531A Exx“;‘~~\“«‘
`l3{§\::\ ms nu.» i
`
`F...-..-.-
`
`FIG. 9
`
`K600
`
`Page 00006
`
`Page 00006
`
`

`

`US. Patent
`
`Dec. 14, 2010
`
`Sheet 5 of6
`
`US 7,853,972 B2
`
`PROVIDE AN INTERFACE ENVIRONMENT
`
`700
`
`702 4/
`
`PROVIDE A FIRST ABSTRACTION
`
`RECEIVE AN EVENT
`
`704
`
`706
`
`708
`
`TRANSITION THE INTERFACE ENVIRONMENT TO INCLUDE A SECOND
`ABSTRACTION
`
`FIG. 10
`
`GENERATE DISPLAY ENVIRONMENT
`
`800
`
`802 /
`
`ABSTRACTION
`
`GENERATE A FIRST ABSTRACTION
`
`RECEIVE AN EVENT
`
`TRANSITION THE DISPLAY ENVIRONMENT TO INCLUDE A SECOND
`
`804
`
`806
`
`808
`
`FIG. 11
`
`Page 00007
`
`Page 00007
`
`

`

`US. Patent
`
`Dec. 14, 2010
`
`Sheet 6 of6
`
`US 7,853,972 B2
`
`GENERATE MENU
`
`902
`
`904
`
`900
`
`/
`
`PRESENT INTERFACE ENVIRONMENT
`
`NOIII
`
`YES
`
`GENERATE FIRST ABSTRACTION
`
`90
`
`8
`
`NO
`
`914
`
`ES
`
`GENERATE SECOND ABSTRACTION ASSOCIATED WITH HIGHLIGHTED MENU
`ITEM
`
`I
`PRESENTFIRSTABSTRACTION
`
`
`W“
`
`TRANSITION SECOND ABSTRACTION INTO INTERFACE ENVIRONMENT
`
`920
`
`922
`NO
`
`PRESENT CONTENT?
`YES
`
`
`
`NO
`
`MAINTAIN INTERFACE ENVIRONMENT
`
`REQUEST CONTENT ASSOCIATED WITH SELECTED MENU ITEM
`
`FIG. 12
`
`924
`
`269
`
`Page 00008
`
`Page 00008
`
`

`

`US 7,853,972 B2
`
`1
`MEDIA PREVIEW USER INTERFACE
`
`BACKGROUND
`
`This disclosure is related to media processing systems and
`methods.
`
`Media devices, such as digital Video and audio players, can
`include multiple functions and capabilities, such as playing
`stored content, browsing and selecting from recorded con-
`tent, storing and/or receiving content selected by a user, and
`the like. These various functions can often be grouped accord-
`ing to content types, e. g., movies, music, television programs,
`photos, etc. The user interface can include both graphical and
`textual features. It is desirable that the user interface conveys
`information to the user in an intuitive manner, and readily
`provides access to various features. One such feature is a
`media preview feature. However, current media devices pro-
`vide inadequate information regarding content and/or poorly
`organize the information that is provided in preview features.
`
`SUMMARY
`
`Disclosed herein are systems and methods for previewing
`content associated with menu items. In one implementation,
`an interface environment includes a menu arranged in the
`interface environment, the menu including a list of menu
`items associated with corresponding content. The interface
`environment further includes a first abstraction of a high-
`lighted menu item, the first abstraction being proximate to the
`menu. The interface environment is further configured to
`transition to include a second abstraction of the highlighted
`menu item based upon an event, the second abstraction being
`proximate to the menu.
`In another implementation, one or more computer readable
`media are used to cause a processor to perform the operations
`comprising: generating a display environment comprising a
`menu, the menu comprising a plurality of menu items includ-
`ing a highlighted menu item, each of the menu items associ-
`ated with corresponding content; generating a first abstrac-
`tion arranged within the display environment,
`the first
`abstraction being associated with the highlighted menu item;
`receiving an event; and, transitioning the first abstraction to a
`second abstraction responsive to the event,
`the second
`abstraction being associated with the highlighted menu item.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram of an example media processing
`system.
`FIG. 2 is a block diagram of an example remote control
`device for the media processing system.
`FIG. 3 is an example network environment in which a
`media processing system in accordance with FIG. 1 can be
`implemented.
`FIG. 4 is a block diagram of an example interface environ-
`ment.
`
`FIG. 5 is block diagram of an example preview interface
`environment.
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`FIGS. 6 and 7 are block diagrams of an example interface
`environment transition.
`
`65
`
`FIGS. 8 and 9 are block diagrams of another example
`media menu interface environment transition.
`
`2
`
`FIGS. 10-12 are flow diagrams of example media presen-
`tation processes.
`
`DETAILED DESCRIPTION
`
`FIG. 1 is a block diagram of an example media processing
`system 100. The media processing system 100 can transmit
`and receive media data and data related to the media data. The
`
`media data can be stored in a data store 102, such as a memory
`device, and be processed by a processing device 104 for
`output on a display device, such as a television, a computer
`monitor, a game console, a hand held portable device, and the
`like, and/or an audio device, such as a multi-channel sound
`system, a portable media player, a computer system, and the
`like. The media processing system 100 may be used to pro-
`cess media data, for example, video data and audio data
`received over one or more networks by an input/output (I/O)
`device 106. Such media data may include metadata, e.g., song
`information related to audio data received, or programming
`information related to a television program received.
`The media data and related metadata may be provided by a
`single provider, or may be provided by separate providers. In
`one implementation, the media processing system 100 can be
`configured to receive media data from a first provider over a
`first network, such as a cable network, and receive metadata
`related to the video data from a second provider over a second
`network, such as a wide area network (WAN). Example
`media data include video data, audio data, content payload
`data, or other data conveying audio, textual and/or video data.
`In another implementation, the media processing system
`100 can be configured to receive media data and metadata
`from a computing device, such as a personal computer. In one
`example of this implementation, a user manages one or more
`media access accounts with one or more content providers
`through the personal computer. For example, a user may
`manage a personal iTunes® account with iTunes® software,
`available from Apple Computer, Inc. Media data, such as
`audio and video media data, can be purchased by the user and
`stored on the user’s personal computer and/or one or more
`data stores. The media data and metadata stored on the per-
`sonal computer and/or the one or more data stores can be
`selectively pushed and/or pulled for storage in the data store
`102 of the media processing system 100.
`In another implementation, the media processing system
`100 can be used to process media data stored in several data
`stores in communication with a network, such as wired and/or
`wireless local area network (LAN), for example. In one
`implementation, the media processing system 100 can pull
`and/or receive pushed media data and metadata from the data
`stores over the network for presentation to a user. For
`example, the media processing system 100 may be imple-
`mented as part of an audio and video entertainment center
`having a video display device and an audio output device, and
`can pull media data and receive pushed media data from one
`or more data stores for storage and processing. At the enter-
`tainment center, a user can, for example, view photographs
`that are stored on a first computer while listening to music
`files that are stored on a second computer.
`In one implementation, the media processing system 100
`includes a remote control device 108. The remote control
`
`device 108 can include a rotational input device 110 config-
`ured to sense touch actuations and generate remote control
`signals therefrom. The touch actuations can include rotational
`actuations, such as when a user touches the rotational input
`device 110 with a digit and rotates the digit on the surface of
`the rotational input device 110. The touch actuations can also
`include click actuations, such as when a user presses on the
`
`Page 00009
`
`Page 00009
`
`

`

`US 7,853,972 B2
`
`3
`rotational input device 110 with enough pressure to cause the
`remote control device 108 to sense a click actuation.
`
`In one implementation, the functionality of the media pro-
`cessing system 100 is distributed across several engines. For
`example, the media processing system 100 may include a
`controller engine 112, a user interface (UI) engine 114, and
`one or more media engines 116-1, 116-2, and 116-n. The
`engines may be implemented in software as software mod-
`ules or instructions, or may be implemented in hardware, or in
`a combination of software and hardware.
`
`The control engine 112 is configured to communicate with
`the remote control device 108 by a link, such as a wireless
`infrared signal or radio frequency signal. The remote control
`device 108 can transmit remote control signals generated, for
`example, from touch actuations of the rotational input device
`110 to the control engine 112 over the link. In response, the
`control engine 112 is configured to receive the remote control
`signals and generate control signals in response. The control
`signals are provided to the processing device 104 for process-
`ing.
`The control signals generated by the control engine 112
`and processed by the processing device 104 can invoke one or
`more ofthe UI engine 114 andmedia engines 116-1-116-n. In
`one implementation, the UI engine 114 manages a user inter-
`face to facilitate data presentation for the media engines 116-
`1-116-n and functional processing in response to user inputs.
`In one implementation, the media engines 116 can include
`one or more content-specific engines, such as a movies
`engine, television program engine, music engine, and the like.
`Each engine 116 can be instantiated to support content-spe-
`cific functional processing. For example, a movie engine to
`support movie-related functions can be instantiated by select-
`ing a “Movies” menu item. Example movie-related functions
`include purchasing movies, viewing movie previews, view-
`ing movies stored in a user library, and the like. Likewise, a
`music engine to support music-related functions can be
`instantiated by selecting a “Music” menu item. Example
`music-related functions include purchasing music, viewing
`music playlists, playing music stored in a user library, and the
`like.
`
`The media processing system 100 ofFIG. 1 can also imple-
`ment different functional distribution architectures that have
`additional functional blocks or fewer functional blocks. For
`
`example, the engines 116 can be implemented in a single
`monolithic engine.
`FIG. 2 is a block diagram of an example remote control
`device 108 for the media processing system 100. The remote
`control device 108 includes a rotational input device 110, a
`processing device 150, and a wireless communication sub-
`system 152. The rotational input device 110 defines a surface
`that can sense a touch actuation, such as the presence of a
`finger on the surface, and can further generate a control signal
`based on a rotation of the finger on the surface. In one imple-
`mentation, a touch sensitive array is disposed beneath the
`surface ofthe rotational input device 110. The touch sensitive
`array can be disposed according to polar coordinates, i.e., r
`and ?, or can be disposed according to Cartesian coordinates,
`i.e., x and y.
`The rotational input device areas 160, 162, 164, 166 and
`168 are receptive to press actuations. In one implementation,
`the areas include a menu area 160, a reverse/previous area
`162, a play/pause area 164, a forward/next area 166, and a
`select area 168. The areas 160-168, in addition to generating
`signals related to their descriptive functionalities, can also
`generate signals for context-dependent functionality. For
`example, the menu area 160 can generate signals to support
`the functionality ofdismissing an onscreen user interface, and
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`the play/pause area 164 can generate signals to support the
`function of drilling down into a hierarchal user interface. In
`one implementation, the areas 160-168 comprise buttons dis-
`posed beneath the surface of the rotational input device 110.
`In another implementation, the areas 160-168 comprise pres-
`sure sensitive actuators disposed beneath the surface of the
`rotational input device 110.
`The processing device 150 is configured to receive the
`signals generated by the rotational input device 110 and gen-
`erate corresponding remote control signals in response. The
`remote control signals can be provided to the communication
`subsystem 152, which can wirelessly transmit the remote
`control signals to the media processing system 100.
`in
`Although shown as comprising a circular surface,
`another implementation, the rotational input device 110 can
`comprise a rectangular surface, a square surface, or some
`other shaped surface. Other surface geometries that accom-
`modate pressure sensitive areas and that can sense touch
`actuations may also be used, e.g., an oblong area, an octago-
`nal area, etc.
`Other actuation area configurations may also be used. For
`example,
`in another implementation,
`the remote control
`device 108 can also include a separate actuation button 170.
`In this implementation, the areas comprise a “+” or increase
`area 160, a reverse/previous area 162, a “—” or decrease area
`164, a forward/next area 166, a play/pause area 168, and a
`menu area 170.
`
`FIG. 3 is an example network environment 200 in which a
`media processing system 100 in accordance with FIG. 1 may
`be implemented. The media processing system 100 receives,
`for example, user input through a remote control device 108
`and media data over a network 202, such as a wired or wire-
`less LAN. In one implementation, the network 202 commu-
`nicates with a wide area network 212, such as the Internet,
`through an I/O device 203, such as a router, server, cable
`modem, or other computing and/or communication process-
`ing device. The media processing system 100 processes the
`media data for output to one or more output devices 204. The
`media processing system 100 can receive the media data from
`one or more data stores connected to the network 202, such as
`computing devices 206 and 208, and a data store 210.
`The media data can be received through the network 212 by
`one ofthe computing devices, such as computing device 208.
`The network 212 can include one or more wired and wireless
`
`networks, such as the Internet. The media data is provided by
`one or more content providers 214. For example, the content
`provider 214-1 may provide media data that is processed by
`the media processing system 100 and output through the
`output devices 206, and the content provider 214-2 may pro-
`vide metadata related to the media data for processing by the
`media processing system 100. Such metadata may include
`episodic content, artist information, and the like. A content
`provider 214 can also provide both media data and related
`metadata.
`
`In one implementation, the media processing system 100
`can also communicate with one or more content providers
`214 directly. For example, the media processing system 100
`can communicate with the content providers the wireless
`network 202, the I/O device 203, and the network 212. The
`media processing system 100 can also communicate with the
`content providers 214 thorough other network configuration,
`e.g., through a direct connection to a cable modem, through a
`router, or through one or more other communication devices.
`Example communications can include receiving sales infor-
`mation, preview information, or communications related to
`commercial transactions, such as purchasing audio files and
`video files.
`
`Page 00010
`
`Page 00010
`
`

`

`US 7,853,972 B2
`
`5
`In another implementation, the media processing system
`100 can receive content from any of the computing devices
`206 and 208, and other such computing devices or data stores
`210 available on the network 202 through sharing. Thus, if
`any one or more of the computing devices or data stores are
`unavailable, media data and/or meta data one the remaining
`computing devices or other such computing devices or data
`stores can still be accessed.
`
`5
`
`FIG. 4 is a block diagram of an example interface environ-
`ment 300. The interface environment 300 can include a menu
`
`10
`
`6
`can include additional information associated with the con-
`
`tent related to the highlighted menu item 306. In various
`examples, the additional information can include metadata
`about the content associated with the highlighted menu item
`306. The metadata in various examples can include any of
`actor(s), director(s), producer(s), genre(s), summary descrip-
`tion, a recommended minimum maturity level (e.g., Motion
`Picture Association of America (MPAA) rating) associated
`with the content, critical review(s), release date(s), episode
`title, episode number, audio or video format, movie poster(s),
`production still(s), duration or length, along with subsets and
`combinations thereof.
`
`3 02 arranged within the interface environment. The menu 3 02
`can include any number of menu items, and can be arranged,
`for example, on the right side of the interface environment
`300. However,
`in other examples the menu 302 can be
`arranged in other ways within the interface environment 300.
`The menu items can correspond to available content (e.g.,
`downloadable content, stored content, etc.), whereby selec-
`tion of a menu item can cause the media system 100 to present
`the content.
`The interface environment 300 can also have a menu item
`
`abstraction 304 arranged within the interface environment
`300. The menu item abstraction 304 can be selected based
`
`upon an association with a highlighted menu item 306. In
`some implementations, the menu item abstraction can be a
`first abstraction 304. The first abstraction canbe, for example,
`a digital representation of art associated with the movie. In
`various examples, art can include one or more movie posters,
`one or more productions stills, or any other promotional
`material, or combinations thereof. The type of menu item
`abstraction displayed can depend on the type of content asso-
`ciated with the highlighted menu item 306. For example, if
`the content is a movie, then the menu item abstractions can be
`digital representations of movie posters or movie stills or
`thumbnail associated with a portion of video content. Like-
`wise, if the content is audio books, then the menu item
`abstractions can be digital representations of book jackets.
`Other menu item abstractions can also be displayed depen-
`dent upon the content associated with the highlighted menu
`item 306. For example, a menu item abstraction for a photo
`can include a representative photo associated with a group of
`photos, or a collage ofthe group ofphotos. In other examples,
`a menu item abstraction for audio content can include an
`album cover art or related still.
`
`FIG. 5 is block diagram of another example interface envi-
`ronment 400. In some implementations, the interface envi-
`ronment 400 results from a user highlighting a menu item 306
`from the menu 302 for a predetermined period of time (e.g.,
`more than a few seconds). A transition between the interface
`environment 300 of FIG. 4 and the interface environment 400
`
`of FIG. 5 can include wiping out the menu item abstraction
`304 of FIG. 4 and wiping in menu item abstractions 402, 404
`ofFIG. 5. In other implementations, the transition can include
`fading out of the abstraction 304 of FIG. 4 and fading in the
`abstractions 402, 404 of FIG. 5. In still further implementa-
`tions, the interface environment 400 can be used instead ofthe
`interface environment 300 of FIG. 4. Other animations or
`transitions between the interface environments can be used in
`
`various example implementations.
`The interface environment 400 can include the menu 302
`
`arranged within the interface environment 400. The interface
`environment 400 can further include the menu item abstrac-
`
`tions 402, 404 associated with a highlighted menu item 306.
`A first abstraction 402 can include a digital representation of
`promotional media (e.g., movie poster(s), preview(s), pro-
`duction stills, etc.). In this example, the first abstraction 402 is
`a preview (e.g., a movie trailer, episode clip, etc.) associated
`with the highlighted menu item 3 06. A second abstraction 404
`
`FIGS. 6 and 7 are block diagrams depicting an example
`interface environment transition. In some implementations,
`the interface environment 300 of FIG. 4 can transition to a
`
`15
`
`preview interface environment 500 of FIGS. 6 and 7 by scal-
`ing a first abstraction 304 to a scaled first abstraction 502
`based upon an event. As an example, the scaling can be
`performed by one or more of the media engines 116-1, 116-2,
`116-n of FIG. 1, such as, for example, a presentation engine
`that can be configured to receive data and render menus and
`other graphics for display on a display device.
`The event causing the transition between user interface 300
`ofFIG. 4 anduser interface 500 ofFIGS. 6 and 7, for example,
`can be a highlighting of a menu item for a predetermined
`period of time (e.g., a few seconds). Alternatively, the event
`can be based upon input (e.g., a request, a selection, etc.)
`received from the user. During and following the transition,
`the menu 302 of FIG. 4 and list of menu items can remain
`
`arranged within the interface environment 500 as they were in
`the interface environment 300 of FIG. 4. Selection of a menu
`item from the interface environment 500 can cause a media
`
`system 100 to present content associated with the selected
`menu item.
`
`FIG. 7 is block diagram of another example ofthe interface
`environment 500. As mentioned above, the interface environ-
`ment 300 of FIG. 4 can transition to a preview interface
`environment 500 of FIGS. 6 and 7 by scaling a first abstrac-
`tion 304 to a scaled first abstraction 502 based upon, for
`example, an event. After scaling the first abstraction, a second
`abstraction 504 can be transitioned into the interface environ-
`
`ment 500 of FIG. 7. As an example, the transition of the
`second abstraction 504 into the interface environment 500
`
`can be performed by a media engine 116-1, 116-2, 116-n,
`such as, for example, a presentation engine.
`Transitioning the second abstraction 504 into the interface
`environment 500 can include a fade-in, a wipe-in, pixilation
`in, or a reveal from behind the first abstraction, among many
`other types of transitions. In various examples, the transition
`can be based upon a preference received from a user for a
`particular kind of transition selected from multiple types of
`transitions. The preference can be received, for example,
`through a user interface.
`As an example, the second abstraction 504 can include
`additional information associated with a highlighted menu
`item 306. In some implementations, the additional informa-
`tion can be metadata associated with the highlighted menu
`item 306. Metadata in various instances can include any of
`any of actor(s), director(s), producer(s), genre(s), summary
`description, a recommended minimum maturity level (e. g.,
`Motion PictureAssociation ofAmerica (MPAA) rating) asso-
`ciated with the content, critical review(s), release date(s),
`episode title, episode number, audio or video format, movie
`poster(s), production still(s), along with subsets and combi-
`nations thereof. In further examples, the second abstraction
`504 may include a preview (e.g., theatrical trailer, episode
`highlights, etc.) of the highlighted menu item 306.
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Page 00011
`
`Page 00011
`
`

`

`US 7,853,972 B2
`
`7
`FIGS. 8 and 9 are block diagrams depicting another
`example interface environment transition. In some imple-
`mentations, the interface environment 300 of FIG. 4 can
`transition to a preview interface environment 600 of FIGS. 8
`and 9 by scaling a first abstraction 304 to a scaled first abstrac-
`tion 602 based upon an event. As an example, the scaling can
`be performed by a media engine 116-1, 116-2, 116-n, such as,
`for example, a presentation engine.
`The event causing the transition between user interface 3 00
`ofFIG. 4 anduser interface 600 ofFIGS. 8 and 9, for example,
`can be a highlighting of a menu item for a period oftime (e. g.,
`a few seconds). Alternatively, the event can be based upon
`input (e. g., a request, a selection, etc.) received from the user.
`During and following the transition, the menu 302 of FIG. 4
`and list of menu items can remain arranged within the inter-
`face environment 600 as they were in the interface environ-
`ment 300 of FIG. 4. Selection of a menu item from the
`
`5
`
`10
`
`15
`
`interface environment 300 can cause a media system 100 to
`present content associated with the selected menu item.
`FIG. 9 is block diagram of another example interface envi- 20
`ronment 600. As mentioned above, the interface environment
`300 of FIG. 4 can transition to a preview interface environ-
`ment 300 of FIGS. 8 and 9 by scaling a first abstraction 304 to
`a scaled first abstraction 602 based upon an event. After
`scaling the first abstraction 304, a second abstraction 604 can 25
`be transitioned into the interface environment 600 of FIG. 9.
`
`30
`
`As an example, the transition of the second abstraction 604
`into the interface environment 600 can be performed by a
`media engine 116-1, 116-2, 116-n, such as, for example, a
`presentation engine.
`Transitioning the second abstraction 604 into the interface
`environment 600 can include any of a number of different
`types of transitions (e. g., fade-in, pixilation in, wipe-in,
`reveal, page turn, etc.). In various examples, the transition can
`be based upon a preference received from a user for a particu- 35
`lar kind of transition selected from multiple types of transi-
`tions. The preference can be received, for example, through a
`user interface engine 114.
`As a

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket