`
`(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2007/0192818 A1
`Bourges-Sevenier et al.
`(43) Pub. Date:
`Aug. 16, 2007
`
`US 20070192818A1
`
`(54) SYSTEM AND METHOD FOR CREATING,
`DISTRIBUTING, AND EXECUTING RICH
`MULTIMEDIA APPLICATIONS
`(76) Inventors: Mikael Bourges-Sevenier, Cupertino,
`CA (US); Paul Collins, Oakland, CA
`(US)
`Correspondence Address:
`HELLEREHRMAN LLP
`4350 LAOLLAVILLAGE DRIVE #7OO
`7TH FLOOR
`SAN DIEGO, CA 92122 (US)
`(21) Appl. No.:
`11/250,003
`
`(22) Filed:
`
`Oct. 12, 2005
`Related U.S. Application Data
`(60) Provisional application No. 60/618,455, filed on Oct.
`12, 2004. Provisional application No. 60/618,365,
`filed on Oct. 12, 2004. Provisional application No.
`60/618,333, filed on Oct. 12, 2004. Provisional appli
`cation No. 60/634,183, filed on Dec. 7, 2004.
`
`
`
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`HO)4N 7/173
`(2006.01)
`H04N 7/16
`(52) U.S. Cl. ......................... 725/132; 725/100; 725/131;
`725/139; 72.5/151
`
`(57)
`
`ABSTRACT
`
`The aim of this invention is to provide a complete system to
`create, to deploy and to execute rich multimedia applications
`on various terminals and in particular embedded devices. A
`rich multimedia application is made of one or more media
`objects, being audio or visual, synthetic or natural, metadata,
`and their protection being composed and rendered on a
`display device over time in response to preprogrammed
`logic and user interaction. We describe the architecture of
`Such a terminal, how to implement it on a variety of
`operating systems and devices, and how it executes down
`loaded rich, interactive, multi-media applications, and the
`architecture of Such applications.
`
`
`
`TRILLER EXHIBIT 1018-002
`
`Patent Application Publication Aug. 16, 2007 Sheet 1 of 31
`
`US 2007/01928.18A1
`
`
`
`!, eun61-I
`
`
`
`TRILLER EXHIBIT 1018-003
`
`Patent Application Publication Aug. 16, 2007 Sheet 2 of 31
`
`US 2007/01928.18A1
`
`
`
`z 9un61-I
`
`
`
`TRILLER EXHIBIT 1018-004
`
`Patent Application Publication Aug. 16, 2007 Sheet 3 of 31
`
`US 2007/01928.18A1
`
`
`
`
`
`TRILLER EXHIBIT 1018-005
`
`Patent Application Publication Aug. 16, 2007 Sheet 4 of 31
`
`US 2007/01928.18A1
`
`
`
`y ?un61-I
`
`
`
`TRILLER EXHIBIT 1018-006
`
`Patent Application Publication Aug. 16, 2007 Sheet 5 of 31
`
`US 2007/01928.18A1
`
`
`
`Programmatic
`compositor
`
`-- Figure 5
`
`
`
`TRILLER EXHIBIT 1018-007
`
`Patent Application Publication Aug. 16, 2007 Sheet 6 of 31
`
`US 2007/01928.18A1
`
`
`
`Figure 6
`
`
`
`TRILLER EXHIBIT 1018-008
`
`Patent Application Publication Aug. 16, 2007 Sheet 7 of 31
`
`US 2007/01928.18A1
`
`
`
`- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
`
`Component
`
`Figure
`7
`
`
`
`TRILLER EXHIBIT 1018-009
`
`Patent Application Publication Aug. 16, 2007 Sheet 8 of 31
`
`US 2007/01928.18A1
`
`
`
`Component A
`
`Buffer
`
`Component B
`
`Figure 8
`
`
`
`TRILLER EXHIBIT 1018-0010
`
`Patent Application Publication Aug. 16, 2007 Sheet 9 of 31
`
`US 2007/01928.18A1
`
`
`
`Vertex data
`
`Pixel data
`
`
`
`TRILLER EXHIBIT 1018-0011
`
`Patent Application Publication Aug. 16, 2007 Sheet 10 of 31
`
`US 2007/01928.18A1
`
`
`
`Mindego.
`Mindego components
`OSGi
`OSGi services
`Native
`
`Figure 10.
`
`
`
`TRILLER EXHIBIT 1018-0012
`
`Patent Application Publication Aug. 16, 2007 Sheet 11 of 31
`
`US 2007/01928.18A1
`
`
`
`
`
`MEGCGies
`
`MDSs calia
`
`Terriral
`
`
`
`&Bigge99te
`
`
`
`Figure 11
`
`
`
`TRILLER EXHIBIT 1018-0013
`
`Patent Application Publication Aug. 16, 2007 Sheet 12 of 31
`
`US 2007/01928.18A1
`
`
`
`Mindego Application Manag
`
`...Fig.
`
`Terminal
`
`Figure 12
`
`
`
`TRILLER EXHIBIT 1018-0014
`
`Patent Application Publication Aug. 16, 2007 Sheet 13 of 31
`
`US 2007/01928.18A1
`
`
`
`Figure 13
`
`
`
`TRILLER EXHIBIT 1018-0015
`
`Patent Application Publication Aug. 16, 2007 Sheet 14 of 31
`
`US 2007/01928.18A1
`
`(Application
`
`
`
`
`
`2DigiDe?iderer
`
`elease...a....
`
`
`
`
`
`
`
`TRILLER EXHIBIT 1018-0016
`
`Patent Application Publication Aug. 16, 2007 Sheet 15 of 31
`
`US 2007/01928.18A1
`
`
`
`Application
`
`Figure 15
`
`Figure 16
`
`
`
`TRILLER EXHIBIT 1018-0017
`
`Patent Application Publication Aug. 16, 2007 Sheet 16 of 31
`
`US 2007/0192818A1
`
`
`
`Appi
`Rendereri
`
`Aojo.
`Renege?2
`
`Figure 17
`
`
`
`TRILLER EXHIBIT 1018-0018
`
`Patent Application Publication Aug. 16, 2007 Sheet 17 of 31
`
`US 2007/0192818A1
`
`
`
`Gönix
`
`
`
`TRILLER EXHIBIT 1018-0019
`
`Patent Application Publication Aug. 16, 2007 Sheet 18 of 31
`
`US 2007/01928.18A1
`
`
`
`Scene isferrer
`
`Figure 19
`
`
`
`TRILLER EXHIBIT 1018-0020
`
`Patent Application Publication Aug. 16, 2007 Sheet 19 of 31
`
`US 2007/01928.18A1
`
`
`
`Figure 20
`
`
`
`TRILLER EXHIBIT 1018-0021
`
`Patent Application Publication Aug. 16, 2007 Sheet 20 of 31
`
`US 2007/0192818A1
`
`
`
`
`
`TRILLER EXHIBIT 1018-0022
`
`Patent Application Publication Aug. 16, 2007 Sheet 21 of 31
`
`US 2007/0192818A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`ZZ ?un61
`
`
`
`TRILLER EXHIBIT 1018-0023
`
`Patent Application Publication Aug. 16, 2007 Sheet 22 of 31
`
`US 2007/0192818A1
`
`
`
`£Z ?un61
`
`
`
`TRILLER EXHIBIT 1018-0024
`
`Patent Application Publication Aug. 16, 2007 Sheet 23 of 31
`
`US 2007/01928.18A1
`
`
`
`(pasqe}}, .
`
`
`
`
`
`
`
`
`
`
`
`
`
`TRILLER EXHIBIT 1018-0025
`
`Patent Application Publication Aug. 16, 2007 Sheet 24 of 31
`
`US 2007/01928.18A1
`
`
`
`GZ
`?J
`
`
`
`TRILLER EXHIBIT 1018-0026
`
`Patent Application Publication Aug. 16, 2007 Sheet 25 of 31
`
`US 2007/01928.18A1
`
`
`
`Figure26
`
`
`
`TRILLER EXHIBIT 1018-0027
`
`Patent Application Publication Aug. 16, 2007 Sheet 26 of 31
`
`US 2007/01928.18A1
`
`Command buffer structure
`
`
`
`Figure 27
`
`
`
`TRILLER EXHIBIT 1018-0028
`
`Patent Application Publication Aug. 16, 2007 Sheet 27 of 31
`
`US 2007/01928.18A1
`
`
`
`
`
`... interface.
`EG1 .
`*
`abstract
`
`interfaces,
`-
`abstract
`
`. . . i
`
`v:
`
`tinterfaces
`EGL
`abstract
`
`
`
`Figure 28
`
`
`
`TRILLER EXHIBIT 1018-0029
`
`Patent Application Publication Aug. 16, 2007 Sheet 28 of 31
`
`US 2007/01928.18A1
`
`
`
`Initialization of Gray
`after EGretumsaald
`
`deallocate resource
`
`egoestroySurface
`
`estestroyContext
`
`f
`
`FIGURE 29
`
`
`
`TRILLER EXHIBIT 1018-0030
`
`Patent Application Publication Aug. 16, 2007 Sheet 29 of 31
`
`US 2007/01928.18A1
`
`
`
`
`
`TRILLER EXHIBIT 1018-0031
`
`Patent Application Publication Aug. 16, 2007 Sheet 30 of 31
`
`US 2007/01928.18A1
`
`i. i. Ei El
`
`
`
`
`
`. .
`
`it,
`EE
`
`C)
`
`d g
`r
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Hii
`
`
`
`
`
`
`
`TRILLER EXHIBIT 1018-0032
`
`Patent Application Publication Aug. 16, 2007 Sheet 31 of 31
`
`US 2007/0192818A1
`
`0. addJoystickListener(JoystickListener): void
`removejoysiusenerjoysticklistenen: Vold
`+ getNumjoystids): int.'...".
`
`
`
`sinterfaces
`Joystickelstoner.
`.
`abstract)
`* update Joystickant, Joystico: wold
`
`
`
`s
`
`
`
`cinterfaceo.
`Joysock -
`
`
`
`JOYBUTTON1:...int c 00001
`JOYBUTTON2: inte 0x0002
`JOY BUTTON3; int = 0x0004. .
`JOYBUTTON4:int c 0x000s. . . .
`JOYBUTON5 in c 00000010
`JOY BUTTON6: inte OOOOOO20
`JOYBUTTON7:ine ox00000040
`
`JOYBUTTON9:inis 0x0000000
`JOYBUON 10:...iris Ox0.000200
`JOY BUTTON11:...int c0x00000400.
`JOYBUTTON12:inta Ooooooboo
`JOYBUTTON13; int = 0x0000-1000
`JOY Butt ON 14; ints 0x00002000
`..., JOY BUTTON 15:... its 0x00004.000."
`JOY BUTTON6 int is 0x00008000
`JOYBUTTON17:int - 0x0001 0000
`Joy BurroNB: noooozoo
`JOYBUTTON19:...into 0x000.40000
`JOYBUTTON2O:lints 0x00080000
`OY BUTTON21:...ints 0x00100000;
`+.
`JOYBUTTON22:ints Ox00200000
`4.
`Joy BUTTON23rints oooooooo
`.JOYBUTTON24::int's Ox00800000
`JOY But ON25; its Ox01000000:
`JOY UTON28:"int a 0x02000000
`JOYBUTON27:int = 0x04000000
`JOYBUttON28: into Ox08000000.
`JOYBUttON29; into 0x000000
`JOYBUTTON30: intox20000000
`JOYUttON31:...int a 0x40000000
`JOYBUTTON32:Ints 0x80000000
`JOYERRNOERRornt so
`JOYErr UNPLUGGEd:nt
`JOYAXISX:lints 0x01
`JOYAXSY: inte Ox02.
`
`4.
`
`Figure 32
`
`
`
`TRILLER EXHIBIT 1018-0033
`
`US 2007/01928.18 A1
`
`Aug. 16, 2007
`
`SYSTEM AND METHOD FOR CREATING,
`DISTRIBUTING, AND EXECUTING RICH
`MULTIMEDIA APPLICATIONS
`
`REFERENCE TO PRIORITY DOCUMENT
`0001. This application claims priority of co-pending U.S.
`Provisional Application Ser. No. 60/618,455 entitled “Sys
`tem and Method for Creating, Distributing, and Executing
`Rich Multimedia Applications” by Mikael Bourges-Sev
`enier filed Oct. 12, 2004; U.S. Provisional Application Ser.
`No. 60/618,365 entitled “System and Method for Low-Level
`Graphic Methods Access for Distributed Applications” by
`Mikael Bourges-Sevenier filed Oct. 12, 2004; U.S. Provi
`sional Application Ser. No. 60/618,333 entitled “System and
`Method for Efficient Implementation of MPEG-Based Ter
`minals with Low-Level Graphic Access” by Mikael
`Bourges-Sevenier filed Oct. 12, 2004; U.S. Provisional
`Application Ser. No. 60/634,183 entitled “A Multimedia
`Architecture for Next Generation DVDs” by Mikael
`Bourges-Sevenier et al. filed Dec. 7, 2004. Priority of the
`filing dates of these applications is hereby claimed, and the
`disclosures of the Provisional Applications are hereby incor
`porated by reference.
`
`COMPUTER PROGRAM LISTING APPENDIX
`0002. Two identical compact discs (CDs) are being filed
`with this document. The content of the CDs is hereby
`incorporated by reference as if fully set forth herein. Each
`CD contains three files of computer code used in a non
`limiting embodiment of the invention. The files on each CD
`are listed in the File Listing Appendix at the end of the
`specification.
`
`COPYRIGHT NOTIFICATION
`0003) A portion of the disclosure of this patent document
`contains material that is Subject to copyright protection. The
`copyright owner has no objection to the facsimile reproduc
`tion by anyone of the patent document or the patent disclo
`sure, as it appears in the Patent and Trademark Office patent
`file or records, but otherwise reserves all copyright rights
`whatsoever.
`
`BACKGROUND
`0004. A multimedia application executing on a terminal
`is made of one or more media objects that are composed
`together in space (i.e. on the screen or display of the
`terminal) and time, based on the logic of the application. A
`media object can be:
`0005 Audio objects—a compressed or uncompressed
`representation of a Sound that is played on terminals
`speakers.
`0006 Visual objects—objects that provide a visual
`representation that is typically drawn or rendered onto
`the screen of the terminal. Such objects include still
`pictures and video (also called natural objects) and
`computer graphics objects (also called synthetic
`objects)
`0007 Metadata—any type of information that may
`describe audio-visual objects
`0008 Scripted logic—whether expressed in a special
`representation (e.g. a scene graph) or a computer lan
`guage (e.g. native code, bytecodes, scripts)
`
`0009 Security information (e.g. rights management,
`encryption keys and so on)
`Audio-visual objects can be
`0010 Natural—their description comes from natural
`means via a transducer or capture device Such as a
`microphone or a camera,
`0011 Synthetic—their description is a “virtual speci
`fication that comes from a computer. This includes
`artwork made with a computer and vector graphics.
`0012 Each media object may be transported by means of
`a description or format that may be compressed or not,
`encrypted or not. Typically, such description is carried in
`parts in a streaming environment from a stored representa
`tion on a server's file system. Such file formats may also be
`available on the terminal.
`0013 In early systems, a multimedia application con
`sisted of a video stream and one or more audio streams.
`Upon reception of Such an application, the terminal would
`play the video using a multimedia player and allow the user
`to choose between audio streams. In Such systems, the logic
`of the application is embedded in the player that is executed
`by the terminal; no logic is stored in the content of the
`application. Moreover, the logic of the application is deter
`ministic: the movie (application) is always played from a
`start point to an end point at a certain speed.
`0014 With the need of more interactive and customizable
`contents, DVDs were the first successful consumer systems
`to propose a finite set of commands to allow the user to
`navigate among many audio-video contents on a DVD.
`Unfortunately, being finite, this set of commands doesn't
`provide much interactivity besides simple buttons. Over
`time, the DVD specification was augmented with more
`commands but few titles were able to use them because titles
`needed to be backward compatible with existing players on
`the market. DVD commands create a deterministic behavior:
`the content is played sequentially and may branch to one
`content or another depending on anchors (or buttons) the
`user can select.
`0015. On the other end, successful advanced multimedia
`applications, such as games, are often characterized by a
`non-deterministic behavior: running the application multiple
`times may create different output. In general, interactive
`applications are non-deterministic as they tend to resemble
`more to lively systems; life is non-deterministic.
`0016. With the advent of the Internet era, more flexible
`markup languages were invented typically based on XML
`language or other textual description programming lan
`guage. The XML language provides a simple and generic
`Syntax to describe practically anything, as long as its syntax
`is used to create an extensible language. However, Such
`language has the same limitations as those with finite set of
`commands (e.g. like DVDs). Recently, standards such as
`MPEG-4/7/21 used XML to describe composition of media.
`Using a set of commands or descriptors or tags to represent
`multimedia concepts, the language grew quickly to encom
`pass so many multi-media possibilities that it became non
`practical or non usable. An interesting fact often mentioned
`is that applications may use different commands but typi
`cally only 10% would be needed. As such, implementing
`terminals or devices with all commands would become a
`
`
`
`TRILLER EXHIBIT 1018-0034
`
`US 2007/01928.18 A1
`
`Aug. 16, 2007
`
`huge waste of time and resources (both in terms of hardware/
`Software and engineering time).
`0017 Today, a new generation of web applications uses
`APIs available in the web browser directly or from appli
`cations available to the web browser. This enable creation of
`applications quickly by reusing other applications as com
`ponents and, since these components have been well tested,
`Such aggregate applications are cheaper to develop. This
`allows components to evolve separately without recompil
`ing the applications as long as their API doesn't change. The
`invention described in this document is based on the same
`principle but with a framework dedicated to multimedia
`entertainment rather than documents (as for web applica
`tions).
`0018. On the other end, the explosion of mobile devices
`(in particular phones) followed a different path. Instead of
`Supporting a textual description (e.g. XML) compressed or
`not, they provide a runtime environment and a set of APIs.
`The Java language environment is predominant on mobile
`phones and cable TV set-top boxes. The terminal downloads
`and starts a Java application. It interprets bytecode in a
`sand-box environment for security reasons. Using bytecodes
`instead of machine language instructions makes such pro
`grams OS (Operating Systems) and CPU (Central Process
`ing Unit) independent. More importantly, using a program
`ming language enables developers to create virtually any
`applications; developers are only limited by their imagina
`tion and the APIs on the device. Using a programming
`language, non-deterministic concepts such as threads can be
`used and hence enhance the realism and appeal of contents.
`0019. In view of this discussion, it should be apparent
`that with a programmatic approach, one can create an
`application that reads textual descriptions, interpret them in
`the most optimized manner (e.g. just for the commands used
`in textual descriptions), and use whatever logic see fit for
`this application. And, in contrary to textual description
`applications, programmatic applications can evolve over
`time and maybe located on different locations (e.g. applica
`tions may be distributed), independently on each axis:
`0020 Data representation
`0021 Application logic
`0022 Application features (including streaming, user
`interaction, and so on)
`0023) API
`0024 For example, a consumer buys a DVD today and
`enjoys a movie with some menus to navigate in the content
`and special features to learn more about the DVD title. Over
`time, the studio may want to add new features to the content,
`maybe a new look and feel to the menus, maybe allow users
`with advanced players to have better looking exclusive
`contents. Today, the only way to achieve that would be to
`produce new DVD titles. With an API approach, only the
`logic of the application may change and extra materials may
`be needed for the new features. If these updates were
`downloadable, production and distribution costs would be
`drastically reduced, content would be created faster and
`consumers would remain longer anchored to a title.
`0025) Even though runtime environments require more
`processing power for the interpreter, the power of embedded
`devices for multimedia today is not an issue. The APIs
`
`available on Such systems for multimedia applications is, on
`the other end, very important. The invention described in
`this document concerned an extensible, programmatic, inter
`active multi-media system.
`
`SUMMARY
`In accordance with an embodiment of the inven
`0026.
`tion, a multimedia terminal for operation in an embedded
`system, includes a native operating system that provides an
`interface for the multimedia terminal to gain access to native
`resources of the embedded system, an application platform
`manager that responds to execution requests for one or more
`multimedia applications that are to be executed by the
`embedded system, a virtual machine interface comprising a
`byte code interpreter that services the application platform
`manager; and an application framework that utilizes the
`virtual machine interface and provides management of class
`loading, of data object life cycle, and of application services
`and services registry, Such that a bundled multimedia appli
`cation received at the multimedia terminal in an archive file
`for execution includes a manifest of components needed for
`execution of the bundled multimedia application by native
`resources of the embedded system, wherein the native
`operating system operates in an active mode when a multi
`media application is being executed and otherwise operates
`in a standby mode, and wherein the application platform
`manager determines presentation components necessary for
`proper execution of the multimedia applications and
`requests the determined presentation components from the
`application framework, and wherein the application plat
`form manager responds to the execution requests regardless
`of the operating mode of the native operating system.
`0027. It should be noted that, although a Java environ
`ment is described, any scripting or interpreted environment
`could be used. The system described has been successfully
`implemented on embedded devices using a Java runtime
`environment.
`
`BRIEF DESCRIPTION OF DRAWINGS
`0028 FIG. 1 is a block diagram of a terminal constructed
`in accordance with the invention.
`0029 FIG. 2 is a Typical Player data flow.
`0030 FIG. 3 is an example of local/unicast/mulitcast
`playback data flow (e.g. for IP-based services).
`0031 FIG. 4 is the same as FIG.3 with DOM description
`replaced by Scripted logic.
`0032 FIG. 5 is a gigh-level view of a programmatic
`interactive multi-media system.
`0033 FIG. 6 is a multimedia framework: APIs (gray
`boxes) and components (green ovals). This shows passive
`and active objects a multimedia application can use.
`0034 FIG. 7 is the anatomy of a component: a light
`weight interface in Java, a heavyweight implementation in
`native (i.e. OS specific). Components can also be pure Java.
`The Java part is typically used to control native processing.
`0035 FIG. 8 is a buffer that holds a large amount of
`native information between two components.
`0.036
`FIG. 9 is an OpenGL order of operations.
`
`
`
`TRILLER EXHIBIT 1018-0035
`
`US 2007/01928.18 A1
`
`Aug. 16, 2007
`
`0037 FIG. 10 is Mindego framework's usage of OSGi
`framework FIG. 11 is the bridging non-OSGi applications
`with OSGi framework.
`0038 FIG. 12 is Mindego framework extended to support
`existing application frameworks. Many such frameworks
`can run concurrently.
`0039 FIG. 13 is Mindego framework support multiple
`textual description frameworks. Each description is handled
`by specific compositors which in turn uses shared (low
`level) services packaged as OSGi bundles.
`0040 FIG. 14 is an application may use multiple scene
`description.
`0041 FIG. 15 and FIG. 16 show different ways of cre
`ating applications:
`0.042
`FIG. 17 is two applications with separate graphic
`COInteXtS.
`0.043
`FIG. 18 is two applications sharing one graphic
`COInteXt.
`0044 FIG. 19 is an active renderer shared by two appli
`cations.
`FIG. 20 is a media pipeline (data flow from left to
`004.5
`right). Green ovals are OSGi bundles (or components). The
`blue oval is provided by the MDGlet application.
`0046 FIG. 21 shows buffers controls interactions
`between active objects such as decoders and renderer.
`0047 FIG. 22 is a media API class diagram.
`0048 FIG. 23 is the Player and Controls in a terminal.
`0049 FIG. 24 is the Mindego controls.
`0050 FIG. 25 is an Advanced Audio API. In blue are
`high-level objects easier to use than the low-level OpenAL
`wrappers AL and ALC interfaces.
`0051
`FIG. 26 is the Java bindings to OpenGL imple
`mentation.
`0.052
`FIG. 27 is the Command buffer structure. Each tag
`corresponds to a native command and params are arguments
`of this command.
`0053 FIG. 28 is the API architecture.
`0054 FIG. 29 is the sequence diagram for MPEGlet
`interaction with Renderer.
`0055 FIG.30 is the Scene and OGL API use OpenGL ES
`hardware, thereby allowing both APIs to be used at the same
`time.
`0056 FIG. 31 is the Scene API class diagram.
`0057 FIG. 32 shows the Joystick may have up to 32
`buttons, 6 axis, and a point of view.
`
`DETAILED DESCRIPTION
`
`1 Architecture
`1.1 High-Level Design
`0.058
`FIG. 1 depicts a terminal constructed in accordance
`with the invention. It will be referred to throughout this
`document as a Mindego Multimedia System (M3S) in an
`embedded device. It is composed of the following elements:
`
`0059 A multitasking operating system of the embed
`ded device 100.
`0060 AJVM running on the device 100, configured at
`least to Support Connected Device Configuration and
`Mobile Information Device Profile.
`0061 Mind
`Platf
`hich includes OSGi R3 b
`1ndego Platform (which includes
`1
`ut
`preferably R4)
`0062 Rendering hardware, such as
`0063 OpenGL 1.3 or 1.5 (see, for example, Silicon
`Graphics Inc. OpenGL 1.5. Oct. 30, 2003), or
`OpenGL ES 1.1 (see, for example, Khronos Group,
`OpenGL ES 1.1. http://www.khronos.org) compliant
`graphic chip
`0064. At least: audio stereo (preferably multichan
`nel) output and SPDIF output
`0065 S-VHS output, optionally: component output,
`DVI output
`0.066
`Basic multi-media components, such as
`0067 AVI decoder (see, for example, Microsoft.
`A VT file format. http://msdn.microsoft.com/library/
`default.asp?url=/library/en-us/directshow/htm/avi
`fileformat.asp), MP4 (see, for example, ISO/IEC
`14496-14, Coding of audio-visual objects, Part 14.
`MP4file format) demultiplexers
`0068. H.261/3/4 (see, for example, ISO/IEC 11172
`3, Coding of moving pictures and associated audio
`for digital storage media at up to about 1.5 Mbit/s,
`Part 3: Audio. 1993), MPEG-4 Video (see, for
`example, Coding of moving pictures and associated
`audio for digital storage media at up to about 1.5
`Mbit/s, Part 3. Audio, supra) support
`0069 MP3 decoder (see, for example, Coding of
`moving pictures and associated audio for digital
`storage media at up to about 1.5 Mbit/s, Part 3.
`Audio, supra), AAC (see, for example, ISO/IEC
`14496-3, Coding of audio-visual objects, Part 3.
`Audio), WAV audio support
`0070 XML support (see, for example, W3C. exten
`sible Markup Language (XML))
`0071
`Ethernet adapter, such as for
`0.072 TCP (see, for example, RFC 1889, RTP: A
`transport protocol for real-time applications, Janu
`ary 1996)/IP (see, for example, RFC 2326, RTSP.
`Real Time Streaming Protocol, April 1998), UDP
`(see, for example, RFC 768, UDP: User Datagram
`Protocol, August 1980), RTP (see, for example, RFC
`1889, RTP. A transport protocol for real-time appli
`cations, January 1996, supra)/RTSP (see, for
`example, RFC 2326, RTSP: Real Time Streaming
`Protocol, April 1998) protocols support
`0073. Flash memory for persistent storage of user
`preferences.
`Optionally, the terminal may have
`007.4 MPEG-2TS (e.g. TV tuner and/or DVD demux)
`0075 Audio/video encoders and multiplexers for
`video encoding and streaming
`
`
`
`TRILLER EXHIBIT 1018-0036
`
`US 2007/01928.18 A1
`
`Aug. 16, 2007
`
`0.076 UPnP (see, for example, Universal Plug and Play
`(UPnP). http://www.upnp.org) support for joysticks,
`mouse, keyboards, network adapters, etc.)
`0.077 USB 2 interface (see, for example, Universal
`Serial Bus (USB). http://www.usb.org) (to support
`mouse, keyboard, joysticks, pads, hard disks, etc.)
`0078 Hard disk
`0079 DVD reader
`0080 Multi Flash card reader and smart card reader
`0081. The last three items may not be included as USB
`support enables users to add these features to the terminal
`from third party vendors.
`0082 FIG. 2 depicts the data flow in a typical player. The
`scene description is received in the form of a Document
`Object Model (DOM). Note that in computer graphics, it is
`often called a scene and, with the advent of web pages and
`XML, the term DOM once reserved to describe web pages
`has been extended to encompass any tree-based representa
`tion. The DOM may be carried compressed or uncom
`pressed, in XML or any other textual description language.
`For web pages, the language used is HTML, for MPEG-4
`(see, for example, Coding of moving pictures and associated
`audio for digital storage media at up to about 1.5 Mbit/s,
`Part 3: Audio, supra) it is called BIFS (see, for example,
`Coding of moving pictures and associated audio for digital
`storage media at up to about 1.5 Mbit/s, Part 3. Audio,
`supra), for 3D descriptions, VRML (see, for example, ISO/
`IEC 14772, Virtual Reality Modeling Language (VRML)
`1997 http://www.web3d.org/x3d/specifications/vrml/) or
`X3D (see, for example, ISO/EC 19775, eXtensible 3D
`(X3D). 2004. http://www.web3d.org/x3d/specifications/x3d/
`specification.html) or Collada (see, for example, Collada.
`http://www.collada.org) or U3D (see, for example, World
`Wide Web Consortium (W3C). Scalar Vector Graphics
`(SVG)) may be used, for 2D descriptions, SVG (see, for
`example, World-Wide Web Consortium (W3C). Scalar Vec
`tor Graphics (SVG), supra) may be used, and so on. The
`main characteristic of a DOM is to describe the assembly of
`various media objects onto the screen of the terminal. While
`the description is often visual and static, advanced DOM
`may be dynamic (i.e. evolve over time) and may describe
`audio environment. Dynamic DOMs enable animations of
`visual and audio objects. If media objects have interactive
`elements attached to their description (e.g. the user may
`click on them or roll-over them), the content become user
`driven instead of being purely data driven where the user has
`no control over what is presented (e.g. as it is the case with
`TV-like contents). The architecture described in this docu
`ment enables user-driven programmatic multi-media appli
`cations.
`0083) The architecture depicted in FIG. 2 is made of the
`following elements:
`0084 Network or local storage 202—a multimedia
`application and all its media assets may be stored on the
`terminal locals storage or may be located on one or
`more servers. The transport mechanism used to
`exchange information between the terminal (the client)
`and servers is irrelevant. However, Some transport
`mechanisms are more Suited from Some media than
`others.
`
`0085) Demultiplexer 204 while multiple network
`adapters may be used to connect to the network,
`terminals typically have only one network adapter.
`Therefore all media are multiplexed at the server and
`must be demultiplexed at the terminal. Likewise, pack
`ets from different media that must be presented at
`similar times are time multiplexed. Once demulti
`plexed, packets of each stream are sent to their respec
`tive decoders and must be decoded at the decoding time
`Stamp.
`0.086 Decoders—a decoder transforms data packets
`from a compressed representation to a decompressed
`representation. Some decoders may just be pass
`through as it is often the case with web pages. Decoder
`output may be a byte array (e.g. in case of audio and
`video data) or a structured list of objects (e.g. typically
`the case with synthetic data like vector graphics or a
`scene graph like a DOM). Decoders can included DOM
`206, graphics 208, audio 210, and visual 212.
`0087 Compositor 214. From a DOM description, the
`compositor mixes multiple media together and issues
`rendering commands to a renderer
`0088 Renderer—a visual renderer 216 draws objects
`onto the terminals screen and an audio renderer 218
`renders sound to speakers. Of course, other types of
`renderers can be used (printers, lasers, and so on) but
`Screen 220 and speakers 222 are the most common
`output forms.
`0089. User the user interacts with the system via the
`compositor to provide input commands.
`0090 FIG. 2 depicts typical playback architecture but it
`doesn’t describe how the application arrives and is executed
`on the terminal. There are essentially two ways:
`0091
`Broadcast The terminal listens to a particular
`channel and waits until a descriptor signals an appli
`cation is available in the stream. This application can be
`a simple video and multiple audio streams (e.g. a TV
`channel) or can be more complex with a DOM or with
`bytecode. Once the application is started, it connects to
`the streams that provide its necessary resources (e.g.
`audio and video streams). In the case of TV broadcast
`ing, the network element can be replaced by a MPEG-2
`TS demultiplexer (to choose the TV channel) and the
`Demux enables demultiplexing of audio-visual data for
`a particular channel.
`0092 Local or download The terminal requests a
`server to send a file that describes an application. Once
`this application is downloaded, it may request the
`terminal to ask for resources on the same or on different
`servers and different protocols may be used depending
`on resilience and QoS needed on the streams. Once the
`connection is established between one or more servers
`and the terminal, the application behaves as in the
`broadcast case.
`0093 FIG. 3 shows an alternative representation of FIG.
`2. In this figure, the network adapter behaves like a multi
`plexer and media assets with synchronized streams (e.g. a
`movie) may use a multiplexed format. In this case, we say
`that a player manages such assets and one could say that a
`multimedia application manages multiple players.
`
`
`
`TRILLER EXHIBIT 1018-0037
`
`US 2007/01928.18 A1
`
`Aug. 16, 2007
`
`0094 FIG. 3 is often found on IP-based services such as
`web applications and it should be clear that the network
`could also a file on the local file system of the terminal. The
`architecture of FIG. 2 is typically found in broadcast scenari.
`One of the advantage of FIG. 3 is for applications to request
`and to use media from various servers, which is typically not
`possible with broadcast scenari.
`0.095
`Instead of DOM descriptions, scripted logic may
`be used. FIG. 4 shows a terminal with pure scripted logic
`used for applications. By pure we mean that no DOM is used
`as the central application description because otherwise
`using scripts simply modifies the DOM. In the case of purely
`Scripted applications, the Script communicates with the
`terminal via Application Programming Interfaces (APIs).
`The script defines its own way to compose media assets, to
`control players, to render audio-visual objects on the termi
`nal's screen and speakers. This approach is the most flexible
`and generic and the one used in this document since it also
`enables usage of any DOM by simply implementing DOM
`processors in the script and the DOM description to be one
`type of Script's data.
`1.2 Concepts
`0096.
`Following