throbber
ISCAS 2000 - IEEE International Symposium on Circuits and Systems, May 28-31, 2000, Geneva, Switzerland
`
`Interactive Broadcast Digital Television
`
`The OpenTV Platform versus the MPEG-4 Standard Framework
`
`Frederic Bouilhaguet, Jean-Claude Dufourd, Souhila Boughoufalah,
`Christophe Havet *
`ENST Paris, dept ComElec - 46 rue Barrault - 75013 Paris, France
`* OpenTV Europe - 160 bis, rue de Paris - 92645 Boulogne Cedex, France
`E-mail: {bouilhaguet, dufourd, souhila) @enst.fr, chavet@opentv.com
`
`Abstract
`
`The last years have seen a vast increase in Interactive
`Broadcast Digital TV systems featuring proprietary
`platforms coupled with the standard MPEG-2. OpenTV is
`an example of a proprietary platform present on the
`market. The study of the OpenTV multimedia delivery
`framework and our involvement in MPEG-4 normalisation
`works give us the occasion to compare the OpenTV
`system architecture with the MPEG-4 one to address 2D
`interactivity in Broadcast Digital TV applications. We
`study their respective approaches to define 2D user
`interfaces and to manage dynamic interactive audio-visual
`contents.
`
`1. Introduction
`Changes in the nature, scope and extent of multimedia on-
`line applications are very rapid, due to the major
`developments
`in computing and
`telecommunications
`technology. Broadcast Television has undergone important
`evolutions since it started to be digital. After broadcast
`digital channels with only audio and video (MPEG-2), the
`growing
`tendency
`is
`to develop
`technologies
`for
`interactive contents based on push/pull scenarios. Starting
`from the only video and audio model standardised by
`MPEG-2, the Broadcast Digital TV industry has provided
`new systems using objects description techniques to afford
`the bases for developments of interactive services. The
`transport structure of MPEG-2, called Transport Stream
`(TS), allows to multiplex with the MPEG audio and video
`signals any other type of digital stream, the so-called
`“private data’’. So the market didn’t wait for a new version
`of MPEG standard to provide new system architectures for
`the development of interactive programs compatible with
`MPEG-2 TS. The definitions of these system archtectures
`consisted in specifying structures of new streams, as
`shown in thejigure I , and their corresponding codecs to
`transport the spatio-temporal descriptions needed for
`interactivity.
`The OpenTV platform provides one of these system
`architectures. It is based on a new opentv stream added to
`MPEG2-audio and MPEG2-video ones. The opentv
`
`stream transmits OpenTV applications that are computer
`programs. The OpenTV applications are currently
`developed in ANSI C and compiled with a special
`development kit compiler. The output from the compiler is
`called 0-code and consists of a private byte code that is
`interpreted by the 0-code interpreter and executed on the
`digital interactive decoder. OpenTV applications make o-
`code function calls to OpenTV library. The library routines
`initiate operations or
`requests. The
`implemented
`Application Programming Interface (API) of this library
`determines the structure and rules of composition of the
`visual and audio objects.
`
`Today, the Quality of Service (QoS) is starting to suffer
`from serious problems of incompatibilities between all the
`multiple system architectures provided by different
`companies. Final users must have several decoders to
`decode
`interactive channels provided by different
`broadcasters who chose different system architectures.
`Content producers must redevelop the same interactive
`audio-visual application to suit all the platforms.
`MPEG with the new MPEG-4 standard version aims at
`specifying an open system architecture adaptable to
`potential future platforms of Broadcast Digital TV and
`suppressing previous sources of incompatibility. The first
`component of the MPEG-4 system architecture to fit the
`requirements is two new elementary streams: the Binary
`description Scene Format (BiFS) stream and the Object
`Descriptors stream. An MPEG-4 scene is a set of
`organised audio and visual objects with behaviours that
`constitutes a part of a 2D interactive application. BiFS is
`the compressed format in which scenes are defined and
`modified. BiFS is composed of 2D and 3D profiles. Our
`interest here is 2D only.
`In this paper, we show how OpenTV and MPEG-4 differ
`in their common requirements for:
`- Composing interactive user interfaces from object
`description techniques (section 2)
`- Communicating events between objects (section 3)
`- Having access to multiple audio and video data
`sources (section 4)
`
`0-7803-5482-6/99/$10.00 02000 IEEE
`
`111-626
`
`Authorized licensed use limited to: Immanuel Freedman. Downloaded on July 07,2020 at 20:56:58 UTC from IEEE Xplore. Restrictions apply.
`
`Roku EX1025
`U.S. Patent No. 10,334,311
`
`

`

`Figure 1: rypical transmission clmin of broadcast interactive channels
`
`2. Gadget tree versus scene graph
`their
`OpenTV and MPEG-4 describe objects and
`behaviour in hierarchical models. MPEG-4 uses the
`concept of a scene graph with object nodes, while
`OpenTV prefers a gadger tree with gadgets.
`
`,
`
`2.1 Scene graph (MPEG-4)
`The grauh and its drawing order:
`An MPEG-4 scene is constructed as a direct a-cyclic graph
`of nodes.
`The following types of nodes exist:
`- Grouping nodes construct the scene structure;
`- Children nodes are the children of grouping nodes that
`represent the multimedia objects in the scene.
`- Interpolator nodes are another subtype of children nodes
`that represent interpolation data to perform key frame
`animation. These nodes generate a sequence of values as a
`function of time or other input parameters.
`- Sensor nodes sense the user and environment changes for
`authoring interactive scenes.
`BiFS scenes are composed of a collection of nodes
`arranged in a hierarchical tree. Each node represents,
`groups, or transforms an object in the scene and consists of
`a list of fields that define the particular behaviour of the
`node.
`The root node of a 2D BIFS scene can be an
`OrderedGroup or a Layer2D node. OrderedGroup may be
`used to specify the drawing order of elements of the scene
`because, this grouping node controls the visual layering
`order of its chldren.
`Reusing obiects:
`BIFS has a mechanism for reusing nodes. For example,
`once a complex graphic object is defined as a collection of
`geometric primitive nodes collected inside a Group node,
`it is possible to reuse the object elsewhere in the scene,
`rather than copying it explicitly wherever it is to appear.
`Each reusable node has a binary ID, and wherever a node
`can appear in the scene description, the ID of a reusable
`node can be inserted.
`The prototypes:
`to
`BIFS provides ways
`and
`encode PROTO
`EXTERNPROTO. These scene constructs enable the
`definition of new interfaces to user-constructed scene
`components. For example, a button PROTO can be
`constructed which accepts a string label as an input
`parameter. The body of the PROTO consists of a scene
`portion that draws a button (using, say, a Box node) and
`renders
`the string parameter on
`the button. The
`EXTERNPROTO is similar to the PROTO, except that its
`
`definition is not part of the scene. Instead, its definition is
`referenced using a URL. This enables the construction and use
`of on-line libraries of PROTOs.
`
`2.2 Gadget Tree (OTV)
`The tree and its drawing order:
`OpenTV provides an object-oriented framework for defining
`classes of user interface elements called gadgets. A gadget
`class specifies the behaviour functions for all gadgets of the
`same class. Gadgets are created and combined by an OpenTV
`application to form its user interface. They are drawn on the
`screen in a predictable order of a tree structure, as shown in
`the figure 2. The position of a gadget in the tree determines
`the order in which it is drawn. Drawing starts at the root, and
`descends the left-most branch first, then traverses that branch,
`then moves onto the sibling just to the right.
`At any given time, the interface manager of the OpenTV
`operating system recognises one gadget to be the root of the
`sub-tree currently being displayed on the TV screen. This is
`set
`by
`calling
`the
`function.
`0-ui-set-root
`The root may change over
`the
`lifetime
`of
`the
`application
`if
`it
`is
`necessary
`to display a
`new screen. Gadgets must
`belong
`to
`the sub-tree
`Figure 2: OWgadget tree drawing order owned by the designated
`root to be visible on the TV screen. There are functions for
`creating, activating (making visible), deactivating (making
`invisible), deleting gadgets. The Ojadget-attach function
`attaches a new child gadget to a specified gadget.
`
`~~
`
`Create some button gadgets and attach them t o 'this' group. * /
`while (nextItem ! = null) (
`o-gadget next-Button =
`O_gadget-new-from_resource('nextItem);
`neXtItemti ;
`0 gadsat attafh(this,next_Button);
`-
`) -
`Reusing gadgets:
`function creates new gadgets, i.e.
`The Ojadger-new
`instances of a gadget class. A gadget can be reused in its
`application
`scope until
`it
`is
`removed with
`the
`Ojadget-delete function.
`The urototvues:
`There is nothing here equivalent to what MPEG-4 calls
`prototypes. Each gadget class is in fact prototyped. Gadget
`class's developers define a structure for passing initial value
`and support functions to supplement the functionality of the
`gadget class.
`
`3. Events communication
`
`111-627
`
`Authorized licensed use limited to: Immanuel Freedman. Downloaded on July 07,2020 at 20:56:58 UTC from IEEE Xplore. Restrictions apply.
`
`

`

`3.1 Sensors and routes (MPEG-4)
`The node fields are labelled as being of type field, eventln,
`eventout, or exposedField. The field label is used for
`values that are set only when instantiating the node. Fields
`that can receive incoming events have the eventln label,
`whereas fields that emit events are labelled as eventout.
`Finally, some fields may set values but also receive or emit
`events, in which case they are labelled as exposedField
`To describe interactivity and behaviour of scene objects,
`the MPEG-4 event architecture defined sensors and routes.
`Sensor nodes generate events based on user interaction on
`a trigger node or a change in the scene. An event can be
`routed from any sensor node eventout field to interpolator
`or other nodes to change the attributes of these target
`nodes. If routed to an interpolator, a new parameter is
`interpolated according to the input value, and is finally
`routed to the target node eventh field. This target node
`processes the event. The scope of a sensor is delimited to
`the children of the grouping node that contains it. In other
`words, routes are used to propagate events between scene
`elements. They are connections that assign the value of
`one field to another field.
`The following figure 5 and the BIFS text of 4.2 shows an
`example of how to trigger an action as the cursor rolls over
`a button object.
`
`is
`associated message
`0-ui-msg-used function
`
`consumed
`
`by
`
`calling
`
`the
`
`example,
`following
`the
`In
`navigation between buttons
`IS
`done by pressing the left arrow
`key and an action is performed
`the enter "OK" key xs
`when
`released on the button, as seen in
`the figure 4. The ancestors of the
`button gadgets may handle other keys.
`
`F p r e 4 remote control
`
`/* The events handler function of the button gadget */
`static void button_meesage_handler(button+ this,o-message* msgi
`{
`
`switch (O-msg-type(msg))
`(
`/* A key has been pressed */
`case MSG-TYPE-KEY-DOWN
`button_key-down(this,msg).
`break,
`
`1
`
`I
`/* Handle key down events */
`static void b"tton_key-down(button* this, o-message* msg)
`(
`
`(
`
`switch(O-ui_msg-key(msg) i
`/* Handle the selection key +/
`case KEY-ENTER-CODE:
`/ * D r a w something to show
`that the button is pressed */
`. . . . . _
`/* Letknow we consumed the key event */
`/f The message is not passed to its father */
`0-ui-mag-used (msg) ;
`break;
`
`/* Pass the focus to its left button
`(in the gadget tree) */
`case KEY-LEFT-ARROW_CODE:
`I
`
`next-button 5 O-gadget-left_brother(this);
`/ * Only change focus if a left button exists */
`if (next-but.ton!=mL)
`o_sndgst_e.t-focu.(next_button);
`/ * Let know that we consumed the key event */
`/* The message is NOTpassed to its father */
`0-ui-msg-used ( m s g ) ;
`break;
`
`I
`
`4. Access to AudioNideo streams
`4.1 Object Descriptor (MPEG-4)
`MPEG-4 defines an ObjectDescriptor (OD) stream. It is
`coupled with the BIFS stream to identify and describe
`elementary streams that are associated to the audio-visual
`scene description. An OD is a collection of one or more
`elementary stream (ES) descriptors. It is assigned an identifier
`(object descriptor ID). An ES descriptor include information
`about the source of the stream data and encoding format for
`the decoding process.
`Here is an example of how we can switch with BIFS from a
`current audio source to a new one when the cursor rolls over a
`button. The BIFS audio node points to the audio data stream
`through the OD's ID value contained in the url field. The
`audio is switched by removing the current audio stream from
`the grouping node and appending the new one.
`
`3.2 Messages handlers (OpenTV)
`To support input processing, OpenTV have the notion of
`focus. Only one gadget in the tree is designated as having
`the focus. All input will be directed to this gadget. The
`gadget is notified of user input by receipt of messages of
`the appropriate types. If a gadget ignores an input
`message, it will propagate up to that gadget's parent, and
`so on, so processing of input is also affected by a gadget
`placement in the gadget tree.
`
`. . -
`I
`Figure 3: transmission of messages in the gadget tree
`At the initialisation of a gadget class, the function that
`handles events called message-handler is passed as an
`argument. This message-handler function is called each
`time a message (MSG-TYPE-NEW / DELETE / ACTIVATE /
`DEACTIVATE) is send to a gadget of this class. A key-
`pressed event is sent only to the gadget that has the focus.
`If the gadget doesn't use the key event, the event .is
`automatically routed to its parents recursively until its
`
`111-628
`
`Authorized licensed use limited to: Immanuel Freedman. Downloaded on July 07,2020 at 20:56:58 UTC from IEEE Xplore. Restrictions apply.
`
`

`

`BlFS fexr:
`DEF ID-100 Group{
`children I
`# * i f f original sound
`SoundZD {
`source audiosource
`ur1 5 # * * * object descriptor ID
`startTime 0
`stopTime -1
`) )
`DEF ID-211 conditional (
`buffer {
`# + * * * remove the original sound
`DELETE ID-lOO.children[O]
`# * * * * switch to the new Sound
`APPEND TO ID_lQO.children
`Sound2D {
`source audiosource (
`url 6 # * * f object d e s c r i p t o r I D
`startTime 0
`stopTime -1
`I111
`U I * * * hot spot area :
`Group (
`children I
`DEF 1205- Touchsensor ( )
`Transform2D [
`translation pos-X pos-Y
`children [
`Shape ( geometry Rectangle ( size w h ]
`appearance Appearance [ . . . )
`11 11
`# * * * f propagate events
`ROUTE IZO5-.isOver TO ID_21l.activate
`
`ObjecrDescripror of rhe audio m e a m :
`
`6
`
`( ObjectDescriptorID
`esdescr [
`ES-Descriptor {
`es-id 3
`decConfigDescr DecoderC~nfigDescriptor {
`ObjectTypeIndication
`0x40
`streamType
`5
`upstream
`FALSE
`bufferSizeDB
`8000
`maxeitrate
`avqeitrate
`
`0
`0
`
`I
`slConfigDescr SlConfigDescriptar (
`. . . #long sync layer parameter l i s t
`111)
`
`4.2 Station control (OTV)
`OpenTV programs can switch from elementary streams via
`function. A
`call
`to
`tlus
`the
`0-station-control
`asynchronous function enables to opedclose elementary
`is an actions
`streams. The parameter of 0-station-control
`list so several actions can be performed in a single call.
`When an action is completed, a message is posted in the
`events queue. The different streams supported are video
`(0-VIDEO), audio (0-AUDIO), opentv (0-OPENTV),
`subtitle (0-SUBTITLE), and teletext (0-TELETEXT).
`Here is an example of switching audio tracks by pressing a
`button gadget.
`
`Swirch 10 a new audio .mzam in rhe CJile:
`/* actions list */
`char switch-to-audio-1 [201 = {
`C-STREAM-ON. 5,
`0-AUDIO, ‘ O ’ , ‘ O ’ , ‘ l ’ , 0,
`C-END. 0 ) :
`. . . . . .
`/* Application main loop */
`for (;;)I
`/ * Get message from the events queue */
`o_ui-get_svent-wait (&msq) ;
`switch( O-msg-class(hmsg) I
`
`{
`
`case audiol-button-ID:
`O-station-control(ewitch-to_audio_l) ;
`break;
`. . . . . .
`I1
`
`Descriprion of rhe audio .stream in the rransporr srream configurarion file:
`elementary-stream
`(
`stream-type = audio-mpeg2
`elementary-streamqid = 523
`descriptor {
`descriptor-tag = iso-639-language
`language (
`language = “001”
`audio-type = 0 ) ) )
`
`5. Other aspects
`5.1 MPEG-J
`The BIFS stream offers a parametric scene representation
`while OpenTV rather offers a programmatic environment.
`MPEG-4
`standard will
`also offer
`a programmatic
`environment, in addition to its parametric capability. Java
`APIs are defined and a dedicated MPEG-J stream can be
`added to BIFS and OD streams. Access to an underlying
`MPEG-4 engine can be provided to Java applets, called
`MPEG-lets. MPEG-J forms the basis for very sophisticated
`applications, opening up completely new ways for audio-
`visual content creators to augment the use of their content.
`MPEG-J will be available in MPEG-4 version 2.
`5.2 Transparency and Composition
`The composition approach in OpenTV differs from MPEG-4.
`and
`From the use of functions such as Ojalette-set
`Ogalette-set-transparency, the current color lookup table
`(palette) is set and transparency can be applied to some of the
`indexed colors. MPEG-4 has a transparency field in the
`Appearance node that can be associated with any visual
`object, even video or still pictures. So transparency is applied
`per object and color space is not limited to a palette.
`6. Conclusion
`two systems for Interactive
`We have presented briefly
`Broadcast Digital TV, OpenTV and MPEG-4. The OpenTV
`environment is far from the MPEG-4 delivery framework if
`we just consider BIFS and OD streams, even if we saw that
`some common approaches existed in the two ways of
`organising visual objects. If we consider MPEG-J, MPEG-4
`moves toward the OpenTV programmatic approach. One
`perspective for the OpenTV framework could be to implement
`on top of the OpenTV engine the MPEG-J APIs with a Java
`VM in order to provide a first MPEG-4 terminal compliant
`with MPEG-J so that future MPEG-lets can execute on the
`OpenTV platform.
`There are pros and cons for both: on paper, MPEG-4 is much
`more powerful but OpenTV is here now.
`
`References :
`
`[I] “MPEG-4 Systems FDIS”, Document ISO/IEC
`JTCl/SC29/WGI llN2501, Atlantic City MPEG meeting, October
`1998
`[2] “MPEG-4 Visual FDIS’, Document ISOlIEC JTCl/SC29/WGI I/N2502,
`Atlantic City MPEG meeting, October 1998
`[3] “MPEG-4 Audio FDIS’, Document ISOlIEC JTCIlSC29IWGl 1lN2503,
`Atlantic City MF’EG meeting, October 1998
`[4] “MPEG-4 DMIF FDIS’, Document ISO/IEC JTClISC29IWGl UN2506.
`Atlantic City MPEG meeting, October 1998
`[5] MPEG Requirements Group, “Overview of MPEG-4 Profile and Level
`Definitions”, Document ISOlIEClJTC I/SC29/WGl VN2458, Atlantic
`City MPEG meeting, October 1998
`[6] MPEG Requirements Group, “MPEG-4 Version 2 Overview”, Document
`ISO/IEUJTC1/SC29IWG1 UN2324, Dublin MPEG meeting, July 1998
`[7] OPEN TV, “Software Developer’s Kit 2.0 - SoftwareDeveloper’s,
`Mountain View, March 1999
`[8] OPEN TV, “Software Developer’s Kit 2.0 - Software Developer’s
`Reference Manual, Mountain View, March 1999
`
`111-629
`
`Authorized licensed use limited to: Immanuel Freedman. Downloaded on July 07,2020 at 20:56:58 UTC from IEEE Xplore. Restrictions apply.
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket