throbber
WO 00/43899
`
`PCT/US00/01699
`
`-27-
`
`being played on CRT 81 as tracking is occurring and is subsequently sent on as output
`
`stream 89 from renderer filter 77 which is analogous to video stream 53 of Fig. 8. An
`
`annotation manager 85 within renderer 77 converts armotation data, input during
`
`annotation processes and the data relating to the tracked entities output from the
`
`tracking module , to metadata for more compact transmission in output stream 87.
`
`Stream 87 is a data stream containing information about the various annotations added
`
`by the author and the tracking co-ordinates of the tracked entities and is analogous to
`
`the annotation stream 62b of Figure 8. Such metadata conversion-data tables for
`
`compact transmission in output stream 87 may be stored elsewhere accessible to the
`
`CPU powering authoring station 61. User interface 83 provides considerable option
`
`and capability for entering commands to add image icons, animated graphics, following
`
`tracked objects or static or moving independently in the video in predefined manner,
`
`formatted text captions and so on.
`
`In one embodiment, user interface 83 may pre-programrned by an author to
`
`supply the appropriate pre-selected annotations in a reactive fashion. That is,
`
`according to a specific time interval, a signal could initiate annotation inserts and so
`
`on.
`
`In other embodiments, an author may physically enter an annotation via pressing a
`
`pre-defined key on a keyboard and so on. There are many known methods for
`inserting annotations.
`8
`
`It will be apparent to one with skill in the art that other sofiware module
`
`configurations may be used instead of those presented in this example without
`
`departing from the spirit and scope of the present invention. For example, similar
`
`fimctional modules may be provided to be compatible with alternate platforms such as
`
`UNIX or Macintosh.
`
`It will also be apparent to one with skill in the art that the bulk of annotation in
`
`the form of inserted text, graphical icons, universal resource locators (URL’s),
`
`interactive shapes, and so on will, in many embodiments, be at least partly associated
`
`with tracking coordinates of an image and therefore will depend on those frame by
`
`frame coordinates. For example, an interactive icon may follow a moving image entity
`
`and be visible by an end user as in case of advertisement logos for sponsors of
`
`I0
`
`15
`
`20
`
`25
`
`30
`
`
`
`NTFX-1002 / Page 1231 of 1867
`
`

`
`
`
`W0 ‘W43899
`
`PCT/US00/01699
`
`- 28 -
`
`sportspersons in a sporting event. Text blocks and the like may take similar
`
`association. Hence, the specific content of annotations and insertion methods of such
`
`annotations may be pre-designed based on known facts about the video stream such as
`
`what image is to be tracked for what advertiser who has what URL’s and so on.
`
`5
`
`Execution of those armotations may be automatic according to a timed fimction as
`
`described above, or may be performed manually, perhaps using a macro or other
`
`designed input function.
`
`In another embodiment, added fimctionality could be added to user interface 83
`
`which allows for an author to adequately identify an image entity to be tracked so as to
`
`10
`
`be enabled to place a tracking box such as box 29 of Fig. 5 over the entity at a
`
`maximally opportune instant during image motion.
`
`In this case, once the tracking box
`
`in activated, the software could be adapted to allow the author to manually track the
`
`object till such a time that the tracking box is placed more or less at the center of the
`
`object in the video. A synchronization module could be added in authoring server 63
`
`15
`
`and adapted to synchronize separate annotation streams before combining them and
`
`synchronizing them with the output video stream which is stream 53 in our example.
`
`20
`
`System for Synchronizing Data Streams Delivered Over Separate Networks
`
`According to a preferred embodiment of the present invention, a unique
`
`synchronization system is provided and adapted to overcome unpredictable latency
`
`inherent in delivery of data-streams that are delivered over separate delivery media to
`
`25
`
`end users. The method and apparatus provided and taught herein for this unique
`
`purpose is two-fold. Firstly, a video/data stream signature operation is executed after
`
`coordinate tracking and armotation operations are performed in an authoring station
`
`such as was described above with respect to authoring station 61 of Fig. 9. The
`
`signature streams are then sent to their respective broadcast and/or data-transmission
`
`30
`
`systems to be sent to an end user. Secondly, a video/annotation stream capture and
`
`
`
`NTFX-1002 / Page 1232 of 1867
`
`

`
`
`
`WO 00/43899
`
`PCT/US00/01699
`
`-29-
`
`synchronization operation, executed via software on customer premises equipment
`
`(CPE), must be executed at the user’s end before a single combined stream may be
`
`viewed by the user.
`
`Fig. 10 is a block diagram illustrating a signature application apparatus at the
`
`authoring end according to an embodiment of the present invention. A signature
`
`application module 9] is provided in this embodiment in the form of a software
`
`application module resident in an authoring server such as server 63 of Fig. 8. Module
`
`91 is initiated in server 63 after tracking and armotation has been performed.
`
`Separate data streams (video and annotation) are given frame-specific
`
`identification and marking so that they may latter be synchronized by using inserted
`
`data corresponding to the frame-specific identification.
`
`_
`
`A video stream 93 is shown entering signature module 91. Video stream 9] is
`
`analogous to stream 53 of Fig. 8. An armotation stream 95 is similarly illustrated as
`
`entering signature module 91. Annotation stream 95 is analogous to stream 55 of Fig.
`
`8. Streams 95 and 93 are synchronous as they enter module 91. Synchronization has
`
`been achieved after image tracking and authoring in authoring server 63 of Fig. 8, as
`
`described in detail above.. Synchronization after separate broadcasting is much more
`
`complicated and is described in enabling detail below.
`
`Referring back to Fig. 10, in this embodiment, a frame reader/counterimodule
`
`97 is adapted to read video stream 93 and annotation stream 95 for the purpose of
`
`recording an association of annotation data to video-frame data using a serial count of
`
`each frame; Because annotation stream 55 of Fig. 8 was generated at the time of
`
`tracking an entity within video stream 53 of Fig. 8, each stream comprises a same
`
`number of frames constituting an entire stream length. Therefore, it is possible to
`
`count and associate individual fi'ames in serial fashion. A number/time marker-
`
`generator module 99 generates code to represent frames in annotation stream 95 and
`
`also to represent time markers in video stream 93. Further binary numbers are
`
`generated for use in a pixel signature method described more fiilly below.
`
`According to a preferred embodiment of the present invention, three separate
`
`signature methods, each method using one sequence of binary numbers described
`
`10
`
`15
`
`20
`
`25
`
`30
`
`
`
`NTFX-1002 / Page 1233 of 1867
`
`

`
`
`
`WO 00/43899
`
`PCT/US00/01699
`
`-30-
`
`above, are executed via signature module 91 in the course of it's function. Using three
`
`separate signatures insures that at least one of the applied signatures will successfiilly
`
`pass on to the end user’s equipment. All three methods share a common goal, which is
`
`to record in one of two data streams to be later synchronized, at regular intervals, a
`
`marker, and information denoting which frame from the other of the two data streams
`
`should be displayed at the marker for the two streams to be properly synchronized.
`
`In one of the three methods a number denoting frames in one of the two data
`
`streams is inserted into video blanking intervals (VBIS) of the other data stream.
`
`Although it is possible to insert such a synchronizing number in each VBI for the
`
`carrying stream, it is not necessary to do so for synchronizing purposes. Typically the
`
`synchronizing number need be inserted only once in several frames, and the fact of
`
`such a number appearing in a VBI can serve also as a marker; that is, the appearance
`
`of the number in a VBI can be taken to mean that the associated frame from the
`
`companion stream is to be displayed with the "next" frame in the carrying stream. The
`
`convention could also be applied to any frame follow g the "next" frame.
`
`In a second method the identifying number is inserted in one or another of the
`
`horizontal blanking intervals (I-IBI) of a frame in the carrying stream. The particular
`
`I-{BI is known by convention, and more than one I-IBI may be used as a "belt-and-
`
`suspenders" approach.
`
`In this method the marker may be also by convention, such as
`
`the "next" frame, or some number of frames following the "next" frame.
`
`A third method for synchronization signature according to an embodiment of
`
`the present invention involves altering pixel data in a manner to communicate a binary
`
`number to a system (described firrther below) at the user's end programmed to decode
`
`such a number from a carrying data stream.
`
`In this method, in the carrying data
`
`stream, the data stream values for an "agreed-upon" pixel are altered. For example, for
`
`one particular pixel in a frame, the KG, and B values (or, in appropriate instances, the
`
`Y, U, and V values) may be arbitrarily set to zero to denote a zero bit in a binary
`
`signature, and in following frames the values for the same pixel may be set to
`
`maximum value (all 1's) to denote a binary 1 bit for the signature. In this ma.nner, over
`
`-l0
`
`I5
`
`20
`
`25
`
`
`
`NTFX-1002 / Page 1234 of 1867
`
`

`
`,
`
`U!
`
`W0 00/43899
`
`,
`
`PCT/US00/01699
`
`- 31 -
`
`several frames, a binary number denoting a particular frame from the companion data
`
`stream may be inserted.
`
`In this pixel alteration method, a marker is also needed. Again, the marker can
`
`be by convention (preferred), such as the third frame alter the end of a decoded
`
`5
`
`signature, or the same sort of coding may be used to insert a binary marker signature.
`
`In the pixel insertion method, any pixel may be used by convention, but some
`
`may serve better than others; For example, in some instances jitter problems may
`
`make pixel identification relatively difficult.
`
`In a preferred embodiment, wherein a
`
`logo is used to identify a data stream, such as a network logo seen in the lower right of
`
`10
`
`frames for some networks, a particular pixel in the logo may be used, which would
`
`serve to alleviate the jitter problem.
`
`It will be apparent to the skilled artisan, giving the above teaching, that there
`
`will be a variety of ways pixel data may be altered providing a coding system for a
`
`synchronization signature. For example, the R, G, and B values may be altered
`
`15
`
`differently by convention, providing three signature bits per pixel, and more than one
`
`pixel may be used; so a coded number of virtually any binary length may be provided
`
`with the data for a single frame in a video data stream.
`
`In a preferred embodiment of the present invention, all three methods of stream
`
`signature, VBI, HBI, and pixel alteration are used. The reason for this is because it is
`
`20
`
`possible that other systems downstream (toward broadcast, or in some rebroadcast)
`
`may use VBI’s and I-IBI’s to bear certain data, thus overwriting some or all data that
`
`may be inserted in blanking intervals via methods of the present invention. Similarly, a
`
`logo or other graphical alteration such as a commercial may be inserted into a video
`
`stream thus overriding a planned pixel alteration in a significant section of the video.
`
`25
`
`By using all three methods at the authoring end survival of the synchronization
`
`information at the user's end is assured.
`
`Refening back to Fig. 10, a frame writer and pixel command module 101,
`
`comprising sub-modules 101a, and 101b, uses previously generated data to insert time
`
`markers and binary numbers into frame data of at least one of the data streams (93 and
`
`30
`
`95), as well as causing alteration to one or more pixels over a series of frames to create
`
`NTFX-1002 / Page 1235 of 1867
`
`

`
`
`
`WO 00/43899
`
`PCT/US00/01 699
`
`.32 _
`
`a serial transmission or physical marker that may be associated with frame numbers
`
`assigned to matching frames within annotation stream 95.
`
`It will be apparent to the skilled artisan that either data stream may be the
`
`carrying stream. As a convention the primary video data stream is used as the carrying
`
`stream rather than the annotation stream.
`
`In some embodiments, a natural screen change convention may be used for
`
`markers. For example, known software may be provided and adapted to detect screen
`
`changes wherein a majority of pixel values show significant alteration. These screen
`
`changes will happen randomly throughout the video and typically are spaced over a
`
`410
`
`number of frames.
`
`15
`
`20
`
`25
`
`It will be apparent to one with skill in the art that module 91 may be
`
`programmed according to pre-determined criteria without departing from the spirit and
`
`scope of the present invention. Such criteria may vary according to factors such as
`
`density of annotation data in a particular annotation stream, nonnal frame rate of the
`
`video, whether or not it is known if there will be any further annotation before
`
`broadcasting, and so on. For example, a timing marker may be taken every 5th frame
`
`instead of every 10th frame. Screen-change marking may or may not be used. There
`
`are many variables that may be considered before applying the innovative signature
`
`methods of the present invention. Presenting the combined signatures insures that re-
`
`synchronization remains possible at the user’s end as previously described.
`
`Fig. 11 is a process flow chart illustrating logical steps for providing a
`
`synchronization signature at the authoring end according to an embodiment of the
`
`present invention. At step 103 the frames of the two streams are identified and
`
`monitored as necessary. The software may determine, for example, the scope (density)
`
`of armotation, the status of available VBI and I-{BI areas, the frequency of frames for
`
`time marking intervals, and so on. This step also includes counting fi'ames for the
`
`purpose of generating armotation frame numbers for signature association purposes.
`
`In step 105, serial binary numbers are generated in separate sequences that may be
`
`used for time marking, physical marking, and frame association.
`
`
`
`NTFX-1002 / Page 1236 of 1867
`
`

`
`
`
`WO 00/43899
`
`PCT/US00/01699
`
`-33-
`
`In step 107, annotation frame numbers are written into VBI and HBI areas
`
`associated with video frames as well as to the appropriate annotation frame headers. If N
`
`a concerted pixel alteration method is pre-deterrnined to be used as a marking scheme,
`
`then the pixel or pixels are selected, altered, and activated in step 109.
`
`It will be apparent to one with skill in the art of video editing including
`
`knowledge of video—fra.me structure and the techniques for writing data into such video
`
`frames that there are many variations possible with regards to time marking and
`
`assigning identifying numbers to data frames wherein such numbers are also added to
`
`video frames. For example, differing frame intervals may be chosen as time markers,
`
`difierent bit stnictures may be used such as 16, 24, or 32 bit resolutions, and so on.
`
`With reference to the stated objective of the present invention as previously
`
`described above, it was mentioned that the method of the present invention involves a
`
`second phase wherein separate data streams, marked via the conventions above, arrive
`
`at a user location after being sent via alternate mediums, such as one via cable
`
`broadcast, and one via a wide area network (WAN) delivery wherein, after receiving
`
`the streams, the user's equipment captures, re-synchronizes and combines the streams
`to be displayed for viewing as one armotated video stream. Such a CPE apparatus and
`
`method is provided and taught below.
`
`Fig. 12 is a block diagram illustrating a data capture and synchronization
`
`system at the user’s end according to an embodiment of the present invention. System
`
`I 15 is provided and adapted to receive broadcast data-streams from varying sources
`
`and combine and synchronize the streams so the data from the two different streams
`
`may be integrally displayed as authored. System 115 has a central processing unit
`
`(CPU) 117 that has a cache memory and random access memory (RAM). Systeml15
`
`may be integrated with a computer or components thereof, a WEB TV or components
`
`thereof, or another type of receiving station capable of capturing and displaying
`
`broadcast video.
`
`System 115 further comprises a signal receiving module H9, illustrated as
`
`connected to CPU 117 via bus structure 12]. Bus structure 121 is the assumed
`
`connection to other illustrated modules within device 115 although an element number
`
`10
`
`15
`
`20
`
`25
`
`
`
`NTFX-1002 / Page 1237 of 1867
`
`

`
`wo 00/43899
`
`PCT/US00/01699
`
`’
`
`- 34 -
`
`’
`
`does not accompany the additional connections. Module 119 is shown divided into
`
`sub-modules with each sub—module dedicated to capturing signals from a specific type
`
`of medium. In this case, there are six sub-modules that are labeled according to
`
`medium type. From top to bottom they are a modem, a satellite receiver, a TV
`
`5
`
`receiver, a first optional input port (for plugging in a peripheral device), a second
`
`optional input port (for plugging in a peripheral device), and a cable receiver. The
`
`optional input ports may accept input from Video Cameras, DVD’s, VCR’s, and the
`
`like.
`
`In this particular example, an annotation data stream 125 is illustrated as
`
`10
`
`entering system 1 15 through a modern, as might be the case if an annotation data
`
`stream is sent to an end user via the lntemet or other WAN. A video broadcast stream
`
`127 is illustrated as entering system 115 through the sub—module comprising a cable
`
`receiver. Streams 125 and 127 are analogous to streams 95 and 93, respectively, as
`
`output from signature application module 91 of Fig. 10. Video stream 127 in this
`
`15
`
`example is a live broadcast stream in digital form. Annotation stream 125 is delivered
`
`via a WAN which in a preferred embodiment will be the Internet. As such, stream 125
`
`arrives as data packets which must be sorted as is well-known in the art.
`
`System 1 15 fiirther comprises a pipeline module 129 adapted to accept both
`
`streams 125 and 127 for the purpose of synchronization. Pipeline 129 is illustrated as
`
`20
`
`having a time-begin mark of 0 and a time-end mark of T. The span of time allowed for
`
`buffering purposes may be almost any increment of time within reason. The inventors
`
`have has determined that a few seconds is adequate in most instances.
`
`Video stream 127 flows trough pipeline 129 via a controllable buffer 133.
`
`Similarly annotation data stream 125 flows through pipeline 129 via controllable buffer
`
`25
`
`13 1.
`
`It is important to note here that either stream may arrive first to pipeline 129 and
`
`that neither stream has a predictable latency. The only constant factor between the
`
`two streams at this entry point are that they are both running at the same frame rate.
`Innovative software is provided and adapted to read the time-marker and data-
`
`frame numbers in the carrying stream and to compare the indicated frame number for
`
`30
`
`the opposite stream to the actual frame position relative to the carrying stream in the
`
`NTFX-1002 / Page 1238 of 1867
`
`

`
`«I
`
`l‘
`
`W0 00/43899
`
`PCT/US00/01699
`
`-35-
`
`pipeline. The system is adapted to adjust either data ‘stream toward synchronization of
`
`the two streams. For example, CPU, through executing the sofiware, may repeat
`
`frames in a pattern in either data stream to slow that stream relative to the opposite
`
`stream. The sofiware in a preferred embodiment performs this calculation for every
`
`5
`
`detected time marker in stream 127.
`
`Buffering alteration parameters will depend upon the frequency of time markers
`
`and the extent of error detected in timing between the two data streams. For example,
`
`it is desired to produce what is termed in the art to be asofl ramp effect so that sudden
`
`movement or jumping of annotations related to video entities as viewed by a user does
`not noticeably occur. Similarly, latency factors are unpredictable regarding both
`
`10
`
`streams during the entirety of their transmissions. Therefore, buffers 131 and 133 are
`
`utilized continually to synchronize streams 127 and 125 as they pass through pipeline
`
`129. Synchronization error toward the end of pipeline 129 is small enough so that the
`
`signals may be combined via a signal combining module 135 before they are sent on as
`
`15
`
`one stream into typically a video RAM of a display module 139.
`
`A single annotated video-stream 137 is output from display module 139 to a
`
`suitable connected display monitor or screen. An input signal 141 represents user
`
`interaction with an entity in video stream 137 as it is displayed. Such a signal may
`
`trigger downloading of additional detailed information regarding the subject of
`
`20
`
`interaction. Interaction signal 141 results from a mouse click or other input command
`
`such as may be initiated via a connected keyboard or the like.
`
`It will be apparent to one with skill in the art that the architecture illustrated
`
`herein is but one example of a data stream capture and synchronization system or
`
`device that may be integrated with other equipment without departing from the spirit
`
`25
`
`and scope of the present invention.
`
`In one embodiment, system 115 may be part of a
`
`computer station. In another embodiment, system 115 may be part of a set-top box
`
`used in conjunction with a TV. There are various possibilities. Moreover, there may
`
`be differing modular components installed in system 115. For example, instead of
`
`providing a dial-up modem, WAN connection may be via satellite and the modem may
`
`30
`
`be wireless.
`
`NTFX-1002 / Page 1239 of 1867
`
`

`
`i\
`
`W0 00/43899
`
`PCT/US00/01699
`
`- 36 -
`
`i
`
`In one embodiment, a broadcast video stream without audio narration may be
`
`synchronized to a separately received audio stream. Furthermore, a prerecorded and
`
`authored video feed from a source connected to an optional input module may be
`
`synchronized with a previously stored and annotated data stream from a source
`
`5
`
`connected to a second optional input module as long as the signature process was
`
`applied to both streams according to the embodiment of Fig. 10. Interaction with
`
`tracked entities and the like associated with the prerecorded streams may be sent to a
`
`participating Internet server or the like through the modem sub-module provided the
`
`system is on—line during viewing.
`
`10
`
`Fig. 13 is a Process flow chat illustrating logical steps for capturing and
`
`synchronizing separate video streams for user display and interaction according to an
`
`embodiment of the present invention.
`
`In step 143, separate data streams are captured
`
`and redirected into a synchronization pipeline such as pipeline 129 of Fig. 12. Time
`
`markers, and if applicable, screen-change markers are searched for and detected in step
`
`15
`
`145.
`
`In step 147, data-frarne ID numbers are searched and compared to data-frame
`
`numbers inserted in marker frames of a video stream such as stream 127 of Fig. 12.
`
`The data may be inserted in VBI and HBI areas or as coded numbers added previously
`
`by pixel manipulation.
`
`In step 149, a timing error is calculated with regards to data inserted in a
`
`20 marker frame in the video stream as matched to data in an annotation data-frame
`
`closest to the marker. The error will define an annotation frame as being n number of
`
`fi'ame intervals ahead of or behind the target marker frame. In step 151, the strea.m
`
`determined to be running n number of frames ahead is buffered to reduce the error.
`
`In
`
`step 153, the process repeats (steps 145-151) for each successive marker in the video
`
`25
`
`stream.
`
`The process steps illustrated in this embodiment are intended to be exemplary
`
`only. The order and function of such process steps may vary according to differing
`
`embodiments. For example, in some embodiments wherein it may be known that no
`
`firrther armotation will be performed after signature operations, then only time marker
`
`30
`
`intervals with VBI inserted data may be used.
`
`In another such instance, it may be
`
`NTFX-1002 / Page 1240 of 1867
`
`

`
`
`
`WO 00/43899
`
`PCT/US00/01699
`
`-37-
`
`determined that only screen change marking and I-IBIiinserted data will be used, and so
`
`on.
`
`In a preferred embodiment, the method and apparatus of the present invention
`
`is intended for a user or users that will receive the video data via broadcast, and the
`
`annotation data via a WAN, preferably the Internet. This is so that additional data
`
`obtained by a user through interaction with a tracked entity in the video may be
`
`personalized and specific to the user. In a case such as this a user would, perhaps,
`
`obtain a subscription to the service.
`
`In other embodiments, other broadcast and data
`
`delivery methods may be used.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`Hygervideo and Scene Video Editor
`
`In another aspect of the present invention, a video editor is provided for editing
`
`video streams and corresponding annotation streams and creating new video and
`
`synchronous annotation streams. The editor in a preferred embodiment comprises a
`
`software suite executable on a computer platform similar to the various platforms
`
`described above related to the coordinate tracking and annotating systems (authoring)
`of Fig. 1. The editor in some embodiment manipulates data streams in the well-known
`
`MPEG format, and in others in other formats. The format under which the editor
`
`performs is not limiting to the invention, and in various embodiments the system
`
`includes filters (translators) for converting data streams as need to perform its
`
`fimctions.
`
`The Editor is termed by the inventors the HoTV!Studio, but will be referred to
`
`in this specification simply as the Editor. The Editor in various embodiments of the
`
`present invention may operate on computer platforms of various different types, such
`
`as, for example, a high—end PC having a connected high-resolution video monitor. As
`
`such platforms are very familiar to the skilled artisan, no drawings of such a platfonn
`
`are provided. Instead, descriptive drawings of displays provided in a user interface are
`
`used for describing preferred embodiments of the invention.
`
`It may be assumed that
`
`the editor platform includes typical apparatus of such a platform, such a one or more
`
`
`
`NTFX-1002 / Page 1241 of 1867
`
`

`
`W0 00/43899
`
`PCT/US00/01699
`
`-33-
`
`pointer devices and a keyboard for user data entry. Platforms used as editors in
`
`embodiments of the invention may also be assumed to include or be connected to
`
`adequate data storage capacity to store a plurality of video data streams, and one or
`
`more ports for inducting such data streams.
`
`5
`
`In oneembodiment of the invention, when an operator invokes the Editor, a
`
`main window 185 appears, as shown in Fig. 14. This window is a control window
`
`providing menus and icons for invoking various functions related to the Editor. It will
`
`be apparent to the skilled artisan that this control window may take a variety of
`
`different forms, and the form shown is but one such form. Window 185 in this
`
`.10
`
`embodiment has iconic selectors 186 for minimizing, maximizing and closing, as is
`
`well-known in the art, drop-down menus 198 providing menu-selectable fiinctions, and
`
`a task-bar 193 with iconic selectors for the fiinctions of the drop-down menus.
`
`The File menu in this embodiment includes selections for File; Tools; Window;
`
`Options; and Help. Under File in this embodiment one can select to create a new
`
`15
`
`video file, open an existing video file, or import a file while simultaneously
`
`transforming the file from any supported fonnat, such as AVI, ASF, MPEG, and so on
`for video, or WAV and other formats for audio.
`
`Under the Tools menu in this embodiment one may select functions for audio
`
`mixing, audio replacement, or for multiplexing audio and video. Under Windows one
`
`20 may select fiinctions to determine how windows are presented, such as Tile
`
`Horizontally; Tile Vertically; Cascade; and Close Stream. Under Options one may
`
`select fiinctions to set the image editor path, temporary volume path, temporary
`
`encoded files, and so forth. The Help menu provides functions for guiding a user
`
`through the functionality of the Editor application.
`
`25
`
`At the heart of the Editor is a display window with selectable functionality for
`
`playback and editing. Fig. 14 illustrates one such display window 187. When one
`
`creates a new file or selects a file from storage, a display window is opened for the file.
`
`An existing file will have a name and storage path, and for a new file the user will be
`
`provided a window (not shown) for naming the new file and designating the storage
`
`30
`
`path. Such windows are well-known in the art. In the file window the name and path
`
`NTFX-1002 / Page 1242 of 1867
`
`

`
`W0 00/43899
`
`-39-
`
`A
`
`PCT/US00/01699
`
`is displayed in a title bar 197, which also displays a frame number and/or time of the
`
`video, in a display area 195.
`
`A slider 194 is provided in this embodiment to indicate to a user the
`
`approximate position in the overall file for the frame displayed.
`
`In some embodiments
`
`5
`
`one may drag the slider to force the display to a new frame position, and in some
`
`embodiments the slider is display only.
`
`Play and editing functions are enabled in this embodiment by selectable icons in
`
`a taskbar 190. Included in the taskbar are functions for Play, Pause, Go-To-End, Go-
`
`To-Beginning, Fast Forward, and Fast Reverse.
`
`In addition there are selectable
`
`10
`
`fimctions for marking (blocking) portions of a file for editing functions such as Cut,
`
`Copy, and Paste fimctions.
`
`It will be apparent to the skilled artisan that other functions
`
`may be included by appropriate icons and menus.
`
`In a preferred embodiment of the invention a user can open and create multiple
`
`data streams, each of which will be represented by an editing window such as window
`
`15
`
`187. The user may also select the anangement for the display of the windows. Fig. 14
`
`shows a tiled arrangement 187, 188 of multiple windows, which may be manipulated
`
`by a user to accomplish editing of one or more of the data streams represented. A user
`
`can, for example, select portions of one stream, Copy the portion, and Paste the
`
`portion into another data stream. The user may also Cut selected portions and Paste,
`
`20
`
`and mark position in a receiving data stream for a Paste to be done.
`
`In addition to the traditional editing fimctions, there are also special effects that
`
`may be accomplished. With a Special Effects window (not shown), which a user may
`
`invoke from any one of the editing windows, the user may accomplish special effects
`
`on any selected frame in a data stream. The Special Effects Window provides a
`
`25
`
`scrollable list of special effects that a user may choose for accomplishing the selected
`
`effect in a frame of a video stream. Among the special effects are Adding Text of a
`
`Bitmap to a frame, Merging Frames, Fade In or Out, and Alpha Merging frames.
`
`Other effects may be added to the list in other embodiments.
`
`
`
`NTFX-1002 / Page 1243 of 1867
`
`

`
`WO 00/43899
`
`_
`
`PCT/US00/01699
`
`-40-
`
`Given the editing functionality taught above, users are provided with a filli-
`
`featured package with the Editor to create and edit video data streams and associated
`
`audio streams.
`
`In yet another aspect of the present invention the video editor application is
`
`5
`
`provided wherein visual thumbnails are widely used in editing functions.
`
`In
`
`HoTV!Studio window185, stored video files may be opened in separate windows
`
`within the editor window. Two such file windows 187 and 188 are shown in editor
`
`window 185. File window 187 is, in this example, a video clip titled backstreetmpg,
`and file window 188 is for a video clip titled trailerl .mpg. The filnctionality of each of
`
`10
`
`the file windows shown, and other such file windows, is the same. Therefore only file
`
`window will be described in detail here.
`
`It may be assumed that the others operate in a
`
`similar fashion.
`
`.
`
`File window 188 (and other such windows) has a display region 189 where the
`
`video file may be played and reviewed. A tool bar 190 has selectable icons (buttons)
`
`15
`
`for controlling playback, and the button functions are intentionally crafted to resemble
`
`the familiar physical input keys on a video cassette recorder (VCR). There are, for
`
`example, from lefi to right, a button for Play, Pause, Stop, Rewind, and Fast Forward.
`
`The next button, shaped as an arrow is presently undefined. There are then two
`
`buttons 19] and 192 for marking a place in the video stream. Button 19

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket