`Rhodes et al.
`
`IIIIIIIIIIIIIIIIIIIIIIIIIJIIIIIIIIIIIIIIIIIIIIIIB~ 11111111111111111111111
`US005432900A
`5,432,900
`[II} Patent Number:
`Jul. 11, 1995
`(45] Date of Patent:
`
`Special Effects, and Mixing in Many Formats", EON,
`vol. 35 No. 25 (Dec. 6, 1990), pp. 72-73. Abstract Only.
`
`Primary Examiner-Heather R. Herndon
`Assistant Examiner-N. Kenneth Burras.ton
`Arromey, Agent, or Firm-Blakely, Sokoloff, Taylor
`Zafman
`
`[57]
`ABSTRACT
`Graphical, video, and audio data is integrated into a
`single processing environment. The present invention
`employs an integrated graphics/video controller (IVC)
`which interfaces with application software through a
`graphics APJ and a video API. The JVC receives
`graphics commands through the graphics API and
`video commands through the video API. A mask driver
`produces information from the graphics commands
`including clipping information, graphics information
`and mask information. A blender uses the mask in forma·
`tion, the graphics information, and the clipping infor(cid:173)
`mation for combining or compositing graphics images
`with video images. The video commands of the video
`command stream provide functions for configuring the
`operation of the JVC. These functions include com(cid:173)
`mands for loading software video decoders in a decoder
`block within the IVC. Video data transferred to the
`IVC via the video API may be encoded in a variety of
`different formats. The present invention provides a
`means for dynamically loading a plurality of different
`video der.oders through a video command interface to a
`video de..:vde block both within the IVC. Each of the
`independent decoders within the decode block contain
`processing logic for decoding a particular type of video
`data to produce a uniform type of decoded video data
`which is provided to the blender. The blender receives
`the decoded video data and combines the video data
`with graphics data as defined by the mask information,
`the clipping information, and the graphics information.
`
`26 Oaims. 14 Drawing Sheets
`
`(75]
`
`(56]
`
`(54] ThiTEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`Inventors: Kenneth E. Rhodes, Portland; Robert
`T. Adams, Lake Oswego; Sberman
`Janes, Portland; Rohan G. F. Coelho,
`Hillsboro, all of Oreg.
`(73] Assignee:
`Intel Corporation, Santa Clara. Calif.
`[21] Appl. No.: 261,284
`[22] Filed:
`Jun. 16, 1994
`Int. CJ.• ................................................ G06T 1/ 00
`{51]
`(52] u.s. Cl. ····································· 395/154; 345/1 18
`[58] Field of Searc:h ........................ 348/589; 345/ I I 8;
`395/153, 154, 162, 163
`References Cited
`U.S. PATENT DOCUMENTS
`4.498,081 2/1985 Fukushima et al. ................ 358/240
`4.530,009 7/1985 Mizokawa ........................... 358/183
`4.539,585 9/1985 Spaekova et al. .................... 358/93
`4.644,401 2/1987 Gaskins ............................... 358/183
`l/1991 Ohuchi ................................ 364/521
`4.984,183
`5.012,342 4/1991 Olsen et al. ......................... 358/181
`5.027,212 6/1991 Marlton et al. ..................... 358/183
`5.119,080 6/1992 Kajimoto el al. ................... 340/723
`5.161,102 11/1992 Griffin et al . ....................... 364/488
`5.195.177 3/1993 Kamiyama et al. ................. 395/162
`5.220.312 6/1993 Lumelsky et al. .................. 340/721
`5.251,301 10/1993 Cook ................................... 395/163
`5,264,8~7 11/1993 Buehler ............................... 395/153
`5.271,097 12/1993 Barker et al ........................ 395/162
`5.274.364 12/1993 Li et al. ............................... 345/118
`5.274,753 12/ 1993 Roskows.ki et al ................. 395/135
`
`FOREIGN PATENT DOCUMENTS
`0454414 10/1991 European Pat. Off . .
`
`OTHER PUBLICATIONS
`Worthington, "True Vision Enhances Its PC Video
`Graphics Card", Info world, vol. 12 No. 32 (Aug. 6,
`1990), pp. 23 Abstract Only.
`Quinnell, "Video Chip Set Handles Decompression
`
`Page 1 of 27
`
`ZTE Exhibit 1028
`
`
`
`~ • rJ).
`•
`
`Memory
`
`104
`
`Disk
`~ ~ Drive
`114
`
`Page 2 of 27
`
`Graphics
`1-- Controller
`116
`
`Video RAM
`(VRAM)
`122
`
`DAC
`
`124
`
`CRT
`Dis~lay
`1 6
`
`CPU
`
`102
`
`t--
`
`~
`- Controller
`
`Capture
`
`118
`
`Video
`Source
`128
`
`Speaker
`
`Memory
`Bus
`
`lt_
`
`Audio
`~ Controller
`
`~ 130
`120 ~
`
`Microphone
`
`Peripheral
`Bus
`110
`
`Networltor
`~ Other I/O
`140
`
`132
`
`FIGURE 1
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 2 of 14
`
`5,432,900
`
`FIGURE2
`(Prior Art)
`
`Application Software
`210
`
`Graphics
`API
`212
`
`Graphics
`Driver
`218
`
`Video
`API
`214
`
`Video
`Driver
`222
`
`Audio
`API
`216
`
`Audio
`Driver
`226
`
`Graphics
`Frame Buffer
`220
`
`Video
`Frame Buffer
`224
`
`Audio
`Output Buffer
`228
`
`Page 3 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 3 of 14
`
`5,432,900
`
`FIGURE3
`
`Application Software
`310
`
`Graphics
`API
`312
`
`,
`
`Video
`API
`314
`
`, ,
`
`Integrated Graphics/Video
`Controller (IV C)
`320
`
`-
`
`..
`
`Frame Buffer
`322
`
`Audio
`API
`316
`
`Audio
`Driver
`324
`
`•
`
`Display
`Timing
`Control
`326
`
`Page 4 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 4 of 14
`
`5,432,900
`
`Display
`Tuning
`Coouot
`326
`
`~
`
`Sync
`450
`T
`
`- Audio
`.. Driver
`- 324
`
`FIGURE4
`
`Video
`API
`314
`
`. Graohics/Yldeo Controller liVC) ~ ll
`~~
`
`4~
`
`4t
`4~
`4~
`
`430,
`
`M.PEG
`Decode
`JPEG
`Decode
`RTV
`Decode
`Other Loedable
`Decod.ers
`
`Video
`Command
`Interface
`412
`
`~ll
`
`• ,
`
`Video
`ControUcr
`Interface
`440
`
`'
`
`Video
`CootrolJer
`
`..
`.
`Decoding
`442
`
`, .......
`
`320
`
`\
`
`Graphics
`API
`312
`
`In
`
`Graphics
`Coron11od
`Interface
`410
`~ll
`
`,,
`
`Mask
`Driver
`414
`
`H
`
`,
`
`t
`
`t
`
`Blender
`
`Masks
`
`Graphics
`Oipping
`Information
`
`~8
`L-)20
`L-)22
`024
`
`Display
`Driver
`416
`t
`
`,
`
`Frame
`Buffer
`322
`
`,
`
`Page 5 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 5 of 14
`
`5,432,900
`
`FIGURES
`
`Initialize decoder table
`512
`
`Send initialize command to
`graphics hardware
`514
`
`Yes-520
`
`Send capability report command to
`graphics hardware/microcode
`522
`
`Receive capability report from
`graphics hardware/microcode
`524
`
`Store hardware/microcode
`capability infamation in video
`conunand interface decoder table
`526
`
`Page 6 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 6 of 14
`
`5,432,900
`
`FIGURE6
`Decoder Table
`
`Address
`
`Length
`
`IDCode
`
`Capebitiry
`Code
`
`entryo
`
`j
`
`0
`0
`0
`
`Page 7 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 7 of 14
`
`5,432,900
`
`FIGURE 7
`
`Send iniriaJj~ command to grapmcs hardware
`714
`
`>--- No-718--...,
`
`Yes-720
`
`Send capability report command to grapbics twdwate
`722
`
`Compare contents of decoder table with minimwn capability table
`728
`
`Yes-732
`
`No-734
`
`Seod ~to Application Program duough API to lOid missin&
`dc:coden
`736
`
`Page 8 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 8 of 14
`
`5,432,900
`
`FIGURES
`Minimum Capability Table
`
`IDCode
`
`Minimum Capability
`
`entryn
`
`.J
`
`0
`0
`0
`
`Page 9 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 9 of 14
`
`5,432,900
`
`FIGURE9
`
`Rec:eNe a request to toed a video decoder. 1be request illcludes an address c:Ltbe
`decoder, length. ideoti.Jicarim code and a cap~bility code
`912
`
`Search dccodcz- table b- an enuy wilh tbe same idencific;adoo code n:ceivcd in the
`request. If oue than one entry wilh tbe same identi6catioo code exists in tbe
`decoder' l'llblc. take the entJy wilh the grea1e5t capability.
`914
`
`No-918
`
`Yes-920
`
`~ capebility code of the mau:bcd entJy with
`capability code teceived in the request.
`922
`
`No-926
`
`Yes-928
`
`Use address and length in the request 10 transt'a- tbe declodea(cid:173)
`to a free area in the video decode block 230
`930
`
`Updale tbe decoder table with the address. length,
`identi6cation code, aod capability code of tbe oewly lo8dcd
`dccodrr
`932
`
`Page 10 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 10 of 14
`
`5,432,900
`
`FIGURE 10
`
`Search the decoder table for an entry corresponding
`to the type of video data teeeived
`1014
`
`No-1018
`
`Yes-1020
`
`Rewm eD'Or code to
`Application Program
`1022
`
`No-1026
`
`Transfer video data to
`~ble
`bard~
`implcmenll:d dcoodc:r
`through Video
`Controller lntaface
`(440)
`1032
`
`Yes-1028
`
`Transfer video data to software
`implemented decoder in
`Decoder Block (430)
`1030
`
`End
`1034
`
`Page 11 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 11 of 14
`
`5,432,900
`
`FIGURE 11
`
`Receive request for capability report
`1112
`
`Retrieve infonnation in Decoder Table
`1114
`
`Send Decoder Table infonnation to
`Application Program through API
`1116
`
`Page 12 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 12 of 14
`
`5,432,900
`
`FIGURE 12
`
`Receive Graphics conmand
`froo.l cbc Application Program
`through Graphics APL
`1212
`
`Pass the Graphics command
`through to cbc Display Driver
`416 wilbout having modified the
`dcstilwion penial of dle
`c:nmmaM The Display Driver
`pcrfulms tbe graphics operation
`on a Display Bitmap.
`1214
`
`~ c:ommand swus from
`Display Drivu in response to tbe
`· pcdolmed on
`graphics
`tbe =Bitmap.
`1216
`
`Yes - 1222
`
`Modify the graphics command 10
`reference a mask bilmap maintainttl
`by the Display Driver.
`1224
`
`Page 13 of 27
`
`>---No - 1220---.
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 13 of 14
`
`5,432,900
`
`FIGURE 13
`
`Send the modified Graphics
`command 10 the Display Driver.
`The Display Dri~ perf<mlS the
`same operation on the mask
`bitmap as performed earlier on the
`display bitmap.
`1310
`
`Receive <nnmand stanJs from the
`Display Driva- in raponse to the
`graphics opention pedormcd on
`the mask bitmap.
`1312
`
`Send commarvt swus to the
`Applicatioo Program through the
`graphics API.
`1314
`
`Page 14 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 14 of 14
`
`5,432,900
`
`FIGURE 14
`
`Receive a graphics command from the Application Program
`through the Graphics API
`1412.
`
`Interpret the graphics command to derive the location and
`dimensions of a visible region of a window.
`1414
`
`Genenue clipping information defining the location and
`dimensions of the visible portion of the window. This
`clipping information defines a clipping region.
`1416
`
`Send the clipping information to the Blender 418 which
`blends video data with graphics data inside the clipping
`region and disables blending outside the clipping region.
`1418
`
`Page 15 of 27
`
`
`
`1
`
`5,432,900
`
`2
`graphical image across a video background typically
`requires additional processing to prevent image from
`being destroyed. Secondly, synchronizing graphics and
`video images in prior art systems is typically very diffi(cid:173)
`cult. Synchronization problems in prior art systems
`result in video images that appear tom or disjoint. More(cid:173)
`over, aligning graphics images at the proper location
`and time in a video image is difficult using prior an
`techniques. Thirdly, the video data decoding schemes
`used in prior art systems are typically limited to a single
`decoding scheme. For example, less expensive graphics
`programs may provide software implemented video
`data decoders. Although these systems provide an inex(cid:173)
`pensive solution, they tend to run slowly and often
`provide decoded video data of a low resolution. Other
`more expensive prior art systems provide graphics
`hardware that may be used to decode a video data
`stream. These systems are fast and provide high resolu(cid:173)
`tion decoded video data; however, they are also su~
`stantially more expensive. Moreover, users wanting to
`upgrade from a Jess expensive system, such as a soft(cid:173)
`ware implemented decoder, to a more expensive sys(cid:173)
`tem, such as a graphic hardware implemented system,
`must first reconfigure or reinstall their graphics applica·
`tion program in order to take advantage of a different
`video decoding technique. This additional impact at the
`applications program level further increases the up(cid:173)
`grade cost to the computer user.
`Thus, a bett.er means for integrating graphical and
`video information in a single computer display system is
`needed.
`
`INTEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`is a continuation of application Ser. No. S
`This
`07/901,280, filed Jun. 19, 1992, now abandoned.
`
`BACKGROUND OF THE rNVENTION
`I. Field Of The Invention
`The present invention relates to the field of computer 10
`display systems. Specifically, the present invention per(cid:173)
`tains to the manipulation and control of graphics and
`video data for display on a display medium. The present
`invention further relates to the control and manipula(cid:173)
`tion of audio data that is output in synchronization with IS
`images on a display screen.
`2. Prior Art
`Many conventional computer graphics display sys(cid:173)
`tems exist in the prior art. These systems provide mech(cid:173)
`anisms for manipulating and displaying graphics objects 2.0
`on a display screen. These graphic objects include
`points, vectors, conical shapes, rectangles, polygons,
`arcs, alpha numeric information, shaded or color-ftlled
`regions, and other forms of graphic imagery, typically
`represented as objects in a two dimensional or three 25
`dimensional virtual space. A graphics computer pro(cid:173)
`gram in these conventional systems provides mecha(cid:173)
`nisms by which a computer user may manipulate
`graphic objects using functions provided by the graph-
`ics software. This graphics software typicalJy includes a 30
`graphics driver software module that operates in con(cid:173)
`junction with a graphics controller to load a frame
`SUMMARY OF THE INVENTION
`buffer with digital information that is subsequently con-
`The present invention integrates graphical, video,
`verted to a form displayable on a display screen. It will
`be apparent to those skilled in the art that the use of 35 and audio data into a single processing environment.
`graphics driver software, graphics controllers, and
`Therefore, the present invention includes a processor
`frame buffers is weU known to those of ordinary skill in
`(CPU), a graphics controller, a capture controller, and
`the art.
`an audio controller. The graphics controller is a special-
`Other prior art computer display systems provide a
`ized graphics control processor coupled to a peripheral
`means for displaying video images on a display screen. 40 bus. The graphics controller, operating under the soft-
`These video images comprise streams of digital video
`ware control of the CPU performs processing and com-
`data encoded in several well known formats such as
`positing of the graphics and video data to be displayed
`MPEG, JPEG, and RTV fonnat . Unlike graphical data,
`on a CRT display. This software control methodology
`this video data includes no representations of individual
`is the subject of the present invention and is described in
`objects within the video image. To t.he computer dis- 45 detail herein.
`play system, the video image is a homogeneous video
`Application software interfaces with system software
`data stream encoded in a particular way. In prior an
`of the present invention through a grapnics application
`systems, this video data is received by processing logic
`program interface (API), a video API, and an audio
`that includes a particular mechanism for decoding the
`API. These APis provide a high level functional inter-
`video data and transferring the decoded video data into so face for the manipulation of graphics, video, and audio
`a frame buffer for display on the display screen. Because
`information. Unlike the prior art, however, the present
`of the different nature of graphical data as compared
`invention employs an integrated graphics/video con-
`with video data, prior art systems tend to handle graphi-
`troller (IV C). The IVC interfaces with application soft-
`cal and video data in separate subsystems within th.e
`ware through the graphics API and the video APl. The
`computer display system. Thus, graphical data and SS output data generated. by the IVC is a composite graph-
`video data typically take paraJlel and independent paths
`ics and video image which is stored for output to a
`through prior art systems as the data is processed for
`display device in a frame buffer. The IVC of the present
`display. ln some cases, two separate frame buffers are
`invention also interfaces with an audio driver. The IVC
`used, one for graphical data and one for video data. In
`provides the audio driver with synchronization infor-
`other systems, a single frame buffer is used; however, 60 mation used to synchronize a video image stream with
`graphical data occupies one distinct region of the frame
`an audio data stream. The IVC receives timing informa-
`buffer and video data occupies a different port.io.n of the
`tion from a display timing control circuit. The timing
`information provided by the display timing control
`frame buffer.
`circuit is used by the IVC for synchronizing the loading
`A number of problems exist in prior art systems that
`process graphical and video data independently. First, 6S of the frame buffer. The display timing control circuit
`these prior art systems cannot efficiently combine
`provides a means for optimizing the operation of the
`IVC for the timing characteristics of .a particular dis-
`graphical and video images together into a composite
`play system that is used with the present invention.
`form that is easily manipulated. For example, scrolling a
`
`Page 16 of 27
`
`
`
`5,432,900
`
`4
`FIG. 4 is a block diagram of the integrated gra(cid:173)
`phics/video controller (IV C).
`FIGS. S-11 are flowcharts illustrating the processing
`logic of the video command interface.
`FIGS. 12-14 are flowcharts illustrating the process(cid:173)
`ing logic of the mask driver.
`
`3
`The IVC receives graphics instructions through the
`graphics API and video instructions though the video
`API. The graphics API interface provides a means by
`which an application program may interface with a
`display driver for displaying graphics images on a dis- 5
`play screen. A graphics command interface and a mask
`driver derive information from the graphics commands
`DETAILED DESCRIPTION OF THE
`provided by an application program prior to transfer(cid:173)
`PREFERRED EMBODIMENT
`ring the graphics information to a display driver. Three
`The present invention is an integrated graphics and
`types of information derived from the graphics com· 10
`mand stream are produced by the mask driver. First,
`video display system for manipulating and viewing both
`clipping information i.s derived by the mask driver.
`graphical and video data in a single integrated system.
`In the following description. numerous specific details
`Clipping information is used to enable the display of
`are set forth in order to provide a thorough understand-
`graphics information within a specified window on the
`display screen. A window is a rectangular portion of the 15 ing of the invention. However, it will be apparent to one
`display screen having particular attnoutes that are well
`of ordina.ry skill in the art that these specific details need
`not be used to practice the present invention. In other
`known in the art. Clipping information is used to enable
`the display of graphics information within the window
`instances, well known structures, circuits, and inter-
`and to suppress the display of information outside the
`faces have not been shown in detail in order not to
`boundary of a window. The second type of infoJmation 20 obscure unnecessarily the present invention.
`Referring now to FIG. L a block diagram of the
`derived by the mask driver from the graphics command
`stream is the actual graphic content of a specified win-
`computer system hardware used in the present embodi-
`dow. The third type of information derived by the mask
`ment is illustrated. The computer system used in the
`driver is mask information. A blender uses the mask
`preferred embodiment comprises a memory bus 100 for
`information, the graphics information, and the clipping 25 communicating information, a processor (CPU) 102
`information for combining or compositing graphics
`coupled with memory bus 100 for processing informa-
`tion, and a memory 104 coupled with bus 100 for storing
`information with video information.
`The IVC also receives video image information
`information and instructions for CPU 102 and other
`through the video API. The video API provides a
`components of the computer system. In the preferred
`means by which an applications program may interface 30 embodiment, memory bus 100 is a high speed bus which
`is well known to those of ordinary skill in the art. In the
`with the IVC for the purpose of manipulating video
`preferred embodiment, CPU 102 is an i486 ® brand
`images for display on a display screen. A video com-
`man.d stream is received by a video command interface
`microprocessor manufactured by the assignee of the
`through the video API. The video command stream
`present invention. i486® is a registered trademark of
`comprises video commands for configuring the opera- 35 Intel Corporation. Santa Clara, Calif. Memory 104 in
`tbe preferred embodiment com.prises dynamic random
`tion of the IVC and video data wbicb comprises a com-
`bination of video and audio data with synchronization
`access memory (DRAM) and read only memory
`information for synchronizing the display of frames of
`(ROM). Coupling a CPU 102 and a memory 104 to-
`gether on a memory bus 100 as shown in FIG. 1 is well
`video images.
`The video commands of the video command stream 40 known in the art.
`provide functions for configuring the operation of the
`FIG. 1 also iJlustrateS a peripheral bus 110 coupled to
`JVC. These functions include commands for loading
`memory bus 100 via a bridge W . The use of a bridge
`between two computer buses is well known in the art.
`software video decoders in a decoder block within the
`IVC. Video data transferred to the rvc via the video
`Similarly, the use of a peripheral bus such as peripheral
`API may be encoded in a variety of different formats. 45 component interface (PCI) is descnDed in co-pending
`The present invention provides a means for dynami-
`patent application Ser. No. 07/886,992.
`cally loading a plurality of different video decoders
`A disc drive 114 and a network interface or other
`through a video command interface to a video decode
`input/output device 140 may also optionally be coupled
`block both within the IVC. Each of the independent
`to peripheral bus UO. Again, disc drive and network
`decoders within the decode block contain processing 50 interface apparatus and their connection to a peripheral
`logic for decoding a particular type of video data to
`bus 110 is well known in the art.
`The present invention integrates graphical, video,
`produce a uniform type of decoded video data which is
`provided to the blender. The blender receives the de-
`and audio data into a single processing environment.
`coded video data and combines the video data with
`Therefore. the present invention centers around the
`graphics data as defiDed by the mask information, the 55 interaction between CPU 102 and graphics controller
`116, capture controller 118, and audio controller 120.
`graphics information, and the clipping information. The
`Graphics controDer 116 is a specialized graphics control
`combined or composited image is then transferred to a
`display driver and output to a display through a frame
`processor coupled to peripheral bus 110. Graphics con-
`buffer.
`troller 116 operating under the software control of CPU
`60 102 performs processing and compositing of the graph-
`ics and video data to be displayed on CRT display 126.
`This software control methodology is the subjec.t of the
`present invention and is described in detail below. The
`hardware platform in wbich the present invention oper(cid:173)
`in patent application Set. No.
`is described
`ates
`07,901 ,519, now U.S. PaL No. 5,243,447, filed concur-
`rently with the present patent application. In general,
`this hardware platform comprises graphics controller
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIG. 1 is a block diagram of the computer system
`hardware used in the preferred embodiment.
`FIG. 2 is a block diagram of the prior art software
`used in a system for processing graphics, video, and 65
`audio data.
`FIG. 3 is a block diagram of the software used in the
`preferred embodiment of the present invention.
`
`Page 17 of 27
`
`
`
`s
`
`5,432,900
`
`6
`lie functions and corresponding caJling parameters with
`which application software 210 may create and manipu(cid:173)
`late graphics objects that are stored for display in
`graphics frame buffer 220. Graphics objects typically
`include points. vectors, corneal shapes. polygons, rec(cid:173)
`tangles, blocks of text, arcs, and other shapes that may
`be represented in a two or three dimensional virtual
`space.
`In a similar manner, video API 214 provides an inter(cid:173)
`face between application software 210 and video driver
`222. Video API 214 provides a set of functions for ma(cid:173)
`nipulating video image data that is processed by video
`driver 222 and eventually stored fo r o utput to a display
`screen in video frame buffer 224. Because prior art
`systems typically used graplric display hardware which
`was independent of video display hardware, the pro(cid:173)
`cessing of graphic and video data in prior art systems
`was typically done using independent and distinct hard(cid:173)
`ware and software processes. This prior art design is
`therefore considered a loosely coupled or low integra(cid:173)
`tion system; because, the graphic and video data is not
`integrated as it travels from the application software
`210 to a display screen (not shown). Rather, in the prior
`art systems, graphic and video data travels two parallel
`paths through the processing architecture.
`Audio API 216 provides an interface between appli(cid:173)
`cation software 210 and audio driver 226. Audio API
`216 provides a set of interfaces for manipulating audio
`information that is processed by audio driver 226 and
`stored in audio output buffer 228 fo r eventual output to
`an audio emission system (not shown). The audio API
`216 provides a set of audio related functions with which
`application software 210 may manipulate audio infor(cid:173)
`mation. Again, the loosely coupled or low integration
`approach used in the prior art is apparent as audio infor·
`mation travels an independent path from application
`software llO to audio output buffer 2Z8. The indepen(cid:173)
`dent path traveled by audio data in the prior art systems
`is less of a performance penalty; however, because
`audio information is destined for a different output de(cid:173)
`vice than graphics and video data. Both graphics and
`video data, on the other band, are typically rusplayed on
`the same display device.
`Several problems exist with the tow integration ap(cid:173)
`proach used in the prior art. Because graphics and video
`information is processed using separate processes, prior
`art systemS do not provide an efficient means for com(cid:173)
`bining graphics and video images together in a compos(cid:173)
`ite form .. Secondly, because certain items of hardware in
`the prior art design are dedicated to processing either
`graphics or video data, some items of the prior art de(cid:173)
`sign must be duplicated for processing both graphics
`and video data. Some of these duplicated hardware
`elements include frame buffer memory 220 and 224.
`Because of the independent paths traveled by graphics
`and video data in the prior art system, other hardware
`logic is required to handle the parallel paths. This addi(cid:173)
`tional logic includes multiplexing logic which deter(cid:173)
`mines which frame buffers data is to be displayed for
`each pixel location. Moreover, additional synchroniza(cid:173)
`tion logic is required to synchronize the output of
`graphics and video data as the infonn.ation is merged
`just before output to the display device. Thus, the prior
`art design suffers from low graphics/video integration
`functionality and increased hardware cost due to the
`low integration design.
`Referring now to FIG. 3, the structure of the system
`software used in the present invention is illustrated.
`
`116 to which a video random access memory (VRAM)
`122 is coupled. Video RAM 122 provides storage for a
`frame buffer whjch represents a binary encoded repre(cid:173)
`sentation of the picture elements (pixels) that are dis(cid:173)
`played on CRT display ll6. Video RAM 122 is used for 5
`storage of both graphical information and video infor(cid:173)
`mation. Using the processing techruques described
`herein and the hardware platform described in the
`above-referenced co-pending patent application, both
`graphical and video data may be composited (i.e. com- 10
`biDed) into a single frame buffer within video RAM 122.
`This composite graphical and video image is converted
`to analog form by djgitaJ to analog converter (DAC)
`124 and output to CRT display 126. ln this manner,
`applications programs executing within CPU 102 can 15
`use the processing logic of the present invention to
`display composited graphical/video images in arbitrary
`locations on CRT msplay 126.
`Capture controller 118 is coupled to peripheral bus
`110. Caprure controller 118 is a hardware system for 20
`receiving and processing video display data from video
`source 128. Capture controller 118 receives video im(cid:173)
`ages from video source 128 and generates video image
`data packets for transmission on peripheral bus 110. The
`video image data packets comprise still video images or 2S
`a series o f motion video images which include synchro(cid:173)
`cization i.nfonnation. The video data may be moved on
`peripheral bus 110 into graphics controller 116 for com·
`positing with graphics data or moved to CPU 102 or
`memory 104 via bridge U2 and memory bus 100. Video 30
`source 128 is a video camera, a video recordmg device,
`or other source for video image data. The structure and
`detailed operation of capture controller 118 is beyond
`the scope of the present invention.
`Audio controller UO is also coupled to peripheral bus 35
`110. Audio controller 120 provides a means for receiv(cid:173)
`ing and processing audio data from a microphone 132 or
`other audio input device. Audio controller 120 gener(cid:173)
`ates a series of audio data packets for transfer on periph(cid:173)
`eral bus 110. The auruo data packets represent a series of 40
`sounds retrieved from microphone 132 or other audio
`input device. Audio controller 120 also provides syn(cid:173)
`chrocization information with each auruo data packet.
`Audio controller 120 also provides a means for decod(cid:173)
`ing audio data received via peripheral bus 110 and out- 4S
`putting audio signals to speaker 130 or other audio out(cid:173)
`put device. The structure and detailed operation of
`audio controller 120 is beyond the scope of the present
`invention.
`Referring now to FIG. l, a block diagram illustrates 50
`the prior art method for processing graphics, video, and
`audio data. Application software 210, residmg in mem(cid:173)
`ory 104 and executed by CPU 102, comprises user level
`programming tools for manipulating and processing
`graphics, video, and audio data. Several application ss
`programs of th.is type exist in the prior art. An example
`of one such application program is AutoDesk Anima(cid:173)
`tor TM for Microsoft Windows TM developed by Auto(cid:173)
`Desk TM, Inc. Application sortware 210 interfaces with
`system software through several application program 60
`interfaces (API). APis provide a means by which an
`application program may request service from system
`software through the functions provided in an API. In
`the prior art, three APis are generally provided: l)
`graphics API, 2) a video API, and 3) an audio API. As 6S
`illustrated in FIG. 2, graphics API 212 provides an
`interface between application software 210 and graph-
`ics driver 218. Graphics API lll provides a list of pub-
`
`Page 18 of 27
`
`
`
`5,432,900
`
`7
`8
`both graphics and video data, masks are used to specify
`Again, application software 310 interfaces with system
`wbeth.er or not video information may over write
`software through graphics API 312, video API 314, and
`audio API 316. In a manner similar to the application
`graphics information within a particular window. The
`program interfaces illustrated in FIG. l, APis 3ll, 314,
`mask information as derived by mask driver 414 is
`and 316 provide a high level functional interface for the s stored in area 420. Blender 418 uses mask information
`420, graphics information 422, and clipping information
`manipulation of graphics, video, and audio information.
`Unlike the prior art, however, the present invention
`424 for combining or compositing graphics information
`with video information as will be described below.
`employs an integrated graphics/video contr