`US005432900A
`
`{I 1] Patent Number:
`
`5,432,900
`
`[45] Date of Patent:
`
`Jul. 11, 1995
`
`Special Effects, and Mixing in Many Formats". EDN,
`vol. 35 No. 25 (Dec. 6, 1990), pp. '32-'23, Abstract Only.
`
`Primary .Examiner—Heather R. Herndon
`Assistant Exam£ner—N. Kenneth Burraston
`Attorney. Agent. or .F:'rm-—Blakely, Sokoloff. Taylor
`Zafman
`
`{S7}
`
`ABSTRACT
`
`Graphical, video, and audio data is integrated into a
`single processing environment. The present invention
`employs an integrated graphics,’video controller (IVC)
`which interfaces with application software through a
`graphics API and a video AP]. The IVC receives
`graphics commands through the graphics API and
`video commands through the video API. A mask driver
`produces information from the graphics commands
`including clipping information, graphics information
`and mask information. A biender uses the mask infon11a-
`
`tion, the graphics information, and the clipping infor-
`mation for combining or compositing graphics images
`with video images. The video commands of the video
`command stream provide functions for configuring the
`operation of the IVC. These functions include com-
`mands for loading software video decoders in a decoder
`block within the IVC. Video data transferred to the
`IVC via the video API may be encoded in a variety of
`different
`fonnats. The present
`invention provides a
`means for dynamically loading a plurality of different
`video decoders through a video command interface to a
`video decode block both within the IVC. Each of the
`independent decoders within the decode block contain
`processing logic for decoding a particular type of video
`data to produce a uniform type of decoded video data
`which is provided to the blender. The blender receives
`the decoded video data and combines the video data
`with graphics data as defined by the mask information,
`the clipping information, and the graphics information.
`
`26 Claims, 14 Drawing Sheets
`
`FOREIGN PATENT DOCUMENTS
`
`.
`0454414 ID/l99| European Pat. Off.
`OTHER PUBLICATIONS
`
`Worthington, “True Vision Enhances Its PC Video
`Graphics Card“, Info world, vol. 12 No. 32 (Aug. 6.
`1990), pp. 23 Abstract Only.
`Quinnell, “Video Chip Set Handles Decompression
`
`1°-7
`
`
`
`
`
`Page 1 of 27
`Page 1 of 27
`
`—t 1028
`
`United States Patent
`Rhodes et al.
`
`[19]
`
`[54]
`
`[75]
`
`INTEGRATED GRAPHICS AND VIDEO
`COIVIPUTER DISPLAY SYSTEM
`
`Inventors: Kenneth E. Rhodes, Portland; Robert
`T. Adams, Lake Oswego; Sherman
`Janes, Portland; Rohan G. F. Coollio,
`I-lillsboro, all of Oreg.
`
`[73] Assignee:
`
`Intel Corporation, Santa Clara. Calif.
`
`[2]] Appl. No.: 261,284
`
`[22] Filed:
`
`Jun. 16, 1994
`
`G06T 1/00
`395/154; 345/118
`343/539; 345x113;
`‘"395/153, I54. 162. I63
`Referenc Cited
`U.S. PATENT DOCUMENTS
`
`
`
`Int. Cl.”
`[SI]
`[52] U.S. CI.
`[58] Field ofSearchH::i:...........
`
`[56]
`
`._
`
`.............. .. 358/24-0
`353/183
`.... .. 353/93
`353/133
`364/521
`no 35BX18!
`no 353/I83
`"H 340/T23
`,u 364K488
`"U 395K162
`"H 340/T21
`__ 395/163
`no 395/I53
`.n 395/162
`V 345fll8
`..
`395K135
`
`..
`
`2/1985 Fukushima at al.
`4.493.031
`7131935 Mizokawa ...........
`4.530.009
`9/‘I985 Spaekova et al.
`4.539.585
`2/198'-I" Gaskins
`-i.644,40l
`l./I991 Ohuchi
`4.934.l83
`..
`5.012.342 M199} Olsen el al.
`5.027.212
`6./1991 Marlton et al.
`5.119.030 6/1992 Kajirrlolo Bt al.
`5.161.102 llz"l992 Griffin et al.
`5.195.177
`3/1993 Kamiyama ct al.
`5.220.312
`ti/1993 Lurnelsky et al.
`5.251.301 10/1993 Cook .............. ._
`5.264.310 ll/1993 Buehler ...... ..
`5.271.091 l2./1993 Barker et al.
`5.273.364 12.31993 Li et al.
`.............. ..
`5.2?4.?53 I2fl‘]93 Roskowski et al.
`
`
`
`Samsung Exhibit 1028
`
`
`
`S.U
`
`u
`
`3
`
`2
`
`00
`
`5«SmI
`
`.553525.32>
`
`v.S.m_§asJ3so§$_>.3.6
`
`3.m:
`
`oo_._>2.2.6
`
`
`
`uu.=_am..o=o.a:oU
`
`mmsfiam
`
`43m502H8.
`
`9.,_MMDUE
`
`3oz4.,2%mm5toz5__E._._E._
`
`..mx:o..mwP@9352WoP
`
`Page 2 of 27
`
`
`
`
`US. Patent
`
`July 11, 1995
`
`Sheet 2 of 14
`
`5,432,900
`
`FIGURE 2
`
`(Prior Art)
`
`Application Software
`210
`
`Graphics
`API
`212
`
`Video
`API
`214
`
`Graphics
`Driver
`218
`
`Video
`Driver
`222
`
`Audio
`API
`216
`
`Audio
`Driver
`26
`
`Graphics
`Frame Buffer
`220
`
`Video
`Frame Buffer
`224
`
`Audio
`Output Buffer
`228
`
`Page 3 of 27
`Page 3 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 3 of 14
`
`5,432,900
`
`FIGURE 3
`
` Applicatioxi Software
`
`
`310
`
`Video
`API
`314
`
`Audio
`API
`316
`
`
`
`Audio
`Driver
`324
`
`
`
`
`
`
`Graphics
`API
`312
`
`Integrated Graphics/Video
`Controller (IVC)
`320
`
`Page 4 of 27
`Page 4 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 4 of 14
`
`5,432,900
`
`FIGURE 4
`
`Page 5 of 27
`Page 5 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 5 of 14
`
`5,432,900
`
`FIGURE 5
`
`514
`
` graphics hardware
`
`
`
`capability information in video
`command intcrfacc decoder table
`526
`
`
`
`Page 6 of 27
`Page 6 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 5 of 14
`
`5,432,900
`
`FIGURE 6
`
`Decoder Table
`
`Page 7 of 27
`Page 7 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 7 of 14
`
`5,432,900
`
`FIGURE '7
`
`Yes-720
`
`Sendcapahililyreponootgaéandtogl-aphicshardwait
`
`Ret:civc<::pabilityrepu1fin;1124gzaphicshardwartJn1icrocode
`
`Stcrchaniwaxellnicrooodccapahilityinfcn-mationin-video
`comnaaxndintqifazcgdecodea-table
`
`Compamwnwntsafdwodatablewdmmhfimtnnmpabifitytablc
`728
`
`Yes-732
`
`
`
`
`
`
`
`Areaflclanenuofusinimzmxcapabilily
`tableprcscmindeoodatablcwitiioqualor
`greancrgaasaliljty?
`
`
`
`
`
`No-734
`
`SendrequesttoApplicati<mPmg|'amd1roughAPItoloadmissing
`Cl'.'»9’03?S
`
`Page 8 of 27
`Page 8 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 3 of 14
`
`5,432,900
`
`FIGURE 8
`
`Minimum Capability Table
`
`
`
`Page 9 of 27
`Page 9 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 9 of 14
`
`5,432,900
`
` rcqneslto1oada\rideodecodea'. Thcrequestincludesanaddressoftilc
`
`
`decoder. 1engd1.identific21“:;cfioodeandacapaJ:iliIycodc
`
`
`
`
`Sca:ch¢hcodernhlefa'anc:mywifl1thcsnmeida1fificafionmiemceivedinmc
`request. lfmmefimnonceuu-ywidzdnsaaneiiunfifiutionoodccadstsinthc
`deuudu‘tabie,takcfl1een1r93;:rid1tl1eg1'cancstm.patili1y.
`
`
`
` (.‘ouparet.:1pnbilityood
`
`mpahh"tycodere§§;vedi11d1e1eqncst.
`
`fttnemaaadwdentrywida
`
`
`
`
`
`Useadd1'essandIe11gtl1inIl1ereqncstIouansfe1-thedecode1-
`toafi-aeareaind1evideodeoodeblock230
`930
`
`
`
`Up1h:cthedeoodcrtable\nrit!1thead1:tress.h:ng1h.
`id:ntificaIiunoodc,andcapahi1iIycodcofth¢newlyloadcd
`
`932
`
`Page 10 of 27
`Page 10 of 27
`
`
`
`
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 10 of 14
`
`5,432,900
`
`FIGURE 10
`
`
`
`Scamhthedecodertable foranentry corresponding
`to thctypc ofvideo data mocived
`1014
`
`
`
`Transfer video data to software
`'
`ted decoder in
`Dccoda Block (430)
`1030
`
`
`
`Page 11 of 27
`Page 11 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 11 of 14
`
`5,432,900
`
`FIGURE 11
`
`Raxmmrumnstfiucmmmflfiynqnn
`I112
`
`Retrieve information in Decoder Table
`1114
`
`Send
`Table infonnation to
`Apphmnmmlfi?§T:nflu0uQ1API
`
`Page 12 of 27
`Page 12 of 27
`
`
`
`US. Patent
`
`July 11, 1995
`
`Sheet 12 of 14
`
`5,432,900
`
`FIGURE 12
`
` Application
`
`gram
`through Graphics AP].
`
`1212
`
` PasstheGraphicsoom1mnd
`
`IhmughtoIhcDisp1ayDriver
`4l6wil:houthavingmodifiedthe
`
`destinationparlicnofthc
`
`Receive command slams from
`
`Dispigy
`response to the
`graplncs opg-anon pgaformed on
`I
`theD1s;iI;y6B1unap.
`
`
`
`
` Was the Gnaphics
`command successfully
`pracessed by the Display
`Driver"
`1213'
`
`Yes -1222
`
`Page 13 of 27
`Page 13 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 13 of 14
`
`5,432,900
`
`FIGURE 13
`
`
`
`
`
`
`
`
`SendthemodifiedGraphics
`umnnnndtnthelfisphwlhfimm
`'I‘heD1'sp1ayDrive.rpc1fom1sthe
`satneoperalionon themask
`hitmapasperformedearlieron the
`displayéiatijtrnap.
`1
`
`
`
`
`
`
` Receive command stains from the
`Display Driver in response to the
`
`graphics operation performed on
`the
`
`
`
`1
`
`
`
`Send command sums to the
`Application Program through the
`graphics API.
`
`
`
`1314
`
`Page 14 of 27
`Page 14 of 27
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 14 of 14
`
`5,432,900
`
`FIGURE 14
`
`
`
`Mask Driver Processing
`(Clipping Information Generation)
`1410
`
`
`
`
`
`Reoeiveagraphicseommand ErotntheApplicat.ionProgram
`through the Graphics API
`1412.
`
`Interpret the graphics command to derive the location and
`dimensions of a visible region of a window.
`1414
`
`Generate clipping information defining the location and
`dimensions of the visible portion of the window. This
`clipping information defigtes a clipping region.
`141
`
`
`Send the clipping information to the Blender 418 which
`blends videodala with graphicsdam inside theclipping
`region and disables blendiflglrgutside the clipping region.
`
`
`
`Page 15 of 27
`Page 15 of 27
`
`
`
`1
`
`5,432,900
`
`INTEGRATED GRAPI-IIC3 AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`a continuation of application Ser. No.
`is
`This
`07/901,280, filed Jun. 19, 1992, now abandoned.
`
`5
`
`BACKGROUND OF THE INVENTION
`1. Field Of The Invention
`
`The present invention relates to the field of computer to
`display systems. Specifically, the present invention per-
`tains to the manipulation and control of graphics and
`video data for display on a display medium. The present
`invention further relates to the control and manipula-
`tion of audio data that is output in synchronization with
`images on a display screen.
`2. Prior Art
`
`l5
`
`Many conventional computer graphics display sys-
`tems exist in the prior art. These systems provide mech-
`anisms for manipulating and displaying graphics objects
`on a display screen. These graphic objects include
`points, vectors, conical shapes, rectangles. polygons,
`arcs, alpha numeric information. shaded or color-filled
`regions, and other forms of graphic imagery, typically
`represented as objects in a two dimensional or three
`dimensional virtual space. A graphics computer pro-
`gram in these conventional systems provides mecha-
`nisms by which a computer user may manipulate
`graphic objects using functions provided by the graph-
`ics software. This graphics software typically includs a
`graphics driver mftvvare module that operates in con-
`junction with a graphics controller to load a frame
`buffer with digital information that is subsequently con-
`vened to a form displayable on a display screen. It will
`be apparent to those skilled in the art that the use of
`graphics driver software, graphics controllers, and
`frame buffers is well known to those of ordinary skill in
`the art.
`
`25
`
`35
`
`Other prior art computer display systems provide a
`means for displaying video images on a display screen.
`These video images comprise streams of digital video
`data encoded in several well known fomtats such as
`MPEG. JPEG, and RTV format. Unlike graphical data.
`this video data includes no representations of individual
`objects within the video image. To the computer dis-
`play system, the video image is a homogeneous video
`data stream encoded in a particular way. In prior art
`systems, this video data is received by processing logic
`that includes a particular mechanism for decoding the
`video data and transferring the decoded video data into
`a frame buffer for display on the display screen. Because
`of the different nature of graphical data as compared
`with video data, prior art systems tend to handle graphi-
`cal and video data in separate subsystems within the
`computer display system. Thus, graphical data and
`video data typically take parallel and independent paths
`through prior art systems as the data is processed for
`display. In some cases, two separate frame buffers are
`used, one for graphical data and one for video data. In
`other systems, a single frame buffer is used; however,
`graphical data occupies one distinct region of the frame
`buffer and video data occupies a difierent portion of the
`frame buffer.
`
`-15
`
`55
`
`A number of problems exist in prior art systems that
`process graphical and video data independently. First,
`these prior art systems cannot efficiently combine
`graphical and video images together into a composite
`form that is easily manipulated. For example, scrolling in
`
`65
`
`Page 16 of 27
`Page 16 of 27
`
`2
`graphical image across a video background typically
`requires additional processing to prevent image from
`being destroyed. Secondly. synchronizing graphics and
`video images in prior art systems is typically very diff -
`cult. Synchronization problems in prior art systems
`result in video images that appear tom or disjoint. More-
`over, aligning graphics images at the proper location
`and time in a video image is difficult using prior an
`techniques. Thirdly, the video data decoding schemes
`used in prior art systems are typically limited to a single
`decoding scheme. For example, less expensive graphics
`programs may provide software implemented video
`data decoders. Although these systems provide an inex-
`pensive solution,
`they tend to run slowly and often
`provide decoded video data of a low resolution. Other
`more expensive prior art systems provide graphics
`hardware that may be used to decode a video data
`stream. These systems are fast and provide high resolu-
`tion decoded video data; however, they are also sub-
`stantially more expensive. Moreover, users wanting to
`upgrade from a less expensive system. such as a soft-
`ware implemented decoder, to a more expensive sys-
`tem, such as a graphic hardware implemented system.
`must first reconfigure or reinstall their graphics applica-
`tion program in order to take advantage of a different
`video decoding technique. This additional impact at the
`applications program level further increases the up
`grade cost to the computer user.
`Thus, a better means for integrating graphical and
`video information in a single computer display system is
`needed.
`
`SUMMARY OF THE INVENTION
`
`invention integrates graphical, video.
`The present
`and audio data into a single processing environment.
`Therefore, the present invention includes a processor
`(CPU), a graphics controller. a capture controller. and
`an audio controller. The graphics controller is a special-
`ized graphics control processor coupled to a peripheral
`bus. The graphics controller, operating under the soft-
`ware control of the CPU performs processing and com-
`positing of the graphics and video data to be displayed
`on a CRT display. This software control methodology
`is the subject of the present invention and is described in
`detail herein.
`
`Application software interfaces with system software
`of the present invention through a graphics application
`program interface (API), a video API. and an audio
`API. These APIs provide a high level functional inter-
`face for the manipulation of graphics, video, and audio
`information. Unlike the prior art, however, the present
`invention employs an integrated graphics/video con-
`troller GVC). The IVC interfaces with application soft-
`ware through the graphics API and the video API. The
`output data generated by the IVC is a composite graph-
`ics and video image which is stored for output to a
`display device in a frame buffer. The IVC of the present
`invention also interfaces with an audio driver. The IVC
`provides the audio driver with synchronization infor-
`mation used to synchronize a video image stream with
`an audio data stream. The IVC receives timing informa-
`tion from a display timing control circuit. The timing
`information provided by the display timing control
`circuit is used by the NC for synchronizing the loading
`of the frame buffer. The display timing control circuit
`provides a means for optiniizing the operation of the
`IVC for the timing characteristics of a particular dis-
`play system that is used with the present invention.
`
`
`
`3
`The IVC receives graphics instructions through the
`graphics API and video instructions though the video
`API. The graphics API interface provides a means by
`which an application program may interface with a
`display driver for displaying graphics images on a dis-
`play screen. A graphics command interface and a mask
`driver derive information from the graphics commands
`provided by an application program prior to transfer-
`ring the graphics information to a display driver. Three
`types of information derived from the graphics com-
`mand stream are produced by the mask driver. First,
`clipping information is derived by the mask driver.
`Clipping information is used to enable the display of
`graphics infomtation within a specified window on the
`display screen. A window is a rectangular portion of the
`display screen having particular attributes that are well
`known in the art. Clipping information is used to enable
`the display of graphics information within the window
`and to suppress the display of information outside the
`boundary of a window. The second type of information
`derived by the mask driver from the graphics command
`stream is the actual graphic content of a specified win-
`dow. The third type of information derived by the mask
`driver is mask information. A blender uses the mask
`information, the graphics information, and the clipping
`information for combining or compositing graphics
`information with video information.
`The IVC also receives video image information
`through the video API. The video API provides a
`means by which an applications program may interface
`with the NC for the purpose of manipulating video
`images for display on a display screen. A video com-
`mand stream is received by a video command interface
`through the video API. The video command stream
`comprises video commands for configuring the opera-
`tion of the IVC and video data which comprises a com-
`bination of video and audio data with synchronization
`information for synchronizing the display of frames of
`video images.
`The video commands of the video command stream
`provide functions for configuring the operation of the
`IVC. These functions include commands for loading
`software video decoders in a decoder block within the
`IVC. Video data transferred to the IVC via the video
`A?! may be encoded in a variety of different formats.
`The present invention provides a means for dynami-
`cally loading a plurality of different video decoders
`through a video command interface to a video decode
`block both within the IVC. Each of the independent
`decoders within the decode block contain processing
`logic for decoding a particular type of video data to
`produce a uniform type of decoded video data which is
`provided to the blender. The blender receivs the de-
`coded videodataandcombinesthcvideodata with
`graphics data as defined by the mask information, the
`graphics information, and the clipping information. The
`combined or cornposited image is then transferred to a
`display driver and output to a display through a frame
`buffer.
`
`l5
`
`2!}
`
`25
`
`35
`
`40
`
`45
`
`55
`
`BRIEF DESCRIPTION OF THE D'R.AW'INGS
`
`FIG. 1 is a block diagram of the computer system
`hardware used in the preferred embodiment.
`FIG. 2 is a block diagram of the prior art software
`used in a system for processing graphics, video, and 65
`audio data.
`FIG. 3 is a block diagram of the software used in the
`preferred embodiment of the present invention.
`
`Page 17 of 27
`Page 17 of 27
`
`5,432,900
`
`4
`FIG. 4 is a block diagram of the integrated gra-
`phics/video controller (IVC).
`FIGS. 5-11 are flowcharts illustrating the processing
`logic of the video command interface.
`FIGS. 12-14 are flowcharts illustrating the process-
`ing logic of the mask driver.
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENT
`
`The present invention is an integrated graphics and
`video display system for manipulating and viewing both
`graphical and video data in a single integrated system.
`In the following description, numerous specific details
`are set forth in order to provide a thorough understand»
`ing of the invention. However, it will be apparent to one
`of ordinary skill in the art that these specific details need
`not be used to practice the present invention. In other
`instances. well known structures, circuits. and inter-
`faces have not been shown in detail in order not to
`obscure unnecessarily the present invention.
`Referring now to FIG. 1, a block diagram of the
`computer system hardware used in the present embodi-
`ment is illustrated. The computer system used in the
`preferred embodiment comprises a memory bus 100 for
`communicating information, a processor (CPU) 102
`coupled with memory bus ltll for processing informa-
`tion, and a memory 104 coupled with bus 100 for storing
`information and instructions for CPU 102 and other
`components of the computer system. In the preferred
`embodiment, memory bus III] is a high speed bus which
`is well known to those of ordinary skill in the art. In the
`preferred embodiment. CPU 102 is an i486® brand
`microprocessor manufactured by the assignee of the
`present invention. i486® is a registered trademark of
`Intel Corporation, Santa Clara. Calif. Memory 104 in
`the preferred embodiment comprises dynamic random
`access memory (DRAM) and read only memory
`(ROM). Coupling a CPU 102 and a memory 104 to-
`gether on a memory bus 100 as shown in FIG. 1 is well
`known in the art.
`FIG. 1 also illustrates a peripheral bus 110 coupled to
`memory bus 100 via a bridge 112. The use ofa bridge
`between two computer buses is well known in the art.
`Similarly, the use of a peripheral bus such as peripheral
`component interface (PCB is described in co-pending
`patent application Ser. No. 07/836,992.
`A disc drive 114 and a network interface or other
`input/output device 140 may also optionally be coupled
`to peripheral bus 110. Again, disc drive and network
`interface apparatus and their connection to a peripheral
`bus III] is well ‘known in the art.
`The present
`invention integrates graphical, video,
`and audio data into a single procasing environment.
`Therefore.
`the present
`invention centers around the
`interaction between CPU 102 and graphics controller
`116, capture controller 118, and audio controller 1213.
`Graphics controller 116 is a specialized graphics control
`processor coupled to peripheral bus llll. Graphics con-
`troller l16 operating under the software control of CPU
`102 performs processing and compositing of the graph-
`ics and video data to be displayed on CRT display 126.
`This software control methodology is the subject of the
`present invention and is described in detail below. The
`hardware platform in which the present invention oper~
`ates
`is described in patent application Ser. No.
`07,901,519, now US. Pat. No. 5,243,447, filed concur-
`rently with the present patent application. In general,
`this hardware platform comprises graphics controller
`
`
`
`5,432,900
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`5
`116 to which a video random access memory (VRAM)
`122 is coupled. Video RAM 122 provides storage for a
`frame buffer which represents a binary encoded repre-
`sentation of the picture elements (pixels) that are dis-
`played on CRT display 126. Video RAM 122 is used for
`storage of both graphical information and video infor-
`mation. Using the processing techniques described
`herein and the hardware platform described in the
`above-referenced co-pending patent application, both
`graphical and video data may be composited (i.e. com-
`bined) into a single frame buffer within video RAM 122.
`This composite graphical and video image is convened
`to analog form by digital to analog converter (DAC)
`124! and output to CRT display 126. In this manner,
`applications programs executing within CPU 102 can
`use the procming logic of the present
`invention to
`display composited graphical/video images in arbitrary
`locations on CRT display 126.
`Capture controller 118 is coupled to peripheral bus
`Ill}. Capture controller 118 is a hardware system for
`receiving and processing video display data from video
`source 123. Capture controller 118 receives video im-
`ages from video source 128 and generates video image
`data packets for transmission on peripheral bus 110. The
`video image data packets comprise still video images or
`a series of motion video images which include synchro-
`nization information. The video data may be moved on
`peripheral bus 110 into graphics controller 116 for com-
`positing with graphics data or moved to CPU 102 or
`memory 104 via bridge 112 and memory bus 100. Video
`source 123 is a video camera, a video recording device,
`or other source for video image data. The structure and
`detailed operation of capture controller 118 is beyond
`the scope of the present invention.
`Audio controller 120 isalso coupled to peripheral bus
`110. Audio controller 120 provides a means for receiv-
`ing and processing audio data from a microphone 132 or
`other audio input device. Audio controller 120 gener-
`ates a series of audio data packets for transfer on periph-
`eral bus 110. The audio data packets represent a series of
`sounds retrieved from microphone 132 or other audio
`input device. Audio controller 120 also provides syn-
`chronization information with each audio data packet.
`Audio controller 120 also provides a means for decod-
`ing audio data received via peripheral bus 110 and out-
`putting audio signals to speaker 130 or other audio out-
`put device. The structure and detailed operation of
`audio controller 120 is beyond the scope of the present
`invention.
`
`Referring now to FIG. 2, a block diagram illustrates
`the prior art method for processing graphics, video, and
`audio data. Application software 210, residing in mem-
`ory 104 and executed by CPU 102. comprises user level
`programming tools for manipulating and processing
`graphics, video, and audio data. Several application
`programs of this type exist in the prior art. An example
`of one such application program is Autobesk Anima-
`tor ‘FM for Microsoft Windows TM developed by Auto-
`Desk TM , Inc. Application software 210 interfaces with
`system software through several application program
`interfaces (Al-'1]. APIS provide a means by which an
`application program may request service from system
`software through the functions provided in an AP]. In
`the prior art, three APls are generally provided:
`l)
`graphics API. 2) a video AP], and 3) an audio Al-‘I. As
`illustrated in FIG. 2, graphics AP] 212 provides an
`interface between application software 210 and graph-
`ics driver 218. Graphics API 212 provides a list of pub-
`
`55
`
`65
`
`Page 18 of 27
`Page 18 of 27
`
`6
`lic functions and corresponding calling parameters with
`which application software 210 may create and manipu-
`late graphics objects that are stored for display in
`graphics frame buffer 220. Graphics objects typically
`include points, vectors, conical shapes, polygons, rec-
`tangles, blocks of text, arcs, and other shapes that may
`be represented in a two or three dimensional virtual
`space.
`In a similar manner, video API 214 provides an inter-
`face between application software 2l0 and video driver
`222. Video AP! 214 provides a set of functions for ma-
`nipulating video image data that is processed by video
`driver 222 and eventually stored for output to a display
`screen in video frame buffer 224. Because prior art
`Systems typically used graphic display hardware which
`was independent of video display hardware, the pro-
`ccising of graphic and video data in prior art systems
`was typically done using independent and distinct hard-
`ware and software processes. This prior art design is
`therefore considered a loosely coupled or low integra-
`tion system; because, the graphic and video data is not
`integrated as it travels from the application software
`210 to a display screen (not shown). Rather, ‘tn the prior
`art systems: Smphic and video data travels two parallel
`paths through the processing architecture.
`Audio API 216 provides an interface between appli-
`cation software 210 and audio driver 2.26. Audio API
`216 provides a set of interfaces for manipulating audio
`hiformation that is procmsed by audio driver 226 and
`stored in audio output buffer 228 for eventual output to
`an audio emission system (not shown). The audio AP]
`216 provides a set of audio related functions with which
`application software 210 may manipulate audio infor-
`mation. Again, the loosely coupled or low integration
`approach used in the prior art is apparent as audio infor-
`mation travels an independent path from application
`software 210 to audio output buffer 2%. The indepen-
`dent path traveled by audio data in the prior art systems
`is
`less of a performance penalty; however. because
`audio information is destined for a different output de-
`vice than graphics and video data. Both graphics and
`video data, on the other hand, are typically displayed on
`the same display device.
`Several problems exist with the low integration ap-
`proach used in the prior art. Because graphics and video
`information is procemed using separate processes, prior
`art systems do not provide an efficient means for com-
`bining graphics and video images together in a compos-
`ite form. Secondly, because certain items of hardware in
`the prior art design are dedicated to processing either
`graphics or video data, some items of the prior art de-
`sign must be duplicated for processing both graphics
`and video data. Some of these duplicated hardware
`elements include frame buffer memory 220 and 224.
`Because of the independent paths traveled by graphics
`and video data in the prior art system, other hardware
`logic is required to handle the parallel paths. This addi-
`tional logic includes multiplexing logic which deter-
`mines which frame buffers data is to be displayed for
`each pixel location. Moreover. additional synchroniza-
`tion logic is required to synchronize the output of
`graphics and video data as the information is merged
`just before output to the display device. Thus, the prior
`art design sufllers from low graphics/video integration
`functionality and increased hardware cost due to the
`low integration design.
`Referring now to FIG. 3, the structure of the system
`software used in the present invention is illustrated.
`
`
`
`7
`Again, application software 310 interfaces with system
`software through graphics API 312, video API 314, and
`audio AP] 316. In a manner similar to the application
`program interfaces illustrated in FIG. 2, APIS 312, 314,
`and 316 provide a high level functional interface for the
`manipulation of graphics, video, and audio information.
`Unlike the prior art. however. the present invention
`employs an integrated graphics/’video controller (IVC)
`3211. WC 3211
`interfaces application software 310
`through graphics API 312 and video API 314. The
`structure and operation of IVC 320 is illustrated in
`detail in connection with FIG. 4. The output data gen-
`erated by IVC 321] is a composite graphics and video
`image which is stored for output to a display device in
`frame buffer 322. NC 320 also interfaces with audio
`driver 324. IVC 320 provides audio driver 324 with
`synchronization information used to synchronize a
`video image stream with an audio data stream. IVC 20
`receives timing information from a display timing con-
`trol circuit 326. The timing information provided by
`display timing control 326 is used by IVC 320 for syn-
`chronizing the combination of graphics and video data
`and for controlling when the combined data is loaded
`into the frame buffer 322. Display tinting control circuit
`326 provides a means for optimizing the operation of
`IVC 320 for the timing characteristics of a particular
`display system that is used with the present invention.
`Referring now to FIG. 4, a detailed block diagram of
`the structure of integrated graphics/video controller
`(IVC) 320 is illustrated. IVC 320 receives graphics
`instructions through graphics AP! 312 and video in-
`structions though video AP] 314. In the preferred em-
`bodiment, graphics AP} 312 is a GDI interface devel-
`oped by Microsoft Corporation of Redmond, Wash.
`The graphics API interface 312 provides a means by
`which an application program may interface a display
`driver for displaying graphics images on a display
`screen. In the present invention, the IVC 320 receives
`graphics commands via graphics API 312 through a
`graphics command interface 4111. Graphics command
`interface 410 and mask driver 414 derive information
`from the graphics commands provided by an applica-
`tion program prior to transferring the graphics informa-
`tion to display driver 416. Three types of information
`derived from the graphics command stream are derived
`by mask driver 414. First, clipping information is de-
`rived by mask driver 414. Clipping information is used
`to enable the display of graphics information within a
`specified window on the display screen. A window is a
`rectangular portion of the display screen having partic-
`ular attributes that are well known in the art. Clipping
`information is used to enable the display of graphics
`information within the window and to suppress the
`display information outside the boundary of a window.
`The clipping information derived by mask driver 414 is
`stored in area-t24asi1lustrate4:l in FIG. 4. The second
`type of information derived by mask driver «$14 from
`the graphics command stream is the actual graphic
`content of a specified window. The graphic contents of
`a window is specified in terms of a pixel bitmap. Each
`pixel within the window is represented by one or more
`binary bits of memory in the bitmap. This graphics
`information as derived by mask driver -01-! is stored in
`graphics area 422. The graphics information is also
`