throbber
United States Patent
`Rhodes et al.
`
`|1||||llllllllllllllllIllllllllllllllllllllllllllllillllllllllllllIllllllll
`U8005432900A
`
`[19]
`
`{t 1] Patent Number:
`
`5,432,900
`
`
`
`[45] Date of Patent: Jul. 11, 1995
`
`[54]
`
`[75]
`
`INTEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`Inventors: Kenneth E. Rhodes, Portland; Robert
`T. Adams, Lake Oswego; Sherman
`Jones. Portiand; Ruben G. F. Coellto,
`Hillsboro, all of Greg.
`
`[73] Assignee:
`
`Intel Corporation, Santa Clara. Calif.
`
`[21] Appl. No.: 261,284
`
`[22] Filed:
`
`Jun. 16, 1994
`
`0061‘ 1/09
`1m. (31.0
`{51]
`
`
`mass/154.- 345/118
`[52] U.S.CI........._
`......343/539;345/118;
`[58] Field ofSearch-‘i:::...........
`"'395/153, 154. 162. 163
`References Cited
`U.S. PATENT DOCUMENTS
`
`[56]
`
`2/1985 Fukushima et al.
`4.493.031
`................ 358/240
`7/1985 Mizokawa ...........
`4.530.009
`358/183
`
`4.539585 9/ 1985 Spaekowt et a1.
`...... 358/93
`
`4.644.401
`2/1987 Gaskirts
`353/183
`4.934.I83
`[[1991 Ohuchi
`364/521
`5.012.342
`4(1991 Olsen :1 a1.
`..
`3582181
`
`5.027.212 of 1991 Marlton et al.
`..
`353/183
`340/723
`5.119.080 6/1992 Kajirnoto et a].
`
`3642488
`5.161.102 11X1992 Griffin et a1.
`395(162
`3/1993 Kamiyama et a1
`5.195.111?I
`340K721
`5.220.312
`6/1993 Lumelslty et al.
`
`5.251.301 102’1993 C001:
`................
`395/163
`5264.83? 11/1993 Buehler ........
`395/153
`
`3951'162
`5.271.091 12/ 1993 Barker et a1.
`.. 345/118
`5.2741364 1221993 Li et al.
`................
`..
`
`5.226353 12/1993 Roskowski et a1.
`395/135
`
`FOREIGN PATENT DOCUMENTS
`
`.
`0454414 1121/1991 European Pat. Off.
`OTHER PUBLICATIONS
`
`Worthington. “True Vision Enhances Its PC Video
`Graphics Card“, Info world, vol. 12 No. 32 (Aug. 6.
`1990). pp. 23 Abstract Only.
`Quinnell, “Video Chip Set Handles Decompression
`
`Special Effects, and Mixing in Many Formats". EDN,
`vol. 35 No. 25 (Dec. 6, 1990), pp. 72—73. Abstract Only.
`
`Primary Examiner—Heather R. Herndon
`Assistant Examiner—N. Kenneth Burraston
`Attorney. Agent. or Firm—Blakely, Sokoloff. Taylor
`Zafman
`
`{57]
`
`ABSTRACT
`
`Graphical, video. and audio data is integrated into a
`single processing environment. The present invention
`employs an integrated graphics/ video controller (IVC)
`which interfaces with application software through a
`graphics API and a video API. The IVC receives
`graphics Cornmands through the graphics API and
`video commands through the video API. A mask driver
`produces information from the graphics commands
`including clipping information, graphics information
`and mask information. A biender uses the mask informa-
`
`tion. the graphics information. and the clipping infor—
`mation for combining or compositing graphics images
`with video images. The video commands of the video
`command stream provide functions for configuring the
`Operation of the IVC. These functions include com-
`mands for loading software video decoders in a decoder
`block within the IVC. Video data transferred to the
`IVC via the video API may be encoded in a variety of
`different
`formats. The present
`invention provides a
`means for dynamically loading a plurality of different
`video decoders through a video command interface to a
`video decode block both within the IVC. Each of the
`independent decoders within the decode block contain
`processing logic for decoding a particular type of video
`data to produce a uniform type of decoded video data
`which is provided to the blender. The blender receives
`the decoded video data and combines the video data
`with graphics data as defined by the mask information,
`the clipping information. and the graphics information.
`
`26 Claims, 14 Drawing Sheets
`
`r—iffitol1
`
`
`
`
`
`
`Page 1 of 27
`Page 1 of 27
`
`HTC-LG-SAMSUNG EXHIBIT 1028
`HTC-LG-SAMSUNG EXHIBIT 1028
`
`

`

`US. Patent
`
`u,
`
`a
`
`00
`
`5NH_mI
`
`EMUn35E:32>
`
`N.no.u_#2«SJ3“522aEu
`
`an.m:
`
`02.2%3:95:90
`32>flan—«u
`
`u.as
`
`%.8333
`
`4eh
`mDEBS
`
`m:
`
`9.,
`
`Samoa9.EA,
`9:25wk.552025Evan—tom
`
`h8g3:
`
`2:
`
`Page 2 of 27
`Page 2 of 27
`
`
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 2 of 14
`
`5,432,900
`
`FIGURE 2
`
`(Prior Art)
`
`Application Software
`210
`
`Graphics
`API
`212
`
`Video
`API
`214
`
`Graphics
`Driver
`218
`
`Video
`Driver
`222
`
`Audio
`API
`216
`
`Audio
`Driver
`26
`
`Graphics
`Frame Buffer
`220
`
`Video
`Frame Buffer
`224
`
`Audio
`Output Buffer
`228
`
`Page 3 of 27
`Page 3 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 3 of 14
`
`5,432,900
`
`FIGURE 3
`
`
`
`
`
`Applicatioh Software
`310
`
`Graphics
`AP!
`312
`
`Video
`AP]
`314
`
`Integrated Graphics/Video
`Controller (IVC)
`320
`
`Audio
`API
`316
`
`Audio
`Driver
`324
`
`
`
`
`
`
`
`
`
`Page 4 of 27
`Page 4 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 4 of 14
`
`5,432,900
`
`FIGURE 4
`
`
`
`Page 5 of 27
`Page 5 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 5 of 14
`
`5,432,900
`
`FIGURE 5
`
`514
`
` graphics hardware
`
`
`
`capability information in video
`command intafaoc decoder table
`526
`
`
`
`Page 6 of 27
`Page 6 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 6 of 14
`
`5,432,900
`
`FIGURE 6
`
`Decoder Table
`
`
`
`Page 7 of 27
`Page 7 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 7 of 14
`
`5,432,900
`
`FIGURE 7
`
`Yes-720
`
`Smdcapahilhyreponcoggaznndtogmphicshardwm
`
`Reocivecqnhifitympmfingflgmphicshardwmflnfimode
`
`Smhmuwwmimmumyinfumdmm'vm
`mmmgazgedwodamble
`
`Cmmmnndmmggmnfinimmmpabifiwtzblc
`
`
`
`
`
`
`
`
`Areaflelcmuofminimeapabilily
`
`mfleprwemindwodanbkwimeqmlor
`
`wiggling?
`
`
`
`bio-734
`
`SmdreqlmtoApplioationPInglmdmughAPItoloadufissing
`
`736
`
`
`
`Yes-732
`
`
`
`
`
`Page 8 of 27
`Page 8 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 8 of 14
`
`5,432,900
`
`FIGURE 8
`
`Minimum Capability Table
`
`
`
`Page 9 of 27
`Page 9 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 9 of 14
`
`5,432,900
`
`FIGURE 9
`
`
`
`
`
`Receiveamquesnoloadafidaodecodu. Thereqnestincludesanaddmssafmc
`m.kngm.idenfifimigtflwdeandacapafifitywdc
` mmmhmmfimflnmfimfifimfimmmhm
`request. Ifmethnnonceuu'ywithdnsmidmfifimfionoodecximinth:
`l4
`demdumbhtakcmecmgywimdneglummpabflity.
`
`
`
` Is matchcd entry found in
`916
`decodertable?
`
` Coumecapntilityoodcnfdnmatdmdenuywim
`
`mpalfifitycodemgg'zvedmmemqm
`
`
`
`
`
`Useaddmsandlmgfllinmemqmtommedmda
`mafiaearuindaevideodwodeblockfio
`930
`
`
`
`Upthmthedwodamblevfimtheaddresalmglh.
`chfifimfimmandcapahflitycodcofdwmlylmdud
`
`
`932
`
`Page 10 of 27
`Page 10 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 10 of 14
`
`5,432,900
`
`FIGURE 10
`
`
`
`Scamhmedemdutable foranentry mponding
`to thctypc ofvideo data received
`1014
`
`
`
`
`
`Transfer video data to software
`'
`ted decoder in
`Deccan- Block (430)
`1030
`
`
`
`Page 11 of 27
`Page 11 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 11 of 14
`
`5,432,900
`
`FIGURE 11
`
`Receive request for capability report
`1112
`
`1116
`
`Reukvehfibnnmkmintkwa$nThbb
`1114
`
`SendIkxmdchhbkflnfiwnnfion1n
`Application Program through API
`
`Page 12 of 27
`Page120f27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 12 of 14
`
`5,432,900
`
`FIGURE 12
`
`through Graphics APL
`
`
`lhmu
`416wiflgouthav3mpmodifiedythe
`
`
`la Dnver
`
`to
`
` . m
`
`Yes -1222
`
`
`
`pardmofmc
`
`
` Was the Gmphics
`
`
`command successfully
`
`
`paused by the Display
`
`Driver"
`1213'
`
`
`
`Page 13 of 27
`Page 13 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 13 of 14
`
`5,432,900
`
`FIGURE 13
`
`
`
`
`
`
`
`
`
`SendthemodifiedGraphics
`mudtotheDisplayDl-ivm
`'IheDisplayDriverpaformsthe
`smoperalionm themask
`hiunapaspeaformedeaflieron the
`displayébtijunap.
`1 1
`
`
`
`
`
`
` Receive command stains fmm the
`Display Driver in response to the
`
`graphics operationperformed on
`Ibenmfigggnnap.
`1
`I
`
`
`
`Send oomnand status to the
`Application Program though the
`graphics AP].
`
`
`
`1314
`
`Page 14 of 27
`Page 14 of 27
`
`

`

`US. Patent
`
`July 11, 1995
`
`Sheet 14 of 14
`
`5,432,900
`
`FIGURE 14
`
`
`
`
`Mask Driver Processing
`
`
`(Clippng Informatgon Generation)
`
`1 4 1
`
`
`ReeciveagraphicscommandfromtheApplicatioangram
`through the Graphics API
`1412.
`
`Interpret the graphics command to derive the location and
`dimensions of a visible region of a window.
`1414
`
`141
`
`Genuateclippinginfamaliondefining meioeationand
`dimensions of the visible portion of the window. This
`clipping information defigies a clipping region.
`
`
`
`Send the clipping infmmation to the Blender 418 which
`blends Videodala with graphicsdm inside tiaeclipping
`region and disables blending outside the clipping region.
`
`
`1418
`
`
`Page 15 of 27
`Page 15 of 27
`
`

`

`1
`
`5,432,900
`
`INTEGRATED GRAPHIC3 AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`a continuation of application Ser. No.
`is
`This
`07/901,280, filed Jun. 19. 1992, now abandoned.
`
`BACKGROUND OF THE INVENTION
`1. Field Of The Invention
`
`The present invention relates to the field of computer
`display systems. Specifically, the present invention per-
`tains to the manipulation and control of graphics and
`video data for display on a display medium. The present
`invention further relates to the control and manipula-
`tion of audio data that is output in synchronization with
`images on a display screen.
`2. Prior Art
`
`5
`
`IO
`
`15
`
`Many conventional computer graphics display sys-
`tems exist in the prior art. These systems provide mech-
`anisms for manipulating and displaying graphics objects
`on a display screen. These graphic objects include
`points, vectors, conical shapes, rectangles. polygons,
`arcs, alpha numeric information. shaded or color-filled
`regions, and other forms of graphic imagery, typimlly
`represented as objects in a two dimensional or three
`dimensional virtual space. A graphics computer pro-
`gram in these conventional systems provides mecha~
`nisms by which a computer user may manipulate
`graphic objects using functions provided by the graph-
`ics software. This graphics software typically includes a
`graphics driver mftware module that operates in con-
`junction with a graphics controller to load a frame
`buffer with digital information that is subsequently con-
`vened to a form displayable on a display screen. It will
`be apparent to those skilled in the art that the use of
`graphics driver software, graphics controllers, and
`frame buffers is well known to those of ordinary skill in
`the art.
`
`25
`
`35
`
`Other prior art computer display systems provide a
`means for displaying video images on a display screen.
`These video images comprise streams of digital video
`data encoded in several well known formats such as
`MPEG. JPEG. and RTV format. Unlike graphical data.
`this video data includes no representations of individual
`objects within the video image. To the computer dis-
`play system, the video image is a homogeneous video
`data stream encoded in a particular way. In prior art
`systems, this video data is received by processing logic
`that includes a particular mechanism for decoding the
`video data and transferring the decoded video data into
`a frame buffer for display on the display screen. Because
`of the different nature of graphical data as compared
`with video data, prior art systems tend to handle graphi-
`cal and video data in separate subsystems within the
`computer display system. Thus, graphical data and
`video data typically take parallel and independent paths
`through prior art systems as the data is processed for
`display. In some cases, two separate frame buffers are
`used, one for graphical data and one for video data. In
`other systems, a single frame buffer is used; however,
`graphical data occupies one distinct region of the frame
`buffer and video data occupies a difi‘erent portion of the
`frame buffer.
`
`I15
`
`55
`
`A number of problems exist in prior art systems that
`process graphical and video data independently. First,
`these prior art systems cannot efficiently combine
`graphical and video images together into a composite
`form that is easily manipulated. For example, scrolling a
`
`65
`
`Page 16 of 27
`Page 16 of 27
`
`2
`graphical image across a video background typically
`requires additional processing to prevent image from
`being destroyed. Secondly, synchronizing graphics and
`video images in prior art systems is typically very diff -
`cult. Synchronization problems in prior art systems
`result in video images that appear torn or disjoint. More-
`over, aligning graphics images at the proper location
`and time in a video image is difficult using prior art
`techniques. Thirdly, the video data decoding schemes
`used in prior art systems are typically limited to a single
`decoding scheme. For example, less expensive graphics
`programs may provide software implemented video
`data decoders. Although these systems provide an inex-
`pensive solution,
`they tend to run slowly and often
`provide decoded video data of a low resolution. Other
`more expensive prior art systems provide graphics
`hardware that may be used to decode a video data
`stream. These systems are fast and provide high resolu-
`tion decoded video data; however, they are also sub-
`stantially more expensive. Moreover, users wanting to
`upgrade from a less expensive system. such as a soft-
`ware implemented decoder, to a more expensive sys-
`tem, such as a graphic hardware implemented systern,
`must first reconfigure or reinstall their graphics applica-
`tion program in order to take advantage of a different
`video decoding technique. This additional impact at the
`applications program level further increases the up
`grade cost to the computer user.
`Thus, a better means for integrating graphical and
`video information in a single computer display system is
`needed.
`
`SUMMARY OF THE INVENTION
`
`invention integrates graphical, video.
`The present
`and audio data into a single processing environment.
`Therefore, the present invention includes a processor
`(CPU), a graphics controller. a capture controller. and
`an audio controller. The graphics controller is a special-
`ized graphics control processor coupled to a peripheral
`bus. The graphics controller, operating under the soft-
`ware control cf the CPU performs processing and och:-
`positing of the graphics and video data to be displayed
`on a CRT display. This software control methodology
`is the subject of the present invention and is described in
`detail herein.
`
`Application software interfaces with system software
`of the present invention through a graphics application
`program interface (API), a video API. and an audio
`API. These APIs provide a high level functional inter-
`face for the manipulation of graphics, video, and audio
`information. Unlike the prior art, however, the present
`invention employs an integrated graphics/video con-
`troller {lVC}. The IVC interfaces with application soft-
`ware through the graphics AP] and the video API. The
`output data generated by the IVC is a composite graph-
`ics and video image which is stored for output to a
`display device in a frame buffer. The IVC of the present
`invention also interfaces with an audio driver. The IVC
`provides the audio driver with synchronization infor-
`mation used to synchronize a video image stream with
`an audio data stream. The IVC receives timing informa-
`tion from a display tinting control circuit. The timing
`information provided by the display timing control
`circuit is used by the IVC for synchronizing the loading
`of the frame buffer. The display timing control circuit
`provides a means for optimizing the operation of the
`IVC for the timing characteristics of a particular dis-
`play system that is used with the present invention.
`
`

`

`3
`The IVC receives graphics instructions through the
`graphics API and video instructions though the video
`API. The graphics API interface provides a means by
`which an application program may interface with a
`display driver for diSplaying graphics images on a dis-
`play screen. A graphics command interface and a mask
`driver derive information from the graphics commands
`provided by an application program prior to transfer-
`ring the graphics information to a display driver. Three
`types of information derived fmm the graphics com-
`mand stream are produced by the mask driver. First,
`clipping information is derived by the mask driver.
`Clipping information is used to enable the display of
`graphics information within a Specified window on the
`display screen. A window is a rectangular portion of the
`display screen having particular attributes that are well
`known in the art. Clipping information is used to enable
`the display of graphics information within the window
`and to suppress the display of infomation outside the
`baundary of a window. The second type of information
`derived by the mask driver from the graphics enmmand
`stream is the actual graphic content of a specified win-
`dow. The third type of information derived by the mask
`driver is mask information. A blender uses the mask
`information, the graphics information, and the clipping
`information for combining or compositing graphics
`information with video informatioa.
`The IVC also receives video image information
`through the video API. The video API provides a
`means by which an applications program may interface
`with the IVC for the purpose of manipulating video
`images for display on a display screen. A video com—
`mand stream is received by a video command interface
`through the video API. The video command stream
`comprises video commands for configuring the opera—
`tion of the WC and video data which comprises a com-
`bination of video and audio data with synchronization
`infonnation for synchronizing the display of frames of
`video images.
`The video commands of the video command stream
`provide functions for configuring the operation of the
`IVC. These functions include commands for loading
`software video decoders in a decoder block within the
`IVC. Video data transferred to the IVC via the video
`AP] may be encoded in a variety of different formats.
`The present invention provides a means for dynami~
`cally loading a plurality of different video decoders
`through a video command interface to a video decode
`block both within the IVC. Each of the independent
`decoders within the decode block contain processing
`logic for decoding a particular type of video data to
`produce a uniform type of decoded video data which is
`provided to the blender. The blender receives the de-
`coded videodataandcombinesthcvideodata with
`graphics data as defined by the mask information, the
`graphics information, and the clipping information. The
`combined or composited image is then transferred to a
`display driver and output to a display through a frame
`buffer.
`
`10
`
`15
`
`20
`
`25
`
`35
`
`40
`
`45
`
`55
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram of the computer system
`hardware used in the preferred embodiment.
`FIG. 2 is a block diagram of the prior art software
`used in a system for proceeding graphics, video, and 65
`audio data.
`FIG. 3 is a block diagram of the software used in the
`preferred embodiment of the present invention.
`
`Page 17 of 27
`Page 17 of 27
`
`5,432,900
`
`4
`FIG. 4 is a block diagram of the integrated gra-
`phics/video controller (IVC).
`FIGS. 5—11 are flowcharts illustrating the processing
`logic of the video command interface.
`FIGS. 12—14 are flowcharts illustrating the process-
`ing logic of the mask driver.
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENT
`
`The present invention is an integrated graphics and
`video display system for manipulating and viewing both
`graphical and video data in a single integrated system.
`In the following description, numerous specific details
`are set forth in order to provide a thorough understand»
`ing of the inventioa. However, it will be apparent to one
`of ordinary skill in the art that these specific details need
`not be used to practice the present invention. In other
`instances. well known structures, circuits. and inter-
`faces have not been shown in detail in order not to
`obscure unnecessarily the present invention.
`Referring now to FIG. 1, a block diagram of the
`computer system hardware used in the present embodi-
`ment is illustrated. The computer system used in the
`preferred embodiment comprises a memory bus 100 for
`communicating information, a processor (CPU) 102
`coupled with memory bus 10] fOr processing informa-
`tion, and a memory 104 coupled with bus 100 for storing
`information and instructions for CPU 102 and other
`components of the computer system. In the preferred
`embodiment. memory bus 11!] is a high speed bus which
`is well known to those of ordinary skill in the art. In the
`preferred embodiment. CPU 102 is an i486® brand
`microprocessor manufactured by the assignee of the
`present invention. i486® is a registered trademark of
`Intel Corporation, Santa Clara. Calif. Memory 104 in
`the preferred embodiment comprises dynamic random
`access memory (DRAM) and read only memory
`(ROM). Coupling a CPU 102 and a memory 104 to-
`gether on a memory bus 100 as shown in FIG. 1 is well
`known in the art.
`FIG. 1 also illustrates a peripheral bus 110 coupled to
`memory bus 100 via a bridge 112. The use ofa bridge
`between two computer buses is well known in the art.
`Similarly, the use of a peripheral bus such as peripheral
`component interface (PCB is described in co-pending
`patent application Ser. No. 07/836,992.
`A disc drive 114 and a network interface or other
`input/output device 140 may also optionally be coupled
`to peripheral bus 110. Again. disc drive and network
`interface apparatus and their connection to a peripheral
`bus 110 is well known in the art.
`The present
`invention integrates graphical, video,
`and audio data into a single processing environment.
`Therefore,
`the present
`invention centers around the
`interaction between CPU 102 and graphics coutrolier
`116, capture controller 118, and audio controller 120.
`Graphics coon-tiller 116 is a specialized graphics control
`processor coupled to peripheral bus 110. Graphics con-
`troller 116 operating under the software control of CPU
`102 performs processing and compositing of the graph-
`ics and video data to be displayed on CRT display 126.
`This software control methodology is the subject of the
`present invention and is described in detail below. The
`hardware platform in which the present invention oper~
`ates
`is described in patent application Ser. No.
`07,901,519, now US. Pat. No. 5,243,447, filed concur-
`rently with the present patent application. In general,
`this hardware platform comprises graphics controller
`
`

`

`5,432,900
`
`ll"!
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`5
`116 to which a video random access memory (VRAM)
`122 is coupled. Video RAM 122 provides storage for a
`frame buffer which represents a binary encoded repre-
`sentation of the picture elements (pixels) that are dis-
`played on CRT display 126. Video RAM 122 is used for
`storage of both graphical information and video infor-
`mation. Using the processing techniques described
`herein and the hardware platform described in the
`above-referenced co-pending patent application, both
`graphical and video data may be composited (i.e. com-
`bined) into a single frame buffer within video RAM 122.
`This composite graphical and video image is converted
`to analog form by digital to analog converter (DAC)
`1211 and output to CRT display 126. In this manner,
`applications programs executing within CPU 102 can
`use the processing logic of the present
`invention to
`display composited graphical/video images in arbitrary
`locations on CRT display 126.
`Capture controller 118 is coupled to peripheral bus
`110. Capture controller 118 is a hardware system for
`receiving and processing video display data from video
`source 123. Capture controller 118 receives video im-
`ages from video source 128 and generates video image
`data packets for transmission on peripheral bus 110. The
`video image data packets comprise still video images or
`a series of motion video images which include synchro—
`nization information. The video data may be moved on
`peripheral bus 110 into graphics controller 116 for com-
`positing with graphics data or moved to CPU 102 or
`memory 104 via bridge 112 and memory bus 100. Video
`source 123 is a video camera, a video recording device,
`or other source for video image data. The structure and
`detailed operation of capture controller 118 is beyond
`the scope of the present invention.
`Audio controller 120 isalso coupled to peripheral bus
`119. Audio controller 120 provides a means for receiv-
`ing and processing audio data from a microphone 132 or
`other audio input device. Audio controller 120 gener-
`ates a series of audio data packets for transfer on periph-
`eral bus 110. The audio data packets represent a series of
`sounds retrieved from microphone 132 or other audio
`input device. Audio controller 120 also provides syn-
`chronization information with each audio data packet.
`Audio controller 120 also provides a means for decod-
`ing audio data received via peripheral bus 110 and out-
`putting audio signals to speaker 130 or other audio out-
`put device. The structure and detailed operation of
`audio controller 120 is beyond the scope of the present
`invention.
`
`Referring now to FIG. 2, a block diagram illustrates
`the prior art method for processing graphics, video, and
`audio data. Application software 210, residing in mem-
`ory 1114 and executed by CPU 102. comprises user level
`programming tools for manipulating and processing
`graphics, video, and audio data. Several application
`programs of this type exist in the prior art. An example
`of one such application program is AutoDesk Anima-
`tor TM for Microsoft Windows TM developed by Auto-
`Desk TM , Inc. Application software 210 interfaces with
`system software through several application program
`interfaces (API). APIs provide a means by which an
`application program may request service from system
`software through the functions provided in an AP]. In
`the prior art, three APls are generally provided:
`l)
`graphics API, 2) a video AP], and 3) an audio API. As
`illustrated in FIG. 2, graphics API 212 provides an
`interface between application software 210 and graph-
`ics driver 218. Graphics API 212 provides a list of pub-
`
`55
`
`65
`
`Page 18 of 27
`Page 18 of 27
`
`6
`lie functions and corresponding calling parameters with
`which application software 210 may create and manipu-
`late graphics objects that are stored for display in
`graphics frame buffer 220. Graphics objects typically
`include points. vectors, conical shapes, polngns, rec-
`tangles, blocks of text, arcs, and other shapes that may
`be represented in a two or three dimensional virtual
`space.
`In a similar manner, video API 214 provides an inter-
`face between application software 210 and video driver
`222. Video AP! 214 provides a set of functions for ma-
`nipulating video image data that is processed by video
`driver 222 and eventually stored for output to a display
`screen in video frame buffer 224. Because prior art
`systems typically used graphic display hardware which
`was independent of video display hardware, the pro-
`cessing of graphic and video data in prior art systems
`was typically done using independent and distinct hard-
`ware and software processes. This prior art design is
`therefore considered a loosely coupled or low integra-
`tion system; because, the graphic and video data is not
`integrated as it travels from the application software
`210 to a display screen (not shown). Rather, in the prior
`art systems, graphic and video data travels two parallel
`paths through the processing architecture.
`Audio API 216 provides an interface between appli-
`cation sofiware 210 and audio driver 2.26. Audio API
`216 provides a set of interfaces for manipulating audio
`mformatiOn that is processed by audio driver 226 and
`stored in audio output buffer 228 for eventual output to
`an audio emission system (not shown). The audio AP]
`216 provides a set of audio related functions with which
`application sofiware 210 may manipulate audio infor-
`mation. Again, the loosely coupled or low integration
`approach used in the prior art is apparent as audio infor-
`mation travels an independent path from applicatiOn
`software 210 to audio output buffer m. The indepen-
`dent path traveled by audio data in the prior art systems
`is
`less of a performance penalty; however. because
`audio information is destined for a different output de-
`vice than graphics and video data. Both graphics and
`video data, on the other hand, are typically displayed on
`the same display device.
`Several problems exist with the low integration ap-
`proach used in the prior art. Because graphics and video
`information is procemed using separate processes, prior
`art systems do not provide an efficient means for com-
`bining graphics and video images together in a compos—
`ite form. Secondly, because certain items of hardware in
`the prior art design are dedicated to processing either
`graphics or video data, some items of the prior art de-
`sign must be duplicated for processing both graphics
`and video data. Some of these duplicated hardware
`elements include frame bufi'er memory 220 and 224.
`Because of the independent paths traveled by graphics
`and video data in the prior art system, other hardware
`logic is required to handle the parallel paths. This addi-
`tional logic includes multiplexing logic which deter-
`mines which frame bul'fers data is to be displayed for
`each pixel location. Moreover. additional synchroniza-
`tion logic is required to synchronize the output of
`graphics and video data as the information is merged
`just before output to the display device. Thus, the prior
`art design suffers from low graphics/video integration
`functionality and increased hardware cost due to the
`low integration design.
`Referring now to FIG. 3, the structure of the system
`software used in the present invention is illustrated.
`
`

`

`7
`Again, application software 310 interfaces with system
`software through graphics API 312, video API 314, and
`audio AP] 316. In a manner similar to the application
`program interfaces illustrated in FIG. 2, APIs 312, 314,
`and 316 provide a high level functional interface for the
`manipulation of graphics, video, and audio information.
`Unlike the prior art. however. the present invention
`employs an integrated graphics/video controller (IVC)
`3211. NC 320 interfaces application software 310
`through graphics API 312 and video API 314. The
`structure and operation of IVC 320 is illustrated in
`detail in connection with FIG. 4. The output data gen-
`erated by IVC 320 is a composite graphics and video
`image which is stored for output to a diSplay device in
`frame buffer 322. NC 320 also interfaces with audio
`driver 324. IVC 320 provides audio driver 324 with
`synchronization information used to synchronize a
`video image stream with an audio data stream. IVC 20
`receives timing informatiOn from a display timing cOn-
`trol circuit 326. The timing information provided by
`display timing control 326 is used by IVC 320 for syn-
`chronizing the combination of graphics and video data
`and for controlling when the combined data is loaded
`into the frame buffer 322. Display timing control circuit
`326 provides a means for Optimizing the operation of
`IVC 320 for the timing characteristics of a particular
`display system that is used with the present invention.
`Referring now to FIG. 4, a detailed block diagram of
`the structure of integrated graphics/video controller
`(IVC) 320 is illustrated. IVC 320 receives graphics
`instructions through graphics API 312 and video in-
`structions though video AP] 314. In the preferred em-
`bodiment, graphics AP} 312 is a GDI interface devel-
`oped by Microsoft Corporation of Redmond, Wash.
`The graphics AP! interface 312 provides a means by
`which an application program may interface a display
`driver for displaying graphics images on a display
`screen. In the present invention, the IVC 320 receives
`graphics commands via graphics API 312 through a
`graphics command interface 410. Graphics command
`interface 410 and mask driver 414 derive information
`from the graphics commands provided by an applica-
`tion program prior to transferring the graphics informa-
`tion to display driver 416. Three types of information
`derived from the graphics command stream are derived
`by mask driver 414. First, clipping information is de-
`rived by mask driver 414. Clipping information is used
`to enable the display of graphics information within a
`specified window on the display screen. A window is a
`rectangular portion of the display screen having partic-
`ular attributes that are well known in the art. Clipping
`information is used to enable the display of graphics
`information within the window and to suppress the
`display information outside the boundary of a window.
`The clipping information derived by mask driver 414 is
`stored in area424asillustrated in FIG. 4. The second
`type of information derived by mask driver 414 from
`the graphics command stream is the actual graphic
`content of a specified window. The graphic contents of
`a window is specified in terms of a pixel bitmap. Each
`pixel within the window is represented by one or more
`binary bits of memory in the bitmap. This graphics
`information as derived by mask driver 414 is stored in
`graphics area 422. The graphics information is also
`transferred from mask driver 414 to display driver 416
`for subsequent transfer to frame buffer 32.2. The third
`type of information derived by mask driver 414 is mask
`information. Because a particular window may contain
`
`5
`
`10
`
`l5
`
`20
`
`25
`
`30
`
`35
`
`45
`
`55
`
`60
`
`5,432,900
`
`8
`both graphics and video data, masks are used to specify
`whether or not vid

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket