throbber
United States Patent [19J
`Rhodes et al.
`
`IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIBIIIIIIIIIIIIIIIIIIIIIIII
`US005432900A
`5,432,900
`[It ) Patent Number:
`Jul. 11, 1995
`[45) Date of Patent:
`
`[56)
`
`[75]
`
`[54] ThiTEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`Inventors: Kenneth E. Rhodes, Portland; Robert
`T. Adams, Lake Oswego; Sherman
`Janes, Portland; Rohan G. F. Coelho,
`Hillsboro, all of Oreg.
`Intel Corporation, Santa Clara, Calif.
`[73] Assignee:
`[21] Appl. No.: 261,284
`[22] Filed:
`Jun. 16, 1994
`Int. Cl.~ ................................................ G06T 1/ 00
`{51]
`[52] u.s. Cl. ····································· 395/154; 345/118
`[58] Field of Search ........................ 348/589; 345/ 118;
`395/153, 154, 162, 163
`References Cited
`U.S. PATENT DOCUMENTS
`4.498,081 2/1985 Fukushima et al. ................ 358/240
`4,530,009 7/1985 Mizokawa ........................... 358/183
`4.539,585 9/1985 Spaekova et at. .................... 358/93
`4.644,401 2/1987 Gaskins ............................... 358/183
`4.984,183 1/1991 Ohuchi ................................ 364/521
`5.012,342 4/1991 Olsen et al. ......................... 358/181
`5.027,212 6/1991 Marlton et al. ..................... 358/183
`5.119,080 6/1992 Kajimoto et al. ................... 3401723
`5.161,102 11/1992 Griffin et at. ....................... 364/488
`5.195.177 3/ 1993 Kamiyama et al. ................. 395/162
`5.220.312 6/1993 Lumelsky et at. .................. 3401721
`5.251 ,301 10/1993 Cook ................................... 395/163
`5.264,837 I 1/1993 Buehler ............................... 395/153
`5.271,097 12/1993 Barker et at. ....................... 395/162
`5.274.364 12/1993 Li et al. ............................... 345/118
`5.274,753 12/1993 Roskowski et al ................. 395/ 135
`
`FOREIGN PATENT DOCUMENTS
`0454414 !0/1991 European Pat. Off . .
`
`OTHER PUBLICATIONS
`Worthington, "True Vision Enhances Its PC Video
`Graphics Card", Info world, vol. 12 No. 32 (Aug. 6,
`1990), pp. 23 Abstract Only.
`Quinnell, "Video Chip Set Handles Decompression
`
`Special Effects, and Mixing in Many Formats'', EON,
`vol. 35 No. 25 (Dec. 6, 1990), pp. 72- 73, Abstract Only.
`
`Primary Examiner-Heather R. Herndon
`Assistant Examiner- N. Kenneth Burraston
`Attorney, Agent, or Firm-Blakely, Sokoloff, Taylor
`Zafman
`
`ABSTRACT
`(57]
`Graphical, video, and audio data is integrated into a
`single processing environment. The present invention
`employs an integrated graphics/video controller (IVC)
`which interfaces with application software through a
`graphics API and a video API. The IVC receives
`graphics commands through the graphics API and
`video commands through the video API. A mask driver
`produces information from the graphics commands
`including clipping information, graphics information
`and mask information. A blender uses the mask informa(cid:173)
`tion, the graphics information, and the clipping infor(cid:173)
`mation for combining or compositing graphics images
`with video images. The video commands of the video
`command stream provide functions for configuring the
`operation of the IVC. These functions include com(cid:173)
`mands for loading software video decoders in a decoder
`block within the IVC. Video data transferred to the
`IVC via the video API may be encoded in a variety of
`different formats. The present invention provides a
`means for dynamically loading a plurality of different
`video deo:;oders through a video command interface to a
`video de..;vde block both within the IV C. Each of the
`independent decoders within the decode block contain
`processing logic for decoding a particular type of video
`data to produce a uniform type of decoded video data
`which is provided to the blender. The blender receives
`the decoded video data and combines the video data
`with graphics data as defined by the mask information,
`the clipping information, and the graphics information.
`
`26 Oaim.s, 14 Drawing Sheets
`
`Page 1 of 27 ZTE EXHIBIT 1028
`
`

`
`.. \0
`... e N
`
`U1
`
`Q
`0
`
`""" ~
`0 ..._
`
`(I) -......
`
`(I)
`1:1"
`1Jl
`
`132
`
`~ Microphone
`~ 130
`
`Speaker
`
`I--Other I/O
`Network or
`
`140
`
`Peripheral
`
`110
`Bus
`
`I--Controller
`
`120
`
`Audio
`
`It_
`
`Bus
`
`Memory
`
`Source
`Video
`
`128
`
`Controller
`Capture
`
`118
`
`-
`
`""" ~
`.!""
`"""
`~
`
`(.II
`
`~
`
`~ = ....
`"'C a
`•
`00
`~ •
`
`1 6
`Dis~ lay
`CRT
`
`124
`
`DAC
`
`Video RAM
`
`~ (VRAM)
`
`122
`
`Kti;
`
`1--Controller
`Graphics
`
`116
`
`t--
`
`102
`
`CPU
`
`114
`Drive
`Disk
`
`1--
`
`1--
`
`104
`
`Memory
`
`Page 2 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 2 of 14
`
`5,432,900
`
`FIGURE2
`(Prior Art)
`
`Application Software
`210
`
`Graphics
`API
`212
`
`Graphics
`Driver
`218
`
`Video
`API
`214
`
`Video
`Driver
`222
`
`Audio
`API
`216
`
`Audio
`Driver
`226
`
`Graphics
`Frame Buffer
`220
`
`Video
`Frame Buffer
`224
`
`Audio
`Output Buffer
`228
`
`Page 3 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 3 of 14
`
`5,432,900
`
`FIGURE3
`
`Application Software
`310
`
`Graphics
`API
`312
`
`Video
`API
`314
`
`Integrated Graphics/Video
`Controller (IV C)
`320
`
`-
`
`, '
`
`Frame Buffer
`322
`
`Audio
`API
`316
`
`, ,
`
`Audio
`Driver
`324
`
`Display
`Timing
`Control
`326
`
`Page 4 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 4 of 14
`
`5,432,900
`
`FIGURE4
`
`Video
`API
`314
`
`Display
`-----1 Tuning
`Control
`326
`
`3
`
`Graphics
`API
`312
`
`In
`,
`Graphics
`Command
`Interface
`410
`
`Mask
`Driver
`414
`
`Audio
`Driver
`324
`
`--
`
`Sync
`450
`I
`
`430,
`
`Video
`Command
`Interface
`412
`
`+
`
`,,
`
`Video
`Controller
`Intelface
`440
`
`Video
`~troller
`
`Decoding
`442
`
`4~2
`MPEG
`'-
`4~
`Decode
`JPEG
`"'
`4~n---~ne.::;co;::::ode=-----l
`,
`RTV
`4~8
`Decode
`......._
`Other' Loedable
`Decoders
`
`'
`+ ,~
`r.=========::;'l, 4]8
`~20
`1--------1, ~22
`j.....-'
`I ~24
`V
`
`Blender
`Masks
`
`Graphics
`Oipping
`Information
`
`j.....-'
`
`Display
`Driver
`416
`t
`
`Frame
`Buffer
`322
`
`Page 5 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 5 of 14
`
`5,432,900
`
`FIGURES
`
`Initialize decoder table
`512
`
`Send initialize command to
`graphics hardware
`514
`
`Yes-520
`
`Send capability report conunand to
`graphics hardwaiT/microcode
`522
`
`Receive capability repon from
`graphics hardwattlmicrocode
`524
`
`Store hardware/microcode
`capability infcxmation in video
`conunand interface decoder table
`526
`
`Page 6 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 6 of 14
`
`5,432,900
`
`FIGURE6
`Decoder Table
`
`Address
`
`Length
`
`IDCode
`
`Capability
`Code
`
`enttyo
`
`j
`
`0
`0
`0
`
`Page 7 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 7 of 14
`
`5,432,900
`
`FIGURE 7
`
`Send initialize command to graphics baidware
`714
`
`>---No-718 - - ,
`
`Yes-720
`
`Send capabilily report command to graphics hardware
`722
`
`Receive capability report from graphics harc:lware/micre
`724
`
`Stme banlwaJe/miaoco capability infmnatioo in video
`command intedaoe clcc:ocb table
`726
`
`Compare contents of decoder table with minimmn capability table
`728
`
`Yes-732
`
`No-734
`
`Send request to Application Prop'am through API to load missing
`dcoodc:rs
`736
`
`Page 8 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 8 of 14
`
`5,432,900
`
`FIGURES
`Minimum Capability Table
`
`IDCode
`
`Minimum Capability
`
`entryn
`
`.J
`
`0
`0
`0
`
`Page 9 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 9 of 14
`
`5,432,900
`
`FIGURE9
`
`Receive a request to load a video decodet. The request iocludes an address <:L tbe
`decoder, length, ideotificatioo oode and a capability code
`912
`
`Search decodet llble fcl' an entry wilh the same idmtffiottioo code JllCICivc:d in the
`request. If more than one enay with tbe same idmli6e~bon code exists in the
`decodet table, take lhe enay wilh the grealeSt capability.
`914
`
`No-918
`
`Yes-920
`
`eoq,are capability code of the matched enay with
`capability code received in the request.
`922
`
`No-926
`
`Yes-928
`
`Use address and length in the n:quest to transfa- the dcoodcr
`to a free area in the video decode block 230
`930
`
`Update the decoder table with the address, length.
`identificatioo code, and capability code of the newly loaded
`decoder
`932
`
`Page 10 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 10 of 14
`
`5,432,900
`
`FIGURE 10
`
`Search the decodtz table for an entry corresponding
`to the type of video data received
`1014
`
`No-1018
`
`Yes - 1020
`
`Rennn euor code to
`Application Program
`1022
`
`No-1026
`
`Transfer video daaa to
`oompatible
`bani~
`implemented decoder
`through Video
`Controller lntaf'ace
`(440}
`1032
`
`Yes-1028
`
`Transfer video data to software
`implemented decoder in
`Decoder Block (430}
`1030
`
`II
`
`End
`1034
`
`II
`
`Page 11 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 11 of 14
`
`5,432,900
`
`FIGURE 11
`
`Receive request for capability report
`1112
`
`Retrieve information in Decoder Table
`1114
`
`Send Decoder Table information to
`Application Program through API
`1116
`
`Page 12 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 12 of 14
`
`5,432,900
`
`FIGURE 12
`
`Receive Graphics tnnmand
`from tbc Application Program
`through Graphics APL
`1212
`
`Pass the Graphics command
`through to the Display Driver
`416 without having modified the
`dcsriNtioo portioo of the
`command The Display Driver
`pmorms the graphics operation
`on a Display Bitmap.
`1214
`
`Receive ooo:unand status from
`Display Driver in response to the
`graphics~ pedolmed on
`the Display Bitmap.
`1216
`
`>---No - 1220--.....
`
`Yes - 1222
`
`Modify the graphics command to
`ld'erence a mask bib:Dilp maintained
`by the Display Driver.
`1224
`
`Seod error message to
`application program.
`1226
`
`Page 13 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 13 of 14
`
`5,432,900
`
`FIGURE 13
`
`Send tbe modified Graphics
`CQJnn~and 10 tbe Display Driver.
`The Display Driver performs the
`same operation on the mask
`bitmap as performed earlier on the
`display bitmap.
`1310
`
`Receive <xXJll1l3Dd stanJs from the
`Display Driver in ~nse to the
`graphics operation pedormed on
`the mask bitmap.
`1312
`
`Send comn~and status to the
`Application Program through the
`graphics API.
`1314
`
`Page 14 of 27
`
`

`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 14 of 14
`
`5,432,900
`
`FIGURE 14
`
`Receive a graphics command from the Application Program
`through the Graphics API
`1412.
`
`Interpret the graphics command to derive the location and
`dimensions of a visible region of a window.
`1414
`
`Genmue clipping information defining tbe location and
`dimensions of the visible portion of the window. This
`clipping information defines a clipping region.
`1416
`
`Send the clipping information to the Blender 418 which
`blends video data with graphics data inside the clipping
`region and disables blending outside the clipping region.
`1418
`
`Page 15 of 27
`
`

`
`1
`
`5,432,900
`
`INfEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`This is a continuation of application Ser. No. 5
`07/901,280, filed Jun. 19, 1992, now abandoned.
`
`BACKGROUND OF THE INVENTION
`I. Field Of The Invention
`The present invention relates to the field of computer 10
`display systems. Specifically, the present invention per(cid:173)
`tains to the manipulation and control of graphics and
`video data for display on a display medium. The present
`invention further relates to the control and manipula(cid:173)
`tion of audio data that is output in synchronization with I 5
`images on a display screen.
`2. Prior Art
`Many conventional computer graphics display sys(cid:173)
`tems exist in the prior art. These systems provide mech(cid:173)
`anisms for manipulating and displaying graphics objects 20
`on a display screen. These graphic objects include
`points, vectors, conical shapes, rectangles, polygons,
`arcs, alpha numeric information, shaded or color-filled
`regions, and other forms of graphic imagery, typically
`represented as objects in a two dimensional or three 25
`dimensional virtual space. A graphics computer pro(cid:173)
`gram in these conventional systems provides mecha(cid:173)
`nisms by which a computer user may manipulate
`graphic objects using functions provided by the graph-
`ics software. This graphics software typically includes a 30
`graphics driver software module that operates in con(cid:173)
`junction with a graphics controller to load a frame
`buffer with digital information that is subsequently con(cid:173)
`verted to a form displayable on a display screen. It will
`be apparent to those s.killed in the art that the use of 35
`graphics driver software, graphics controllers, and
`frame buffers is well known to those of ordinary skill in
`the art.
`Other prior art computer display systems provide a
`means for displaying video images on a display screen. 40
`These video images comprise streams of digital video
`data encoded in several well known formats such as
`MPEG, JPEG, and RTV format. Unlike graphical data,
`this video data includes no representations of individual
`objects within the video image. To the computer dis- 45
`play system, the video image is a homogeneous video
`data stream encoded in a particular way. In prior art
`systems, this video data is received by processing logic
`that includes a particular mechanism for decoding the
`video data and transferring the decoded video data into so
`a frame buffer for display on the display screen. Because
`of the different nature of graphical data as compared
`with video data, prior art systems tend to handle graphi(cid:173)
`cal and video data in separate subsystems within the
`computer display system. Thus, graphical data and SS
`video data typically take parallel and independent paths
`through prior art systems as the data is processed for
`display. In some cases, two separate frame buffers are
`used, one for graphical data and one for video data. In
`other systems, a single frame buffer is used; however, 60
`graphical data occupies one distinct region of the frame
`buffer and video data occupies a different portion of the
`frame buffer.
`A number of problems exist in prior art systems that
`process graphical and video data independently. First, 6S
`these prior art systems cannot efficiently combine
`graphical and video images together into a composite
`form that is easily manipulated. For example, scrolling a
`
`Page 16 of 27
`
`2
`graphical image across a video background typically
`requires additional processing to prevent image from
`being destroyed. Secondly, synchronizing graphics and
`video images in prior art systems is typically very diffi(cid:173)
`cult. Synchronization problems in prior art systems
`result in video images that appear tom or disjoint. More(cid:173)
`over, aligning graphics images at the proper location
`and time in a video image is difficult using prior art
`techniques. Thirdly, the video data decoding schemes
`used in prior art systems are typically limited to a single
`decoding scheme. For e.xample, less expensive graphics
`programs may provide software implemented video
`data decoders. Although these systems provide an inex(cid:173)
`pensive solution, they tend to run slowly and often
`provide decoded video data of a low resolution. Other
`more expensive prior art systems provide graphics
`hardware that may be used to decode a video data
`stream. These systems are fast and provide high resolu(cid:173)
`tion decoded video data; however, they are also sub(cid:173)
`stantially more expensive. Moreover, users wanting to
`upgrade from a less expensive system, such as a soft(cid:173)
`ware implemented decoder, to a more expensive sys(cid:173)
`tem, such as a graphic hardware implemented system,
`must first reconfigure or reinstall their graphics applica(cid:173)
`tion program in order to take advantage of a different
`video decoding technique. This additional impact at the
`applications program level further increases the up(cid:173)
`grade cost to the computer user.
`Thus, a better means for integrating graphical and
`video information in a single computer display system is
`needed.
`
`SUMMARY OF THE INVENTION
`The prese.nt invention integrates graphical, video,
`and audio data into a single processing environment.
`Therefore, the present invention includes a processor
`(CPU), a graphics controller, a capture controller, and
`an audio controller. The graphics controller is a special(cid:173)
`ized graphics control processor coupled to a peripheral
`bus. The graphics controller, operating under the soft(cid:173)
`ware control of the CPU performs processing and com(cid:173)
`positing of the graphics and video data to be displayed
`on a CRT display. This software control methodology
`is the subject of the present invention and is described in
`detail herein.
`Application software interfaces witb system software
`of the present invention through a graphics application
`program interface (API), a video API, and an audio
`API. These APis provide a high level functional inter(cid:173)
`face for the manipulation of graphics, video, and audio
`information. Unlike the prior art, however, the present
`invention employs an integrated graphics/video con(cid:173)
`troller (IVC). The IVC interfaces with application soft(cid:173)
`ware through the graphics API and the video API. The
`output data generated by the lVC is a composite graph(cid:173)
`ics and video image which is stored for output to a
`display device in a frame buffer. The IVC of the present
`invention also interfaces with an audio driver. The IVC
`provides the audio driver with synchronization infor(cid:173)
`mation used to synchronize a video image stream with
`an audio data stream. The IVC receives timing informa(cid:173)
`tion from a display timing control circuit. The timing
`information provided by the display timing control
`circuit is used by the IVC for synchronizing the loading
`of the frame buffer. The display timing control circuit
`provides a means for optimizing the operation of the
`IVC for the timing characteristics of a particular dis(cid:173)
`play system that is used with the present invention.
`
`

`
`5,432,900
`
`3
`4
`FIG. 4 is a block diagram of the integrated gra(cid:173)
`The IVC receives graphics instructions through the
`phics/ video controller (IVC).
`graphics API and video instructions though the video
`FIGS. 5-11 are flowcharts illustrating the processing
`API. The graphics API interface provides a means by
`which an application program may interface with a
`logic of the video command interface.
`display driver for displaying graphics images on a dis- 5
`FIGS. 12-14 are flowcharts illustrating the process(cid:173)
`play screen. A graphics command interface and a mask
`ing logic of the mask driver.
`driver derive information from the graphics commands
`DETAILED DESCRIPTION OF THE
`provided by an application program prior to transfer(cid:173)
`PREFERRED EMBODIMENT
`ring the graphics information to a display driver. Three
`The present invention is an integrated graphics and
`types of information derived from the graphics com- 10
`video display system for manipulating and viewing both
`mand stream are produced by the mask driver. First,
`graphical and video data in a single integrated system.
`clipping information is derived by the mask driver.
`Clipping information is used to enable the display of
`In the following description, numerous specific details
`are set forth in order to provide a thorough understand-
`graphics information within a specified window on the
`display screen. A window is a rectangular portion of the 15 ing of the invention. However, it will be apparent to one
`display screen having particular attributes that are well
`of ordinary skill in the art that these specific details need
`known in the art. Clipping information is used to enable
`not be used to practice the present invention. In other
`the display of graphics information within the window
`instances, well known structures, circuits, and inter-
`and to suppress the display of information outside the
`faces have not been shown in detail in order not to
`boundary of a window. The second type of information 20 obscure unnecessarily the present invention.
`derived by the mask driver from the graphics command
`Referring now to FIG. 1, a block diagram of the
`stream is the actual graphic content of a specified win-
`computer system hardware used in the present embodi-
`dow. The third type of information derived by the maslc
`ment is illustrated. The computer system used in the
`driver is mask information. A blender uses the mask
`preferred embodiment comprises a memory bus 100 for
`information, the graphics information, and the clipping 25 communicating information, a processor (CPU) 102
`information for combining or compositing graphics
`coupled with memory bus 100 for processing informa-
`tion, and a memory 104 coupled with bus 100 for storing
`information with video information.
`The IVC also receives video image information
`information and instructions for CPU 102 and other
`through the video API. The video API provides a
`components of the computer system. In the preferred
`means by which an applications program may interface 30 embodiment, memory bus 100 is a high speed bus which
`is well known to those of ordinary skill in the art. In the
`with the IVC for the purpose of man.ipulating video
`preferred embodiment, CPU 102 is an i486 ® brand
`images for display on a display screen. A video com-
`mand stream is m::eived by a video command interface
`microp~r manufactured by the assignee of the
`through the video API. The video command stream
`present invention. i486® is a registered trademark of
`comprises video commands for configuring the opera- 35 Intel Corporation, Santa Clara. Calif. Memory 104 in
`the preferred embodiment comprises dynamic random
`lion of the IVC and video data which comprises a com-
`access memory (DRAM) and read only memory
`bination of video and audio data with synchronization
`information for synchronizing the display of frames of
`(ROM). Coupling a CPU 102 and a memory 104 to-
`video images.
`gether on a memory bus 100 as shown in FIG. 1 is well
`The video commands of the video command stream 40 known in the art.
`provide functions for configuring the operation of the
`FIG. 1 also illustrates a peripheral bus 110 coupled to
`IVC. These functions include commands for loading
`memory bus 100 via a bridge 112. The use of a bridge
`software video decoders in a decoder block within the
`between two computer buses is well known in the art.
`Similarly, the use of a peripheral bus such as peripheral
`IV C. Video data transferred to the IVC via the video
`API may be encoded in a variety of different formats. 45 component interface (PCI) is described in co-pending
`The present invention provides a means for dynami-
`patent application Ser. No. 07/886,992.
`cally loading a plurality of different video decoders
`A disc drive 114 and a network interface or other
`inpuVoutput device 1441 may also optionally be coupled
`through a video command interface to a video decode
`block both within the IVC. Each of the independent
`to peripheral bus 110. Again, disc drive and network
`decoders within the decode block contain processing SO interface apparatus and their connection to a peripheral
`logic for decoding a particular type of video data to
`bus 110 is well lcnown in the art.
`produce a uniform type of decoded video data which is
`The present invention integrates graphical, video,
`provided to the blender. Tbe blender receives the de-
`and audio data into a single processing environment.
`coded video data and combines the video data with
`Therefore, the pr-esent inventioo centers around the
`graphics data as defiDed by the mask information, the 55 interaction between CPU 102 and graphics controller
`graphics information, and the clipping information. The
`116, capture controller 118, and audio controller 120.
`combined or com posited image is then transferred to a
`Graphics controller 116 is a specialized graphics control
`processor coupled to peripheral bus 110. Graphics con·
`display driver and output to a display through a frame
`buffer.
`troller 116 operating under the software control of CPU
`60 102 performs processin& ud compositing of the graph-
`BRIEF DESCRIPTION OF THE ORA WINGS
`ics and video data to be displayed on CRT display l.M.
`FIG. 1 is a block diagram of the computer system
`This software control methodology is the subject of the
`present invention and is described in detail below. The
`hardware used in the preferred embodiment.
`FIG. 2 is a block diagram of the prior art software
`hardware platform in which the present invention oper(cid:173)
`ates is described
`in patent application Ser. No.
`used in a system for processing graphics, video, ud 65
`audio data
`07,901,519, now U.S. Pat- No. 5,243,-447, ftled concur-
`rently with the present patent application. In general,
`FIG. 3 is a block diagram of the software used in the
`preferred embodiment of the present invention.
`this hardware platform comprises graphics controller
`
`Page 17 of 27
`
`

`
`5,432,900
`
`5
`116 to which a video random access memory (VRAM)
`122 is coupled. Video RAM 122 provides storage for a
`frame buffer which represents a binary encoded repre(cid:173)
`sentation of the picture elements (pixels) that are dis(cid:173)
`played on CRT display 126. Video RAM 122 is used for s
`storage of both graphical information and video infor(cid:173)
`mation. Using the processing techniques described
`herein and the hardware platform described in the
`above-referenced co-pending patent application, both
`graphical and video data may be composited (i.e. com- 10
`bined) into a single frame buffer within video RAM 122.
`This composite graphical and video image is converted
`to analog form by digital to analog converter (DAC)
`124 and output to CRT display 126. In this manner,
`applications programs executing within CPU 102 can 15
`use the processing logic of the present invention to
`display com posited graphical/video images in arbitrary
`locations on CRT display 126.
`Capture controller 118 is coupled to peripheral bus
`110. Capture controller 118 is a hardware system for 20
`receiving and processing video display data from video
`source 128. Capture controller 118 receives video im(cid:173)
`ages from video source 128 and generates video image
`data packets for transmission on peripheral bus 110. The
`video image data packets comprise still video images or 25
`a series of motion video images which include synchro(cid:173)
`nization information. The video data may be moved on
`peripheral bus 110 into graphics controller 116 for com(cid:173)
`positing with graphics data or moved to CPU 102 or
`memory 104 via bridge 112 and memory bus 100. Video 30
`source 128 is a video camera. a video recording device,
`or other source for video image data. The structure and
`detailed operation of capture controller 118 is beyond
`the scope of the present invention.
`Audio controller 120 is also coupled to peripheral bus 35
`110. Audio controller 120 provides a means for receiv(cid:173)
`ing and processing audio data from a microphone 132 or
`other audio input device. Audio controller 120 gener(cid:173)
`ates a series of audio data packets for transfer on periph(cid:173)
`eral bus 110. The audio data packets represent a series of 40
`sounds retrieved from microphone 132 or other audio
`input device. Audio controller 120 also provides syn(cid:173)
`chronization information with each audio data packet.
`Audio controller 120 also provides a means for decod(cid:173)
`ing audio data received via peripheral bus 110 and out- 45
`putting audio signals to speaker 130 or other audio out(cid:173)
`put device. The structure and detailed operation of
`audio controller 120 is beyond the scope of the present
`invention.
`Referring now to FIG. 2, a block diagram illustrates 50
`the prior art method for processing graphics, video, and
`audio data. Application software 210, residing in mem(cid:173)
`ory 104 and executed by CPU 102, comprises user level
`programming tools for manipulating and processing
`graphics, video, and audio data. Several application 55
`programs of this type exist in the prior art. An example
`of ooe such app.lication program is AutoDesk Anima(cid:173)
`tor TM for Microsoft Windows TM developed by Auto(cid:173)
`Deslc TM , Inc. Application software 210 interfaces with
`system software through several application program 60
`interfaces (API). APls provide a means by which an
`application program may request service from system
`software through the functions provided in an API. In
`the prior art, three APis are generally provided: I)
`graphics API, 2) a video API, and 3) an audio API. A$ 6S
`illustrated in FIG. 2, graphics API 212 provides an
`interface between application software 210 and graph·
`ics driver 218. Graphics API 212 provides a .list of pub-
`
`6
`lie functions and corresponding calling parameters with
`which application software 210 may create and manipu(cid:173)
`late graphics objects that are stored for display in
`graphics frame buffer 220. Graphics objects typically
`include points, vectors, conical shapes, polygons, rec(cid:173)
`tangles, blocks of text, arcs, and other shapes that may
`be represented in a two or three dimensional virtual
`space.
`In a similar manner, video API 214 provides an inter(cid:173)
`face between application software 210 and video driver
`222. Video API 214 provides a set of functions for ma(cid:173)
`nipulating video image data that is processed by video
`driver 222 and eventually stored for output to a display
`screen in video frame buffer 224. Because prior art
`systems typically used graphic display hardware which
`was independent of video display hardware, the pro(cid:173)
`cessing of graphic and video data in prior art systems
`was typically done using independent and distinct hard·
`ware and software processes. This prior art design is
`therefore considered a loosely coupled or low integra(cid:173)
`tion system; because, the graphic and video data is not
`integrated as it travels from the application software
`210 to a display screen (not shown). Rather, in the prior
`art systems, graphic and video data travels two parallel
`paths through the processing architecture.
`Audio API 216 provides an interface between appli(cid:173)
`cation software 210 and audio driver 226. Audio API
`216 provides a set of interfaces for manipulating audio
`information that is processed by audio driver 226 and
`stored in audio output buffer 228 for eventual output to
`an audio emission system (not shown). Tbe audio API
`216 provides a set of audio related functions with which
`application software 210 may manipulate audio infor(cid:173)
`mation. Again, the loosely coupled or low integration
`approach used in the prior art is apparent as audio infor(cid:173)
`mation travels an independent path from application
`software 210 to audio output buffer 228. The indepen(cid:173)
`dent path tnveled by audio data in the prior art systems
`is less of a performance penalty; however, because
`audio information is destined for a different output de(cid:173)
`vice than graphics and video data. Both graphics and
`video data, on the other hand, are typically displayed on
`the same display device.
`Several problems exist with the low integration ap(cid:173)
`proach used in the prior art. Because graphics and video
`information is processed using separate processes, prior
`act systems do not provide an efficient means for com(cid:173)
`bining graphics and video images together in a compos(cid:173)
`ite form. Secondly, because certain items of hardware in
`the prior art design are dedicated to processing either
`graphics or video data, some items of the prior art de(cid:173)
`sign must be duplicated for processing both graphics
`and video data. Some of these duplicated hardware
`elements include frame buffer memory 220 and 224.
`Because of the independent paths traveled by graphics
`and video data in the prior art system, other hardware
`logic is required to handle the parallel paths. This addi(cid:173)
`tional logic includes multiplexing logic which deter(cid:173)
`mines which frame buffers data is to be displayed for
`each pixel location. Moreover, additional synchroniza(cid:173)
`tion logic is required to synchronize the output of
`graphics and video data as the info.nn.ation is merged
`just before output to the display device. Thus, the prior
`art design suffers from low graphics/video integration
`functionality and increased hardware cost due to the
`low integration design.
`Referring now to FIG. 3, the structure of the system
`software used in the present invention is illustrated.
`
`Page 18 of 27
`
`

`
`5,432,900
`
`7
`8
`both graphics and video data, masks are used to specify
`Again, application software 310 interfaces with system
`whether or not video information may over write
`software through graphics API 312, video API 314, and
`graphics information within a particular window. The
`audio

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket