`USOO5432900A
`
`United States Patent
`
`[19]
`
`[11] Patent Number:
`
`5,432,900
`
`Rhodes et al.
`
`[45] Date of Patent:
`
`Jul. 11, 1995
`
`[54]
`
`INTEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`Special Effects, and Mixing in Many Formats", EDN,
`vol. 35 No. 25 (Dec. 6, 1990), pp. 72-73, Abstract Only.
`
`[75]
`
`Inventors:
`
`Kenneth E. Rhodes, Portland; Robert
`'1'. Adams, Lake Oswego; Sherman
`Janes, Portland; Rohan G. F. Coelho,
`1-Iillsboro, all of Oreg.
`
`Primary Examiner—Heather R. Herndon
`Assistant Examiner—N. Kenneth Burraston
`Attorney, Agent, or Firm—Blake1y, Sokoloff, Taylor
`Zafman
`
`[73] Assignee:
`
`Intel Corporation, Santa Clara, Calif.
`
`[2]] Appl. No.: 261,284
`
`[22] Filed:
`
`Jun. 16, 1994
`
`Int. Cl.“ .............................................. .. G06T 1/00
`[51]
`
`[52] U.S. Cl.
`.
`.
`..... .. 395/154; 345/118
`[58] Field of Search ...................... .. 348/589; 345/118;
`395/153, 154, 162, 163
`
`[56]
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`2/1985 Fukushima eta]. .............. .. 358/240
`4,498,081
`7/1985 Mizokawa ...........
`.
`.. 358/183
`4,530,009
`9/I985 Spaekova etal.
`353/93
`4,539,585
`2/1987 Gaskins .......... ..
`353/183
`4,644,401
`364/521
`1/1991 Ohuchi
`4,984,183
`
`"N 358/181
`..
`5.012,342 4/1991 Olsen et al.
`“M 358/183
`6/1991 Marlton et al.
`5,027,212
`
`nu 340/723
`6/1992 Kajimoto et al.
`5,119,080
`
`no 364/488
`5,161,102 11/1992 Griffin et al.
`_u 395/162
`5,195,177
`3/1993 Kamiyama et al.
`"_ 340/721
`5.220.312
`6/1993 Lumelsky et a1.
`
`nu 395/163
`5,251,301 10/1993 Cook .............. ..
`0” 395/153
`
`5.264,837 11/1993 Buehler ........
`H 395/162
`..
`5,271,097 12/1993 Barker et al.
`
`5,274,364 12/1993 Li et al.
`345/118
`
`5,274,753 12/1993 Roskowski et al.
`.............. .. 395/135
`
`.
`
`FOREIGN PATENT DOCUMENTS
`
`0454414 10/1991 European Pat. Off.
`
`.
`
`OTHER PUBLICATIONS
`
`Worthington, “True Vision Enhances Its PC Video
`Graphics Card”, Info world, vol. 12 No. 32 (Aug. 6,
`1990), pp. 23 Abstract Only.
`Quinnell, “Video Chip Set Handles Decompression
`
`[57]
`
`ABSTRACT
`
`Graphical, video, and audio data is integrated into a
`single processing environment. The present invention
`employs an integrated graphics/video controller (IVC)
`which interfaces with application software through a
`graphics API and a video API. The IVC receives
`graphics commands through the graphics API and
`video commands through the video API. A mask driver
`produces information from the graphics commands
`including clipping information, graphics information
`and mask information. A blender uses the mask informa-
`
`tion, the graphics information, and the clipping infor-
`mation for combining or compositing graphics images
`with video images. The video commands of the video
`command stream provide functions for configuring the
`operation of the IVC. These functions include com-
`mands for loading software video decoders in a decoder
`block within the IVC. Video data transferred to the
`IVC via the video API may be encoded in a variety of
`different
`formats. The present
`invention provides a
`means for dynamically loading a plurality of different
`video decoders through a video command interface to a
`video decode block both within the IVC. Each of the
`independent decoders within the decode block contain
`processing logic for decoding a particular type of video
`data to produce a uniform type of decoded video data
`which is provided to the blender. The blender receives
`the decoded video data and combines the video data
`with graphics data as defined by the mask information,
`the clipping information, and the graphics information.
`
`26 Claims, 14 Drawing Sheets
`
`
`
`1
`Petitioners HTC and LG — Exhibit 1028, p.
`HTC and LG v. PUMA, IPR2()l5—()l5()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 1
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 1 of 14
`
`5,432,900
`
`EMU
`
`e_as$9
`
`U<Q
`
`V3
`
`:5.32>3_._.__co
`
`9:5
`
`2_X:
`
`.30bofioi
`
`ase:
`
`
`
`E55§_2._suEu
`
`§_._&m
`
`ca
`
`o€=<
`
`wea:
`
`32>2895
`
`
`
`oo.=5m._o=ob=oU
`
`~_~
`
`No.
`
`
`asmam
`
`§_8_ao.¢oEoE
`
`8.
`
`MNMDOE
`
`c30:.550c:
`
`
`
`.5u_._o33Z_Eom.._..wCom
`
`Petitioners HTC and LG — Exhibit 1028, p. 2
`HTC and LG V. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 2
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 2 of 14
`
`5,432,900
`
`FIGURE 2
`
`(Prior Art)
`
`Application Software
`210
`
`Graphics
`API
`212
`
`Video
`API
`214
`
`Graphics
`Driver
`218
`
`Video
`Driver
`222
`
`Audio
`API
`216
`
`Audio
`Driver
`226
`
`Graphics
`Frame Buffer
`
`220
`
`Petitioners HTC and LG — Exhibit 1028, p. 3
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 3
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 3 of 14
`
`5,432,900
`
`FIGURE 3
`
`Applicatiori Software
`310
`
`
`
`Integrated Graphics/Video
`Controller (IVC)
`320
`
`Petitioners HTC and LG — Exhibit 1028, p. 4
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 4
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 4 of 14
`
`5,432,900
`
`FIGURE 4
`
`Petitioners HTC and LG — Exhibit 1028, p. 5
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 5
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 5 of 14
`
`5,432,900
`
`FIGURE 5
`
`
`
`Send
`command to
`graphics hardware
`514
`
`
`
`
`
` Send capability report command to
`
`graphics harcggre/tnicroeode
`
` Receive capability rcpon from
`
` Store hardware/microcode
`
`
`
`graphics hardware/microcode
`524
`
`capability information in video
`command interface decoder table
`526
`
`
`
`Petitioners HTC and LG — Exhibit 1028, p. 6
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 6
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 6 of 14
`
`5,432,900
`
`FIGURE 6
`
`Decoder Table
`
`Petitioners HTC and LG — Exhibit 1028, p. 7
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 7
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 7 of 14
`
`5,432,900
`
`FIGURE 7
`
`
`
`
`
`Send
`
`command to graphics hardware
`714
`
`
`
`Yes-720
`
`
`
`No-718
`
`Send capability report command to graphics hardware
`722
`
`Receive capability report fronagraphics hardware/microcode
`7 4
`
`sm hardware/microcode capability informatim inivideo
`command interface decoder table
`726
`
`Compare contents of decoder table with minimum capability table
`728
`
`
`
`
`
`Axe all elements of minimum capability
`table present in decoder table with equal or
`greatercapability?
`730
`
`No-734
`
`Send request to Application Program through API to load missing
`(W736
`
`“"732
`
`
`
`
`
`
`
`Petitioners HTC and LG — Exhibit 1028, p. 8
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 8
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 8 of 14
`
`5,432,900
`
`FIGURE 8
`
`Minimum Capability Table
`
`Minimum Capability
`
`
`
`Petitioners HTC and LG — Exhibit 1028, p. 9
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 9
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 9 of 14
`
`5,432,900
`
`FIGURE 9
`
`
`
`
`
`
`Receive a request to load a video decoder. The request includes an address ofthe
`decoder, length, identification code and a capability code
`912
`
`
`
`
`
`
`
`
`Seamhdecodernblefwanenuywimflwsameidenfificationcodexeceivedinme
`request. Ifmotethanoneenuywiththesameidentificationcodeexistsinthe
`deoodertable, take the entry with the gteatestcapability.
`914
`
` Is matched entry found in
`
`decoder table?
`916
`
` Compare capability code of the matched entry with
`
`
`capability code received in the request.
`922
`
`
`
`
`
` Use addressand lengthin the request notransferthedecoder
`toafieeareainthevideodecodeblockzio
`930
`
`
`Update the decoder table with the address, length,
`identification code, and capability code of the newly loaded
`
`932
`
`
`
`Petitioners HTC and LG — Exhibit 1028, p. 1()
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 10
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 10 of 14
`
`5,432,900
`
`FIGURE 10
`
`
`
`Receive video data from API
`1012
`
`Search the decoder table for an entry corresponding
`to the type of video data received
`
`1014
`
` Is a compatible
`
`
`decoder found?
`1016
`
`
`
` Is com
`patible decoder
`software implemented?
`1024
`
`
`
`
`Transfer video data to software
`implemented decoder in
`Decoder glock (430)
`
`
`1 30
`
`1 1
`Petitioners HTC and LG — Exhibit 1028, p.
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 11
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 11 of 14
`
`5,432,900
`
`FIGURE 11
`
`Receive request for capability report
`1 1 12
`
`Retrieve information in Decoder Table
`1 1 14
`
`Send Decoder Table information to
`Application Pmgram through API
`11 16
`
`Petitioners HTC and LG — Exhibit 1028, p. 12
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 12
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 12 of 14
`
`5,432,900
`
`FIGURE 12
`
`
`
`
`
`
`
`
`Mask Driver Processing
`(Mask Bitriggigieneration)
`
`
` Receive Graphics command
`from the Application Program
`through Gaaighics API.
`
`1
`
`
`
`Pass the Graphics command
`through to the Display Driver
`
`
`
`416 without having modified the
`destination portion of the
`
`
`command. The Display Driver
`performs the graphics operation
`on a Dispéay Bitmap.
`
`1 14
`
`
`
`Receive command status from
`Display Driver in response to the
`
`graphics operation performed on
`
`
`the Display6Bitmap.
`1 21
`
`
`
`
`Was the Graphics
`
`command successfully
`
`processed by the Display
`Driver‘?
`1218'
`
`
`
`Yes - 1222
`
`
`
`
`Modify the graphics command to
`
`
`reference a mask bitmap maintained
`by the Display Driver.
`1224
`
`1226
`
`Sendenormessageto
`application program.
`
`Petitioners HTC and LG — Exhibit 1028, p. 13
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 13
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`July 11, 1995
`
`Sheet 13 of 14
`
`5,432,900
`
`FIGURE 13
`
`
`
`
`
`
`
`
`
`Send the modified Graphics
`command to the Display Driver.
`The Display Driver performs the
`same operation on the mask
`bitmap as performed earlier on the
`display:/‘Ib(‘1)ttnap.
`
`1
`
`1
`
`
`
`
` Receive command status fiom the
`Display Driver in response to the
`
`
`graphics operation performed on
`the HES; tzaitmap.
`
`
`
`1
`
`1
`
`Send command status to the
`Application Program through the
`graphics API.
`
`
`
`1 3 14
`
`Petitioners HTC and LG — Exhibit 1028, p. 14
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 14
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`U.S. Patent
`
`Jnly 11, 1995
`
`Sheet 14 of 14
`
`5,432,900
`
`FIGURE 14
`
`
`
`
`_ Nlask Driver Ifnocessing _
`(Clipping Hf On Generation)
`
`
`
`Receive a graphics command from the Application Program
`thmugh the Graphics API
`1 4 1 2.
`
`Interpret the graphics command to derive the location and
`dimensions of a visible region of a window.
`
`1414
`
`
`
`Generate clipping information defining the location and
`dimensions of the visible portion of the window. This
`clipping information defines a clipping region.
`
`141 6
`
`
`
`Send the clipping information to the Blender 418 which
`blends video data with graphics data inside the clipping
`region and disables blending outside the clipping region.
`141 8
`
`
`
`
`Petitioners HTC and LG — Exhibit 1028, p. 15
`HTC and LG v. PUMA, IPR2()15—()15()1
`
`Petitioners HTC and LG - Exhibit 1028, p. 15
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`1
`
`5,432,900
`
`INTEGRATED GRAPHICS AND VIDEO
`COMPUTER DISPLAY SYSTEM
`
`a continuation of application Ser. No.
`This is
`07/901,280, filed Jun. 19, 1992, now abandoned.
`
`5
`
`BACKGROUND OF THE INVENTION
`1. Field Of The Invention
`
`The present invention relates to the field of computer 10
`display systems. Specifically, the present invention per-
`tains to the manipulation and control of graphics and
`video data for display on a display medium. The present
`invention further relates to the control and manipula-
`tion of audio data that is output in synchronization with
`images on a display screen.
`2. Prior Art
`
`15
`
`20
`
`2
`graphical image across a video background typically
`requires additional processing to prevent image from
`being destroyed. Secondly, synchronizing graphics and
`video images in prior art systems is typically very diffi-
`cult. Synchronization problems in prior art systems
`result in video images that appear tom or disjoint. More-
`over, aligning graphics images at the proper location
`and time in a video image is difficult using prior art
`techniques. Thirdly, the video data decoding schemes
`used in prior art systems are typically limited to a single
`decoding scheme. For example, less expensive graphics
`programs may provide software implemented video
`data decoders. Although these systems provide an inex-
`pensive solution, they tend to run slowly and often
`provide decoded video data of a low resolution. Other
`more expensive prior art systems provide graphics
`hardware that may be used to decode a video data
`stream. These systems are fast and provide high resolu-
`tion decoded video data; however, they are also sub-
`stantially more expensive. Moreover, users wanting to
`upgrade from a less expensive system, such as a soft-
`ware implemented decoder, to a more expensive sys-
`tem, such as a graphic hardware implemented system,
`must first reconfigure or reinstall their graphics applica-
`tion program in order to take advantage of a different
`video decoding technique. This additional impact at the
`applications program level further increases the up-
`grade cost to the computer user.
`Thus, a better means for integrating graphical and
`video information in a single computer display system is
`needed.
`
`Many conventional computer graphics display sys-
`tems exist in the prior art. These systems provide mech-
`anisms for manipulating and displaying graphics objects
`on a display screen. These graphic objects include
`points, vectors, conical shapes, rectangles, polygons,
`arcs, alpha numeric information, shaded or color-filled
`regions, and other forms of graphic imagery, typically
`represented as objects in a two dimensional or three
`dimensional virtual space. A graphics computer pro-
`gram in these conventional systems provides mecha-
`nisms by which a computer user may manipulate
`graphic objects using functions provided by the graph-
`ics software. This graphics software typically includes a
`graphics driver software module that operates in con-
`junction with a graphics controller to load a frame
`buffer with digital information that is subsequently con-
`verted to a form displayable on a display screen. It will
`be apparent to those skilled in the art that the use of 35
`graphics driver software, graphics controllers, and
`frame buffers is well known to those of ordinary skill in
`the art.
`
`25
`
`30
`
`Other prior art computer display systems provide a
`means for displaying video images on a display screen.
`These video images comprise streams of digital video
`data encoded in several well known formats such as
`MPEG, JPEG, and RTV fonnat. Unlike graphical data,
`this video data includes no representations of individual
`objects within the video image. To the computer dis-
`play system, the video image is a homogeneous video
`data stream encoded in a particular way. In prior art
`systems, this video data is received by processing logic
`that includes a particular mechanism for decoding the
`video data and transferring the decoded video data into
`a frame buffer for display on the display screen. Because
`of the different nature of graphical data as compared
`with video data, prior art systems tend to handle graphi-
`cal and video data in separate subsystems within the
`computer display system. Thus, graphical data and
`video data typically take parallel and independent paths
`through prior art systems as the data is processed for
`display. In some cases, two separate frame buffers are
`used, one for graphical data and one for video data. In
`other systems, a single frame buffer is used; however,
`graphical data occupies one distinct region of the frame
`buffer and video data occupies a different portion of the
`frame buffer.
`
`45
`
`50
`
`55
`
`A number of problems exist in prior art systems that
`process graphical and video data independently. First,
`these prior art systems cannot efficiently combine
`graphical and video images together into a composite
`form that is easily manipulated. For example, scrolling a
`
`65
`
`SUMMARY OF THE INVENTION
`
`invention integrates graphical, video,
`The present
`and audio data into a single processing environment.
`Therefore, the present invention includes a processor
`(CPU), a graphics controller, a capture controller, and
`an audio controller. The graphics controller is a special-
`ized graphics oontrol processor coupled to a peripheral
`bus. The graphics controller, operating under the soft-
`ware control of the CPU performs processing and com-
`positing of the graphics and video data to be displayed
`on a CRT display. This software control methodology
`is the subject of the present invention and is described in
`detail herein.
`
`Application software interfaces with system software
`of the present invention through a graphics application
`program interface (API), a video API, and an audio
`API. These APIs provide a high level functional inter-
`face for the manipulation of graphics, video, and audio
`information. Unlike the prior art, however, the present
`invention employs an integrated graphics/video con-
`troller (IVC). The IVC interfaces with application soft-
`ware through the graphics API and the video API. The
`output data generated by the IVC is a composite graph-
`ics and video image which is stored for output to a
`display device in a frame buffer. The IVC of the present
`invention also interfaces with an audio driver. The IVC
`provides the audio driver with synchronization infor-
`mation used to synchronize a video image stream with
`an audio data stream. The IVC receives timing informa-
`tion from a display timing control circuit. The timing
`information provided by the display timing control
`circuit is used by the IVC for synchronizing the loading
`of the frame buffer. The display timing control circuit
`provides a means for optimizing the operation of the
`IVC for the timing characteristics of a particular dis-
`play system that is used with the present invention.
`
`Petitioners HTC and LG — Exhibit 1028, p. 16
`HTC and LG v. PUMA, IPR2()l5—()l5()l
`
`Petitioners HTC and LG - Exhibit 1028, p. 16
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`3
`The IVC receives graphics instructions through the
`graphics API and video instructions though the video
`API. The graphics API interface provides a means by
`which an application program may interface with a
`display driver for displaying graphics images on a dis-
`play screen. A graphics command interface and a mask
`driver derive information from the graphics commands
`provided by an application program prior to transfer-
`ring the graphics information to a display driver. Three
`types of information derived from the graphics com-
`mand stream are produced by the mask driver. First,
`clipping information is derived by the mask driver.
`Clipping information is used to enable the display of
`graphics information within a specified window on the
`display screen. A window is a rectangular portion of the
`display screen having particular attributes that are well
`known in the art. Clipping information is used to enable
`the display of graphics information within the window
`and to suppress the display of information outside the
`boundary of a window. The second type of information
`derived by the mask driver from the graphics command
`stream is the actual graphic content of a specified win-
`dow. The third type of information derived by the mask
`driver is mask information. A blender uses the mask
`information, the graphics information, and the clipping
`information for combining or compositing graphics
`information with video information.
`The IVC also receives video image information
`through the Video AP]. The video API provides a
`means by which an applications program may interface
`with the IVC for the purpose of manipulating video
`images for display on a display screen. A video com-
`mand stream is received by a video command interface
`through the video API. The video command stream
`comprises video commands for configuring the opera-
`tion of the IVC and video data which comprises a com-
`bination of video and audio data with synchronization
`information for synchronizing the display of frames of
`video images.
`The video commands of the video command stream
`provide functions for configuring the operation of the
`IVC. These functions include commands for loading
`software video decoders in a decoder block within the
`IVC. Video data transferred to the IVC via the video
`API may be encoded in a variety of different formats.
`The present invention provides a means for dynami-
`cally loading a plurality of different video decoders
`through a video command interface to a video decode
`block both within the IVC. Each of the independent
`decoders within the decode block contain processing
`logic for decoding a particular type of video data to
`produce a uniform type of decoded video data which is
`provided to the blender. The blender receives the de-
`coded video data and combines the video data with
`graphics data as defined by the mask information, the
`graphics information, and the clipping information. The
`combined or composited image is then transferred to a
`display driver and output to a display through a frame
`buffer.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`S0
`
`55
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram of the computer system
`hardware used in the preferred embodiment.
`FIG. 2 is a block diagram of the prior art software
`used in a system for processing graphics, video, and
`audio data.
`FIG. 3 is a block diagram of the software used in the
`preferred embodiment of the present invention.
`
`65
`
`5,432,900
`
`4
`FIG. 4 is a block diagram of the integrated gra-
`phics/video controller (IVC).
`FIGS. 5-11 are flowcharts illustrating the processing
`logic of the video command interface.
`FIGS. 12-14 are flowcharts illustrating the process-
`ing logic of the mask driver.
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENT
`
`The present invention is an integrated graphics and
`video display system for manipulating and viewing both
`graphical and video data in a single integrated system.
`In the following description, numerous specific details
`are set forth in order to provide a thorough understand-
`ing of the invention. However, it will be apparent to one
`of ordinary skill in the art that these specific details need
`not be used to practice the present invention. In other
`instances, well known structures, circuits, and inter-
`faces have not been shown in detail in order not to
`obscure unnecessarily the present invention.
`Referring now to FIG. 1, a block diagram of the
`computer system hardware used in the present embodi-
`ment is illustrated. The computer system used in the
`preferred embodiment comprises a memory bus 100 for
`communicating information, a processor (CPU) 102
`coupled with memory bus 100 for processing informa-
`tion, and a memory 104 coupled with bus 100 for storing
`information and instructions for CPU 102 and other
`components of the computer system. In the preferred
`embodiment, memory bus 100 is a high speed bus which
`is well known to those of ordinary skill in the art. In the
`preferred embodiment, CPU 102 is an i486® brand
`microprocessor manufactured by the assignee of the
`present invention. i486® is a registered trademark of
`Intel Corporation, Santa Clara, Calif. Memory 104 in
`the preferred embodiment comprises dynamic random
`access memory (DRAM) and read only memory
`(ROM). Coupling a CPU 102 and a memory 104 to-
`gether on a memory bus 100 as shown in FIG. 1 is well
`known in the art.
`FIG. 1 also illustrates a peripheral bus 110 coupled to
`memory bus 100 via a bridge 112. The use of a bridge
`between two computer buses is well known in the art.
`Similarly, the use of a peripheral bus such as peripheral
`component interface (PCI) is described in co-pending
`patent application Ser. No. 07/886,992.
`A disc drive 114 and a network interface or other
`input/output device 140 may also optionally be coupled
`to peripheral bus 110. Again, disc drive and network
`interface apparatus and their connection to a peripheral
`bus 110 is well known in the art.
`
`invention integrates graphical, video,
`The present
`and audio data into a single processing environment.
`Therefore,
`the present invention centers around the
`interaction between CPU 102 and graphics controller
`116, capture controller 118, and audio controller 120.
`Graphics controller 116 is a specialized graphics control
`processor coupled to peripheral bus 110. Graphics con-
`troller 116 operating under the software control of CPU
`102 performs processing and compositing of the graph-
`ics and video data to be displayed on CRT display 126.
`This software control methodology is the subject of the
`present invention and is described in detail below. The
`hardware platform in which the present invention oper-
`ates
`is described in patent application Ser. No.
`07,901,519, now U.S. Pat. No. 5,243,447, filed concur-
`rently with the present patent application. In general,
`this hardware platform comprises graphics controller
`
`Petitioners HTC and LG — Exhibit 1028, p. 17
`HTC and LG v. PUMA, IPR2()l5—()l5()l
`
`Petitioners HTC and LG - Exhibit 1028, p. 17
`HTC and LG v. PUMA, IPR2015-01501
`
`
`
`5
`116 to which a video random access memory (VRAM)
`122 is coupled. Video RAM 122 provides storage for a
`frame buffer which represents a binary encoded repre-
`sentation of the picture elements (pixels) that are dis-
`played on CRT display 126. Video RAM 122 is used for
`storage of both graphical information and video infor-
`mation. Using the processing techniques described
`herein and the hardware platform described in the
`above-referenced co-pending patent application, both
`graphical and video data may be composited (i.e. com-
`bined) into a single frame buffer within video RAM 122.
`This composite graphical and video image is converted
`to analog form by digital to analog converter (DAC)
`124 and output to CRT display 126. In this manner,
`applications programs executing within CPU 102 can
`use the processing logic of the present
`invention to
`display composited graphical/video images in arbitrary
`locations on CRT display 126.
`Capture controller 118 is coupled to peripheral bus
`110. Capture controller 118 is a hardware system for
`receiving and processing video display data from video
`source 128. Capture controller 118 receives video im-
`ages from video source 128 and generates video image
`data packets for transmission on peripheral bus 110. The
`video image data packets comprise still video images or
`a series of motion video images which include synchro-
`nization information. The video data may be moved on
`peripheral bus 110 into graphics controller 116 for com-
`positing with graphics data or moved to CPU 102 or
`memory 104 via bridge 112 and memory bus 100. Video
`source 128 is a video camera, a video recording device,
`or other source for video image data. The structure and
`detailed operation of capture controller 118 is beyond
`the scope of the present invention.
`Audio controller 120 is also coupled to peripheral bus
`110. Audio controller 120 provides a means for receiv-
`ing and processing audio data from a microphone 132 or
`other audio input device. Audio controller 120 gener-
`ates a series of audio data packets for transfer on periph-
`eral bus 110. The audio data packets represent a series of
`sounds retrieved from microphone 132 or other audio
`input device. Audio controller 120 also provides syn-
`chronization information with each audio data packet.
`Audio controller 120 also provides a means for decod-
`ing audio data received via peripheral bus 110 and out-
`putting audio signals to speaker 130 or other audio out-
`put device. The structure and detailed operation of
`audio controller 120 is beyond the scope of the present
`invention.
`
`5
`
`l0
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`5,432,900
`
`6
`lie functions and corresponding calling parameters with
`which application software 210 may create and manipu-
`late graphics objects that are stored for display in
`graphics frame buffer 220. Graphics objects typically
`include points, vectors, conical shapes, polygons, rec-
`tangles, blocks of text, arcs, and other shapes that may
`be represented in a two or three dimensional virtual
`space.
`In a similar manner, video API 214 provides an inter-
`face between application software 210 and video driver
`222. Video API 214 provides a set of functions for ma-
`nipulating video image data that is processed by video
`driver 222 and eventually stored for output to a display
`screen in video frame buffer 224. Because prior art
`systems typically used graphic display hardware which
`was independent of video display hardware, the pro-
`cessing of graphic and video data in prior art systems
`was typically done using independent and distinct hard-
`ware and software processes. This prior art design is
`therefore considered a loosely coupled or low integra-
`tion system; because, the graphic and video data is not
`integrated as it travels from the application software
`210 to a display screen (not shown). Rather, in the prior
`art systems, graphic and video data travels two parallel
`paths through the processing architecture.
`Audio API 216 provides an interface between appli-
`cation software 210 and audio driver 226. Audio API
`216 provides a set of interfaces for manipulating audio
`information that is processed by audio driver 226 and
`stored in audio output buffer 228 for eventual output to
`an audio emission system (not shown). The audio API
`216 provides a set of audio related functions with which
`application software 210 may manipulate audio infor-
`mation. Agajn, the loosely coupled or low integration
`approach used in the prior art is apparent as audio infor-
`mation travels an independent path from application
`software 210 to audio output bufier 228. The indepen-
`dent path traveled by audio data in the prior art systems
`is less of a performance penalty; however, because
`audio information is destined for a different output de-
`vice than graphics and video data. Both graphics and
`video data, on the other hand, are typically displayed on
`the same display device.
`Several problems exist with the low integration ap-
`proach used in the prior art. Because graphics and video
`information is processed using separate processes, prior
`art systems do not provide an efficient means for com-
`bining graphics and video images together in a compos-
`ite forrn. Secondly, because certain items of hardware in
`the prior art design are dedicated to processing either
`graphics or video data, some items of the prior art de-
`sign must be duplicated for processing both graphics
`and video data. Some of these duplicated hardware
`elements include frame buffer memory 220 and 224.
`Because of the independent paths traveled by graphics
`and video data in the prior art system, other hardware
`logic is required to handle the parallel paths. This addi-
`tional logic includes multiplexing logic which deter-
`mines which frame buffers data is to be displayed for
`each pixel location. Moreover, additional synchroniza-
`tion logic is required to synchronize the output of
`graphics and video data as the information is merged
`just before output to the display device. Thus, the prior
`an design suffers from low graphics/video integration
`functionality and increased hardware cost due to the
`low integration design.
`Referring now to FIG. 3, the structure of the system
`software used in the present invention is illustrated.
`
`Referring now to FIG. 2, a block diagram illustrates
`the prior art method for processing graphics, video, and
`audio data. Application software 210, residing in mem-
`ory 104 and executed by CPU 102, comprises user level
`programming tools for manipulating and processing
`graphics, video, and audio data. Several application
`programs of this type exist in the prior art. An example
`of one such application program is AutoDesk Anima-
`tor TM for Microsoft Windows TM developed by Auto-
`Desk TM , Inc. Application software 210 interfaces with
`system software through several application program
`interfaces (API). APIs provide a means by which an
`application program may request service from system
`software through the functions provided in an AP]. In
`the prior art, three APIs are generally provided:
`l)
`graphics API, 2) a video API, and 3) an audio API. As
`illustrated in FIG. 2, graphics API 212 provides an
`interface between application software 210 and graph-
`ics driver 218. Graphics API 212 provides a list of pub-
`
`50
`
`55
`
`65
`
`Petitioners HTC and LG — Exhibit 1028, p. 18
`HTC and LG v. PUMA