`
`[19]
`
`nu Patent Number:
`
`5,710,895
`
`Gerber et al.
`
`[45] Date of Patent:
`
`Jan. 20, 1998
`
`US005710895A
`
`[54] METHOD AND APPARATUS FOR
`CAPTURING AND COMPRESSING V]])E()
`DATA IN REAL TIME
`
`5,471,577
`5,512,503
`5,508,940
`
`ll/1995 Lightbody et a1.
`3/1996 Koz ...................
`4/1996 Rossmere et al.
`
`.. 395/340
`
`348/552
`.................. 395/806 X
`
`OTHER PUBLICATIONS
`Roberts, “Putting Pictures in PCs”, Tech PC User, Apr., 7,
`1989.
`
`Leonard, “Compression Chip Handles Rea1—T1'me Video and
`Audio”, Electronic Design, Dec. 1990.
`Floyd, “Data Compression and the AVK”, Dr. Dobb’s Jour-
`“‘’““1' 1992'
`Green, “Capturing Digital Video Using DVL; Multimedia
`and the i750 Video Processor”, Dr. Dobb’s Journal, Jul.
`1992.
`
`_
`,
`Primary E.xammer—Raymond J. Bayerl
`A
`A
`17''
`Blake], Skol ff,
`Z”,fi,”"’,,ff‘
`39"‘ °'
`‘W’
`y
`°
`°
`
`1 &
`Ta-y "I
`
`[57]
`
`ABSTRACT
`
`Operating in cooperation with a shared software interface, a
`driver controls the operations of a video hardware device
`that captures and compresses video data in real time. This
`driver controls capture and compression of the video data in
`response to a capture message call through control of a
`handshaking scheme between a host processor within the
`computer system and an auxiliary processor located in a
`.
`.
`"‘d°° h‘“d“’a‘° d°"‘°"‘
`A
`
`15 Claims, 6 Drawing Sheets
`
`[75]
`
`Inventors: Richard Gerber, Hillsboro; Joshua
`Helmfilla B63V€1‘t0l1s both Of 0F3S-
`
`[73] Assign”:
`
`Intel Corpomfiona Santa Clara: Cafit
`
`[21] APP1- N03 7429251
`2
`Fil dz
`0 t.
`1 1996
`[2 ]
`C
`C 3 ’
`Related us. Application Data
`
`-
`
`[63] Continuation of Ser. No. 215,918, Mar. 22, 1994, aba.n—
`dom,d_
`Int. Cl.‘ ................................. G06T 1/20; GO6F 3/14
`[51]
`[52] US. Cl. .......................... 395/327; 395/972; 395/503;
`395/504; 395/520; 348/552
`[58] Field of Search ..................................... 395/327, 972,
`395/328, 340, 806, 807, 503, 504, 520,
`511, 514, 526; 348/552
`
`[56]
`
`.
`References Cited
`U.S. PATENT DOCUNIENTS
`
`_
`
`........................... 395/392
`4,698,770 1o/1937 Rattan etal.
`5,191,645
`3/1993 Catlucci eta].
`395/323
`5,203,745
`5/1993 Quentin et al.
`. 395/806 X
`5,355,450 10/1994 Gannon et al. ........................ 395/501
`
`
`
`HOST
`
`PROCESSOR
`
`42
`
`AUXILARY
`
`U0 DEVICE
`
`18
`
`1
`
`SAMSUNG 1005
`
`
`
`
`
`FIRST BUFFER
`
`50
`SECOND BUFFER
`51
`
`37
`
`41
`
`PROCESSOR
`
`E‘gg-[i1;%%%S10N 52
`
`DIGITIZER
`
`
`
`V
`Second
`First
`Temp
`Storage Compression Compression
`Storage
`Storage
`Element
`E;ement
`
`VRAM
`
`38
`
`53
`
`40a
`
`40b
`
`VIDEO HARDWARE
`
`17
`
`1
`
`SAMSUNG 1005
`
`
`
`U.S. Patent
`
`Jan. 20, 1993
`
`Sheet 1 of 5
`
`5,710,895
`
`
` ‘
`
`Initialize the
`
`t
`Set format for the
`captured video data.
`
`
`
`103
`
`.
`
`Customize
`capture
`
`format?
`
`Step 102
`
`btep
`
`.
`Transmlts the message
`call concerning the
`capture format to the
`capture driver.
`
`Y
`
`Step 104
`
`
`Step 105
`
`.Capture 9
`
`
`Vldeo data‘
`
`N
`
`_
`Transmit
`
`capture frame
`
`and
`predetermined
`parameters.
`
`Step 109
`
`
`
`
`
`Signal capture
`driver to unload
`information
`therein.
`
`Step 106
`
`.
`
`down vidgo
`hardware.
`
`107
`
`St
`
`ep
`
`END
`
`Step 110 : essage cal]
`
`p redetermined
`parameters.
`
`Step 111 Start stream.
`
`Step 112
`
`Stop
`stream?
`
`N
`
`Y
`
`Figure 1
`Prior Art
`
`2
`
`
`
`U.S. Patent
`
`Jan. 20, 1998
`
`Sheet 2 of 6
`
`5,710,895
`
`MAIN
`MEMORY
`
`COMPUTER SYSTEM
`
`VIDEO HARDWARE
`
`3
`
`
`
`U.S. Patent
`
`Jan. 20, 1993
`
`Sheet 3 of 6
`
`5,710,895
`
`31
`
`
`
`CONVENTIONAL
`VFW—BASED
`CAPTURE
`APPLICATION
`
`17
`
`
`
`DIGITIZER
`
`I/O DEVICE
`
`
`
`
`4
`
`
`
`U.S. Patent
`
`Jan. 20, 1998
`
`Sheet 4 of 6
`
`45
`
`46
`
`47
`
`
`
`
`
`SECOND LAYER
`
`Capture control and interrupt service routine
`
`THIRD LAYER
`
`Hardware communication
`
`“W
`
`FIRST LAYER
`
`Capture Messages
`
`Figure 4
`
`5
`
`
`
`U.S. Patent
`
`Jan. 20, 1993
`
`Sheet 5 of 6
`
`59009017.,5
`
`m.mNH.HHU~Q
`
`mommmoomm
`
`Vm<AUm.D<
`
`
`
`mmmmemfiwmm
`
`zoimmmmsoo
`
`
`
`mmmmbmEmma
`
`om
`
`Hm
`
`
`
`MMMEDMmzoomm
`
`
`
`finmfiwmmasmfiwram
`
`mouwmmonma5
`
`mm<2Bm<mOED;
`
`mm
`
`mommmoomm
`
`ems:
`
`mE=m.£
`
`Sham328938m:5
`
`
`
`obsmmo2236®H~.£.QNO925530
`
`
`
`
`
`
`
`
`
`
`
`wmfifimummnoummmfioammmfioum
`
`6
`
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`Jan. 20, 1993
`
`Sheet 6 of 6
`
`5,710,895
`
`O
`
`Flgure 6
`
`START
`
`Step
`
`Clear PIC.
`
`Step 201
`
`Step 202
`
`Step 203
`
`
`
`Clear interrupting
`condition reassert
`
`
`hardware interrupts.
`
`
` Capture even or
`
`odd fields?
`
`
`Update stream time.
`
`Step 210
`
`
`
`
`
`Step 204
`
`Update stream time.
`
`Step 205
`
`Step 206
`
`Time to
`encode next
`frame?
`
`
`Y
`
`
`
`Update compression. Step 207
`
`Microcode?
`
`Step 211
`
`Step 299
`
`
`Hardware
`Step 203
`performance
`
`too slow.
`
`
`
`Save state of
`interrupt flag.
`
`Enable interrupts.
`
`Step 213
`
`Step 214
`
`Compressed video data is
`read by the host processor.
`
`S
`
`tep
`
`21
`
`5
`
`Restore interru t flag Step 216
`
`Step 217
`
`7
`
`
`
`5,710,895
`
`1
`METHOD AND APPARATUS FOR
`CAPTURING AND COMPRESSING VIDEO
`DATA IN REAL TIME
`
`This is a continuation of Application Ser. No. 08/215,
`918, filed Mar. 22, 1994, now abandoned
`BACKGROUND OF THE INVENTION
`1. Field of the Invention
`
`The present invention relates to a method and apparatus
`for capturing and compressing video data. More particularly,
`the present invention relates to a driver operating in coop-
`eration with VIDEO FOR WINDOWS, a software interface
`manufactured by Microsoft® Corporation, to drive specifi-
`cally configured hardware to capture and compress video
`data in real time.
`
`2. Background of the Field
`It is commonly known that video equipment (e.g., video
`camcorders, video recorders, laser disks, etc.) has enabled
`persons to record or “capture” video data (i.e., audio and
`visual data) onto storage medium such as videotape. The
`storage medium is commonly “played back” by the video
`equipment which transmits the captured video data into a
`cathode ray tube for viewing purposes. This captured video
`data is aflixed to the storage medium, thereby preventing any
`alteration of the captured video data except, of course, by
`erasing it completely from the storage medium.
`Due to the growing popularity of video equipment and
`multi-media products over the last few years, businesses
`have begun to realize the commercial advantages in estab-
`lishing an efficient interactive relationship between video
`equipment and computers. Recently, Microso Corpora-
`tion has developed a software interface entitled VIDEO
`FOR WINDOWS (the “VFW interface”) which enables
`video equipment to operate in conjunction with computers
`using a WINDOWS” operating system. Thus, a software
`capture application using the VFW interface (hereinafter
`referred to as a “VFW-based capture application”) generates
`and transmits a capture message call to the VFW interface
`which, in turn, is usually routed to one of a plurality of
`drivers, namely a capture driveroperating under software
`control, to drive video hardware specifically configured to
`capture video data.
`As shown in FIG. 1, operational steps undergone by the
`VFW-based capture application are fllustrated, wherein cer-
`tain VFW capture message calls associated with these opera-
`tional steps are listed in Table 1 of the Appendix. Upon
`activation by the user, the VFW-based capture application
`initializes the capture driver through pre-selected initializa-
`tion capture message calls (Step 101), including but not
`limited to those listed in Table 1. Next, format for the
`captured video data (i.e., capture format) is configured
`according to predetermined default values (Step 102).
`Thereafter, the VFW-based capture application performs one
`of four user—selected alternatives; namely, customizing the
`capture format (Step 104), terminating the video capture
`application (Step 106-107), capturing video data video
`flame-by-video flame (Steps 109) or capturing video data in
`a stream of video frames (Steps 110-112).
`To‘ customize the capture format, the VFW-based capture
`application generates and transmits a capture message call
`(e.g., a “DVM__DIALOG” message call per Table 1) to the
`capture driver in order to open a dialog box (Step 104). The
`dialog box allows user-modification of video quality, key
`frame rate and other custom video options.
`The second alternative is to exit the VFW-based applica-
`tion (Step 10S). This causes a message to be sent to the
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`65
`
`2
`capture driver informing the capture driver to unload infor-
`mation contained therein (Step 106) and power-down the
`video hardware coupled to the computer system (Step 107).
`The VFW-based capture application offers two more
`alternatives associated with capturing video data. These
`alternatives demonstrate two different modes of capturing
`video data; namely, a stream mode where multiple video
`flames are being captured sequentially and a flame mode
`Where only one video frame of video data is captured at a
`time. Avideo frame has a maximum height of 240 scan lines
`according to NTSC standards. In step 108, the flame mode
`is commenced upon receipt of a flame capture message call
`(a “DVM___FRAME” capture message call per Table 1) and
`parameters included with the frame capture message call
`which provide the capture driver information relevant to its
`operations, such as storage location information (Step 109).
`In steps 110-112, the stream mode is commenced upon
`receipt of a stream capture message call (a “DVM_
`STREAM_]NIl"’ capture message call per Table 1) and,
`similar to the flame mode, pre-selected parameters neces-
`sary to perform proper stream capture operations are trans-
`mitted therewith.
`
`interactive video computer systems,
`In conventional
`video data is captured when the VFW-based capture appli-
`cation transmits a capture message call having a number of
`parameters through the VFW interface and into a conven-
`tional capture driver. These parameters include information
`necessary to properly capture video data including, but not
`limited to, its capture mode (i.e., frame or stream), the video
`frame size and a storage location for the captured video data.
`Upon receipt of the capture message call, the conventional
`capture driver initiates control signals to the specifically
`configured hardware to capture the video data. Thereafter,
`the VFW-based capture application writes the captured
`video data to disk. Subsequently,
`the same or perhaps
`another VFW-based application requests the captured video
`data to be retrieved from disk, uses a compression driver to
`encode (i.e., compress) the captured video data and then
`writes the compressed video data back to the disk. This
`conventional interactive video computer system has many
`disadvantages.
`A first disadvantage associated with the conventional
`interactive video computer system is that video data is not
`captured in real-time, but rather includes extraneous steps in
`writing and retrieving the captured video data to and from
`disk. As a result, this conventional interactive video com-
`puter system experiences enormous time delays in the range
`of several minutes to several hours caused by these extra-
`neous steps.
`Another disadvantage is that the conventional interactive
`video computer system require a hard disk with a large
`memory space, generally hundreds of megabytes, in order to
`store the captured video data in uncompressed form. Due to
`this large memory requirement, the overall costs associated
`with the conventional interactive video computer systems is
`quite substantial, making the systems unatfordable to a large
`section of the public.
`In an elfort to overcome these disadvantages, prior to the
`subject application, Intel Corporation of Santa Clara, Calif.
`had designed and developed a driver for the VFW interface
`which combined capture and compression operations
`(hereinafter referred to as the “AVK driver”). However, the
`AVK driver also had many disadvantages associated there-
`with. One such disadvantage was that the AVK driver is
`extremely complex and unreliable, requiring at least fifty
`installation files to be installed in the interactive video
`
`8
`
`
`
`5,710,895
`
`3
`computer system before functioning. Another disadvantage
`was that it was extremely costly to implement since it
`required a relatively large memory space, approximately
`two megabytes of main memory and two megabytes of
`video random access memory (“VRAM”). This large
`memory space was necessary to store a large number of
`micro-code files therein. As a result, the AVK driver was
`incapable of operating in real time. In fact, the AVK driver
`was extremely slow, requiring up to five seconds before the
`first picture was displayed because the loading of installation
`files into the interactive video system was extremely time
`consuming. Besides requiring a large memory space of main
`memory and VRAM, the AVK driver also required EMS
`memory space which is a scarce resource in typical micro-
`computers.
`
`BRIEF SUMMARY OF THE INVENTION
`
`In light of the foregoing, it is appreciated that a simple
`driver capturing and compressing data in real time and
`requiring minimal memory would be advantages in the
`marketplace by allowing any computer user to immediately
`capture data without purchasing additional memory or a
`computer system having large memory capabilities.
`Therefore, present invention provides a driver under con-
`trol by the VFW-based capture application to control the
`capture of video data transmitted into video hardware by an
`I/O device and to compress the video data in real time (a
`“capture and compression driver”). One embodiment of the
`present invention comprises a driver which controls video
`hardware including an auxiliary processor to capture and
`compress video data based on particular capture message
`calls transmitted therein by a VFW-based capture applica-
`tion. Such capture and compression of video data is accom-
`plished through a handshaking scheme controlled by the
`driver between a host processor within the computer system
`and the auxiliary processor within the video hardware.
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The features and advantages of the present invention will
`become apparent from the following detailed description of
`the present invention in which:
`FIG. 1 is a flowchart of the operations of a VFW—based
`capture application.
`FIG. 2 is a block diagram of the interactive video com-
`puter system employing the capture and compression driver.
`FIG. 3 is a detailed block diagram of a communication
`path of the capture message call being transmitted within the
`interactive video computer system employing the capture
`and compression driver.
`FIG. 4 is a block diagram of the structural organization of
`the capture and compression driver.
`FIG. 5 is a block diagram of the auxiliary and host
`processors undergoing handshaking operations being con-
`trolled by the capture and compression driver.
`FIG. 6 is a flowchart of the interrupt service routine
`supporting the handshaking operations illustrated in FIG. 5.
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`The present invention describes a driver, operating in
`cooperation with the VFW interface, which captures and
`compresses video data in real time to enhance system
`performance. The detailed description which follows is
`presented largely in terms of tables and algorithms which are
`the means used by those skilled in the art to most effectively
`
`65
`
`4
`convey the substance of their work to others skilled in the
`art. An algorithm is generally conceived to be a series of
`steps leading to a desired result. These steps require physical
`manipulations of physical quantities. Usually, although not
`necessary, these quantities take the form of electrical or
`magnetic signals capable of being stored,
`transferred,
`combined, compared or otherwise manipulated. It further
`should be noted that there exist some instances where
`well-known steps are not set forth in detail in order to avoid
`unnecessarily obscuring the present invention.
`Referring to FIG. 2, an embodiment of an interactive
`video computer system 10 utilizing the present invention is
`illustrated in which the present invention may be best
`described as an ISA plug-in adapter for an IBM compatible
`personal computer with an INTEL® 80X86 processor. The
`computer system 10 generally comprises a system bus 11
`including address, data and control buses for communicating
`information between a plurality of bus agents, including at
`least a host processor 12. The host processor 12 is an Intel®
`80x86-compatible processor; however, the present invention
`may be utilized in any type of processor. Although only the
`host processor 12 is illustrated in this embodiment, it is
`contemplated that multiple processors could be employed
`within the computer system 10.
`As further shown in FIG. 2, the system bus 11 provides
`access to a memory subsystem 13, video subsystem 14 and
`an input/output (“I/O”) subsystem 15. The memory sub-
`system 13 includes a memory element 16 such as a dynamic
`random access memory (“DRAM”) 16 in FIG. 2, but such
`memory element may include any type of memory such as
`read only memory (“ROM”), static random access memory
`(“SRAM”) and the like. The memory element 16 stores
`programs executed by the host processor 12 such as the
`VFW—based capture application. It is contemplated that a
`memory controller could be used as an interface between the
`system bus 11 and a variety of memory elements to control
`access thereto.
`
`The video subsystem 14, shown coupled to the system bus
`11 but may be coupled to an I/O bus 20, includes a video
`hardware device 17 coupled to both the host processor 12
`and a video input device 18 which enables video data to be
`inputted into the computer system 10. Such video input
`device 18 may include a video camcorder, laser disk player,
`video recorder and any other similar video devices. A
`preferred embodiment of the video hardware device 17 is
`illustrated in FIG. 3 discussed below.
`
`The 1/0 subsystem 15 may include an I/O bridge 19 as an
`interface between the I/O bus 20 and the system bus 11
`which provides a communication path (i. e., ateway) for the
`computer system 10 to transfer information to peripheral
`devices on the I/O bus 20 including, but not limited to a
`display device 21 (e.g., cathode ray tube, liquid crystal
`display, etc.) for displaying images; an alphanumeric input
`device 22 (e.g., an alphanumeric keyboard, etc.) for com-
`municating information and command selections to the host
`processor 12; a cursor control device 23 (e.g., a mouse, track
`ball etc.) for controlling WINDOWSTM placement; a mass
`data storage device 24 (e.g., magnetic tapes, hard disk drive,
`floppy disk drive, etc.) for storing information and instruc-
`tions; and a hard copy device 25 (e.g., plotter, printer, etc.)
`for providing a tangible, visual representation of the infor-
`mation. It is contemplated that the computer system shown
`in FIG. 2 may employ some or all of these components or
`diflerent components than those illustrated.
`Referring to FIG. 3. the propagation of a capture message
`call through the interactive video computer system 10 of the
`
`9
`
`
`
`5,710,895
`
`5
`
`present invention is illustrated in detail. In order to capture
`video data, a VFW-based capture application 31 (e.g., a
`published application entitled VIDCAPTM by
`MICROSOFI‘® Corporation), is executed by the host pro-
`cessor to generate a capture message call. The capture
`message call is transmitted to a library “MSVIDEOTM” 32
`and then through the VFW interface 33 to a capture and
`compression driver (hereinafter referred to as the “CC
`driver”) 34.
`Monitoring the VFW interface 33 in a periodic or con-
`tinuous manner, the CC driver 34 receives the capture
`message call and outputs an appropriate control signal along
`a bi-directional communication path 35 to the video hard-
`ware 17 (e.g., an ACTIONMEDIATM II card or an IN'I‘EL®
`SMAKI‘ VIDEO RECORDERTM), being coupled to the
`video input device 18. This communication path 35 is
`bi-directional to allow the video hardware 17 to return result
`codes such as error codes, acknowledgment signals. and the
`like to the VFW-based capture application 31.
`As further shown in FIG. 3.
`the video hardware 17
`comprises a digitizer 36 (e.g., an IBM® CSZTM), an auxil-
`iary processor 37 such as an IN'I'EL® pixel processor
`(i750PB"M) having dynamic random access memory 41 and
`video random access memory (“VRAM”) 38, coupled
`together through a video bus 42. These components are
`provided within the ACTIONMEDIAW II card or the
`INTEL® SMAKI‘ VIDEO RECORDER” (ISRVTM) cur-
`rently being sold by Intel Corporation. In response to a
`control signal by the CC driver 34, the digitizer 36 captures
`the video data from the video input device 18 and stores the
`captured video data in a first storage element 39:1 of a
`plurality of linked-list capture storage elements 39a—39d
`(e.g., bitmaps, bulfers, etc.) located within the VRAM 88. In
`this embodiment, there exists four (4) linked-list capture
`storage elements 39a-—39d, although any number of capture
`storage elements may be used, depending on a particular
`interactive video computer system design. After the first
`capture storage element 39a is full, the digitzer 36 alterna-
`tively stores the captured video data in a successive capture
`storage element 39b, for example, and proceeds to store the
`captured video in another capture storage element 39c when
`the successive capture storage element 39b is full and so on.
`The digitizer 36 reverts back to the first capture storage
`element 39a after a last of the capture storage elements 39d
`is full. Such storage is generally continuous and repetitious
`in nature.
`
`The auxiliary processor 37, in fact the video hardware 17
`in general, is controlled by the CC driver 34 in its allocation
`of storage elements for captured video data as well as
`compression of the video data through handshaking with the
`host processor 12. Briefly, in the preferred embodiment, the
`auxiliary processor 37 compresses the captured video data
`stored in the plurality of capture storage elements 39a-39d
`within VRAM 38 upon signaling by the host processor 12
`after detection of a capture interrupt as shown in FIG. 6. The
`auxiliary processor 37 places the resulting compressed video
`data in one of at least two compression storage elements 40a
`and 40b in the VRAM 38. These plurality of compression
`storage elements 40a and 4012 are also accessible by the host
`processor 12.
`Referring now to FIG. 4, a more-detailed illustration of
`the CC driver 34 is illustrated. The CC driver 34 is a
`combination of software programs forming at least three
`separate layers. Afirst layer 45 comprises a program respon-
`sive to numerous capture mes sage calls from the VFW-
`based capture application 31, including opening a dialog box
`(i.e., a control panel, video effects, etc.). Basically, this
`
`10
`
`20
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`65
`
`6
`program is a switch statement initiating procedural calls to
`various procedures within the second and third layers 46 and
`47 of the CC driver 34 and for maintaining and filling the
`linked-list of capture storage elements 39a—39d used in
`capturing video data.
`A second layer 46 is used to store capture control pro-
`grams including, but not limited to, an interrupt service
`routine which is discussed below in more detail. Moreover,
`the second layer 46 further includes control structures which
`maintain the capture storage elements 39a—39d where the
`captured video data is being saved.
`A third layer 47 of the CC driver 34 includes those
`programs needed for the CC driver 34 to directly commu-
`nicate with the video hardware 17 through control signals
`corresponding to the capture message calls received from
`the video capture application 31. These control signals
`signal the video hardware 17 where to place the captured
`video data inVRAM 38 and when the auxiliary processor 37
`should begin compressing the captured video data.
`In the preferred embodiment of the present invention, the
`CC driver 34 provides stream and fiarne capture modes
`under two control methods, namely host processor control
`having subsampled compression only and auxiliary proces-
`sor control which uses micro-code for additional compres-
`sion purposes as illustrated in Table 2 immediately below. In
`determining which control method to employ, the CC driver
`34 needs to ascertain from the capture message call whether
`the video data (in video image form) to be captured has a
`height greater than 240 scan lines equivalent to one video
`field defined by NTSC standards.
`
`TABLE 2
`
`AUXILIARY
`PROCESSOR
`HOST PROCESSOR
`CONTROLTED
`CONTROLLED
`
`
`STREAM O Digitize every frame
`of video data and
`store in VRAM.
`I Compress the frames
`ofvideodataata
`frame rate set by the
`user.
`O Digitize the video
`data as fast as possible.
`0 Compress the f:ran:re
`of video data at a
`Erame rate of 30 ms.
`
`FRAIVEJ
`
`I Digitize the video
`data as fast as possible.
`0 Transfer multiple
`firames of video data
`fr0mVRAMatt.he
`frame rate.
`
`0 Digitize the video
`data as fast as possible.
`O Tiansfer the video
`frame only after a
`capture frame
`message has been
`received.
`
`Referring back to FIG. 3 with reference to Table 2, if the
`CC driver 34 determines that a capture message call is
`directed toward video data being a video image less than one
`Video field in size, the capture and compression process is
`under auxiliary processor control. In this event, if a single
`video frame is selected to be captured indicating that the
`VFW-based capture application 31 generated a capture
`frame message (e.g., “DVM_FRAME”), the CC driver 34
`digitizes the video data comprising the single video frame
`and stores the digitized Video data into one of the plurality
`of capture storage elements 39a—39d. The digitized video
`data within the capture storage elements 39a-39d is com-
`pressed by the auxfliary processor 37 in accordance with a
`conventional compression algorithm at a frame rate of
`approximately thirty (30) video frames per second (“fps”)
`although any frame rate supported by the auxiliary processor
`37 may be chosen. On the other hand, if the VFW-based
`capture application 31 initiates a capture stream message
`
`10
`
`10
`
`
`
`5,710,895
`
`7
`
`(e.g., “DVM_STREAM_lN1T”) indicating that streaming
`mode is desired,
`the CC driver 34 controls the video
`hardware so that every video frame is captured and stored in
`the VRAM 38 upon receipt so that compression is accom-
`plished at a frame rate set by the user.
`If the CC driver 34 determines that video data to be
`captured involves a video image larger than a single video
`frame, the video data is captured in the plurality of capture
`storage elements 39a-39d and then is transmitted directly to
`main memory for use by the host processor 12. More
`specifically, upon detection of the frame capture message
`from the VFW-based capture application 31, the CC driver
`34 signals the digitizer 36 to capture a single video frame of
`video data. This single video flame is digitized as fast as
`possible according to the limitations of the digitizer 36 and
`is stored in one of capture the plurality of storage elements
`39a—39d in VRAM 38 to be transmitted to main memory
`upon indication by the host processor 12. The latency
`involved is greater than the auxiliary processor controlled
`operations because the time delay in transmitting the video
`frame to the host processor greatly increases because no
`decompression is used. In regards to the streaming method,
`video frames are copied from VRAM 38 at the frame rate
`selected by the user, which is, of course, limited by the
`capability of the digitizer 36.
`Referring to FIG. 5, in its preferred embodiment, the CC
`driver 34 controls capture and compression of video data
`under auxiliary processor control by employing a handshak-
`ing mechanism between the host and auxiliary processors 12
`and 37 utilizing a plurality of shared memory locations. In
`its preferred embodiment, at least three shared handshalcing
`memory locations are used; namely, a first and a second
`buffer 50 and 51 and a compression register 52 which are
`stored in DRAM 41 located within the auxiliary processor
`37. The compression register 52 is used to signal
`the.
`auxiliary processor 37 to begin compression of a particular
`video frame being stored in one of the plurality of capture
`storage elements 39a—-39d according to a predetermined
`compression algorithm. The first and second buffers 50 and
`51 are used as semaphores indicating whether its corre-
`sponding compression storage elements 40a and 40b are full
`or empty. Once compression is complete,
`the auxiliary
`processor 37 sets the appropriate buffer 50 or 51, signaling
`to the host processor 12 that compressed data is ready to be
`copied to the main memory from VRAM 38.
`The CC driver 34 controls the host processor 12 and the
`auxiliary processor 37 with a handshaking scheme as illus-
`trated by the pseudo code in Table 3 located in the Appendix.
`The first step in the handshaking scheme is for the auxiliary
`processor 37 to read a particular value stored in the com-
`pression register 52 and to store the particular value within
`a temporary storage element 53 (e.g., register, bulfer, etc.)
`located within DRAM 41 as shown or any other memory
`source accessible by the auxiliary processor 37. Thereafter,
`the auxiliary processor 37 periodically compares the com-
`pression register 52 with this temporary storage element 53
`to determine if any changes have been made to the com-
`pression register by the host processor 12.
`Operating concurrently with the auxiliary processor 37,
`the host processor 12, after being interrupted by the digitizer
`36, alters the compression register 52 to signal that the
`captured video data needs to be compressed. After altering
`the compression register 52, the host processor 12 periodi-
`cally and alternatively checks the first and second butfers 50
`and 51 to determine which buifer, if any, is active (performs
`a “callback routine”). An active first bulfer 50 indicates that
`compressed video data unread by the host processor 12 is
`
`8
`being stored within the first compression storage element
`40a and an active second buifer 51 indicates that compressed
`video data is within the second compression storage element
`40b.
`
`Once the compression register 52 is changed by the host
`processor 12, the CC driver 84 generates a first control
`signal to the digitizer 36 to continue sequentially storing
`video data in any of the remaining plurality of capture
`storage elements in VRAM 38, for example capture storage
`elements 39b, 39c or 39d in the event that video data was
`being captured and stored in a first capture storage element
`39a. In addition, the CC driver 34 generates a second control
`signal to the auxiliary processor 37 to begin compression of
`the captured video data within the first capture storage
`element 39a and store the result in a first of the plurality of
`compression storage elements 40a. Thereafter, the CC driver
`34 signals the auxiliary processor 37 to attach a frame header
`to the compressed video data in order to provide information
`regarding its video frame size, and the CC driver 34 also sets
`the first butter 50 to indicate that the first compression
`storage elements 40a is full. After setting the first buffer 50,
`the host processor 12 will then read compressed Video data
`from the first compression storage element 40a and clears
`the first buffer 50 after completing the read operation.
`Besides a one-to-one relationship between the first capture
`and first compression storage elements, it is contemplated
`that identical handshaldng operations as above could occur
`for any ratio of capture and compression storage elements.
`Referring to FIG. 6, a detailed flowchart of the interrupt
`service routine of the capture and compression driver is
`illustrated. The interrupt service routine within the CC driver
`34 enables the host and auxiliary processors 12 and 37 to
`operate in a handshaking relationship to capture and com-
`press video data in real time for a streaming mode. In Step
`200, the interrupt service routine saves the contents of the
`internal host processor registers (“register values") in order
`to temporarily pause the host processor for servicing the
`interrupt. In Step 201, a programmable interrupt controller is
`cleared for reasons as known by persons skilled in the art.
`In Step 202.,
`the interrupt service routine determines
`whether the interrupt was a result of a capture interrupt by
`the digitizer in storing captured video data into VRAM. If
`the interrupt is not a capture interrupt, then the register
`values are “popped” (i.e., placed on top of stack memory)
`and copied back into the internal host-processor registers
`and the interrupt service routine continues monitoring for
`another interrupt (Step 217). If the interrupt is a capture
`interrupt, then the interrupt is cleared and the hardware
`intenupts are reasserted so as to allow the next interrupt to
`be received (Step 203).
`In step 204, the interrupt service routine then determines
`Whether even or odd fields of the video data are to be capture
`because the video data comprises sixty interlaced video
`fields and thirty video fields still provide good picture
`quality. As a result, a divide-by-