`
`9.3 Activex: Handling Events
`
`The ActiveX control provides an abstracted list of events to handle. These
`1 events do not map directly to the events generated by the filter or the filter
`graph manager, but they are important events at such a high—level interface.
`For example, you receive events when the state of the movie changes or
`when the position of the movie changes.
`
`To handle such events, you can use the Microsoft Visual C++ class wizard
`to add a handler for each event. Notice that you have to select the Direct-
`Show ActiveX control ID that you specified in the resource editor in order
`to display its events. Figure 9-7 shows how to add a handler for the
`StLzteChcmge event.
`
`‘ta ta: "
`
`Activ-3XF‘ia er
`
`pa
`’
`. iDC_F’ause
`iDC__F'iay
`iDCu’Stup
`
`‘.
`
`v
`
`Posiiionfihange
`Displayiviodefihange
`Timer
`Readyslatefihange
`Elpanflomplete
`
`DN__‘w"M_EtUEFIYDHAGiCUN
`UnQueryDraglcon
`D nfleadyfitalethangei Elflfl N_1UUO:FieadySta!eChange
`Elnfieady-StateEhangafisctiDN_i[J_Ao(ivaMcxvialiontroi:R eadystatailhange
`E If!r'oE3t;-atefihartge.éu:ti¢et51fi~,0 I"-J._i[‘:_>“4.x:ti*<.-‘eis*| ox!ire):-3ntr-:>i;S.tata-Change V
`’
`
`’
`
`FIGURE 9-7 Handling Directshow ActiveX control events
`using the Microsoft Visual C++ class wizard.
`
`151
`
`
`
`ACTIVEX: HANDLING EVENTS I
`
`129
`
`WHAT HAVE
`ft You LEARNED?
`
`By the end of the chapter, you should have learned how to build Directshow filter graphs
`from within your application in one of three ways: with the ActiveX controls, the automatic
`COM interface, or the manual COM interface.
`
`You should also know how to access your filter's custom interface and its property pages.
`You should be able to handle filter events and control the running state of the Dire<:tShow
`filter.
`
`Lastly, you should be able to enumerate all the registered DireCtSl'1oW filters, loaded filters,
`arid pins in a filter.
`
`PARTIII
`
`152
`
`
`
`
`
`CHAPTER 10
`
`in
`
`Mixing Sprites,
`Backgrounds, and Videos
`
`,
`
`WHY READ
`THIS CHAPTER?
`
`It was certainly nice to play a video clip with DirectShow, but wouldn’t it be even better if
`you could use video as part of your game or application? Surely, video is not the only thing
`moving on the screen; you probably have some moving sprites, backgrounds, and anima-
`tion bouncing around the screen at the same time.
`
`PARTIll
`
`In this chapter we'll show you how to
`
`I mix multiple objects together and how to place them relative to each other—in the
`front, middle, or the back of the viewing area, and
`I use RDX to mix video, backgrounds, and sprites.
`
`l0.l
`
`Introduction to Mixing
`
`You’ve seen how mixing works throughout Part II when We showed you
`how to superimpose a static sprite on top of a static background using GDI
`DirectDraW, and RDX. That was nice, but it can get boring fast.
`
`7
`
`Typically, multimedia applications have multiple sprites, backgrounds, ani-
`mation, and video clips moving around on the screen all at the same time—
`sometimes with music playing in the background. For example, you could
`play a video clip with animation moving in the front and a moving back-
`ground.
`
`I
`
`131
`
`I
`
`
`
`«E
`
`s
`
`it ir
`
`153
`
`
`
`
`
`132 I CHAPTER ‘[0 MIXING SPRITES, BACKGROUNDS, AND VIDEOS
`
`10.1.1
`
`Mixing Sprites with Video
`First let’s review how we draw a static sprite on top of a static background.
`As you recall, some of the pixels in the sprite are transparent and should not
`be drawn on top of the background. For optimal performance, we typically
`mix the two objects in system memory and then write the mixed result to
`the screen.
`
`As you can see in Figure 10-1, you can first copy the background to the
`mixing buffer, then overlay the sprite on top of it. Actually, you can overlay
`many sprites on top of the background at this stage. Finally, you can write
`the mixed result to video memory to make it visible on the screen. Notice
`that you only have to update the display whenever the sprite or the back-
`ground moves around the screen.
`
`With motion video, you display so many frames per second (fps) to give
`the illusion of motion. Now, if you treat every frame in the video as if it
`were a static background, you can apply the same technique we just dis-
`cussed, mixing sprites with a background, for mixing sprites over video. In
`this case, however, you need to update the display screen whenever the
`sprite or the video moves on the screen and whenever a new video frame is
`displayed.
`'
`
`
`
`10.2 1
`
`10.1.2 Mixing Animation with Video
`
`Suppose you want to mix an animation sequence on top of video—an ani-
`mation clip is a sequence of sprites with transparent pixels, which gives the
`effect of a moving picture. Similar to motion video, animation clips are dis-
`played at a specific rate measured in frames per second. To mix an anima-
`tion on top of a video, you can use the same concept as when you mix a
`sprite over video.
`background
`bitmap
`
`sprite over_
`backg round
`
`On—screen
`
`Trasnpar -
`
`4.
`‘
`
`
`
`
`FIGURE '1 0-1 Mixing a static sprite with a static background.
`
`154
`
`
`
`MIXING WITH RDX I 133
`
`At any given moment, you only need to deal with one sprite from the ani-
`mation and one frame from the video. This is exactly the case when we dis-
`played a sprite over a static background. In this case, however, you need Lo
`update the display whenever a new frame is displayed from either the ani-
`mation or the video and whenever the animation or the video hops around
`the screen.
`
`Of course, you can apply this same technique for mixing a video on top of
`another video. The same technique is used on TV shows and in the movies
`when there is a scene inside a car and the back window shows some video
`clip giving the illusion that the car is moving. To do that, you typically film
`the car in front of a blue background——blue is your transparency color.
`Then you mix this video clip, with the blue background, with another video
`clip exactly the same way you mixed animation over video.
`
`10.2 Mixing with RDX
`
`In Part II you’ve learned how to use RDX for mixing static sprites on top of
`static backgrounds. Here we’ll show you how to use RDX to mix a static
`sprite over Video. You can use the same technique to superimpose video
`over video or animation over video.
`
`PARTIII
`
`Before we go into that, let’s first review some of the techniques RDX uses to
`perform object mixing. RDX uses a draw order to decide which object
`should be rendered first on the screen. For example, if you want to give the
`illusion that a background is “behind” a sprite, you would assign the back-
`ground a high er draw order number than the sprite. RDX in turn paints the
`background first, then overlays the sprite on top of it (Figure 10-2).
`
`Background
`*
`Video
`
`Draw Order
`
`High
`
`Low
`
`FIGURE 1 O-2 RDX draw order. Higher order objects are
`displayed behind lower order objects.
`
`155
`
`
`
`‘I34 I CHAPTER 10 MIXING SPRITES, BACKGROUNDS, AND VIDEOS
`
`10.2.1
`
`Playing Video with the RDX Directshow Interface
`
`First let’s show you how to play a video clip using RDX. RDX supports mul-
`tiple architectures for Video playback, such as DirectShow and VFW. Since
`we’ve been discussing DirectShow, we will show you how to play a Video
`clip using that interface. In this example, we’ll use an MPEG file as the
`Video clip.
`
`To display an MPEG file, we first ask RDX to create a filter graph object and
`associate the MPEG file with it. RDIX in turn creates a DirectShow filter
`
`graph for the input file and returns a handle to an RDX video object that
`represents that file.
`
`FgCreate(&m_hAM);
`fgAssoc1'ateF1".e(m,hAM, "biastoff.mpg");
`fgGetV‘»'deoObject(nI_h.AM, 0, Knghvid):
`
`Create a DirectShow filter graph “fg” object and set the MPEG file as the source. If suc—
`cessful, get a pointer to the RDX Video object for later use.
`
`1 N
`
`ow you can instruct DirectShow to place the final output to the RDX sur-
`face, hSurf (refer to Chapter 6 to learn how to create an RDX surface). You
`can then set the draw order for the video such that it would be drawn
`
`behind the sprite. As an example, we use 100 for the sprite and 150 for the
`video clip. Finally, we declare that this object is visible.
`
`objSetDest1'nati'on (rr,hAM, rsurt);
`objSetDrawOrder
`(m_hAM, 150);
`objSet\/isibi‘Hty(m_hAM, TRUE);
`
`1
`
`When DirectShow renders the MPEG file, it writes the final image not to
`the screen but to the RDX surface, which is typically in system memory or
`offscreen video memory. To display each frame to the screen, you must call
`the 5rfDrczw() function, which copies the contents of the RDX surface to the
`appropriate location on the screen.
`
`Alternatively, you can request that RDX automatically call the 5rfDmw()
`function to render each frame to the screen. To do that, you can use RDX’s
`timers and events to schedule a draw event every frame. A timer is an object
`that counts user~defined time periods. An event is an object that defines an
`activity that you want to perform on an episodic or periodic basis.
`
`156
`
`
`
`MIXING WITH RDX I
`
`‘I35
`
`To create a timer, you must call the TimerCreate () function with a handle to
`the video object and the timer sampling rate. You can then activate the
`timer with the TimerStart() function. Even though the timer is generating
`so many ticks per second, the timer tick does not generate any callback or
`event. So what use is this timer anyway?
`
`
`fqGetvi:iIrito(m_,h/\M, m_hVid, FG_INFU_SAMPLER/\TE, &dwFP3);
`timecCreate((iriORD)dwFPS, HTIMER *)phTi‘mer);
`time "Start(*phTimer):
`m_hT 1‘ me r = *phTi me r ;
`
`CAUTION: Make sure to stop phTimerbef0re you destroy
`the RDX objects. Use Timerstop {) or Tir/:erDestroy().
`
`EI-'
`n:
`.
`
`<n
`
`To make it worthwhile, you must associate the timer with a draw event—a
`draw event informs RDX to call the srfDraw() function. VVhen the timer
`ticks, it sends RDX a draw event advising it that it is time to render the
`frame. To create a draw event, you must call the eventCreate () function with
`Ev £NT_DRAw as a parameter. You must then associate the event with the timer
`that will raise that event.
`
`
`it (bAutoDraw)
`i
`eventCreate(m_10bj, E\1ENT_DRAw, O, 0, Zim_rDrawEventI>;
`eventScheduie(m hTi'mer. m,,hDrawEvent, 1, RELATIVEJIME, 1, Oxffff);
`¢
`The third paramater, wPeriod, in eve11tSchedule allows you to specify the number of
`timer ticks per event period, for example, if the timer generates 30 ticks/sec and the
`event wPeriod is 3, then an event is generated every 3 timer ticks.
`
`l
`
`
`
`Even though DirectShow decodes the video clip according to its frame rate, the
`number of frames rendered to the screen depends on the timer’s sampling rate
`and the event's period.
`
`
`Now that you have everything set up, you can call the fgPlay() function to
`put the DirectShow filter graph in run state.
`
`
`
`ob;Prepare(m_hAM);
`i°gPiay(rr_hAM, P_AWODE_REPEAT, 0, U):
`
`i
`
`e
`
`_____mi
`
`157
`
`
`
`I36 I CHAPTER '10 MIXING SPRITES, BACKGROUNDS, AND VIDEOS
`
`At this stage, DirectShow decodes every frame into an RDX surface, the
`timer generates a draw event on every decoded frame, and RDX calls the
`5rfDraw() function to copy the image from the RDX surface to the screen.
`
`Why don’t you fire up the sample application on the CD and select the
`option for this chapter from the menu. You should see a video clip playing
`on the screen.
`
`Mixing a Sprite on Top of Video
`
`Now that you have the video playing, let’s see how We can overlay a sprite
`on top of it. As in Chapter 6, you must first load the sprite bitmap into
`memory and create an RDX sprite object. Once the sprite object is created,
`you can associate it with the RDX surface, hSurf
`
`bitmap bm;
`bttmap.Ge:Bi'tmap(&bm):
`m_dwwi'_dtw = hm.hmNi'dt.h;
`m,dwHe"gw: — 3m.bmHei'ght;
`nLbyTransp
`Jykeytiolor;
`
`Create and set 11 J an hbrnp (Source Data Object).
`-BMPHEADER bm3Header;
`hbmpCroatc(m,dv/width,m,d\»/Height,RGB_CLLT8,&m_hBmp);
`BYTE *pDa:a;
`hbmpCetLackedBuffer(m_hEmp, &pData, &bmp+eader):
`t1‘tmap.Ge:B1'tmapBits(m_d\«/N1d:h*m_dwHeight, pData):
`hbmp’<eleaseBu‘fe“tm,hBmp);
`l‘l)HlDSeLTV‘drI§|JdT'EIICyCUlOTWLWBWD,
`
`(DwORD)byKe_yCo or);
`
`Create sprite; associate data to it; associate sprite to surface.
`
`sprcreatettwn, Sp");
`sprSetData(m_ Sp", m_hBmp);
`objSetDesti‘na iontmjispr, m,hSur-“J;
`
`Finally, you need to set the draw order and visibility of the sprite. Notice
`that we set the draw order to be lower than that of the video clip so that the
`sprite is drawn in front of the video clip.
`
`:bjSe:DrawOrder(m_hSpr, 100):
`:bjSe:\/isibil 1’ty(m_rSpr, TRUE);
`
`158
`
`
`
`MIXING WITH RDX I 137
`
`10.2.3
`
`Mixing Video on Video
`
`As we’ve mentioned earlier, you should be able to mix an animation or a
`Video on top of another Video. Let’s see how you can overlay a video clip
`with a transparency color on top of another Video.
`
`You can actually use the same code from section 10.2.1 to start the Video
`clip in the foreground—with a couple of modifications. First, you must
`inform RDX about the transparent color of the Video. To do that, you must
`call the fgvidSetTran5parencyC0lor() function.
`
`
`
`fgvidsetwansparencycolnr(rn_hA‘/IE,
`
`(Dl.«lORD')byl<eyCo1or);
`
`
`
`o
`
`As with the sprite, you should set the draw order of the video clip to be in
`front of the background video clip. Notice that in our example we posi-
`tioned this Video clip between the background Video (150) clip and the
`sprite (100).
`
` PARTIII
`
`objSetDrawOrder
`
`(ni_hAM2, 120):
`
`l
`
`t
`
`l
`
`At this stage, you’ll have two Video clips playing, one on top of another with
`21 sprite in front of both of them. Notice that since the two video clips could
`have different frame rates, you need to use the higher frame rate when you
`create the timer for the draw event. This way, you’re drawing at the rate of
`the faster Video clip.
`
`WHAT HAVE
`You LEARNED?
`
`By now you should be familiar with mixing different objects on top of each other. In this
`chapter you learned
`
`about mixing sprites, video, and animation together,
`about draw order and how to position objects relative to each other,
`about RDX timers and events and how to create them,
`
`how to use RDX to mix a sprite on top of video, and
`how to mix video or animation on top of another video,
`
`
`
`159
`
`
`
`
`
`CHAPTER 11
`
`in
`
`Streaminlg Down
`the Super ighway
`with RealMedia
`
`WHY READ
`THIS CHAPTER?
`
`’
`
`PARTIll
`
`The Internet! You must have heard of it by now. Yes, and while cruising the Net you must
`have been struck by all of these RealAudio icons "To Listen, Click Here
`‘
`.” Real-
`Audio has become THE audio streaming solution on the Internet.
`With its success, RealNetworks released a similar technology for video on the Internet-
`Realvideo. In 1997 the company is building on its success with streaming on the Internet
`and is releasing a new streaming architecture, which allows for installable media types to
`be streamed on the Internet. This technology is called Rea|l\/ledia.
`
`In this chapter, you will
`
`I get an overview of Reall\/ledia and learn about its plug—in model,
`I be introduced to the concept of a RealMedia plug-in and how to build File-Format and
`Rendering plug—ins,
`
`I
`I
`
`learn about Audio Sen/ices and how to use them within a plug—in, and
`learn about metafiles and how to use them.
`
`In the past few years, the number of people connected to the Internet has
`grown astronomically. Similar to television, radio, and newspaper, the
`Internet has become the information medium of choice for millions of peo-
`ple. The Internet, however, offers an additional quality that does not exist
`in any of the earlier mediums: interactivity. Televisions and radios allow you
`to select between a preset number of local or cable channels; the Internet, on
`
`I'l39I
`
`160
`
`
`
`140 I CHAPTER 11 STREAMING DOWN THE SUPERHIGHWAY WITH REALMEDIA
`
`the other hand, opens the gate to millions of information servers, games,
`and music archives throughout the world.
`
`RealNetworks (RN) seized the opportunity and established itself as THE
`Internet audio streaming technology on the Internet. Its RealAudio tech-
`nology is specifically designed for real—time audio streaming on the Inter-
`net. With real-time audio streaming, you don’t have to download an audio
`file first and then play it back; rather, you play the data as you retrieve it
`from the Internet server. Building on their success with RealAudio, RN
`introduced a similar streaming technology for Video called RealVideo, and
`then RealMedia.
`
`RealMedia is a real—time streaming technology specifically designed for the
`Internet. RealMedia includes both the RealVideo and RealAudio technolo-
`
`gies as part of its core. VVith the plug—in mechanism that it provides, you
`can stream and synchronize the playback of any data type, in real time, over
`the Internet. For example, you can stream a new file format like MPEG,
`text, animation, MIDI, financial data, weather information, industrial
`information, or VRML.1
`
`In our effort to present only technologies of the future, we wrote this chapter
`while the Reall\/ledia SDK was still in its late beta cycle. Therefore some of the
`APIS might have changed slightly by the time this book is published. Nonetheless,
`the material in this chapter should be relevant and reflect the Reallvledia archi-
`tecture accurately. Use this chapter for the concept, but use the Reall\/ledia SDK
`for the actual API definitions.
`
`ll.l Overview of ReaIMedia
`
`RealMedia is an open, cross-platform technology for streaming multimedia
`presentations over the Internet—~or networks in general. (See Figure 11-1.)
`It uses the Real Time Streaming Protocol (RTSP) for communicating over
`the Internetz and the Real Time Session Language (RTSL) to define presen-
`tations. What does all of this mean?
`
`1. VRML: Virtual Reality Modeling Language.
`2. RTSP supports multicasting, unicasting, and RTP protocols.
`
`161
`
`
`
`THE REALMEDIA PLUG-IN ARCHITECTURE I
`
`141
`
`RealMedia Server
`‘ MetaFile (FlTSL protocol)
`<FlTSL>
`<SEOUE\lCE>
`<TFlACK SRC:"itspIi/Serve'.:om:554/FiAuClo.ra‘ >
`<TFl/«CK SRC:'rl:.p./‘/serve ..:uln.554/Fil\1udia.rm">
`<lSEQUENCE>
`
`g/HTS.)
`
`RealMedia Client
`
`:...t
`I-
`D:
`L
`
`<L
`
`FIGURE ‘I
`
`‘I -‘I The roles of the RTSP protocol and RTSL session language.
`
`RealMedia uses the RTSP protocol to transport data across networks—both
`the Internet and Intranets.1 RTSP defines an application interface for con-
`trolling the delivery of real-time data. It allows for delivery of multiple
`streams simultaneously, such as video and audio and time—stamped data
`packets. For a reliable delivery, the RealMedia client uses the RTSP protocol
`to acknowledge the server when a packet is received; otherwise, the client
`resends another request for the packet or decides to throw it away. This
`decision depends on the quality setting of the application.
`
`RTSL is a presentation language that is similar in form to the HyperText
`Markup Language (HTML). I-ITMI. is used to create \/Veb documents on
`the Internet. RTSL allows you to define a presentation sequence that con-
`sists of multiple audio, video, and other data streams. With RTSL, the Real-
`Media server and client can negotiate the type of content delivered based on
`the information in the RTSL file (a.k.a. a metafile) and the settings of the
`player. For example, in the metafile, you can specify different media files
`(audio, video) depending on the bandwidth of the Internet pipe (28.8K,
`ISDN, and so forth) and on the language (English, French, and so forth).
`For a 28.8K pipe you can deliver a file with low quality and a low rate of
`data; for ISDN, you can deliver a better quality file with a higher rate
`of data. (See the RTSL definition in the RealMedia SDK for more details.)
`
`We won’t go into all the details of RTSP and RTSL in this book. Since Real-
`Media handles all the communication between the client and the server
`internally, you never have to deal directly with the RTSP protocol. However,
`as a content developer (someone who designs metafiles), you will need to
`learn more about the RTSL protocol and l1ow to use it. Refer to RealMedia
`SDK for more details.
`
`11.2 The RealMedia Plug-in Architecture
`
`RealMedia is a simple plug—in architecture for adding custom data types.
`Figure 11-2 shows three RealMedia plug—in interfaces: File-System, File-
`Format, and Rendering plug—ins.
`
`.»
`
`:
`
` .
`
`ia
`
`162
`
`
`
`142 I CHAPTER 11
`
`STREAMING DOWN THE SUPERHIGHWAY WITH REALMEDIA
`
`Supported
`Datatypes
`Text
`Audio/video
`QuickTime
`Audio
`Audio
`Audio
`Audio
`
`Text
`AVI
`MOV
`WAV
`SND
`AIFF
`AU
`
`RealMedia server
`
`Fire system i
`Plug—|n
`
`File Format’
`Plugdn
`
`RealMedia Client
`
`FIGURE ‘I ‘I-2 RealMedia plug-in architecture.
`
`The File-System plug-in is only responsible for reading “raw” data from a
`source. The source could be a prerecorded audio/video file, a satellite feed,
`or a database server. This plug-in is typically loaded by the RealMedia
`server. The File—System plug-in does not know, or care, how the data will be
`parsed; it only knows how to read, write, and seek through a file. Since the
`RealMedia binaries come with a slew of File-System plug-ins, you typically
`don’t have to implement a File-System plug-in to stream custom data types.
`
`The File-Format plug-in is responsible for parsing the data, splitting it into
`multiple streams, and breaking it into smaller packets for delivery over the
`Internet. This plug—in is typically loaded by the RealMedia server. The File-
`Format plug-in does not know how to read the data from the source, and
`it does not know how to send the data over the Internet. Currently, Real-
`Media supports AVI, WAV, AU, SND, AIFF, RealAudio, RealVideo, RealMedia,
`and Re alTeXt file formats.
`
`The Rerzderingpl ug—in understands the contents of the data and knows how
`to render it to its final destination——screen, audio device, and so forth. This
`
`plug-in is typically loaded by the RealMedia player or client.
`
`In Figure 1 1-2, we show that the File—System and Fi|e—Format plug—ins are loaded
`by the RealMedia server. It you're playing a RealMedia tile on a local machine,
`the Reallvledla player loads the Fi|e—System, Fl|e—Format, and the Rendering plug-
`lns on the same PC.
`
`Notice that none of the plug—ins we’ve discussed so far deal with data deliv-
`ery over the Internet. They only worry about reading the data, breaking it
`into smaller packets, and rendering the final result. The RealMedia server
`and client handle all the necessary communication over the Internet. Real-
`Media allows for streaming any data type and synchronizing the playback
`of multiple data types.
`
`80 what do you really need to do to stream your own custom data type? Typ-
`ically, you only need to implement a File—Format plug-in and a Rendering
`
`163
`
`
`
`DATA FLOWS: SERVER TO CLIENT I 143
`
`plug—in, since they both have to understand the new data type. The File-
`System plug—in, on the other hand, is only required if you have to read data
`from a source not supported by the RealMedia binaries, for example, from a
`database server.
`
`in this chapter, you'll learn how to build a File-Format and a Rendering
`plug—in. You’ll also learn about RealMedia metafiles and how to use them to
`configure the Web server.
`
`Let’s go over some of the basic RealMedia concepts and interfaces. First
`we’ll describe the data flow model between the server and client. Then we’ll
`
`glance over some of the basic RealMedia interfaces that are used in the sam-
`ple code in this Chapter.
`
`11.3 Data Flows: Server to Client
`
`For the purposes of this discussion, we’re assuming that you know how to
`use a Web browser such as Internet Explorer or Netscape Navigator. When
`you select a hot link in the browser, it takes you to a new Web page or down-
`loads a file to your local drive. If the hot link points to an audio or a video file,
`the browser first downloads the file to your local machine and then launches
`the media player to play it. Web browsers allow you to associate any file exten-
`sion with an application that will be launched when such a file is down-
`loaded. For example, ’‘.doc is associated with launching Wi11W0rd.
`
`To perform realtime streaming, RealMedia adds another step to this pro-
`cess. Instead of pointing the hot link to the RealMedia file on the server, you
`point it to a metafile. Metafiles hold configuration information that allows
`the client (RealPlayer) to communicate directly with the server. They also
`hold the list of media files to play when the metafile hot link is selected.
`VVe’ll discuss metafiles in more detail later in this chapter.
`
`So what really happens when you select a metafile hot link? Since the meta-
`file file extension *.rt5 is associated with the RealMedia player, the player is
`launched when the metafile hot link is activated. The player parses the
`metafile to find the streams that it should request from the RealMedia
`server. Notice that once the metafile is downloaded, the player makes the
`connection directly to the RealMedia server and bypasses both the Web
`browser and the Web server.
`
`On the server side, the server loads the appropriate plug—ins and starts
`delivering data packets to the player (Figure 11-2). The RealMedia server
`
`164
`
`
`
`144 I CHAPTER 11
`
`STREAMING Down THE SUPERHIGHWAY WITH REALMEDIA
`
`loads a File-System plug—in to read the raw data from a file. It then loads the
`appropriate File—Format plug-in based on the file extension of the media
`file. The File~Format plug-in parses the media file and determinesithe
`MIME type of each stream in the file.3 The server sends the MIME type of
`each stream to the client, over the Internet, and the RealMedia client, in
`
`turn, loads the appropriate Rendering plug—in for that MIME type.
`
`Once the plug-ins are loaded, the RealMedia server requests a data packet
`from the File—Format plug—in. The File-System plug-in reads the raw data
`from the file, the File-Format plug-in parses it and breaks it into smaller
`packets. The RealMedia server sends the packet over the Internet to the cli-
`ent where it is rendered by the Rendering plug-in,
`
`In a nutshell, File-Format plug-ins make the packets, Rendering plug—ins
`receive the packets and play them, and the RealMedia engine handles all the
`underlying communication and timing of shuttling the packets from the
`server to the player.
`
`ll.4 Data Management Objects
`RealMedia defines a set of data objects to transport the data from the server
`to the client. These objects include dynamic memory allocation, indexed
`lists, and data packet objects.
`
`Although all the RealMedia objects are COM interfaces, they are not specific to
`the Windows environment. Even though COM was defined for Windows, the
`COM architecture does not require Windows.
`
`lRMABuffer: Dynamic Memory Allocation Object
`The IRMA4 Buffer object allows you to allocate a memory buffer at runtime.
`Typically, the buffer is used to transport data over the Internet. To allocate a
`memory buffer in RealMedia, you need to create an instance of the IRMA-
`Buffer object, set the size of the buffer, and then request a pointer to it. And
`you thought mulled) was hard to use!
`
`3. A MIME type specifies the type of data in the message. MIME, or Multipurpose Internet Mail Exten-
`sion, allows for transporting mail messages with binary data and many parts such as attachments and
`such,
`4. IRMA: Interface Real.\/Iedla Architecture.
`
`165
`
`
`
`DATA MANAGEMENT OBJECTS I 145
`
`To create an instance of the IRMABuffer object,'you need to call the IMM-
`CommonCla5sFact0ry::CreateIr15tar1ce() function using C LS I D_I RMABU FF E R as a
`parameter. You’ll soon learn how to request a pointer to an IRMACom—
`monClassFactory object. To set the size of the buffer, you must call the
`IRMABufi‘er::SetSize() member function; the actual memory allocation
`happens here. If successful, you can then call the IRMABufi‘er::GetBufi’er()
`function to obtain a pointer to the data buffer. When you’re done with the
`buffer, you should release the object to avoid any memory leaks.
`
`
`m_pC I assFactory—>Cre-ateInstance(CLSID_IRMABuffer,(voic**)&pTi:Ie):
`pfiitl e->SetS‘ zei INFO_SIZE+1):
`p'itleData~ (char*)pTitle->GetBuf‘Fer();
`stmcpy(pTi't'e:~ata .pBufferData,INFO,SIZE);
`p’itIe~>ReIease();
`
`|RMABufierf“n°fi°'3S3
`Get()
`5910
`SetSize()
`GetSize()
`GetBufie7'()
`
`I
`
`I-
`D:
`L
`
`<C
`
`IRMAValues: Indexed List Object
`The IRMAValues object allows you to build an indexed list at runtime and
`send it off to other plug-ins over the Internet. The index is an ASCII string
`that specifies some special property. The Value is either an IRMABuffer
`object or an unsigned long. For instance, you could use the IRMAValues
`object to build the following indexed list.
`
`I 1.4.2
`
`Index "Title”
`
`"Author"
`
`”Copyright”
`
`”Count"
`
`Va|ue ”Carrots” ”Bugs Bunny" ”BigEars/nc”
`
`3
`
`As with the IRMABuffer object, you must first create an instance of the
`‘ IRMAValues object with the IRMAC0mm0r1Cla5sFact0ry::CreateIn5tcmce()
`function. You can then call the member function SetPr0pertyULONG32()
`or SetPropertyBufi‘er() to add an unsigned long or an IRMABuffer object to
`the list, respectively. Notice that the string index, for example, “title,” is
`specified in the first parameter.
`
`166
`
`
`
`146 I CHAPTER 11 STREAMING Down THE SUPERHIGHWAY WITH REALMEDIA
`
`m_p(1lassractory—>Create:ns:ance(CLSIDVZRM/Avai Lies,
`pHeader->SetProper:yBufter ("Tit1e", p’1'tlc);
`plleader‘->SetProper:yULONG32("StreamC:unt", 1);
`
`pHeader~>GetPropertyBJffer ( “Ti tl e”,
`pTitleData = pt=tic->»3ctsi-'re~::>,-
`PT1-tk-jg , > R5196 Se (: );
`ptcad:r~>Re'ease():
`
`:Ti fl e) ;
`
`(void**)&pHeader))
`.
`_
`lRMAVa|Ue5fim°u°“S'
`SetPrapertyULONG32 ()
`/
`GetPT°F'e”7ULONG320
`G“’F”5’P"’P”‘VULON,G32l)
`GelNextPropertyUI.OIVG320
`SetPmPe"7B“ficeTO
`GetPropertyBuffer()
`GetFirstPropertyBu_6‘er()
`GerNextProperZyBufi'er{)
`
`To retrieve an item from the list, you can call the GetPropertyBufj‘er() or the
`GetPropertyULONG32() function for an IRMABuffer object or an unsigned
`long, respectively. In addition," you can enumerate the entire indexed list
`with the GetFirstXyz() and GetNextXyz() member functions. Refer to the
`Include file in the RealMedia SDK for prototypes of these functions.
`
`The following rules describe when to use the AddRefO and Re/ease0 functions
`with RealMedia objects:
`
`-
`
`if an object is passed to your code as a parameter of a function call, you must
`use Add/?efO to reference the object. When your code is finished with the
`object, you must use Re/easeO to release the object.
`
`(IRMAOl)jec:
`Void Furctiuri
`pObject—>AddRef();
`...Lse Dbjec: Here...
`P0bject—>Release();
`
`“p0bject> (
`
`in-
`For objects returned by functions, use Add/?efO to reference the object
`side the function. You must use Re/easeO to release the object when you're
`done with it. The following functions use the Add/?efO function to increment
`the reference count of the objects before returning them:
`
`RM/lCreate/nstance
`
`/RMAF//eSystem.':CreateH/e
`
`/RM/lCommon C/assFactor)/.'.'Crec7te/nstance
`
`/Unknown .':Queryln terface
`
`/RMAF/'/eSj/stem:.'CreateDir
`
`If your code creates an object using the C++ new operator, your code must
`use the AddRefO function to reference the object. When your code is finished
`with the object, it must use Re/easeo to release the object.
`
`167
`
`
`
`REALMEDIAASYNCHRONOUSINTERFACES I 147
`
`11.4.3
`
`1RMAPacket: Packet Transport Object
`
`The IRMAPacket object is used to transport data packets from the File-
`Format plug-in on the server side to the Rendering Plug-in on the client side.
`
`Again, you must first call the IRMAC0mmrmCla5sFact0ry::CreateInstcmce()
`function to create an instance of the lRMAPacket object. You can then call
`the Set() member function to specify the lRMABuffer object that holds the
`data of each packet. \/Vith the Set() function you can also set a time stamp
`for the packet and a priority flag indicating the importance of the packet—
`Can it be dropped or not?
`
`
`m_pC1assFactory~>Create1nstance(CLSID,1?MAPacK:t.(voio**3&pPacket))
`pPacket->Set(pBuffer,m_ulCurrentT1me,O,H.PN_RELIABLE NORMAL)
`pPacket—>Release():
`
`11.5 Rea|Media Asynchronous Interfaces
`
`File-Format plug—ins are responsible for parsing the data and splitting it
`into multiple streams. They’re also responsible for breaking the data in each
`stream into smaller packets before sending it over the Internet.
`
`in addition to the IRMAPlugin interface, File-Format plug—ins implement
`both the IRMAFileFormatObj ect and IRMAFileResponse interfaces. The
`IRMAFileFormatObject interface defines the functionality of the DLL as a
`File Format plug-in. The RealMedia server uses this interface to retrieve
`header information from the source file and the header for each stream. It
`
`also uses this interface to request data packets to send out over the Internet.
`
`IRMAFileResponse is a callback interface used to notify the plug-in when
`an asynchronous operation is complete. As you recall, the File-Format
`plug-in uses the services of the File—System plug-in to read raw data from
`the input source. Since all the File-System plug-in’s operations are asyn-
`chrono us, the File-Format plug-in exposes the IRMAFileResponse interface
`in order to receive notification when these operations are complete.
`
`RealMedia defines nonblocking interfaces for the File-Format and the File-
`Syst