throbber
8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`Real-time communication with WebRTC: Google I/O 2013
`Real-time communication with WebRTC: Google I/O 2013
`
`https://web.dev/articles/webrtc-basics
`
`1/27
`
`Sportradar 1038
`Page 1
`
`Get started with WebRTC
`Sam Du
`on
` (hps://twier.com/sw12)
` (hps://github.com/samduon)
` (hps://glitch.com/@samduon)
` (hps://techhub.social/@samduon)
` (hps://samduon.com)
`WebRTC is a new front in the long war for an open and unencumbered web.
`Brendan Eich, inventor of JavaScript
`Real-time communication without plugins
`Imagine a world where your phone, TV, and computer could communicate on a common pla
`orm. Imagine it was easy to add video chat and peer-to-peer data sharing to
`your web app. That's the vision of WebRTC.
`Want to try it out? WebRTC is available on desktop and mobile in Google Chrome, Safari, Firefox, and Opera. A good place to start is the simple video chat app at
`appr.tc
`(hps://appr.tc) :
`Open
`appr.tc
` (hps://appr.tc) in your browser.
`Click
`Join to join a chat room and let the app use your webcam.
`
`Open the URL displayed at the end of the page in a new tab or, beer still, on a dierent computer.
`Quick start
`Haven't got time to read this article or only want code?
`To get an overview of WebRTC, watch the following Google I/O video or view these slides  (hps://io13webrtc.appspot.com/):
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`2/27
`
`Sportradar 1038
`Page 2
`
`If you haven't used the getUserMedia API, see
`Capture audio and video in HTML5
` (hps://www.html5rocks.com/en/tutorials/getusermedia/intro) and
`simpl.info
`getUserMedia
` (hps://www.simpl.info/getusermedia).
`To learn about the RTCPeerConnection API, see the following example and
`'simpl.info RTCPeerConnection'
` (hps://simpl.info/rtcpeerconnection) .
`To learn how WebRTC uses servers for signaling, and rewall and NAT traversal, see the code and console logs from
`appr.tc
` (hps://appr.tc) .
`Can’t wait and just want to try WebRTC right now? Try some of the more-than 20 demos  (hps://webrtc.github.io/samples) that exercise the WebRTC JavaScript APIs.
`Having trouble with your machine and WebRTC? Visit the WebRTC Troubleshooter (hps://test.webrtc.org).
`Alternatively, jump straight into the
`WebRTC codelab (hps://codelabs.developers.google.com/codelabs/webrtc-web/), a step-by-step guide that explains how to build a
`complete video chat app, including a simple signaling server.
`A very short history of WebRTC
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`3/27
`
`Sportradar 1038
`Page 3
`
`One of the last major challenges for the web is to enable human communication through voice and video: real-time communication or RTC for short. RTC should be as
`natural in a web app as entering text in a text input. Without it, you're limited in your ability to innovate and develop new ways for people to interact.
`Historically, RTC has been corporate and complex, requiring expensive audio and video technologies to be licensed or developed in house. Integrating RTC technology
`with existing content, data, and services has been di cult and time-consuming, particularly on the web.
`Gmail video chat became popular in 2008 and, in 2011, Google introduced Hangouts, which uses Talk (as did Gmail). Google bought GIPS, a company that developed
`many components required for RTC, such as codecs and echo cancellation techniques. Google open sourced the technologies developed by GIPS and engaged with
`relevant standards bodies at the Internet Engineering Task Force (IETF) and World Wide Web Consortium (W3C) to ensure industry consensus. In May 2011, Ericsson built
`the
`rst implementation of WebRTC
` (h
`ps://labs.ericsson.com/developer-community/blog/beyond-html5-peer-peer-conversational-video) .
`WebRTC implemented open standards for real-time, plugin-free video, audio, and data communication. The need was real:
`Many web services used RTC, but needed downloads, native apps, or plugins. These included Skype, Facebook, and Hangouts.
`Downloading, installing, and updating plugins is complex, error prone, and annoying.
`Plugins are dicult to deploy, debug, troubleshoot, test, and maintain - and may require licensing and integration with complex, expensive technology. It's oen
`dicult to persuade people to install plugins in the rst place!
`The guiding principles of the WebRTC project are that its APIs should be open source, free, standardized, built into web browsers, and more ecient than existing
`technologies.
`Where are we now?
`WebRTC is used in various apps, such as Google Meet. WebRTC has also been integrated with
`WebKitGTK+
` (hps://labs.ericsson.com/developer-community/blog/beyond-html5-conversational-voice-and-video-implemented-webkit-gtk) and Qt native apps.
`WebRTC implements these three APIs: -
`MediaStream (also known as getUserMedia) - RTCPeerConnection - RTCDataChannel
`The APIs are dened in these two specs:
`WebRTC (hps://w3c.github.io/webrtc-pc/)
`getUserMedia
` (hps://www.w3.org/TR/mediacapture-streams)
`All three APIs are supported on mobile and desktop by Chrome, Safari, Firefox, Edge, and Opera.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`4/27
`
`Sportradar 1038
`Page 4
`
`getUserMedia: For demos and code, see
`WebRTC samples
` (hps://webrtc.github.io/samples) or try Chris Wilson's
`amazing examples
` (hps://webaudiodemos.appspot.com) that
`use
`getUserMedia as input for web audio.
`RTCPeerConnection: For a simple demo and a fully functional video-chat app, see
`WebRTC samples Peer connection
`(hps://webrtc.github.io/samples/src/content/peerconnection/pc1/) and
`appr.tc
` (hps://appr.tc) , respectively. This app uses
`adapter.js
` (hps://github.com/webrtc/adapter) , a JavaScript
`shim maintained by Google with help from the
`WebRTC community  (hps://github.com/webrtc/adapter/graphs/contributors), to abstract away browser di erences and spec
`changes.
`RTCDataChannel: To see this in action, see
`WebRTC samples
` (hps://webrtc.github.io/samples/) to check out one of the data-channel demos.
`The
`WebRTC codelab (hps://codelabs.developers.google.com/codelabs/webrtc-web/#0) shows how to use all three APIs to build a simple app for video chat and le sharing.
`Your first WebRTC
`WebRTC apps need to do several things:
`Get streaming audio, video, or other data.
`Get network information, such as IP addresses and ports, and exchange it with other WebRTC clients (known as peers) to enable connection, even through NATs
`(hps://en.wikipedia.org/wiki/NAT_traversal)
` and rewalls.
`Coordinate signaling communication to report errors and initiate or close sessions.
`Exchange information about media and client capability, such as resolution and codecs.
`Communicate streaming audio, video, or data.
`To acquire and communicate streaming data, WebRTC implements the following APIs:
`MediaStream (hps://dvcs.w3.org/hg/audio/raw- le/tip/streams/StreamProcessing.html) gets access to data streams, such as from the user's camera and microphone.
`RTCPeerConnection (hps://dev.w3.org/2011/webrtc/editor/webrtc.html#rtcpeerconnection-interface) enables audio or video calling with facilities for encryption and
`bandwidth management.
`RTCDataChannel (hps://dev.w3.org/2011/webrtc/editor/webrtc.html#rtcdatachannel) enables peer-to-peer communication of generic data.
`(There is detailed discussion of the network and signaling aspects of WebRTC later.)
`MediaStream
`API (also known as getUserMedia API)
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`5/27
`
`Sportradar 1038
`Page 5
`
`The MediaStream API (hps://dev.w3.org/2011/webrtc/editor/getusermedia.html) represents synchronized streams of media. For example, a stream taken from camera and
`microphone input has synchronized video and audio tracks. (Don't confuse
`MediaStreamTrack with the <track> element, which is something
`entirely di
`erent
` (h
`ps://www.html5rocks.com/en/tutorials/track/basics/).)
`Probably the easiest way to understand the
`MediaStream API is to look at it in the wild:
`In your browser, navigate to
`WebRTC samples getUserMedia
` (hps://webrtc.github.io/samples/src/content/getusermedia/gum/) .
`Open the console.
`Inspect the
`stream variable, which is in global scope.
`Each
`MediaStream has an input, which might be a MediaStream generated by getUserMedia(), and an output, which might be passed to a video element or an
`RTCPeerConnection.
`The
`getUserMedia() method takes a MediaStreamConstraints object parameter and returns a Promise that resolves to a MediaStream object.
`Each
`MediaStream has a label, such as 'Xk7EuLhsuHKbnjLWkW4yYGNJJ8ONsgwHBvLQ'. An array of MediaStreamTracks is returned by the getAudioTracks() and
`getVideoTracks() methods.
`For the
`getUserMedia (hps://webrtc.github.io/samples/src/content/getusermedia/gum/) example, stream.getAudioTracks() returns an empty array (because there's no audio)
`and, assuming a working webcam is connected,
`stream.getVideoTracks() returns an array of one MediaStreamTrack representing the stream from the webcam. Each
`MediaStreamTrack has a kind ('video' or 'audio'), a label (something like 'FaceTime HD Camera (Built-in)'), and represents one or more channels of either audio
`or video. In this case, there is only one video track and no audio, but it is easy to imagine use cases where there are more, such as a chat app that gets streams from the
`front camera, rear camera, microphone, and an app sharing its screen.
`A
`MediaStream can be aached to a video element by seing the
`srcObject
` aribute (hps://developer.mozilla.org/docs/Web/API/HTMLMediaElement/srcObject) . Previously, this
`was done by seing the src aribute to an object URL created with URL.createObjectURL(), but
`this has been deprecated
` (hps://developer.mozilla.org/docs/Web/API/URL/createObjectURL).
`Note: The MediaStreamTrack is actively using the camera, which takes resources, and keeps the camera open and camera light on. When you are no longer using a track, make sure to call
`track.stop() so that the camera can be closed.
`getUserMedia can also be used
`as an input node for the Web Audio API
` (hps://developer.chrome.com/blog/live-web-audio-input-enabled) :
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`6/27
`
`Sportradar 1038
`Page 6
`
`Chromium-based apps and extensions can also incorporate
`getUserMedia. Adding audioCapture and/or videoCapture
`permissions
`(hps://developer.chrome.com/extensions/permission_warnings)
` to the manifest enables permission to be requested and granted only once upon installation. Thereaer, the user
`is not asked for permission for camera or microphone access.
`Permission only has to be granted once for
`getUserMedia(). First time around, an Allow buon is displayed in the browser's infobar
`(hps://dev.chromium.org/user-experience/infobars). HTTP access for
`getUserMedia() was deprecated by Chrome at the end of 2015 due to it being classi ed as a Powerful
`// Cope with browser differences.
`let audioContext;
`if (typeof AudioContext === 'function') {
`audioContext = new AudioContext();
`} else if (typeof webkitAudioContext === 'function') {
`audioContext = new webkitAudioContext(); // eslint-disable-line new-cap
`} else {
`console.log('Sorry! Web Audio not supported.');
`}
`// Create a filter node.
`var filterNode = audioContext.createBiquadFilter();
`// See https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#BiquadFilterNode-section
`filterNode.type = 'highpass';
`// Cutoff frequency. For highpass, audio is attenuated below this frequency.
`filterNode.frequency.value = 10000;
`// Create a gain node to change audio volume.
`var gainNode = audioContext.createGain();
`// Default is 1 (no change). Less than 1 means audio is attenuated
`// and vice versa.
`gainNode.gain.value = 0.5;
`navigator.mediaDevices.getUserMedia({audio: true}, (stream) => {
`// Create an AudioNode from the stream.
`const mediaStreamSource =
`audioContext.createMediaStreamSource(stream);
`mediaStreamSource.connect(filterNode);
`filterNode.connect(gainNode);
`// Connect the gain node to the destination. For example, play the sound.
`gainNode.connect(audioContext.destination);
`});
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`7/27
`
`Sportradar 1038
`Page 7
`
`feature (hps://sites.google.com/a/chromium.org/dev/Home/chromium-security/deprecating-powerful-features-on-insecure-origins) .
`The intention is potentially to enable a
`MediaStream for any streaming data source, not only a camera or microphone. This would enable streaming from stored data or
`arbitrary data sources, such as sensors or other inputs.
`getUserMedia() really comes to life in combination with other JavaScript APIs and libraries:
`Webcam Toy
` (hps://webcamtoy.com/app/) is a photobooth app that uses WebGL to add weird and wonderful eects to photos that can be shared or saved locally.
`FaceKat (hps://www.auduno.com/2012/06/15/head-tracking-with-webrtc/) is a face-tracking game built with
`headtrackr.js
` (hps://github.com/auduno/headtrackr) .
`ASCII Camera (hps://idevelop.ro/ascii-camera/) uses the Canvas API to generate ASCII images.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`8/27
`
`gUM ASCII art!
`Sportradar 1038
`Page 8
`
`Constraints
`Constraints  (hps://tools.ie.org/html/dra-alvestrand-constraints-resolution-00#page-4) can be used to set values for video resolution for getUserMedia(). This also allows
`support for other constraints
` (hps://w3c.github.io/mediacapture-main/getusermedia.html#the-model-sources-sinks-constraints-and-seings), such as aspect ratio; facing mode
`(front or back camera); frame rate, height and width; and an
`applyConstraints()
`(hps://w3c.github.io/mediacapture-main/getusermedia.html#dom-mediastreamtrack-applyconstraints) method.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`Screen and tab capture
`
`https://web.dev/articles/webrtc-basics
`
`9/27
`
`Sportradar 1038
`Page 9
`
`For an example, see
`WebRTC samples getUserMedia
`: select resolution (hps://webrtc.github.io/samples/src/content/getusermedia/resolution) .
`Gotcha: getUserMedia constraints may aect the available congurations of a shared resource. For example, if a camera was opened in 640 x 480 mode by one tab, another tab will not be able
`to use constraints to open it in a higher-resolution mode because it can only be opened in one mode. Note that this is an implementation detail. It would be possible to let the second tab reopen
`the camera in a higher resolution mode and use video processing to downscale the video track to 640 x 480 for the rst tab, but this has not been implemented.
`Seing a disallowed constraint value gives a DOMException or an OverconstrainedError if, for example, a resolution requested is not available. To see this in action, see
`WebRTC samples getUserMedia
`: select resolution (hps://webrtc.github.io/samples/src/content/getusermedia/resolution/) for a demo.
`Chrome apps also make it possible to share a live video of a single browser tab or the entire desktop through chrome.tabCapture
` (hps://developer.chrome.com/dev/extensions/tabCapture) and
`chrome.desktopCapture
` (hps://developer.chrome.com/extensions/desktopCapture) APIs. (For a demo and more
`information, see
`Screensharing with WebRTC
` (hps://developer.chrome.com/blog/screensharing-with-webrtc). The article is a few years old, but it's still interesting.)
`It's also possible to use screen capture as a
`MediaStream source in Chrome using the experimental chromeMediaSource constraint. Note that screen capture requires
`HTTPS and should only be used for development due to it being enabled through a command-line ag as explained in this
`post
` (hps://groups.google.com/forum/#!msg/discuss-webrtc/TPQVKZnsF5g/Hlpy8kqaLnEJ) .
`Signaling: Session control
`, network, and media information
`WebRTC uses
`RTCPeerConnection to communicate streaming data between browsers (also known as peers), but also needs a mechanism to coordinate communication
`and to send control messages, a process known as signaling. Signaling methods and protocols are
`not specied by WebRTC. Signaling is not part of the
`RTCPeerConnection API.
`Instead, WebRTC app developers can choose whatever messaging protocol they prefer, such as SIP or XMPP, and any appropriate duplex (two-way) communication
`channel. The
`appr.tc
` (hps://appr.tc) example uses XHR and the Channel API as the signaling mechanism. The codelab
` (hps://codelabs.developers.google.com/codelabs/webrtc-web/#0) uses
`Socket.io  (hps://socket.io) running on a Node server (hps://nodejs.org/) .
`Signaling is used to exchange three types of information:
`Session control messages: to initialize or close communication and report errors.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`10/27
`
`Sportradar 1038
`Page 10
`
`Network conguration: to the outside world, what's your computer's IP address and port?
`Media capabilities: what codecs and resolutions can be handled by your browser and the browser it wants to communicate with?
`The exchange of information through signaling must have completed successfully before peer-to-peer streaming can begin.
`For example, imagine Alice wants to communicate with Bob. Here's a code sample from the
`W3C WebRTC spec
` (hps://w3c.github.io/webrtc-pc/#simple-peer-to-peer-example) ,
`which shows the signaling process in action. The code assumes the existence of some signaling mechanism created in the
`createSignalingChannel() method. Also
`note that on Chrome and Opera,
`RTCPeerConnection is currently prexed.
`// handles JSON.stringify/parse
`const signaling = new SignalingChannel();
`const constraints = {audio: true, video: true};
`const configuration = {iceServers: [{urls: 'stun:stun.example.org'}]};
`const pc = new RTCPeerConnection(configuration);
`// Send any ice candidates to the other peer.
`pc.onicecandidate = ({candidate}) => signaling.send({candidate});
`// Let the "negotiationneeded" event trigger offer generation.
`pc.onnegotiationneeded = async () => {
`try {
`    await pc.setLocalDescription(await pc.createOffer());
`// Send the offer to the other peer.
`    signaling.send({desc: pc.localDescription});
`} catch (err) {
`    console.error(err);
`}
`};
`// Once remote track media arrives, show it in remote video element.
`pc.ontrack = (event) => {
`// Don't set srcObject again if it is already set.
`if (remoteView.srcObject) return;
`  remoteView.srcObject = event.streams[0];
`};
`// Call start() to initiate.
`async function start() {
`try
`{
`// Get local stream, show it in self-view, and add it to be sent.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`11/27
`
`Sportradar 1038
`Page 11
`
`First, Alice and Bob exchange network information. (The expression
`nding candidates refers to the process of nding network interfaces and ports using the ICE
`framework.)
`Alice creates an
`RTCPeerConnection object with an onicecandidate handler, which runs when network candidates become available.
`Alice sends serialized candidate data to Bob through whatever signaling channel they are using, such as WebSocket or some other mechanism.
`const
` stream =
`      await navigator.mediaDevices.getUserMedia(constraints);
`    stream.getTracks().forEach((track) =>
`      pc.addTrack(track, stream));
`    selfView.srcObject = stream;
`} catch (err) {
`    console.error(err);
`}
`}
`signaling.onmessage = async ({desc, candidate}) => {
`try {
`if (desc) {
`// If you get an offer, you need to reply with an answer.
`if (desc.type === 'offer') {
`        await pc.setRemoteDescription(desc);
`const stream =
`          await navigator.mediaDevices.getUserMedia(constraints);
`        stream.getTracks().forEach((track) =>
`          pc.addTrack(track, stream));
`        await pc.setLocalDescription(await pc.createAnswer());
`        signaling.send({desc: pc.localDescription});
`} else if (desc
`.type === 'answer') {
`        await pc.setRemoteDescription(desc);
`} else {
`        console.log('Unsupported SDP type.');
`}
`} else if (candidate) {
`      await pc.addIceCandidate(candidate);
`}
`} catch (err) {
`    console.error(err);
`}
`};
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`12/27
`
`Sportradar 1038
`Page 12
`
`When Bob gets a candidate message from Alice, he calls addIceCandidate to add the candidate to the remote peer description.
`WebRTC clients (also known as
`peers, or Alice and Bob in this example) also need to ascertain and exchange local and remote audio and video media information, such as
`resolution and codec capabilities. Signaling to exchange media conguration information proceeds by exchanging an oer and an answer using the Session Description
`Protocol (SDP):
`Alice runs the
`RTCPeerConnection createOffer() method. The return from this is passed an RTCSessionDescription - Alice's local session description.
`In the callback, Alice sets the local description using
`setLocalDescription() and then sends this session description to Bob through their signaling channel. Note
`that
`RTCPeerConnection won't start gathering candidates until setLocalDescription() is called. This is codied in the JSEP IETF dra
` (hps://tools.ie.org/html/dra-ie-rtcweb-jsep-03#section-4.2.4) .
`Bob sets the description Alice sent him as the remote description using
`setRemoteDescription().
`Bob runs the
`RTCPeerConnection createAnswer() method, passing it the remote description he got from Alice so a local session can be generated that is
`compatible with hers. The
`createAnswer() callback is passed an RTCSessionDescription. Bob sets that as the local description and sends it to Alice.
`When Alice gets Bob's session description, she sets that as the remote description with
`setRemoteDescription.
`Ping!
`Note: Make sure to allow the RTCPeerConnection to be garbage collected by calling close() when it's no longer needed. Otherwise, threads and connections are kept alive. It's possible to leak
`heavy resources in WebRTC!
`RTCSessionDescription objects are blobs that conform to the
`Session Description Protocol
` (hps://en.wikipedia.org/wiki/Session_Description_Protocol) , SDP. Serialized, an SDP
`object looks like this:
`v
`=0
`o=- 3883943731 1 IN IP4 127.0.0.1
`s=
`t=0 0
`a=group:BUNDLE audio video
`m=audio 1 RTP/SAVPF 103 104 0 8 106 105 13 126
`// ...
`a=ssrc:2223794119 label:H4fjnMzxy3dPIgQ7HxuCTLb4wLLLeRHnFxh810
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`13/27
`
`JSEP architecture
`Sportradar 1038
`Page 13
`
`The acquisition and exchange of network and media information can be done simultaneously, but both processes must have completed before audio and video
`streaming between peers can begin.
`The oer/answer architecture previously described is called
`JavaScript Session Establishment Protocol
` (hps://rtcweb-wg.github.io/jsep/), or JSEP. (There's an excellent
`animation explaining the process of signaling and streaming in
`Ericsson's demo video
` (hps://www.ericsson.com/research-blog/context-aware-communication/beyond-html5-peer-peer-conversational-video/)
` for its rst WebRTC implementation.)
`Once the signaling process has completed successfully, data can be streamed directly peer to peer, between the caller and callee - or, if that fails, through an
`intermediary relay server (more about that later). Streaming is the job of
`RTCPeerConnection.
`RTCPeerConnection
`RTCPeerConnection is the WebRTC component that handles stable and ecient communication of streaming data between peers.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`14/27
`
`WebRTC architecture (from webrtc.org (hps://webrtc.github.io/webrtc-org/architecture/))
`Sportradar 1038
`Page 14
`
`The following is a WebRTC architecture diagram showing the role of
`RTCPeerConnection. As you will notice, the green parts are complex!
`From a JavaScript perspective, the main thing to understand from this diagram is that RTCPeerConnection shields web developers from the myriad complexities that lurk
`beneath. The codecs and protocols used by WebRTC do a huge amount of work to make real-time communication possible, even over unreliable networks:
`Packet-loss concealment
`Echo cancellation
`Bandwidth adaptivity
`Dynamic jier buering
`Automatic gain control
`Noise reduction and suppression
`Image-cleaning
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`15/27
`
`Sportradar 1038
`Page 15
`
`The previous W3C code shows a simpli
`ed example of WebRTC from a signaling perspective. The following are walkthroughs of two working WebRTC apps. The rst is a
`simple example to demonstrate
`RTCPeerConnection and the second is a fully operational video chat client.
`RTCPeerConnection without servers
`The following code is taken from WebRTC samples Peer connection
` (hps://webrtc.github.io/samples/src/content/peerconnection/pc1/) , which has local and remote
`RTCPeerConnection (and local and remote video) on one web page. This doesn't constitute anything very useful - caller and callee are on the same page - but it does
`make the workings of the
`RTCPeerConnection API a lile clearer because the RTCPeerConnection objects on the page can exchange data and messages directly without
`having to use intermediary signaling mechanisms.
`In this example,
`pc1 represents the local peer (caller) and pc2 represents the remote peer (callee).
`Caller
`
`Create a new RTCPeerConnection and add the stream from getUserMedia(): ```js // Servers is an optional conguration le. (See TURN and STUN discussion later.)
`pc1 = new RTCPeerConnection(servers); // ... localStream.getTracks().forEach((track) => { pc1.addTrack(track, localStream); });
`
`Create an oer and set it as the local description for pc1 and as the remote description for pc2. This can be done directly in the code without using signaling
`because both caller and callee are on the same page:
`js pc1.setLocalDescription(desc).then(() => { onSetLocalSuccess(pc1); },
`onSetSessionDescriptionError ); trace('pc2 setRemoteDescription start'); pc2.setRemoteDescription(desc).then(() => {
`onSetRemoteSuccess(pc2); }, onSetSessionDescriptionError );
`Callee
`
`Create pc2 and, when the stream from pc1 is added, display it in a video element: js pc2 = new RTCPeerConnection(servers); pc2.ontrack =
`gotRemoteStream; //... function gotRemoteStream(e){ vid2.srcObject = e.stream; }
`RTCPeerConnection
` API plus servers
`In the real world, WebRTC needs servers, however simple, so the following can happen:
`Users discover each other and exchange real-world details, such as names.
`WebRTC client apps (peers) exchange network information.
`Peers exchange data about media, such as video format and resolution.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`16/27
`
`Sportradar 1038
`Page 16
`
`WebRTC client apps traverse
`NAT gateways
` (hps://en.wikipedia.org/wiki/NAT_traversal) and rewalls.
`In other words, WebRTC needs four types of server-side functionality:
`User discovery and communication
`Signaling
`NAT/rewall traversal
`Relay servers in case peer-to-peer communication fails
`NAT traversal, peer-to-peer networking, and the requirements for building a server app for user discovery and signaling are beyond the scope of this article. Suce to
`say that the STUN (hps://en.wikipedia.org/wiki/STUN) protocol and its extension, TURN (hps://en.wikipedia.org/wiki/Traversal_Using_Relay_NAT) , are used by the ICE
` (hps://en.wikipedia.org/wiki/Interactive_Connectivity_Establishment) framework to enable
`RTCPeerConnection to cope with NAT traversal and other network vagaries.
`ICE is a framework for connecting peers, such as two video chat clients. Initially, ICE tries to connect peers
`directly with the lowest possible latency through UDP. In this
`process, STUN servers have a single task: to enable a peer behind a NAT to nd out its public address and port. (For more information about STUN and TURN, see Build the
`backend services needed for a WebRTC app
` (hps://www.html5rocks.com/tutorials/webrtc/infrastructure/).)
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`17/27
`
`Finding connection candidates
`Sportradar 1038
`Page 17
`
`If UDP fails, ICE tries TCP. If direct connection fails - in particular because of enterprise NAT traversal and
`rewalls - ICE uses an intermediary (relay) TURN server. In other
`words, ICE
`rst uses STUN with UDP to directly connect peers and, if that fails, falls back to a TURN relay server. The expression nding candidates refers to the process
`of nding network interfaces and ports.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`A simple video-chat client
`
`https://web.dev/articles/webrtc-basics
`
`18/27
`
`WebRTC data pathways
`Sportradar 1038
`Page 18
`
`WebRTC engineer Justin Uberti provides more information about ICE, STUN, and TURN in the
`2013 Google I/O WebRTC presentation
` (hps://www.youtube.com/watch?v =p2HzZkd2A40&t=21m12s). (The presentation
`slides (hps://io13webrtc.appspot.com/#52) give examples of TURN and STUN server
`implementations.)
`A good place to try WebRTC, complete with signaling and NAT/
`rewall traversal using a STUN server, is the video-chat demo at
`appr.tc
` (hps://appr.tc) . This app uses
`adapter.js
` (hps://github.com/webrtc/adapter), a shim to insulate apps from spec changes and prex dierences.
`The code is deliberately verbose in its logging. Check the console to understand the order of events. The following is a detailed walkthrough of the code.
`Note: If you nd this somewhat baing, you may prefer the WebRTC codelab (hps://codelabs.developers.google.com/codelabs/webrtc-web/). This step-by-step guide explains how to build a
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`19/27
`
`Multipoint Control Unit topology example
`Sportradar 1038
`Page 19
`
`complete video-chat app, including a simple signaling server running on a Node server (hps://nodejs.org/) .
`Network topologies
`WebRTC, as currently implemented, only supports one-to-one communication, but could be used in more complex network scenarios, such as with multiple peers each
`communicating with each other directly or through a
`Multipoint Control Unit
` (hps://en.wikipedia.org/wiki/Multipoint_control_unit) (MCU), a server that can handle large
`numbers of participants and do selective stream forwarding, and mixing or recording of audio and video.
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`20/27
`
`Tethr/Tropo: Disaster communications in a briefcase
`Sportradar 1038
`Page 20
`
`Many existing WebRTC apps only demonstrate communication between web browsers, but gateway servers can enable a WebRTC app running on a browser to interact
`with devices, such as
`telephones
` (hps://en.wikipedia.org/wiki/Public_switched_telephone_network) (also known as PSTN
` (hps://en.wikipedia.org/wiki/Public_switched_telephone_network)) and with
`VOIP (hps://en.wikipedia.org/wiki/Voice_over_IP) systems. In May 2012, Doubango Telecom open sourced
`the
`sipml5 SIP client
` (hps://sipml5.org/) built with WebRTC and WebSocket, which (among other potential uses) enables video calls between browsers and apps running on
`iOS and Android. At Google I/O, Tethr and Tropo demonstrated
`a framework for disaster communications (hps://tethr.tumblr.com/) in a briefcase using an
`OpenBTS cell
` (hps://en.wikipedia.org/wiki/OpenBTS) to enable communications between feature phones and computers through WebRTC. Telephone communication without a carrier!
`RTCDataChannel
` API<
`

`

`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`21/27
`
`Sportradar 1038
`Page 21
`
`As well as audio and video, WebRTC supports real-time communication for other types of data.
`The
`RTCDataChannel API enables peer-to-peer exchange of arbitrary data with low latency and high throughput. For single-page demos and to learn how to build a
`simple le-transfer app, see
`WebRTC samples
` (hps://webrtc.github.io/samples/#datachannel) and the WebRTC codelab
` (hps://codelabs.developers.google.com/codelabs/webrtc-web/#0), respectively.
`There are many potential use cases for the API, including:
`Gaming
`Remote desktop apps
`Real-time text chat
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket