`
`Get started with WebRTC | Articles | web.dev
`
`
`
`Real-time communication with WebRTC: Google I/O 2013Real-time communication with WebRTC: Google I/O 2013
`
`https://web.dev/articles/webrtc-basics
`
`1/27
`
`Genius Sports Ex. 1038
`p. 1
`
`Get started with WebRTC
`Sam Du on
` (h ps://twi er.com/sw12)
` (h ps://github.com/samdu on)
` (h ps://glitch.com/@samdu on)
` (h ps://techhub.social/@samdu on)
` (h ps://samdu on.com)
`WebRTC is a new front in the long war for an open and unencumbered web.
`Brendan Eich, inventor of JavaScript
`Real-time communication without plugins
`Imagine a world where your phone, TV, and computer could communicate on a common pla orm. Imagine it was easy to add video chat and peer-to-peer data sharing to
`your web app. That's the vision of WebRTC.
`Want to try it out? WebRTC is available on desktop and mobile in Google Chrome, Safari, Firefox, and Opera. A good place to start is the simple video chat app at appr.tc
`(h ps://appr.tc):
` Open appr.tc (h ps://appr.tc) in your browser.
` Click Join to join a chat room and let the app use your webcam.
` Open the URL displayed at the end of the page in a new tab or, be er still, on a di erent computer.
`Quick start
`Haven't got time to read this article or only want code?
`To get an overview of WebRTC, watch the following Google I/O video or view these slides (h ps://io13webrtc.appspot.com/):
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`2/27
`
`Genius Sports Ex. 1038
`p. 2
`
`If you haven't used the getUserMedia API, see
`Capture audio and video in HTML5 (h ps://www.html5rocks.com/en/tutorials/getusermedia/intro) and simpl.info
`getUserMedia (h ps://www.simpl.info/getusermedia).
`To learn about the RTCPeerConnection API, see the following example and 'simpl.info RTCPeerConnection' (h ps://simpl.info/rtcpeerconnection).
`To learn how WebRTC uses servers for signaling, and rewall and NAT traversal, see the code and console logs from appr.tc (h ps://appr.tc).
`Can’t wait and just want to try WebRTC right now? Try some of the more-than 20 demos (h ps://webrtc.github.io/samples) that exercise the WebRTC JavaScript APIs.
`Having trouble with your machine and WebRTC? Visit the WebRTC Troubleshooter (h ps://test.webrtc.org).
`Alternatively, jump straight into the WebRTC codelab (h ps://codelabs.developers.google.com/codelabs/webrtc-web/), a step-by-step guide that explains how to build a
`complete video chat app, including a simple signaling server.
`A very short history of WebRTC
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`3/27
`
`Genius Sports Ex. 1038
`p. 3
`
`One of the last major challenges for the web is to enable human communication through voice and video: real-time communication or RTC for short. RTC should be as
`natural in a web app as entering text in a text input. Without it, you're limited in your ability to innovate and develop new ways for people to interact.
`Historically, RTC has been corporate and complex, requiring expensive audio and video technologies to be licensed or developed in house. Integrating RTC technology
`with existing content, data, and services has been di cult and time-consuming, particularly on the web.
`Gmail video chat became popular in 2008 and, in 2011, Google introduced Hangouts, which uses Talk (as did Gmail). Google bought GIPS, a company that developed
`many components required for RTC, such as codecs and echo cancellation techniques. Google open sourced the technologies developed by GIPS and engaged with
`relevant standards bodies at the Internet Engineering Task Force (IETF) and World Wide Web Consortium (W3C) to ensure industry consensus. In May 2011, Ericsson built
`the rst implementation of WebRTC (h ps://labs.ericsson.com/developer-community/blog/beyond-html5-peer-peer-conversational-video).
`WebRTC implemented open standards for real-time, plugin-free video, audio, and data communication. The need was real:
`Many web services used RTC, but needed downloads, native apps, or plugins. These included Skype, Facebook, and Hangouts.
`Downloading, installing, and updating plugins is complex, error prone, and annoying.
`Plugins are di cult to deploy, debug, troubleshoot, test, and maintain - and may require licensing and integration with complex, expensive technology. It's o en
`di cult to persuade people to install plugins in the rst place!
`The guiding principles of the WebRTC project are that its APIs should be open source, free, standardized, built into web browsers, and more e cient than existing
`technologies.
`Where are we now?
`WebRTC is used in various apps, such as Google Meet. WebRTC has also been integrated with WebKitGTK+
` (h ps://labs.ericsson.com/developer-community/blog/beyond-html5-conversational-voice-and-video-implemented-webkit-gtk) and Qt native apps.
`WebRTC implements these three APIs: - MediaStream (also known as getUserMedia) - RTCPeerConnection - RTCDataChannel
`The APIs are de ned in these two specs:
`WebRTC (h ps://w3c.github.io/webrtc-pc/)
`getUserMedia (h ps://www.w3.org/TR/mediacapture-streams)
`All three APIs are supported on mobile and desktop by Chrome, Safari, Firefox, Edge, and Opera.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`4/27
`
`Genius Sports Ex. 1038
`p. 4
`
`getUserMedia: For demos and code, see
`WebRTC samples (h ps://webrtc.github.io/samples) or try Chris Wilson's amazing examples (h ps://webaudiodemos.appspot.com) that
`use getUserMedia as input for web audio.
`RTCPeerConnection: For a simple demo and a fully functional video-chat app, see WebRTC samples Peer connection
`(h ps://webrtc.github.io/samples/src/content/peerconnection/pc1/) and appr.tc (h ps://appr.tc), respectively. This app uses adapter.js (h ps://github.com/webrtc/adapter), a JavaScript
`shim maintained by Google with help from the WebRTC community (h ps://github.com/webrtc/adapter/graphs/contributors), to abstract away browser di erences and spec
`changes.
`RTCDataChannel: To see this in action, see WebRTC samples (h ps://webrtc.github.io/samples/) to check out one of the data-channel demos.
`The WebRTC codelab (h ps://codelabs.developers.google.com/codelabs/webrtc-web/#0) shows how to use all three APIs to build a simple app for video chat and le sharing.
`Your first WebRTC
`WebRTC apps need to do several things:
`Get streaming audio, video, or other data.
`Get network information, such as IP addresses and ports, and exchange it with other WebRTC clients (known as peers) to enable connection, even through NATs
`(h ps://en.wikipedia.org/wiki/NAT_traversal) and rewalls.
`Coordinate signaling communication to report errors and initiate or close sessions.
`Exchange information about media and client capability, such as resolution and codecs.
`Communicate streaming audio, video, or data.
`To acquire and communicate streaming data, WebRTC implements the following APIs:
`MediaStream (h ps://dvcs.w3.org/hg/audio/raw- le/tip/streams/StreamProcessing.html) gets access to data streams, such as from the user's camera and microphone.
`RTCPeerConnection (h ps://dev.w3.org/2011/webrtc/editor/webrtc.html#rtcpeerconnection-interface) enables audio or video calling with facilities for encryption and
`bandwidth management.
`RTCDataChannel (h ps://dev.w3.org/2011/webrtc/editor/webrtc.html#rtcdatachannel) enables peer-to-peer communication of generic data.
`(There is detailed discussion of the network and signaling aspects of WebRTC later.)
`MediaStream API (also known as getUserMedia API)
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`5/27
`
`Genius Sports Ex. 1038
`p. 5
`
`The MediaStream API (h ps://dev.w3.org/2011/webrtc/editor/getusermedia.html) represents synchronized streams of media. For example, a stream taken from camera and
`microphone input has synchronized video and audio tracks. (Don't confuse MediaStreamTrack with the <track> element, which is something
`entirely di erent
` (h ps://www.html5rocks.com/en/tutorials/track/basics/).)
`Probably the easiest way to understand the MediaStream API is to look at it in the wild:
` In your browser, navigate to WebRTC samples getUserMedia (h ps://webrtc.github.io/samples/src/content/getusermedia/gum/).
` Open the console.
` Inspect the stream variable, which is in global scope.
`Each MediaStream has an input, which might be a MediaStream generated by getUserMedia(), and an output, which might be passed to a video element or an
`RTCPeerConnection.
`The getUserMedia() method takes a MediaStreamConstraints object parameter and returns a Promise that resolves to a MediaStream object.
`Each MediaStream has a label, such as 'Xk7EuLhsuHKbnjLWkW4yYGNJJ8ONsgwHBvLQ'. An array of MediaStreamTracks is returned by the getAudioTracks() and
`getVideoTracks() methods.
`For the getUserMedia (h ps://webrtc.github.io/samples/src/content/getusermedia/gum/) example, stream.getAudioTracks() returns an empty array (because there's no audio)
`and, assuming a working webcam is connected, stream.getVideoTracks() returns an array of one MediaStreamTrack representing the stream from the webcam. Each
`MediaStreamTrack has a kind ('video' or 'audio'), a label (something like 'FaceTime HD Camera (Built-in)'), and represents one or more channels of either audio
`or video. In this case, there is only one video track and no audio, but it is easy to imagine use cases where there are more, such as a chat app that gets streams from the
`front camera, rear camera, microphone, and an app sharing its screen.
`A MediaStream can be a ached to a video element by se ing the srcObject a ribute (h ps://developer.mozilla.org/docs/Web/API/HTMLMediaElement/srcObject). Previously, this
`was done by se ing the src a ribute to an object URL created with URL.createObjectURL(), but this has been deprecated
` (h ps://developer.mozilla.org/docs/Web/API/URL/createObjectURL).
`Note: The MediaStreamTrack is actively using the camera, which takes resources, and keeps the camera open and camera light on. When you are no longer using a track, make sure to call
`track.stop() so that the camera can be closed.
`getUserMedia can also be used as an input node for the Web Audio API (h ps://developer.chrome.com/blog/live-web-audio-input-enabled):
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`6/27
`
`Genius Sports Ex. 1038
`p. 6
`
`Chromium-based apps and extensions can also incorporate getUserMedia. Adding audioCapture and/or videoCapture permissions
`(h ps://developer.chrome.com/extensions/permission_warnings) to the manifest enables permission to be requested and granted only once upon installation. Therea er, the user
`is not asked for permission for camera or microphone access.
`Permission only has to be granted once for getUserMedia(). First time around, an Allow bu on is displayed in the browser's infobar
`(h ps://dev.chromium.org/user-experience/infobars). HTTP access for getUserMedia() was deprecated by Chrome at the end of 2015 due to it being classi ed as a Powerful
`// Cope with browser differences.
`let audioContext;
`if (typeof AudioContext === 'function') {
`audioContext = new AudioContext();
`} else if (typeof webkitAudioContext === 'function') {
`audioContext = new webkitAudioContext(); // eslint-disable-line new-cap
`} else {
`console.log('Sorry! Web Audio not supported.');
`}
`// Create a filter node.
`var filterNode = audioContext.createBiquadFilter();
`// See https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#BiquadFilterNode-section
`filterNode.type = 'highpass';
`// Cutoff frequency. For highpass, audio is attenuated below this frequency.
`filterNode.frequency.value = 10000;
`// Create a gain node to change audio volume.
`var gainNode = audioContext.createGain();
`// Default is 1 (no change). Less than 1 means audio is attenuated
`// and vice versa.
`gainNode.gain.value = 0.5;
`navigator.mediaDevices.getUserMedia({audio: true}, (stream) => {
`// Create an AudioNode from the stream.
`const mediaStreamSource =
`audioContext.createMediaStreamSource(stream);
`mediaStreamSource.connect(filterNode);
`filterNode.connect(gainNode);
`// Connect the gain node to the destination. For example, play the sound.
`gainNode.connect(audioContext.destination);
`});
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`7/27
`
`Genius Sports Ex. 1038
`p. 7
`
`feature (h ps://sites.google.com/a/chromium.org/dev/Home/chromium-security/deprecating-powerful-features-on-insecure-origins).
`The intention is potentially to enable a MediaStream for any streaming data source, not only a camera or microphone. This would enable streaming from stored data or
`arbitrary data sources, such as sensors or other inputs.
`getUserMedia() really comes to life in combination with other JavaScript APIs and libraries:
`Webcam Toy (h ps://webcamtoy.com/app/) is a photobooth app that uses WebGL to add weird and wonderful e ects to photos that can be shared or saved locally.
`FaceKat (h ps://www.auduno.com/2012/06/15/head-tracking-with-webrtc/) is a face-tracking game built with headtrackr.js (h ps://github.com/auduno/headtrackr).
`ASCII Camera (h ps://idevelop.ro/ascii-camera/) uses the Canvas API to generate ASCII images.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`8/27
`
`gUM ASCII art!
`Genius Sports Ex. 1038
`p. 8
`
`Constraints
`Constraints (h ps://tools.ie .org/html/dra -alvestrand-constraints-resolution-00#page-4) can be used to set values for video resolution for getUserMedia(). This also allows
`support for other constraints (h ps://w3c.github.io/mediacapture-main/getusermedia.html#the-model-sources-sinks-constraints-and-se ings), such as aspect ratio; facing mode
`(front or back camera); frame rate, height and width; and an applyConstraints()
`(h ps://w3c.github.io/mediacapture-main/getusermedia.html#dom-mediastreamtrack-applyconstraints) method.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`Screen and tab capture
`
`https://web.dev/articles/webrtc-basics
`
`9/27
`
`Genius Sports Ex. 1038
`p. 9
`
`For an example, see
`WebRTC samples getUserMedia: select resolution (h ps://webrtc.github.io/samples/src/content/getusermedia/resolution).
`Gotcha: getUserMedia constraints may a ect the available con gurations of a shared resource. For example, if a camera was opened in 640 x 480 mode by one tab, another tab will not be able
`to use constraints to open it in a higher-resolution mode because it can only be opened in one mode. Note that this is an implementation detail. It would be possible to let the second tab reopen
`the camera in a higher resolution mode and use video processing to downscale the video track to 640 x 480 for the rst tab, but this has not been implemented.
`Se ing a disallowed constraint value gives a DOMException or an OverconstrainedError if, for example, a resolution requested is not available. To see this in action, see
`WebRTC samples getUserMedia: select resolution (h ps://webrtc.github.io/samples/src/content/getusermedia/resolution/) for a demo.
`Chrome apps also make it possible to share a live video of a single browser tab or the entire desktop through chrome.tabCapture
` (h ps://developer.chrome.com/dev/extensions/tabCapture) and chrome.desktopCapture (h ps://developer.chrome.com/extensions/desktopCapture) APIs. (For a demo and more
`information, see Screensharing with WebRTC (h ps://developer.chrome.com/blog/screensharing-with-webrtc). The article is a few years old, but it's still interesting.)
`It's also possible to use screen capture as a MediaStream source in Chrome using the experimental chromeMediaSource constraint. Note that screen capture requires
`HTTPS and should only be used for development due to it being enabled through a command-line ag as explained in this post
` (h ps://groups.google.com/forum/#!msg/discuss-webrtc/TPQVKZnsF5g/Hlpy8kqaLnEJ).
`Signaling: Session control, network, and media information
`WebRTC uses RTCPeerConnection to communicate streaming data between browsers (also known as peers), but also needs a mechanism to coordinate communication
`and to send control messages, a process known as signaling. Signaling methods and protocols are not speci ed by WebRTC. Signaling is not part of the
`RTCPeerConnection API.
`Instead, WebRTC app developers can choose whatever messaging protocol they prefer, such as SIP or XMPP, and any appropriate duplex (two-way) communication
`channel. The appr.tc (h ps://appr.tc) example uses XHR and the Channel API as the signaling mechanism. The codelab
` (h ps://codelabs.developers.google.com/codelabs/webrtc-web/#0) uses Socket.io (h ps://socket.io) running on a Node server (h ps://nodejs.org/).
`Signaling is used to exchange three types of information:
`Session control messages: to initialize or close communication and report errors.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`10/27
`
`Genius Sports Ex. 1038
`p. 10
`
`Network con guration: to the outside world, what's your computer's IP address and port?
`Media capabilities: what codecs and resolutions can be handled by your browser and the browser it wants to communicate with?
`The exchange of information through signaling must have completed successfully before peer-to-peer streaming can begin.
`For example, imagine Alice wants to communicate with Bob. Here's a code sample from the
`W3C WebRTC spec (h ps://w3c.github.io/webrtc-pc/#simple-peer-to-peer-example),
`which shows the signaling process in action. The code assumes the existence of some signaling mechanism created in the createSignalingChannel() method. Also
`note that on Chrome and Opera, RTCPeerConnection is currently pre xed.
`// handles JSON.stringify/parse
`const signaling = new SignalingChannel();
`const constraints = {audio: true, video: true};
`const configuration = {iceServers: [{urls: 'stun:stun.example.org'}]};
`const pc = new RTCPeerConnection(configuration);
`// Send any ice candidates to the other peer.
`pc.onicecandidate = ({candidate}) => signaling.send({candidate});
`// Let the "negotiationneeded" event trigger offer generation.
`pc.onnegotiationneeded = async () => {
`try {
` await pc.setLocalDescription(await pc.createOffer());
`// Send the offer to the other peer.
` signaling.send({desc: pc.localDescription});
`} catch (err) {
` console.error(err);
`}
`};
`// Once remote track media arrives, show it in remote video element.
`pc.ontrack = (event) => {
`// Don't set srcObject again if it is already set.
`if (remoteView.srcObject) return;
` remoteView.srcObject = event.streams[0];
`};
`// Call start() to initiate.
`async function start() {
`try {
`// Get local stream, show it in self-view, and add it to be sent.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`11/27
`
`Genius Sports Ex. 1038
`p. 11
`
`First, Alice and Bob exchange network information. (The expression nding candidates refers to the process of nding network interfaces and ports using the ICE
`framework.)
` Alice creates an RTCPeerConnection object with an onicecandidate handler, which runs when network candidates become available.
` Alice sends serialized candidate data to Bob through whatever signaling channel they are using, such as WebSocket or some other mechanism.
`const stream =
` await navigator.mediaDevices.getUserMedia(constraints);
` stream.getTracks().forEach((track) =>
` pc.addTrack(track, stream));
` selfView.srcObject = stream;
`} catch (err) {
` console.error(err);
`}
`}
`signaling.onmessage = async ({desc, candidate}) => {
`try {
`if (desc) {
`// If you get an offer, you need to reply with an answer.
`if (desc.type === 'offer') {
` await pc.setRemoteDescription(desc);
`const stream =
` await navigator.mediaDevices.getUserMedia(constraints);
` stream.getTracks().forEach((track) =>
` pc.addTrack(track, stream));
` await pc.setLocalDescription(await pc.createAnswer());
` signaling.send({desc: pc.localDescription});
`} else if (desc.type === 'answer') {
` await pc.setRemoteDescription(desc);
`} else {
` console.log('Unsupported SDP type.');
`}
`} else if (candidate) {
` await pc.addIceCandidate(candidate);
`}
`} catch (err) {
` console.error(err);
`}
`};
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`12/27
`
`Genius Sports Ex. 1038
`p. 12
`
` When Bob gets a candidate message from Alice, he calls addIceCandidate to add the candidate to the remote peer description.
`WebRTC clients (also known as peers, or Alice and Bob in this example) also need to ascertain and exchange local and remote audio and video media information, such as
`resolution and codec capabilities. Signaling to exchange media con guration information proceeds by exchanging an o er and an answer using the Session Description
`Protocol (SDP):
` Alice runs the RTCPeerConnection createOffer() method. The return from this is passed an RTCSessionDescription - Alice's local session description.
` In the callback, Alice sets the local description using setLocalDescription() and then sends this session description to Bob through their signaling channel. Note
`that RTCPeerConnection won't start gathering candidates until setLocalDescription() is called. This is codi ed in the JSEP IETF dra
` (h ps://tools.ie .org/html/dra -ie -rtcweb-jsep-03#section-4.2.4).
` Bob sets the description Alice sent him as the remote description using setRemoteDescription().
` Bob runs the RTCPeerConnection createAnswer() method, passing it the remote description he got from Alice so a local session can be generated that is
`compatible with hers. The createAnswer() callback is passed an RTCSessionDescription. Bob sets that as the local description and sends it to Alice.
` When Alice gets Bob's session description, she sets that as the remote description with setRemoteDescription.
` Ping!
`Note: Make sure to allow the RTCPeerConnection to be garbage collected by calling close() when it's no longer needed. Otherwise, threads and connections are kept alive. It's possible to leak
`heavy resources in WebRTC!
`RTCSessionDescription objects are blobs that conform to the
`Session Description Protocol (h ps://en.wikipedia.org/wiki/Session_Description_Protocol), SDP. Serialized, an SDP
`object looks like this:
`v=0
`o=- 3883943731 1 IN IP4 127.0.0.1
`s=
`t=0 0
`a=group:BUNDLE audio video
`m=audio 1 RTP/SAVPF 103 104 0 8 106 105 13 126
`// ...
`a=ssrc:2223794119 label:H4fjnMzxy3dPIgQ7HxuCTLb4wLLLeRHnFxh810
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`13/27
`
`JSEP architecture
`Genius Sports Ex. 1038
`p. 13
`
`The acquisition and exchange of network and media information can be done simultaneously, but both processes must have completed before audio and video
`streaming between peers can begin.
`The o er/answer architecture previously described is called
`JavaScript Session Establishment Protocol (h ps://rtcweb-wg.github.io/jsep/), or JSEP. (There's an excellent
`animation explaining the process of signaling and streaming in Ericsson's demo video
` (h ps://www.ericsson.com/research-blog/context-aware-communication/beyond-html5-peer-peer-conversational-video/) for its rst WebRTC implementation.)
`Once the signaling process has completed successfully, data can be streamed directly peer to peer, between the caller and callee - or, if that fails, through an
`intermediary relay server (more about that later). Streaming is the job of RTCPeerConnection.
`RTCPeerConnection
`RTCPeerConnection is the WebRTC component that handles stable and e cient communication of streaming data between peers.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`14/27
`
`WebRTC architecture (from webrtc.org (h ps://webrtc.github.io/webrtc-org/architecture/))
`Genius Sports Ex. 1038
`p. 14
`
`The following is a WebRTC architecture diagram showing the role of RTCPeerConnection. As you will notice, the green parts are complex!
`From a JavaScript perspective, the main thing to understand from this diagram is that RTCPeerConnection shields web developers from the myriad complexities that lurk
`beneath. The codecs and protocols used by WebRTC do a huge amount of work to make real-time communication possible, even over unreliable networks:
`Packet-loss concealment
`Echo cancellation
`Bandwidth adaptivity
`Dynamic ji er bu ering
`Automatic gain control
`Noise reduction and suppression
`Image-cleaning
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`15/27
`
`Genius Sports Ex. 1038
`p. 15
`
`The previous W3C code shows a simpli ed example of WebRTC from a signaling perspective. The following are walkthroughs of two working WebRTC apps. The rst is a
`simple example to demonstrate RTCPeerConnection and the second is a fully operational video chat client.
`RTCPeerConnection without servers
`The following code is taken from WebRTC samples Peer connection (h ps://webrtc.github.io/samples/src/content/peerconnection/pc1/), which has local and remote
`RTCPeerConnection (and local and remote video) on one web page. This doesn't constitute anything very useful - caller and callee are on the same page - but it does
`make the workings of the RTCPeerConnection API a li le clearer because the RTCPeerConnection objects on the page can exchange data and messages directly without
`having to use intermediary signaling mechanisms.
`In this example, pc1 represents the local peer (caller) and pc2 represents the remote peer (callee).
`Caller
` Create a new RTCPeerConnection and add the stream from getUserMedia(): ```js // Servers is an optional con guration le. (See TURN and STUN discussion later.)
`pc1 = new RTCPeerConnection(servers); // ... localStream.getTracks().forEach((track) => { pc1.addTrack(track, localStream); });
` Create an o er and set it as the local description for pc1 and as the remote description for pc2. This can be done directly in the code without using signaling
`because both caller and callee are on the same page: js pc1.setLocalDescription(desc).then(() => { onSetLocalSuccess(pc1); },
`onSetSessionDescriptionError ); trace('pc2 setRemoteDescription start'); pc2.setRemoteDescription(desc).then(() => {
`onSetRemoteSuccess(pc2); }, onSetSessionDescriptionError );
`Callee
` Create pc2 and, when the stream from pc1 is added, display it in a video element: js pc2 = new RTCPeerConnection(servers); pc2.ontrack =
`gotRemoteStream; //... function gotRemoteStream(e){ vid2.srcObject = e.stream; }
`RTCPeerConnection API plus servers
`In the real world, WebRTC needs servers, however simple, so the following can happen:
`Users discover each other and exchange real-world details, such as names.
`WebRTC client apps (peers) exchange network information.
`Peers exchange data about media, such as video format and resolution.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`16/27
`
`Genius Sports Ex. 1038
`p. 16
`
`WebRTC client apps traverse
`NAT gateways (h ps://en.wikipedia.org/wiki/NAT_traversal) and rewalls.
`In other words, WebRTC needs four types of server-side functionality:
`User discovery and communication
`Signaling
`NAT/ rewall traversal
`Relay servers in case peer-to-peer communication fails
`NAT traversal, peer-to-peer networking, and the requirements for building a server app for user discovery and signaling are beyond the scope of this article. Su ce to
`say that the STUN (h ps://en.wikipedia.org/wiki/STUN) protocol and its extension, TURN (h ps://en.wikipedia.org/wiki/Traversal_Using_Relay_NAT), are used by the ICE
` (h ps://en.wikipedia.org/wiki/Interactive_Connectivity_Establishment) framework to enable RTCPeerConnection to cope with NAT traversal and other network vagaries.
`ICE is a framework for connecting peers, such as two video chat clients. Initially, ICE tries to connect peers directly with the lowest possible latency through UDP. In this
`process, STUN servers have a single task: to enable a peer behind a NAT to nd out its public address and port. (For more information about STUN and TURN, see Build the
`backend services needed for a WebRTC app (h ps://www.html5rocks.com/tutorials/webrtc/infrastructure/).)
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`17/27
`
`Finding connection candidates
`Genius Sports Ex. 1038
`p. 17
`
`If UDP fails, ICE tries TCP. If direct connection fails - in particular because of enterprise NAT traversal and rewalls - ICE uses an intermediary (relay) TURN server. In other
`words, ICE rst uses STUN with UDP to directly connect peers and, if that fails, falls back to a TURN relay server. The expression
` nding candidates refers to the process
`of nding network interfaces and ports.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`A simple video-chat client
`
`https://web.dev/articles/webrtc-basics
`
`18/27
`
`WebRTC data pathways
`Genius Sports Ex. 1038
`p. 18
`
`WebRTC engineer Justin Uberti provides more information about ICE, STUN, and TURN in the
`2013 Google I/O WebRTC presentation
` (h ps://www.youtube.com/watch?v=p2HzZkd2A40&t=21m12s). (The presentation slides (h ps://io13webrtc.appspot.com/#52) give examples of TURN and STUN server
`implementations.)
`A good place to try WebRTC, complete with signaling and NAT/ rewall traversal using a STUN server, is the video-chat demo at appr.tc (h ps://appr.tc). This app uses
`adapter.js (h ps://github.com/webrtc/adapter), a shim to insulate apps from spec changes and pre x di erences.
`The code is deliberately verbose in its logging. Check the console to understand the order of events. The following is a detailed walkthrough of the code.
`Note: If you nd this somewhat ba ing, you may prefer the WebRTC codelab (h ps://codelabs.developers.google.com/codelabs/webrtc-web/). This step-by-step guide explains how to build a
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`19/27
`
`Multipoint Control Unit topology example
`Genius Sports Ex. 1038
`p. 19
`
`complete video-chat app, including a simple signaling server running on a Node server (h ps://nodejs.org/).
`Network topologies
`WebRTC, as currently implemented, only supports one-to-one communication, but could be used in more complex network scenarios, such as with multiple peers each
`communicating with each other directly or through a Multipoint Control Unit (h ps://en.wikipedia.org/wiki/Multipoint_control_unit) (MCU), a server that can handle large
`numbers of participants and do selective stream forwarding, and mixing or recording of audio and video.
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`20/27
`
`Tethr/Tropo: Disaster communications in a briefcase
`Genius Sports Ex. 1038
`p. 20
`
`Many existing WebRTC apps only demonstrate communication between web browsers, but gateway servers can enable a WebRTC app running on a browser to interact
`with devices, such as
`telephones (h ps://en.wikipedia.org/wiki/Public_switched_telephone_network) (also known as PSTN
` (h ps://en.wikipedia.org/wiki/Public_switched_telephone_network)) and with VOIP (h ps://en.wikipedia.org/wiki/Voice_over_IP) systems. In May 2012, Doubango Telecom open sourced
`the sipml5 SIP client (h ps://sipml5.org/) built with WebRTC and WebSocket, which (among other potential uses) enables video calls between browsers and apps running on
`iOS and Android. At Google I/O, Tethr and Tropo demonstrated a framework for disaster communications (h ps://tethr.tumblr.com/) in a briefcase using an OpenBTS cell
` (h ps://en.wikipedia.org/wiki/OpenBTS) to enable communications between feature phones and computers through WebRTC. Telephone communication without a carrier!
`RTCDataChannel API<
`
`
`8/7/24, 1:09 PM
`
`Get started with WebRTC | Articles | web.dev
`
`https://web.dev/articles/webrtc-basics
`
`21/27
`
`Genius Sports Ex. 1038
`p. 21
`
`As well as audio and video, WebRTC supports real-time communication for other types of data.
`The RTCDataChannel API enables peer-to-peer exchange of arbitrary data with low latency and high throughput. For single-page demos and to learn how to build a
`simple le-transfer app, see WebRTC samples (h ps://webrtc.github.io/samples/#datachannel) and the WebRTC codelab
` (h ps://codelabs.developers.google.com/codelabs/webrtc-web/#0), respectively.
`There are many potential use cases for the API, including:
`Gaming
`Remote desktop apps
`Real-time text chat
`File transfer
`Decentralized networks
`The API has s