Two major trends
Trend 1:
Computing means Mobile.

53% of adults media multi-task while watching TV

Trend 2:
Video is HUGE.

Video will be 80-90% of net traffic by 2017.

Ye olde Flashe vid

<object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="425" height="344"
  <param name="allowFullScreen" value="true" />
  <param name="allowscriptaccess" value="always" />
  <param name="src" value="http://www.eurgh.com/v/oHg5SJYRHA0&hl=en&fs=1&" />
  <param name="allowfullscreen" value="true" />
  <embed type="application/x-shockwave-flash" width="425" height="344"
    allowscriptaccess="always" allowfullscreen="true">
<video src='chrome.webm' />

Codecs for the modern Web

VP8 and VP9: Open codecs for the web

  <source src="chrome.webm" />
  <source src="chrome.mp4" />
  <source src="chrome.webm" 
      type="video/webm" />
  <source src="chrome.mp4" 
      type="video/mp4" />
<video poster="images/poster.jpg">
  <source src="chrome.webm" 
      type="video/webm" />
  <source src="chrome.mp4" 
      type="video/mp4" />
<video poster="images/poster.jpg"
  autoplay preload="metadata">
  <source src="chrome.webm" 
      type="video/webm" />
  <source src="chrome.mp4" 
      type="video/mp4" />
Steve Souders' preload test

Advanced video features

<video poster="images/poster.jpg"
  autoplay preload="metadata">
  <source src="chrome.webm" type="video/webm" />
  <source src="chrome.mp4" type="video/mp4" />
  <track src="track.vtt" />
  <p>Video element not supported.</p>
In-band WebVTT: track data
Media Fragments
<video src='chrome.webm#t=5,10' />
Media Source Extensions
(generating streams from JavaScript)


Local media input


It's pretty simple.

var constraints = {video: true};

function successCallback(stream) {
  var video = document.querySelector("video");
  video.src = window.URL.createObjectURL(stream);

function errorCallback(error) {
  console.log("navigator.getUserMedia error: ", error);

navigator.getUserMedia(constraints, successCallback, errorCallback);

gUM screencapture!


WebRTC across platforms

WebRTC endpoints

What do we need for RTC?

Four main tasks

  • Acquiring audio and video
  • Establishing a connection between peers (signaling)
  • Communicating audio and video
  • Communicating arbitrary data

Three main JavaScript APIs

  • MediaStreams (aka getUserMedia)
  • RTCPeerConnection
  • RTCDataChannel

Communicate Media Streams

WebRTC video chat: caller

WebRTC video chat: callee

WebRTC architecture

WebRTC architecture diagram


Bidirectional communication of arbitrary data between peers

Communicate arbitrary data

Game: caller
onreceivemessage = handle(data); ... var myData = [ { id: "ship1"; x: 24, y: 11, velocity: 7 }, .... ] send(myData);

onreceivemessage = handle(data); ... var myData = [ { id: "ship7"; x: 19, y: 4, velocity: 18 }, .... ] send(myData);
Game: callee


  • Same API as WebSockets
  • Ultra-low latency
  • Optionally unreliable or reliable (UDP)
  • Secure

Audio in the Web Platform

Why Web Audio when we have <audio>?

  • Precise timing of multiple overlapping sounds
  • Audio pipeline/routing for effects and filters
  • Visualize and manipulate audio data

Web Audio can do a LOT...

  • Oscillators
  • Sequences/rhythms/loops
  • Fade-ins/fade-outs/sweeps
  • Time-based event scheduling
  • Frequency and waveform analysis
  • Acoustic environments: reverb, etc.
  • Waveshaping (non-linear distortion)
  • Dynamics processing (compression)
  • Filtering effects: radio, telephone, etc.
  • Distance attenuation and sound directionality
  • Doppler shift: changing pitch for moving sources
  • 3D spatialization: positioning sound at a particular place

Web Audio status

  • Chrome desktop and Android — including gUM input
  • Safari 6.0+ and iOS6+
  • Firefox 25 desktop and Android
  • Mic to speaker latency as low as 5ms
More information? Web Audio talk demos

getUserMedia ☞ Web Audio

// Success callback when requesting audio input stream
function gotStream(stream) {
    var audioContext = new webkitAudioContext();

    // Create an AudioNode from the stream
    var mediaStreamSource = audioContext.createMediaStreamSource(stream);

    // Connect it to the destination or any other node for processing!

navigator.getUserMedia( {audio:true}, gotStream);

gUM ☞ Web Audio ☞ RTCPeerConnection

Capture microphone input and stream it to a peer with processing applied:

navigator.getUserMedia('audio', gotAudio);
function gotAudio(stream) {
  var microphone = context.createMediaStreamSource(stream);
  var filter = context.createBiquadFilter();
  var peer = context.createMediaStreamDestination();

More Media Stream integration examples


  • New proposed standard
  • Standard MIDI files: not just cheesy background music!
  • Connect controllers, synthesizers and more
  • Implemented in Chrome behind a flag - Mac, Windows, Linux, ChromeOS and Android!