Making The Web Rock
Web Audio
Google Chrome Developer Advocate
Why another audio API?
We have <audio> already!
<audio> hides the steps of loading, decoding and playing
<audio controls src="mysound.ogg"></audio>
Sometimes that's the right thing!
Web Audio provides:
2) An audio pipeline/routing system for effects and filters
Web Audio provides:
3) Hooks to analyze and
visualize audio data on the fly
DEMO
(Analysis, Filtering, Visualization)
What is audio useful for, anyway?
- Gaming
- Application UI feedback
- Musical applications
- Audio education
- Audio processing
Building Simple App/Game Audio is easy
- Load audio files with XHR
- Tell Web Audio to decode them into buffers
- Create source node, point at buffer, connect it
- Call
start()
!
Loading and Playing a Sound
var myBuffer = null, context = new AudioContext();
function loadDogSound(url) {
var request = new XMLHttpRequest();
request.open("GET", "dogBarking.mp3", true);
request.responseType = "arraybuffer";
request.onload = function() {
context.decodeAudioData( request.response,
function(buffer) { myBuffer = buffer; } ); }
request.send();
}
function playSound( buffer ) {
var sourceNode = audioContext.createBufferSource();
sourceNode.buffer = myBuffer;
sourceNode.connect( audioContext.destination );
sourceNode.start( 0 );
}
Web Audio API is based on a graph
Web Audio minimizes glitching
Web Audio runs in a separate thread,
so audio and graphics don't compete as much.
You schedule Web Audio events in the future,
and the system takes care of them.
Scheduling Sound Playback
function playEverySecondForTenSeconds( myBuffer ) {
for (var i=0; i<10; i++) {
var sourceNode = context.createBufferSource();
sourceNode.buffer = myBuffer;
sourceNode.connect( context.destination );
sourceNode.start( context.now + i );
}
}
Scheduling in a complex world
For dynamic rhythms, you need to combine web audio and system timing. See
article.
Scheduling in Web Audio
Not just about start( time )!
ANY AudioParam can be scheduled -
frequency, gain, detune, delayTime...
Scheduling on AudioParams
interface AudioParam {
attribute value;
// Parameter automation
void setValueAtTime( value, time );
void linearRampToValueAtTime( value, time );
void exponentialRampToValueAtTime( value, time );
void setTargetAtTime( target, time, timeConstant );
void setValueCurveAtTime( values, time, duration );
void cancelScheduledValues( startTime );
}
Gain Fade Example
var envelope = context.createGain();
mySoundNode.connect( envelope );
envelope.connect( context.destination );
var now = context.currentTime;
envelope.gain.setValueAtTime( 0, now );
envelope.gain.linearRampToValueAtTime( 1.0, now + 2.0 );
envelope.gain.linearRampToValueAtTime( 0.0, now + 4.0 );
mySoundNode.start(0);
Effects in Web Audio
- Biquad Filtering - lowpass, hipass, etc.
- Delays and delay effects
- Waveform synthesis: oscillators
- Envelopes
- Offline processing
- Compression
- Convolution
- Waveshaping
- Positioning/Panning/Doppler
- Custom Javascript processing
Audio for Music Applications
Most "musical effects" are more complex than
just a single filter or delay.
AudioParams can also be driven by audio-rate signals -
a chorus effect is just an oscillator changing delayTime!
DEMO
(you may need headphones for this one, sorry...)
<audio> Integration
Web Audio can also process <audio> streams
(and WebRTC, too!)
Audio Input
Access to audio input devices too!
Web MIDI
This is not cheesy background music!
That's "Standard MIDI files."
MIDI lets you connect controllers, synthesizers and more to your computer.
Asking for MIDI devices
window.addEventListener('load', function() {
navigator.requestMIDIAccess().then(
onMIDIInit,
onMIDISystemError );
});
Enumerating MIDI output devices
function onMIDIInit( midi ) {
for (var input of midiAccess.outputs.values())
input.send( [0x90, 3, 32] );
}
Don't forget hot-plugging MIDI devices!
midiAccess.onstatechange =
function midiConnectionStateChange( e ) {
populateMIDIInSelect();
};
MIDI Message syntax
MIDI has 16 virtual channels, blah blah blah.
Enumerating MIDI input devices
function onMIDIInit( midi ) {
for (var input of midiAccess.inputs.values())
input.onmidimessage = midiMessageReceived;
}
Parsing MIDI messages
function midiMessageReceived( ev ) {
var cmd = ev.data[0] >> 4;
var channel = ev.data[0] & 0xf;
var noteNumber = ev.data[1];
var velocity = 0;
if (ev.data.length > 2)
velocity = ev.data[2];
// MIDI noteon with velocity=0 is the same as noteoff
if ( cmd==8 || ((cmd==9)&&(velocity==0)) ) { // noteoff
noteOff( noteNumber );
} else if (cmd == 9) { // note on
noteOn( noteNumber, velocity);
} else if (cmd == 11) { // controller message
controller( noteNumber, velocity);
} else {
// probably sysex!
}
}
The Incredible Thing about Web MIDI
ZERO-FRICTION
controller and device access!
Web MIDI support
Shipping in Chrome
Works on Android Chrome with USB OTG!
Web Audio Support on Desktop
Chrome, Safari, Firefox, Edge
Web Audio Support on Mobile
Chrome for Android has support (higher latency)
iOS Safari 6.0 has Web Audio (with some caveats)
FFOS
Future App Opportunities
- Immersive gaming audio
- Audio feedback and input in app UX
- Audio education - explore techniques, share with others
- Super-low-friction music applications - synthesis to DAW
What's NOT there (yet) for web audio
- Any kind of plugin/VSTi hooks
- Multi-interface hooks
(multi-channel, partly; multi-device, no.)
Next Things for Web Audio & Web MIDI
- Audio Workers (processing audio in JS)
- Figuring out device access and selection
- latency alignment
- Finalizing v1
- Low-level access to audio stream and resampling
Web Audio Weekly
If you're not already reading Chris Lowis' Web Audio Weekly, you should.
What I want from you:
- More developers to explore Web Audio and Web MIDI
- Tell us what's not there
- Help us prioritize to make the web platform awesome for audio apps!
- Explore audio techniques and share with others
End
Questions?
cwilso@google.com
@cwilso
+Chris Wilson