How To Create A Responsive 8-Bit Drum Machine Using Web Audio, SVG And Multitouch
In this little tutorial, I’m going to share some tips I recently followed to build a fun demo for the Build 2016 conference. The idea was to create a small 8-bit drum machine, with 8-bit sounds and graphics:
This small web app was used in one of our demos to illustrate how you can easily provide a temporary offline experience when your hosted web app loses Internet connectivity.
Building this drum machine might sound trivial, but it raises some interesting questions. For instance, how do you handle the hit testing for the various pads of this bitmap image? How do you guarantee the same experience across all devices and browsers, accounting for resolution and touch support?
Options To Handle Hit Testing
Consider the image used in the demo just below. I built it by degrading the original image to 8-bit, as part of our little retro joke. The full online experience would provide the best graphics and sounds, while the offline experience would offer a degraded 8-bit version.
You’ll probably be tempted to press on the various black circles to make some noises. Let’s figure out how to handle clicks on those parts of the image using HTML.
The older folks among us might be tempted to use an image map, with the map
and area
tags. We can define areas of the image with various geometric shapes, including ellipses and circles. That might do the trick.
Unfortunately, this approach has two problems, the first one being major:
- It’s pixel-based, so it won’t scale across resolutions and devices. Suppose your bitmap image had a fixed size of 600 × 400 pixels. If you defined areas inside it, you would build them based on this 600 × 400 resolution. As soon as you expand or contract the image to match the device’s resolution, the areas wouldn’t match the image any longer. The jQuery RWD Image Maps plugin will compute the area’s coordinates dynamically. But I thought that adding the jQuery library and this plugin just for this simple hit-testing demo would be too much. I was looking for something simpler.
- The feature was originally defined to enable navigation to URLs. I’d rather call a JavaScript function to use web audio to play the sounds. Still, you could use this trick by defining an
href="#"
and registering the click event in the area.
My next idea was to put it in a 2D canvas, which I could stretch to the current resolution using CSS. This solution might work, but it would be a really complex implementation for such a simple web app. Indeed, this would entail:
- manually defining the hit zones by code;
- handling the click event on the canvas, finding the exact x and y mouse coordinates of the click, and scaling that to the current resolution;
- computing the hit-testing algorithm by checking whether those 2D coordinates are in one of the ellipses that we have defined in the code.
It’s not tremendously complex, but it’s a fair bit of math for something that should be simple to build. Also, if you’d like to keep the aspect ratio of the image, the 2D canvas method also entails resizing the canvas’ size by computing in the code the ratio during the loading phase and inside the onresize
event.
Finally, I thought of what should be the most obvious solution. When you think about what scales across devices, facilitates hit-testing and handles aspect ratios, the answer is naturally SVG. The “S” stands for “scalable” — we can define a viewbox with some parameters to lock the aspect ratio.
Sure, but you’re probably going to say, “How does that help us define the hit zones in the image?” Not to mention that we’re working with a bitmap image, not a vector?
Let me show you what I’ve done. Follow these very same steps for any similar experience you’d like to build on top of a bitmap image.
Save the image above of the 8-bit drum machine on your computer.
We need a tool that will generate an XML of our SVG content. I’ll use InkScape, a free application, but you can probably do the same thing with SVG Edit in the browser.
Open Inkscape, go to “File” → “Document Properties,” and change the “Custom size” values to 160 × 90.
Go to “File” → “Import,” and choose the 8bitsdrummachine.jpg
file that you’ve saved. Choose “Link” as the import type, and leave the other options as is. Then, stretch the image to map our drawing zone:
We’re now going to draw ellipses on top on our image. Choose red, and make sure that your ellipses perfectly cover the black drums:
Those seven ellipses will be our hit zones.
To be able to find those SVG forms easily in our code, right-click on each of them, choose “Object Properties” and change its “ID” and “Label” properties to button1
(up to button7
) and #button1
(up to #button7
), respectively:
Now that we have defined the hit zones precisely, let’s make them transparent, rather than hiding them under the image. Right-click on each of them, choose “Fill and Stroke,” and set the alpha value (“A”) to 0
:
Note: We’re using simple ellipses here, but you could draw any complex shape allowed by SVG on top of the image to achieve a similar result.
Save the result on your hard drive by going to “File” → “Save as…” in the menu.
Open this file with your favorite editor (Notepad++, Sublime, Visual Studio Code, whatever), and add the following line of XML just after the viewBox
attribute:
preserveAspectRatio="xMidYMin meet"
Mozilla’s documentation explains what this SVG attribute does.
You’re done with this part. This piece of SVG will embed the bitmap image, add a layer of a transparent vector form that can be clicked on, and scale across resolutions, keeping the aspect ratio, thanks to the preserveAspectRatio
attribute.
Now, we need to set the code to tie those SVG ellipses to our event handler.
What About Web Audio?
I’ve already covered web audio elsewhere in detail. I’m reusing part of that code, including the Sound
object:
var Sound = (function () {
function Sound(url, audioContext, masterGain, loop, callback) {
this.url = url;
this.audioContext = audioContext;
this.masterGain = masterGain;
this.loop = loop;
this.callback = callback;
this.gain = this.audioContext.createGain();
this.gain.connect(this.masterGain);
this.isReadyToPlay = false;
this.loadSoundFile(url);
}
Sound.prototype.loadSoundFile = function () {
if (canUseWebAudio) {
var that = this;
// make XMLHttpRequest (AJAX) on server
var xhr = new XMLHttpRequest();
xhr.open('GET', this.url, true);
xhr.responseType = 'arraybuffer';
xhr.onload = function (e) {
// decoded binary response
that.audioContext.decodeAudioData(this.response,
function (decodedArrayBuffer) {
// get decoded buffer
that.buffer = decodedArrayBuffer;
that.isReadyToPlay = true;
if (that.callback) {
that.callback();
}
}, function (e) {
console.log('Error decoding file', e);
});
};
xhr.send();
}
};
Sound.prototype.play = function () {
if (canUseWebAudio && this.isReadyToPlay) {
// make source
this.source = this.audioContext.createBufferSource();
// connect buffer to source
this.source.buffer = this.buffer;
this.source.loop = this.loop;
// connect source to receiver
this.source.connect(this.gain);
// play
this.source.start(0);
}
};
return Sound;
})();
Then, as I’ve also explained elsewhere, we need to handle web audio in a special way for iOS, by unlocking the AudioContext
:
try {
if (typeof AudioContext !== 'undefined') {
audioContext = new AudioContext();
canUseWebAudio = true;
} else if (typeof webkitAudioContext !== 'undefined') {
audioContext = new webkitAudioContext();
canUseWebAudio = true;
}
if (/iPad|iPhone|iPod/.test(navigator.platform)) {
this._unlockiOSaudio();
}
else {
audioUnlocked = true;
}
15.} catch (e) {
console.error("Web Audio: " + e.message);
17.}
18.
19.function unlockiOSaudio() {
var unlockaudio = function () {
var buffer = audioContext.createBuffer(1, 1, 22050);
var source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.start(0);
setTimeout(function () {
if ((source.playbackState === source.PLAYING_STATE || source.playbackState === source.FINISHED_STATE)) {
audioUnlocked = true;
window.removeEventListener('touchend', unlockaudio, false);
}
}, 0);
};
window.addEventListener('touchend', unlockaudio, false);
}
Handling Multitouch Across Devices
In my opinion, the best touch specification for the web is pointer events. I’ve already covered the specification in detail. We’ll use it to wire our event handler to our SVG shapes.
In the code below, we’re referring to all of the SVG shapes that we built with InkScape — shapes with which we’ve associated IDs (button1
, button2
, etc.). Then, we’re loading the various 8-bit sounds from the web server and decoding them via our Sound
object. Finally, using the pointerdown
event, we’re playing each sound associated with each SVG shape:
var soundsCollection = [];
var buttonsCollection = [];
buttonsCollection.push(document.getElementById("button1"));
buttonsCollection.push(document.getElementById("button2"));
buttonsCollection.push(document.getElementById("button3"));
buttonsCollection.push(document.getElementById("button4"));
buttonsCollection.push(document.getElementById("button5"));
buttonsCollection.push(document.getElementById("button6"));
buttonsCollection.push(document.getElementById("button7"));
if (canUseWebAudio) {
masterGain = audioContext.createGain();
masterGain.connect(audioContext.destination);
soundsCollection.push(new Sound("./8bits_sounds/clap.wav", audioContext, masterGain, false, newSoundLoaded));
soundsCollection.push(new Sound("./8bits_sounds/cowbell.wav", audioContext, masterGain, false, newSoundLoaded));
soundsCollection.push(new Sound("./8bits_sounds/hihat1.wav", audioContext, masterGain, false, newSoundLoaded));
soundsCollection.push(new Sound("./8bits_sounds/kick1.wav", audioContext, masterGain, false, newSoundLoaded));
soundsCollection.push(new Sound("./8bits_sounds/snare1.wav", audioContext, masterGain, false, newSoundLoaded));
soundsCollection.push(new Sound("./8bits_sounds/tom1.wav", audioContext, masterGain, false, newSoundLoaded));
soundsCollection.push(new Sound("./8bits_sounds/kick3.wav", audioContext, masterGain, false, newSoundLoaded));
}
var soundsLoaded = 0;
function newSoundLoaded() {
soundsLoaded++;
if (soundsLoaded == 7) {
// Ready to rock & roll!
for (var i = 0; i < 7; i++) {
buttonsCollection[i].addEventListener("pointerdown", onPointerDown);
}
}
}
function onPointerDown(eventArgs) {
var buttonClicked = eventArgs.currentTarget.id;
var soundId = buttonClicked.substr(buttonClicked.length - 1) - 1;
var soundToPlay = soundsCollection[soundId];
soundToPlay.play();
}
With this code, we’re supporting multitouch in a simple way. But it has one drawback: it only works in the Microsoft Edge browser. To support all browsers and devices, you can use the jQuery-based Pointer Events Polyfill.
<script src="https://code.jquery.com/pep/0.4.1/pep.min.js"></script>
Add the following property to the HTML element that contains the relevant UI (the body
element in our demo):
touch-action="none"
Going Further With Synthesizing
In this small demo, I’ve downloaded recorded samples of 8-bit sounds. But web audio has some great features that could help you generate sounds via oscillators, for example, as Chris Lowis explains on Dev.Opera.
I hope that you’ve enjoyed this little tutorial and that it helps you solve some of your issues or create cool experiences.
"This article is part of the web development series from Microsoft tech evangelists and engineers on practical JavaScript learning, open source projects, and interoperability best practices including Microsoft Edge browser. We encourage you to test across browsers and devices including Microsoft Edge – the default browser for Windows 10 – with free tools on dev.microsoftedge.com, including F12 developer tools — seven distinct, fully-documented tools to help you debug, test, and speed up your webpages. Also, visit the Edge blog to stay updated and informed from Microsoft developers and experts."
Further Reading
- Recreating Theremin With JS And Web Audio API
- Guidelines For Designing With Audio
- How To Increase Workflow And Reduce Stress With Nature Sounds
- Spotify Playlists To Fuel Your Coding And Design Sessions