https://www.youtube.com/watch?v=FO9Bbnv0wpI
When we think of web audio we think of 1998 background audio. There are policies in place in browsers to stop audio playing in the background [for the most part].
How do we go about creating web audio experiences?
// The context is connected to the device speakers.
// You only need one of these per document.
const context = new AudioContext();
// Fetch the file
fetch('sound.mp4')
// Read it into memory as an arrayBuffer
.then(response => response.arrayBuffer())
// Turn it from mp3/aac/whatever into raw audio data
.then(arrayBuffer => context.decodeAudioData(arrayBuffer))
.then(audioBuffer => {
// Now we're ready to play!
});
// Create a source:
// This represents a playback head.
const source = context.createBufferSource();
// Give it the audio data we loaded:
source.buffer = audioBuffer;
// Plug it into the output:
source.connect(context.destination);
// And off we go!
source.start();