javascript – Can I play JS generated audio in an HTML <audio> tag-ThrowExceptions

Exception or error:
const context = new AudioContext();
let o = null,
    g = null;

function play(){
    o = context.createOscillator();
    g = context.createGain();
    o.type = "sine";

function stop(){

When the play() function is called, a sine wave sound is played which is then stopped by calling the stop() function. I want to send this audio to an html <audio> tag. Is it possible?

How to solve:

After being curious myself of how this could be done, I stumbled on an MDN article doing this exact thing.

It uses the MediaRecorder interface and a MediaStreamDestinationNode. The sound wave created by your oscillator must be sent to the MediaStreamDestinationNode. This node is then used by the MediaRecorder, which captures the data streamed to the node whenever the sound is playing. When the playing is stopped, all the data sent is then converted into a Blob. You can specify the type of blob by settings its type property to the desired MIME type that you want to use. For example audio/mp3.

With URL.createObjectURL() you can create a reference with an URL to this blob which is created. This URL can then be used as the src of the <audio> tag. Now the audio element has a source to play from, which is your recorded sound.

Down below I’ve made an example, based on the code in the article, which records your sine wave and allows it to be replayed in the <audio> element. Note: whenever you re-record, the previous recording is lost.

// Select button and audio elements.
const button = document.querySelector('button');
const audio = document.querySelector('audio');

// Define global variables for oscillator and gain.
let oscillator = null;
let gain = null;
let source = null;

// Create context, stream destination and recorder.
const ctx = new AudioContext();
const mediaStreamDestination = ctx.createMediaStreamDestination();
const recorder = new MediaRecorder(;

// Store the chunks of audio data in an array.
let chunks = [];

// Dump the previous stored blob from memory and clear the chunks array.
// Otherwise, all recorded data will be stored until the page is closed. 
recorder.addEventListener('start', function(event) {
  if (source !== null) {
  chunks.length = 0;

// When all the sound has been recorded, store the recorded data
// in the chunks array. The chunks will later be converted into
// a workable file for the audio element.
recorder.addEventListener('dataavailable', function(event) {
  const { data } = event;

// Whenever the recorder has stopped recording, create a Blob 
// out of the chunks that you've recorded, then create a object url
// to the Blob and pass that url to the audio src property.
recorder.addEventListener('stop', function(event) {
  const blob = new Blob(chunks, { 'type': 'audio/aac' });
  source = URL.createObjectURL(blob);
  audio.src = source;

// Click on the button to start and stop the recording.
button.addEventListener('click', function(event) {
  if (recorder.state !== 'recording') {
    // Create new oscillator and gain.
    oscillator = ctx.createOscillator();
    gain = ctx.createGain();
    // Connect the oscillator and gain to the MediaStreamDestination.
    // And play the sound on the speakers.
    // Start recording and playing.
    oscillator.start(); = 'Stop recording';
  } else {
    // Stop recording and playing.
    oscillator.stop(); = 'Record sine wave';
<button>Make sine wave</button>
<audio controls></audio>

If you have any questions regarding the code above, or I haven’t explained it properly, let me know.

Leave a Reply

Your email address will not be published. Required fields are marked *