AudioContext

The AudioContext represents a set of AudioNode linked together along with audio parameters (buffer size, number of channel, sample rate).

An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.

var dsp = Audio.getContext();
var source = dsp.createNode("custom-source", 0, 2);
var target = dsp.createNode("target", 2, 0);

dsp.connect(source.output[0], target.input[0]);
dsp.connect(source.output[1], target.input[1]);

source.assignProcessor(function(frames, scope) {
    for (var i = 0; i < frames.data.length; i++) {
        for (var j = 0; j < frames.size; j++) {
            var t = j/frames.size;
            frames.data[i][j] = Math.sin(Math.PI * t);
        }
    }
});
source.play();

connect(output, input)

Connect an output channel to an input channel.

Arguments:
output (AudioNodeLink)

Output Channel

input (AudioNodeLink)

Input Channel

createNode(type, inputChannels, outputChannels)

Create a new AudioNode of the specified type.

The following nodes are currently supported :

|| Type || Description || Input || Output || || source || Play a file or a stream || 0 || 1 or 2 || || custom-source || Generate sound with JavaScript || 0 || 1 or 2 || || custom || Process sound in JavaScript || up to 32 || up to 32 || || reverb || Simulate sound reverberation || up to 32 || up to 32 || || delay || Delay a sound || up to 32 || up to 32 || || gain || Lower or raise the amplitude of the sound || up to 32 || up to 32 || || target || An audio node that represent audio output (computer speaker) || 1 or 2 || 0 ||.

Arguments:
type (String)

Type of node ( 'source'|'target'|'custom-source'|'custom'|'reverb'|'delay'|'gain'|'stereo-enhancer')

inputChannels (Integer)

The number of input channels

outputChannels (Integer)

The number of output channels

Returns: AudioNode

A AudioNode of the specified type

disconnect(output, input)

Disconnects two channels.

One channel needs to be an input channel, the other channel needs to be an output channel. The order of the parameters is not important.

Arguments:
output (AudioNodeLink)

Channel 1

input (AudioNodeLink)

Channel 2

run(callback)

Run a function inside the audio thread.

Arguments:
callback (Function)

Function to execute on the Audio thread.

pFFT(xArray, yArray, length, dir) (static) Slow Method

Do a FastFourierTransformation.

Arguments:
xArray (Float[])

An array of length point

yArray (Float[])

An array of length point

length (Integer)

Length of the transformed axis

dir (Integer)

Direction (-1 for reverse or +1 for forward)

bufferSize(readonly)

  • Type: Integer

Get the number of samples per buffer for the AudioContext.

channels(readonly)

  • Type: Integer

Get the number of channels for the AudioContext.

sampleRate(readonly)

  • Type: Integer

Get the samplerate for the AudioContext.

volume

  • Type: Integer

Set or Get volume on the AudioContext.

var dsp = Audio.getContext();
console.log("Volume is " + dsp.volume);
dsp.volume = 0.5;
console.log("Volume is now " + dsp.volume);

Caught a mistake or want to contribute to the documentation? Edit this page on GitHub!