Path: blob/main/files/en-us/web/api/audioworkletglobalscope/index.md
6581 views
------{{APIRef("Web Audio API")}}
The AudioWorkletGlobalScope interface of the Web Audio API represents a global execution context for user-supplied code, which defines custom {{domxref("AudioWorkletProcessor")}}-derived classes.
Each {{domxref("BaseAudioContext")}} has a single {{domxref("AudioWorklet")}} available under the {{domxref("BaseAudioContext.audioWorklet", "audioWorklet")}} property, which runs its code in a single AudioWorkletGlobalScope.
As the global execution context is shared across the current BaseAudioContext, it's possible to define any other variables and perform any actions allowed in worklets — apart from defining AudioWorkletProcessor derived classes.
{{InheritanceDiagram}}
Instance properties
{{domxref("AudioWorkletGlobalScope.currentFrame", "currentFrame")}} {{ReadOnlyInline}}
: Returns an integer that represents the ever-increasing current sample-frame of the audio block being processed. It is incremented by 128 (the size of a render quantum) after the processing of each audio block.
{{domxref("AudioWorkletGlobalScope.currentTime", "currentTime")}} {{ReadOnlyInline}}
: Returns a double that represents the ever-increasing context time of the audio block being processed. It is equal to the {{domxref("BaseAudioContext.currentTime", "currentTime")}} property of the {{domxref("BaseAudioContext")}} the worklet belongs to.
{{domxref("AudioWorkletGlobalScope.sampleRate", "sampleRate")}} {{ReadOnlyInline}}
: Returns a float that represents the sample rate of the associated {{domxref("BaseAudioContext")}}.
Instance methods
{{domxref("AudioWorkletGlobalScope.registerProcessor", "registerProcessor()")}}
: Registers a class derived from the {{domxref('AudioWorkletProcessor')}} interface. The class can then be used by creating an {{domxref("AudioWorkletNode")}}, providing its registered name.
Examples
In this example we output all global properties into the console in the constructor of a custom {{domxref("AudioWorkletProcessor")}}.
First we need to define the processor, and register it. Note that this should be done in a separate file.
Next, in our main scripts file we'll load the processor, create an instance of AudioWorkletNode — passing the name of the processor to it — and connect the node to an audio graph. We should see the output of console.log calls in the console:
Specifications
{{Specifications}}
Browser compatibility
{{Compat}}