MediaStreamAudioDestinationNode

我们的志愿者还没有将这篇文章翻译为 中文 (简体)加入我们帮助完成翻译!
您也可以阅读此文章的English (US)版。

The MediaElementAudioDestinationNode interface represents an audio destination consisting of a WebRTC MediaStream with a single AudioMediaStreamTrack, which can be used in a similar way to a MediaStream obtained from Navigator.getUserMedia.

It is an AudioNode that acts as an audio destination, created using the AudioContext.createMediaStreamDestination method.

Number of inputs 1
Number of outputs 0
Channel count 2
Channel count mode "explicit"
Channel count interpretation "speakers"

Constructor

MediaStreamAudioDestinationNode.MediaStreamAudioDestinationNode()
Creates a new MediaStreamAudioDestinationNode instance.

Properties

Inherits properties from its parent, AudioNode.

MediaStreamAudioDestinationNode.stream
Is a MediaStream containing a single AudioMediaStreamTrack with the same number of channels as the node itself. You can use this property to get a stream out of the audio graph and feed it into another construct, such as a Media Recorder.

Methods

Inherits methods from its parent, AudioNode.

Example

In the following simple example, we create a MediaStreamAudioDestinationNode, an OscillatorNode and a MediaRecorder (the example will therefore only work in Firefox at this time.) The MediaRecorder is set up to record information from the MediaStreamDestinationNode.

When the button is clicked, the oscillator starts, and the MediaRecorder is started. When the button is stopped, the oscillator and MediaRecorder both stop. Stopping the MediaRecorder causes the dataavailable event to fire, and the event data is pushed into the chunks array. After that, the stop event fires, a new blob is made of type opus — which contains the data in the chunks array, and a new window (tab) is then opened that points to a URL created from the blob.

From here, you can play and save the opus file.

<!DOCTYPE html>
<html>
  <head>
    <title>createMediaStreamDestination() demo</title>
  </head>
  <body>
    <h1>createMediaStreamDestination() demo</h1>
    <p>Encoding a pure sine wave to an Opus file </p>
    <button>Make sine wave</button>
    <script>
     var b = document.querySelector("button");
     var clicked = false;
     var chunks = [];
     var ac = new AudioContext();
     var osc = ac.createOscillator();
     var dest = ac.createMediaStreamDestination();
     var mediaRecorder = new MediaRecorder(dest.stream);
     osc.connect(dest);
     b.addEventListener("click", function(e) {
       if (!clicked) {
           mediaRecorder.start();
           osc.start(0);
           e.target.innerHTML = "Stop recording";
           clicked = true;
         } else {
           mediaRecorder.stop();
           osc.stop(0);
           e.target.disabled = true;
         }
     });
     mediaRecorder.ondataavailable = function(evt) {
       // push each chunk (blobs) in an array
       chunks.push(evt.data);
     };
     mediaRecorder.onstop = function(evt) {
       // Make blob out of our blobs, and open it.
       var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
       window.location.href = URL.createObjectURL(blob);
     };
    </script>
  </body>
</html>

Note: You can view this example live, or study the source code, on Github.

Specification

Specification Status Comment
Web Audio API
The definition of 'MediaStreamAudioDestinationNode' in that specification.
Working Draft  

Browser compatibility

Feature Chrome Firefox (Gecko) Internet Explorer Opera Safari (WebKit)
Basic support 10.0webkit 25.0 (25.0)  No support 15.0webkit
22 (unprefixed)
6.0webkit
Constructor No support No support No support No support No support
Feature Android Firefox Mobile (Gecko) IE Mobile Opera Mobile Safari Mobile Chrome for Android
Basic support ? 26.0 ? ? ? 33.0
Constructor No support No support No support No support No support No support

See also

文档标签和贡献者