diff --git a/index.html b/index.html index 7057205..7c84cc2 100644 --- a/index.html +++ b/index.html @@ -1214,6 +1214,179 @@

Examples

// Show to user. const videoElement = document.querySelector("video"); videoElement.srcObject = stream; +</script> + + + +
+

Exposing MediaStreamTrack voice activity detection support

+

Some platforms or User Agents may provide built-in support for voice + activity detection. Web applications may want to know whether a user is + speaking when the microphone is muted, so an unmute notification could be + displayed. For that reason, we extend {{MediaStreamTrack}} with the + following properties. +

+

MediaTrackSupportedConstraints Dictionary Extensions

+
+partial dictionary MediaTrackSupportedConstraints {
+  boolean voiceActivityDetection = true;
+};
+
+

Dictionary {{MediaTrackSupportedConstraints}} Members

+
+
voiceActivityDetection of type {{boolean}}, defaulting to + true
+
See + voiceActivityDetection for details.
+
+
+

MediaTrackCapabilities Dictionary Extensions

+
+partial dictionary MediaTrackCapabilities {
+  sequence<boolean> voiceActivityDetection;
+};
+
+

Dictionary {{MediaTrackCapabilities}} Members

+
+
voiceActivityDetection of type + sequence<{{boolean}}>
+
+

If the source does not support voice activity detection, a single + false is reported. If the source supports voice activity + detection, a list with both true and false + are reported. See + voiceActivityDetection for additional + details.

+
+
+
+

MediaTrackConstraintSet Dictionary Extensions

+
+partial dictionary MediaTrackConstraintSet {
+  ConstrainBoolean voiceActivityDetection;
+};
+
+

Dictionary {{MediaTrackConstraintSet}} Members

+
+
voiceActivityDetection of type {{boolean}}, defaulting to + true
+
See + voiceActivityDetection for details.
+
+
+

MediaTrackSettings Dictionary Extensions

+
+partial dictionary MediaTrackSettings {
+  boolean voiceActivityDetection;
+};
+
+

Dictionary {{MediaTrackSettings}} Members

+
+
voiceActivityDetection of type {{boolean}}, defaulting to + true
+
See + voiceActivityDetection for details.
+
+
+

Constrainable Properties

+

The following constrainable properties are defined to apply only to + audio {{MediaStreamTrack}} objects: +

+ + + + + + + + + + + + + + + +
Property NameValuesNotes
+ voiceActivityDetection{{ConstrainBoolean}} +

Voice activity detection allows web applications to be notified + when a voice activity starts.

+
+

MediaStreamTrack Interface Extensions

+
+partial interface MediaStreamTrack {
+  attribute EventHandler onvoiceactivitydetected;
+};
+

+ Let + {{MediaStreamTrack/[[LastVoiceActivityDetectedTimestamp]]}} + be an internal slot of the {{MediaStreamTrack}}, initialized as + undefined. +

+

The onvoiceactivitydetected + attribute is an [=event handler IDL attribute=] for the + `onvoiceactivitydetected` [=event handler=], whose + [=event handler event type=] is + voiceactivitydetected. +

+

+

When the [=User Agent=] detects a voice activity is started in a + track's underlying source, the [=User Agent=] MUST run the + following steps:

+
    +
  1. If {{voiceActivityDetection}} setting of track is set + to false by the ApplyConstraints algorithm, abort + these steps.

  2. +
  3. Let voiceActivityDetectionMinimalInterval be a + [=User Agent=] defined value, depends on [=User Agent=]'s policy on + privacy and power efficiency.
  4. +
  5. If + track.{{MediaStreamTrack/[[LastVoiceActivityDetectedTimestamp]]}} + is not undefined, and {{Performance.now()}} - + track.{{MediaStreamTrack/[[LastVoiceActivityDetectedTimestamp]]}} + is less than voiceActivityDetectionMinimalInterval, abort + these steps.

  6. +
  7. [=Queue a task=] to perform the following steps:

    +
      +
    1. If track.{{MediaStreamTrack/readyState}} is + "ended", abort these steps.

    2. +
    3. +

      [=Fire an event=] named {{voiceactivitydetected}} on + track.

      +
    4. +
    5. +

      Set + track.{{MediaStreamTrack/[[LastVoiceActivityDetectedTimestamp]]}} + to {{Performance.now()}}.

      +
    6. +
    +
  8. +
+

+
+

Examples

+
+<script>
+// Open microphone.
+const stream = await navigator.mediaDevices.getUserMedia({
+  audio: true, voiceActivityDetection: true}
+);
+const [audioTrack] = stream.getAudioTracks();
+
+track.addEventListener("voiceactivitydetected", () => {
+  if (track.muted) {
+    // Show unmute notification.
+  }
+});
 </script>