Skip to content

Commit

Permalink
feat(Call): Dynascale support for Plain-JS SDK (#914)
Browse files Browse the repository at this point in the history
Introduces `call.bindVideoElement` and `call.bindAudioElement` methods
which apply dynascale handlers to the provided video and audio elements,
simplifies SDK-specific implementations.

### Breaking Changes
- `call.viewportTracker` has been replaced with `call.dynascaleManager`
- `call.updateSubscriptionsPartial`
- `video` and `screen` kinds are deprecated and will be removed soon.
Please switch to `videoTrack` or `screenShareTrack`
- `<Audio />` component:
- `audioStream` prop is dropped and now it requires a `participant` prop
- `sinkId` doesn't have to be provided anymore, as our SDK handles this
- `<Video />`
- `kind` prop has been replaced with `trackType`. Accepted values:
`videoTrack`, `screenShareTrack` and `none`
- `<ParticipantView />`
- `videoMode` has been renamed to `trackType`. Accepted values:
`videoTrack`, `screenShareTrack` and `none`

---------

Co-authored-by: Oliver Lazoroski <[email protected]>
Co-authored-by: Zita Szupera <[email protected]>
Co-authored-by: Zita Szupera <[email protected]>
Co-authored-by: Khushal Agarwal <[email protected]>
  • Loading branch information
5 people authored Aug 30, 2023
1 parent 809f2f7 commit d295fd3
Show file tree
Hide file tree
Showing 60 changed files with 2,032 additions and 2,987 deletions.
3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,8 @@
"nx": "16.0.1",
"prettier": "^2.8.4",
"typescript": "^4.9.5",
"vercel": "^30.2.1"
"vercel": "^32.1.0",
"vite": "^4.4.9"
},
"resolutions": {
"react-native-incall-manager@^4.0.0": "patch:react-native-incall-manager@npm%3A4.0.1#./.yarn/patches/react-native-incall-manager-npm-4.0.1-b8859eeb2a.patch",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,7 @@ const apiKey = 'your-api-key';
const token = 'authentication-token';
const user: User = { id: 'user-id' };

const client = new StreamVideoClient({
apiKey,
token,
user,
});
const client = new StreamVideoClient({ apiKey, token, user });

const call = client.call('default', 'call-id');

Expand Down Expand Up @@ -80,11 +76,7 @@ call.state.participants$.subscribe((participants) => {
Now let's see what happens in the `renderParticipant` method, because that's what does the heavy lifting:

```typescript
import {
StreamVideoParticipant,
Call,
SfuModels,
} from '@stream-io/video-client';
import { StreamVideoParticipant, Call } from '@stream-io/video-client';

// The quickstart uses fixed video dimensions for simplification
const videoDimension = {
Expand All @@ -102,32 +94,20 @@ const renderVideo = (call: Call, participant: StreamVideoParticipant) => {
videoEl.id = `video-${participant.sessionId}`;
videoEl.width = videoDimension.width;
videoEl.height = videoDimension.height;
videoEl.autoplay = true;
}
if (videoEl.srcObject !== participant.videoStream) {
videoEl.srcObject = participant.videoStream || null;
}
if (
!participant.isLocalParticipant &&
participant.publishedTracks.includes(SfuModels.TrackType.VIDEO) &&
!participant.videoDimension
) {
// We need to subscribe to video tracks
// We provide the rendered video dimension to save bandwidth
call.updateSubscriptionsPartial('video', {
[participant.sessionId]: {
dimension: {
width: videoDimension.width,
height: videoDimension.height,
},
},
});
// bind the video element to the participant
// that way, our SDK will automatically render the video and
// make sure the element stays in sync with the participant's state
const unbind = call.bindVideoElement(
videoEl,
participant.sessionId,
'videoTrack',
);
}

return videoEl;
};

const renderAudio = (participant: StreamVideoParticipant) => {
const renderAudio = (call: Call, participant: StreamVideoParticipant) => {
// We don't render audio for local participant
if (participant.isLocalParticipant) return;

Expand All @@ -138,11 +118,10 @@ const renderAudio = (participant: StreamVideoParticipant) => {
if (!audioEl) {
audioEl = document.createElement('audio');
audioEl.id = `audio-${participant.sessionId}`;
audioEl.autoplay = true;
}

if (audioEl.srcObject !== participant.audioStream) {
audioEl.srcObject = participant.audioStream || null;
// bind the audio element to the participant
// that way, our SDK will automatically play the audio and
// make sure the element stays in sync with the participant's state
const unbind = call.bindAudioElement(audioEl, participant.sessionId);
}

return audioEl;
Expand All @@ -153,7 +132,7 @@ export const renderParticipant = (
participant: StreamVideoParticipant,
) => {
return {
audioEl: renderAudio(participant),
audioEl: renderAudio(call, participant),
videoEl: renderVideo(call, participant),
};
};
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,21 +82,22 @@ If you want to display information about the joined participants of the call you

The `StreamVideoParticipant` object contains the following information:

| Name | Description |
| -------------------- | --------------------------------------------------------------------------- |
| `user` | The user object for this participant. |
| `publishedTracks` | The track types the participant is currently publishing |
| `joinedAt` | The time the participant joined the call. |
| `connectionQuality` | The participant's connection quality. |
| `isSpeaking` | It's `true` if the participant is currently speaking. |
| `isDominantSpeaker` | It's `true` if the participant is the current dominant speaker in the call. |
| `audioLevel` | The audio level of the participant. |
| `audioStream` | The published audio `MediaStream`. |
| `videoStream` | The published video `MediaStream`. |
| `screenShareStream` | The published screen share `MediaStream`. |
| `isLocalParticipant` | It's `true` if the participant is the local participant. |
| `pin` | Holds pinning information. |
| `reaction` | The last reaction this user has sent to this call. |
| Name | Description |
| ------------------------- | --------------------------------------------------------------------------- |
| `user` | The user object for this participant. |
| `publishedTracks` | The track types the participant is currently publishing |
| `joinedAt` | The time the participant joined the call. |
| `connectionQuality` | The participant's connection quality. |
| `isSpeaking` | It's `true` if the participant is currently speaking. |
| `isDominantSpeaker` | It's `true` if the participant is the current dominant speaker in the call. |
| `audioLevel` | The audio level of the participant. |
| `audioStream` | The published audio `MediaStream`. |
| `videoStream` | The published video `MediaStream`. |
| `screenShareStream` | The published screen share `MediaStream`. |
| `isLocalParticipant` | It's `true` if the participant is the local participant. |
| `pin` | Holds pinning information. |
| `reaction` | The last reaction this user has sent to this call. |
| `viewportVisibilityState` | The viewport visibility state of the participant. |

## Client state

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -94,16 +94,7 @@ this.camera.selectDirection(defaultDirection);

#### In call

This is how you can render video of a participant:

```typescript
const videoEl = document.createElement('video');
videoEl.autoplay = true;

if (videoEl.srcObject !== participant.videoStream) {
videoEl.srcObject = participant.videoStream || null;
}
```
Follow our [Playing Video and Audio guide](../../guides/playing-video-and-audio/).

#### Lobby preview

Expand All @@ -114,10 +105,15 @@ const call = client.call('default', '123');

await call.camera.enable();
const videoEl = document.createElement('video');
videoEl.playsInline = true;
videoEl.muted = true;
videoEl.autoplay = true;

if (videoEl.srcObject !== call.camera.state.mediaStream) {
videoEl.srcObject = call.camera.state.mediaStream || null;
if (videoEl.srcObject) {
videoEl.play();
}
}
```

Expand Down Expand Up @@ -176,16 +172,7 @@ call.microphone.state.selectedDevice$.subscribe(console.log); // Reactive value

#### In call

This is how you can play audio of a participant:

```typescript
const audioEl = document.createElement('audio');
audioEl.autoplay = true;

if (audioEl.srcObject !== participant.audioStream) {
audioEl.srcObject = participant.audioStream || null;
}
```
Follow our [Playing Video and Audio guide](../../guides/playing-video-and-audio/).

#### Lobby preview

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
---
title: Playing Video and Audio
description: Learn how to correctly play participants' video and audio.
---

In this guide, we'll learn how to render and play participants' video and audio by using the JS SDK provided primitives.

## Playing Video

Our JS SDK exposes a low-level method that binds a video element to a participant's video track.
This method can be found in `call.bindVideoElement`. It takes three arguments:

- the video element to bind to
- the participant's `sessionId`
- the kind of video track to bind to (either `videoTrack` or `screenShareTrack` for screen sharing)

This method needs to be called only once, usually after the element is mounted in the DOM.

```js
let videoElement = document.getElementById('my-video-element');
if (!videoElement) {
videoElement = document.createElement('video');
videoElement.id = 'my-video-element';
document.body.appendChild(videoElement);

// bind the video element to the participant's video track
// use the returned `unbind()` function to unbind the video element
const unbind = call.bindVideoElement(
videoElement,
participant.sessionId,
'videoTrack',
);
}
```

### Playing Screen Sharing

Similar to the _Rendering Video_, a screenshare track can be rendered like this:

```js
let screenElement = document.getElementById('my-screenshare-element');
if (!screenElement) {
screenElement = document.createElement('video');
screenElement.id = 'my-screenshare-element';
document.body.appendChild(screenElement);

// bind the video element to the participant's screen track
// use the returned `unbind()` function to unbind the video element
const unbind = call.bindVideoElement(
screenElement,
participant.sessionId,
'screenShareTrack', // note the 'screenShareTrack' argument
);
}
```

### Playing Audio

Our JS SDK exposes a low-level method that binds an audio element to a participant's audio track.
This method can be found in `call.bindAudioElement`. It takes two arguments:

- the audio element to bind to
- the participant's `sessionId`

This method needs to be called only once, usually after the element is mounted in the DOM.

```js
let audioElement = document.getElementById('my-audio-element');
if (!audioElement) {
audioElement = document.createElement('audio');
audioElement.id = 'my-audio-element';
document.body.appendChild(audioElement);

// bind the audio element to the participant's audio track
// use the returned `unbind()` function to unbind the audio element
const unbind = call.bindAudioElement(audioElement, participant.sessionId);
}
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
---
title: Participant visibility tracking
description: Learn how to set up participant visibility tracking the screen.
---

Depending on the UI layout of your call, some participants may not be visible on the screen at all times.
This is especially true for large calls with many participants. We can optimize the performance and bandwidth usage of our SDK
by only subscribing to video for participants that are visible on the screen or within a certain viewport.

To help you keep track of which participants are visible on the screen, our SDK provides a few helpers.
Once you set up visibility tracking to a participant, the client can automatically subscribe and unsubscribe to the video and screen stream of the given participant.

## Setup a visibility tracker

Any DOM element can be tracked for visibility. In a typical scenario, that would be the element that wraps the participant's "box".
You can register a visibility tracker with the SDK by calling the `call.trackElementVisibility` method:

```js
const myParticipantElement = document.getElementById('my-participant-element');

// Track the visibility of the participant's element
// you can use the returned function to stop tracking an element.
const untrack = call.trackElementVisibility(
myParticipantElement,
participant.sessionId,
'videoTrack', // or 'screenShareTrack' if you want to track screen share visibility
);
```

## Setup a viewport

In our context, a _Viewport_ is a section/container on the screen that wraps participant's video elements.
Typically, this is a scrollable container. You can register a viewport with the SDK by calling the `call.setViewport` method:

```js
const viewport = document.getElementById('my-participant-container');

// sets the viewport
// you can use the returned function to unset the viewport.
const unset = call.setViewport(viewport);
```

## Access the visible participants

Every participant can have three visibility states for its video and screen share tracks:

- `VISIBLE` - the track is visible on the screen
- `INVISIBLE` - the track is not visible on the screen
- `UNKNOWN` - the track's visibility is unknown (e.g. the participant is not tracked)

Visibility state is available in the participant's state:

```js
import { VisibilityState } from '@stream-io/video-client';

call.state.participants.forEach((participant) => {
const { viewportVisibilityState } = participant;

// The participant's video visibility in the viewport
switch (viewportVisibilityState.videoTrack) {
case VisibilityState.VISIBLE:
// The participant's video track is visible in the viewport
break;
case VisibilityState.INVISIBLE:
// The participant's video track is not visible in the viewport
break;
case VisibilityState.UNKNOWN:
// The participant's video track visibility in the viewport is unknown
break;
}

// The participant's screen share visibility in the viewport
switch (viewportVisibilityState.screenShareTrack) {
case VisibilityState.VISIBLE:
// The participant's screen share track is visible in the viewport
break;
case VisibilityState.INVISIBLE:
// The participant's screen share track is not visible in the viewport
break;
case VisibilityState.UNKNOWN:
// The participant's screen share track visibility in the viewport is unknown
break;
}
});
```
3 changes: 2 additions & 1 deletion packages/client/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,10 @@ export * from './src/StreamSfuClient';
export * from './src/devices';
export * from './src/store';
export * from './src/sorting';
export * from './src/helpers/DynascaleManager';
export * from './src/helpers/ViewportTracker';

export * from './src/helpers/sound-detector';
export * as Browsers from './src/helpers/browsers';

export * from './src/client-details';
export * from './src/logger';
13 changes: 7 additions & 6 deletions packages/client/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -46,20 +46,21 @@
"devDependencies": {
"@openapitools/openapi-generator-cli": "^2.6.0",
"@rollup/plugin-replace": "^5.0.2",
"@rollup/plugin-typescript": "^11.1.0",
"@rollup/plugin-typescript": "^11.1.2",
"@types/jsonwebtoken": "^9.0.1",
"@types/rimraf": "^3.0.2",
"@types/sdp-transform": "^2.4.6",
"@types/ua-parser-js": "^0.7.36",
"@types/ws": "^8.5.4",
"@vitest/coverage-c8": "^0.31.0",
"@vitest/coverage-v8": "^0.34.2",
"dotenv": "^16.3.1",
"happy-dom": "^10.11.0",
"prettier": "^2.8.4",
"rimraf": "^3.0.2",
"rollup": "^3.25.1",
"rollup": "^3.28.1",
"typescript": "^4.9.5",
"vite": "^4.3.9",
"vitest": "^0.30.1",
"vitest-mock-extended": "^1.1.3"
"vite": "^4.4.9",
"vitest": "^0.34.3",
"vitest-mock-extended": "^1.2.0"
}
}
Loading

0 comments on commit d295fd3

Please sign in to comment.