You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In traditional live streaming it is common place to send per-frame picture timing information, typically using the H264 Picture timing SEI messages (ITU-T REC H.264 annex D.2.3).
This timing information is later used to synchronize out of band events like ad insertion using SCTE 35 splice events (SCTE 35 2023r1 section 6.3). Per frame picture timing is widely used in video editing tools.
Retrieving the picture timing information of the rendered frame is quite clumbsy at the moment at it requires interpolating values using the rtp timestamps of the frame which is the only available value apis exposing frames.
An attempt to add the capture timestamp values is being doing at different apis:
In traditional live streaming it is common place to send per-frame picture timing information, typically using the H264 Picture timing SEI messages (ITU-T REC H.264 annex D.2.3).
This timing information is later used to synchronize out of band events like ad insertion using SCTE 35 splice events (SCTE 35 2023r1 section 6.3). Per frame picture timing is widely used in video editing tools.
Implementation/spec wise, we can use the abs-catpure-time header extension which is exposed in RTCRtpContributingSource captureTimestamp WebRTC Extension .
Retrieving the picture timing information of the rendered frame is quite clumbsy at the moment at it requires interpolating values using the rtp timestamps of the frame which is the only available value apis exposing frames.
An attempt to add the capture timestamp values is being doing at different apis:
The text was updated successfully, but these errors were encountered: