-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ffmpeg latency and gstreamer pipeline "green screen" #81
Comments
With ffmpeg, I wonder if they are doing any sort of buffering. I suspect the latency is coming from the time FFmpeg started consuming the live source, to the time when livekit-cli is launched to consume the socket. It's also possible there are different ffmpeg params at play to encode at low/no latency. With Gstreamer, I'm surprised to hear you are seeing a green screen, that's usually indicative of something else happening in the source. what if you use the pipeline to dump into a .h264 file? would that contain the expected frames? |
Yes for ffmpeg I haven't investigated too much yet but I think it's the encoding parameters indeed. For Gstreamer, when I use the same pipeline but I save in an mp4 file like |
I did some others test and play with x264 parameter and change the video source, but I still have the issue, I made a screen capture of 2 gstreamer pipeline, the only change is the video dimension
|
I've been working on implementing pion's gstreamer-send example in livekit-cli, and it works as expected. func publishPipeline(room *lksdk.Room) error {
var pub *lksdk.LocalTrackPublication
var pub2 *lksdk.LocalTrackPublication
audioSrc := flag.String("audio-src", "audiotestsrc", "GStreamer audio src")
videoSrc := flag.String("video-src", "videotestsrc pattern=ball", "GStreamer video src")
audioTrack, err := webrtc.NewTrackLocalStaticSample(webrtc.RTPCodecCapability{MimeType: "audio/opus"}, "audio", "pion1")
videoTrack, err := webrtc.NewTrackLocalStaticSample(webrtc.RTPCodecCapability{MimeType: "video/h264"}, "video", "pion2")
if err != nil {
panic(err)
}
pub, err = room.LocalParticipant.PublishTrack(audioTrack, &lksdk.TrackPublicationOptions{})
if err != nil {
return err
}
pub2, err = room.LocalParticipant.PublishTrack(firstVideoTrack, &lksdk.TrackPublicationOptions{})
if err != nil {
return err
}
fmt.Println("pub", pub.SID())
fmt.Println("pub2", pub2.SID())
gst.CreatePipeline("opus", []*webrtc.TrackLocalStaticSample{audioTrack}, *audioSrc).Start()
gst.CreatePipeline("h264", []*webrtc.TrackLocalStaticSample{videoTrack}, *videoSrc).Start()
return nil
} |
An other way to send audio/video from a tiers encoder like ffmpeg and gstreamer can be RTP, I have implemented this example https://github.com/pion/webrtc/tree/master/examples/rtp-to-webrtc in livekit-cli, there is less depency so it's maybe a bette way. I will working on a PR before simulcast (#63) |
@nums I just gave your example a try and can reproduce the green screen. The issue seems to be that it's not sending buffers fast enough. If I produce a 640x360 stream with
Then I do see the ball attempting to move, but the motion isn't smooth at all. It's good to hear the approach with embedding gstreamer has worked well for you. I'm a bit wary about pulling in GStreamer into livekit-cli though. I think a better way to approach it might be to turn your snippet above into an example in |
@nums I am interested in this PR if you are able to do it. |
Hello @afgarcia86, I didn't release PR because I didn't succeed to handle properly PLI request between gstreamer and livekit. I let you know if I progress |
When I enable tune=zerolatency in my gstreamer pipeline mediaserver only shows black screen. If I don't enable it, I have about a second of latency which I do not want. Is there any way to make zerolatency work with livekit? |
closing due to being stale. we recommend using GStreamer Publisher if folks are looking to publish custom GStreamer pipelines: https://github.com/livekit/gstreamer-publisher/ |
I'm trying to use ffmpeg or gstreamer to send a screen capture stream to a livekit room.
I have some issues / questions regarding that:
I'm using TCP or unix socket publishing with FFmpeg: the stream can be played well into my livekit room, but I have 5 seconds of latency when I watch the video.
Is it normal or have you got a shortest latency on your side?
When I test TCP publishing with GStreamer (I can't do it with the unix socket), I've got some frames but most of frames are greens.
Maybe my gstreamer pipeline was not good, so I did some research and I stumbled upon it :
https://github.com/pion/example-webrtc-applications/tree/master/gstreamer-send
The sample works great with VP8 and H264, I did a fork to test if H264 was ok (be careful, you must install more some gst plugin to be able to test H264 ), I have a classical webRTC latency (less than 1 second).
I have logged video pipeline (not so far that mine) and used the same pipeline with livekit-cli TCP :
gst-launch-1.0 -v ximagesrc remote=1 use-damage=0 ! video/x-raw,framerate=30/1 ! videoconvert ! queue ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency key-int-max=20 ! video/x-h264,stream-format=byte-stream ! tcpserversink port=16400 host=127.0.0.1
The same issue occurs.
I saw that there is a discussion here: (I hesitated to post here)
livekit/server-sdk-go#20
and I didn't see the pion gstreamer-send implementation as an example. I am not yet very familiar with go and livekit/pion but can we work on an implementation close to pion gstreamer send ?
The text was updated successfully, but these errors were encountered: