-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Setting a custom capture framerate #11
Comments
Sorry, no idea. If someone does figure this out, pull requests are welcome.
…On Wed, Sep 6, 2017 at 3:17 PM, Boscop ***@***.***> wrote:
I bought a webcam that can do 30ps Full Hd.
I tested it in OBS-Studio, at first I got very low capture framerate,
about 5 fps. But then i read this post
<https://obsproject.com/forum/threads/getting-the-most-out-of-your-webcam.1036/>
and it worked: By setting the "Resolution/FPS Type" to Custom and setting
Resolution to 1920x1080 <https://i.imgur.com/9JECr6q.png> I got the 30
FPS capture framerate.
Then I tried to use the webcam in my opengl application, capturing it with
Escapi and rendering it to the screen.
But in my application I ALSO only get that low capture framerate with
Escapi :(
I talked on IRC with the developer of OBS about what OBS does when you set
a custom capture FPS, it uses DirectShow functions to change the pin type.
He pointed me to this:
https://github.com/jp9000/libdshowcapture/blob/master/
source/device.cpp#L304
Could the same be done with Windows Media Foundations?
It seems that some cameras (like mine) use a lower capture framerate by
default than what they could, so it would be useful to be able to set a
custom capture framerate with Escapi, too :)
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#11>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AEQ_Rw2sv_TWhafdXmMfKDKDg8PYKVETks5sfo1SgaJpZM4POSC7>
.
|
I can help a bit since I am working on the same subject. The first thing you need to watch for is to find the supported "capture mode". What I call capture mode is a combination of encoding, (like RGBA32, MJPG, NV12, etc..) width, height and framerate. For that, I recommend to use a tool like graph studio next. Then, you have to note a few things:
|
I'd really appreciate your help!
Btw, I've been using my own Rust bindings, which I started before Escapi had official Rust bindings. |
|
I was quite busy lately.. Now I tried that and when I open the properties, I get this: So that means YUY2 is the first / default capture mode, and Escapi chooses the default / first? Thanks so much! Btw, any idea why my camera appears twice in Filters? Btw, IMO Escapi should convert from the camera format to a consistent output format that can be easily scaled in a shader but it shouldn't scale it on the CPU. If you want, we can work on a fork together that does this. It could also have the raw mode that you want, but I think for most usecases the unscaled RGB24 mode is the most appropriate. |
It seems that your computer and/or camera is not able to stream in YUY2 @ 1920x1080 at 30 FPS. This is not surprising to be honest. Note that if you lowered the resolution you might have been able to chose a higher framerate in YUY2. I think this is a bandwidth problem. YUY2 is probably less space efficient that MJPG so it consumes more bandwidth and you have probably reached the limit at 5fps (for 1980x1080). If your goal is to display and/or process the camera stream at 1980x1080 you will need to "tweak" Escapi. My fork is kind of ugly (and I tweaked some more locally, I might push those changes later) and not there yet, but I can give you some pointers. The first thing to do is to implement the conversion for the MJPG encoding -You might want to double check if there is not another format that may be able to stream your camera at 1920x1080@30fps because MJPG is not easy to support- To do that you should add the MJPG format to the list of supported format in escapi (like here). But then you have to provide a conversion function. The conversion function should take the MJPG buffer of each frame and convert it to RGBA. Unless you planned to write a JPEG decoder yourself (which is a quite hard task), you might want to use a library. This one should work and if you want to understand the reason of the fork, look at this issue. Maybe libjpeg would work too, but if you want to find out you will have to test it by yourself. :) At this point, you just need to make sure that Escapi selects the correct capture mode like I explained before (point 3 of my previous message). For your case you should select the capture mode with MJPG encoding, with a resolution of 1920x1080 and 30 frames per second. For testing purpose you might want to hardcode the capture mode selection. This is currently the part I want to improve on Escapi. I want to return each possible capture mode and be able to select it using some kind of id. It is a bit complicated since I want to do it from another language (C# in my case). My camera shows twice as well in GraphStudioNext and I have no idea why, but it does not seems important. EDIT: If you manage to do that part cleanly, you might want to summit a pull request because I will welcome the addition. |
I tried with your fork at this commit but when I give it a desired framerate of 30, or 20 I only get a completely blue cam buffer from Escapi. Only when I give 5 for the desired framerate it works like before. |
I don't think you should use my fork (at least in this state). Especially since it does not support MJPG. I don't know what happen for sure but I think Escapi does not find compatible capture mode at 1980x1080@30 fps. |
About your remarks on the output mode I am not sure I agree. Ideally I would like things to work that way:
This way, the API is easy to use and the entry barrier quite low (for the case where the default suit you). But in case you want control you still have it without tweaking the library internals. About the fork, if you want something that only works in your case if might not fit this library's goals. But I don't think there is a need to do it (maybe except temporary, for testing stuff). @jarikomppa seems to welcome pull requests, and I think if you come with something that works well and is simple to use he will accept the PRs. Anyway, if you need help I will try my best (in the little time I have). |
Ah yes, getting a list of supported capture modes from escapi would be fancy. So it would work if I just get the MJPG buffer out and then use the jpeg-decoder crate to decode every frame at the client side? Is this what you are planning to do with your fork already? Btw, in your TransformImage_MJPG you can do the same as in TransformImage_RGB32, i.e. If I return the raw MJPG buffer, can it be larger than the decoded frame would be in RGB32 (size of user-provided buffer)? |
Yes I think it should work if you had the MJPG as a supported format and then send the buffer through the jpeg-decoder crate. This is indeed what I did with my fork (among other things, like getting the available capture mode). You just need to be cautious as the buffer length will probably vary at each frame because of the compression. I think it will always be smaller than the allocated one by Escapi (which should be 1920x1080x4 bytes) so you are pretty safe about access violation stuff. Yet, I am not sure that if you pass trailing (garbage) bytes to the jpeg-decoder it will work properly. So just give it the exact numbers of bytes took by the JPG frame. The code of my "fork" is indeed neither clean nor optimal that is why I tried to explain you the overall pictures before sending you piece of code :) . memcpy make sense for copying the MJPG buffer (and I think I use it in my more up to date local version) but MFCopyImage is at best confusing. It gives the illusion of dealing with pixels while you are dealing with compressed data. |
Ah I see. Yea, it would be cool to get MJPG support working. Not sure if the MJPG frame has header info that tell the decoder how long the compressed frame data is. If it has a header, we can just always copy the whole buffer. |
You might want to look at that line. It seems that I found a way to find out the actual MJPG buffer length. Sadly, I don't remember anymore and my code being poorly documented it is not very helpful. |
I just tried with this function (same as TransformImage_RGB32): void TransformImage_MJPG(
BYTE* aDest,
LONG aDestStride,
const BYTE* aSrc,
LONG aSrcStride,
DWORD aWidthInPixels,
DWORD aHeightInPixels
)
{
MFCopyImage(aDest, aDestStride, aSrc, aSrcStride, aWidthInPixels * 4, aHeightInPixels);
} And I get a access violation when it calls |
No idea but like I said MFCopyImage does not make sense for MJPG buffers. You should try with
instead. But you will have "garbage" bytes after the end of the actual JPG buffer. I recommend that you look at my previous message and get the correct length of the buffer. |
Thanks, I'm getting the MJPG data now, I can see it as noise in my opengl window.
Caused here: How much is MJPG different from JPG? |
You are almost there, noise is indeed the expected display of the MJPG buffer if you treat it as a RGBA texture. MJPG is not very different from JPG. The only difference I noted is the lack of huffman tables in the MJPG (but this is not even standardized). MJPG is so similar to JPG that if you dump the bytes of a frame in a file with the .jpg extension many programs should be able to display it properly (including the GIMP, chrome and Firefox). I suggest that you dump the few first bytes of the buffer and post them here. There should be like a magic number at the beginning to identify the buffer as a JPG encoded one. Moreover, the first few bytes have generally a pattern that is pretty noticeable. About what is really happening, my guess is that there is an offset somewhere which screw up the parser. |
You have made great progress. I did not run into those issues and I am mostly clueless about the artifact, the fact that the whole image is flipped or the crash. My advice would be to make sure that there is no threading issue (ie the decoder is not trying to decode a frame while the next is being written in the same buffer as the same time). Maybe you could upload your code so I can give a look at it in case I see something obvious. |
Isn't escapi running the the same thread as my decoder? |
I think it is but that is the only thing that come to my mind. EDIT: The jpeg crate has been updated to support decoding of MJPG frame, so it recommend to use it instead of my own. EDIT2: Can you post the code for the "JPG buffer copy" you used inside the C/C++ Escapi dll? |
You were right! I shouldn't call if self.cam.is_capture_done() {
use std::slice;
use jpeg_decoder::*;
let data = unsafe { slice::from_raw_parts(self.cam_frame.as_ptr() as *const u8, self.cam_frame.len() * 4) };
if !(data[0] == 0xFF && data[1] == 0xD8 /* SOI */) { return Ok(()); }
let mut decoder = Decoder::new(data);
let decoded = decoder.decode().expect("frame decoding failed");
self.cam_tex.main_level().write(Rect {
left: 0,
bottom: 0,
width: CAM_WIDTH as u32,
height: CAM_HEIGHT as u32
}, texture::RawImage2d {
data: Cow::Owned(decoded),
width: CAM_WIDTH as u32,
height: CAM_HEIGHT as u32,
format: texture::ClientFormat::U8U8U8
}
);
self.cam.do_capture();
} Thanks a lot! Now I get 30fps, but in a really hacky way... Btw, there is also this lib: https://github.com/jp9000/libdshowcapture |
I agree that the current code of Escapi does not make it easy for what we are trying to achieve but after looking at the libdshowcapture I am not sure it will be simpler. This library seems to be low level wrapper around the DirectShow API and I have no idea how to get a camera stream using the library. Honestly, I don't think there is really a single blocking issue to make Escapi evolve to a point where it can be used easily for our use cases. There is quite a lot to do for sure and it would probably take a bit of time to do it, but I don't think it would be very hard by itself. |
Btw, libdshowcapture doesn't seem very complicated:
|
Thanks for the info. If you manage to do a sample using this library please let me know. |
I bought a webcam that can do 30ps Full Hd. I tested it in OBS-Studio, at first I got very low capture framerate, about 5 fps. But then I read this post and it worked: By setting the "Resolution/FPS Type" to Custom and setting Resolution to 1920x1080 I got the 30 FPS capture framerate.
Then I tried to use the webcam in my opengl application, capturing it with Escapi and rendering it to the screen.
But in my application I ALSO only get that low capture framerate with Escapi :(
(Btw, I get high FPS with a friend's webcam so it's not my application's fault.)
I talked on IRC with the developer of OBS about what OBS does when you set a custom capture FPS, it uses DirectShow functions to change the pin type. He pointed me to this:
https://github.com/jp9000/libdshowcapture/blob/master/source/device.cpp#L304
Could the same be done with Windows Media Foundations?
It seems that some cameras (like mine) use a lower capture framerate by default than what they could, so it would be useful to be able to set a custom capture framerate with Escapi, too :)
The text was updated successfully, but these errors were encountered: