-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gralloc: Add support for IMPLEMENTATION_DEFINED #44
base: lineage-18.1
Are you sure you want to change the base?
Conversation
Currently, gralloc lacks support for HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, which other implementations treat as NV12. Change gralloc to support HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED as NV12 in line with these other implementations. Validated with the libcamera Android HAL.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not reviewing but just commenting that I'm stuck at this point as well...
As I understand it, IMPLEMENTATION_DEFINED is a wild-card format, that each platform has to define usually to a format the graphic device can display. The gralloc module does the translation to a DRM_FORMAT when it receive an allocation request, and the camera HAL maps it to a compatible format (likely the same format), so that higher layers can use IMPLEMENTATION_DEFINED as a wild card and the platform specific layers do the translation (of course, they should agree on what to translate it to).
I'm now struggling to understand how Waydroid, being platform-agnostic handles this platofmr-specifc mapping, and I'm a bit stuck staring at https://github.com/waydroid/android_external_minigbm trying to find out what backend it uses and how should I infer the correct mapping...
Maybe @aleasto could help clarify this ?
We use hardware/waydroid/gralloc by default on DRI platforms. We also build the gbm_mesa_driver of https://github.com/waydroid/android_external_minigbm which is similar but exposes some more modern features used in upstream mesa. We will probably switch to the latter by default in the future. |
thanks @aleasto, but I'm now then missing where gbm_bo_create() is implemented then, if not in external-minigbm (which confusingly, contains the gbm_mesa_driver too as well as other backends). I'm still trying to figure out what backend is used for my device (pinephone pro) and why I have a failure at gbm_bo_create() after I modified gralloc_gbm to map IMPLEMENTATION_DEFINED on DRM_FORMAT_NV12 |
Oh that's directly from mesa: https://github.com/waydroid/android_external_mesa3d/tree/lineage-18.1/src/gbm |
And yes, minigbm is terribly confusing. It started as an implementation of the gbm interface with different backends for different gpu vendors, and now one of the backends calls into mesa's libgbm instead. |
yeah but if I'm not mistaken that's a bare clone of mesa meant to be build externally and not integrated in the AOSP build ? |
No, mainline mesa nowadays has the capability of building inline in AOSP. That's the mesa we're using. |
that's my understanding as well, it has to be built with the ndk and manually deployed. How can it be in use if I haven't manually deployed it on my device ? :) edit: wrong quote |
No, it can build both against the ndk and inside AOSP. |
btw: cross referencing https://bugs.libcamera.org/show_bug.cgi?id=172 |
@nroth waydroid/android_external_mesa3d#4 |
Is that used at all ? I had a look at Waydroid's gralloc implementation with Jacopo, and at least on the PinePhone Pro, the implementation being loaded at runtime is gbm_gralloc, which links directly to mesa's libgbm. When is the minigbm's gralloc module used ? |
It's not used at the moment. |
OK, thanks for the confirmation. What was the reason for adding a mesa libgbm backend to minigbm (by @rsglobal in waydroid/android_external_minigbm@4957355) ? Was it envisioned that minigbm would be used at that point, with a later change of plans ? What is the advantage of using that backend compared to using gbm_gralloc directly, which also uses mesa's libgbm with fewer levels of indirection ? |
FWIW, I tried using at least two of the alternative gbms we build a few weeks ago and the image did not boot due to a crash loop.
It's not used at the moment.
You could still manually select it by setting ro.hardware.gralloc=minigbm_gbm_mesa
P.S. Some context if it helps— I have a list of TODO emails like this to get through and since I’m moving, one of my highest priorities is adding stuff to our Frankenstein design+instructions doc (see libcamera mailing list) so that I remember what’s going on and why once I regain access to my build machine. Please email me or comment on the doc if anything seems unclear, or if you have questions.
|
|
Agreed, gbm_gralloc is limited and is likely not a good path forward for cameras and codecs. Part of the reason is that it relies on libgbm from mesa, where gbm stands for graphics buffer allocator, not generic buffer allocation. Mesa's libgbm doesn't support YUV formats. While that could be added, it will probably encounter pushback from upstream, given libgbm's nature (we would have to design YUV support from a graphics point of view, and prove that it is useful for GPUs and display controllers without codecs and cameras).
No disagreement that minigbm is the way to go, but I don't think a mesa libgbm backend will help much, as it doesn't support YUV formats. |
In the long term, seems like we want to go with minigbm to handle Android gralloc() because it's more full-featured. For now, do folks mind continuing to use the thin gralloc_gbm implementation while we get the libcamera pipeline working? If we use libcamera's HAL, the requirements are simple for us: support MJEPG and NV12. For now, if we use libcamera's HAL, we will HAVE to map IMPLEMENTATION_DEFINED to NV12: from src/android/camera_capabilities.cpp
|
I don't mind at all, but you will need to find a way to provide an NV12 buffer with two planes. The mesa libgbm doesn't support YUV formats, and is not likely to accept YUV support upstream easily. It may be possible to hack Waydroid's version of mesa, but I don't know how much work that would require, and if switching to minigbm wouldn't be simpler. |
Just as a note, last week I tried moving my setup to use minigbm. The minigbm rockchip backend requires out-of-tree kernel modifications to support a custom ioctl that allows buffer allocation on the Rockchip display driver As I didn't want to backport support for that ioctl I used instead the dumb_device backend The result is that the container starts but I don't see any buffer being drawn to the screen, which makes me wonder: how does rendering works with Waydroid ? I see the minigbm backend operating on /dev/dri/card0, does mesa instead operates on the GPU device at /dev/dri/renderD128 ? |
@aleasto any thoughts? |
The mesa libgbm doesn't support YUV formats, and is not likely to accept YUV support upstream easily. It may be possible to hack Waydroid's version of mesa, but I don't know how much work that would require, and if switching to minigbm wouldn't be simpler.
Correct me if I'm wrong, but it seems like the existing code for
gbm_gralloc already supports YUV. For example, search for
HAL_PIXEL_FORMAT_YCrCb_420_SP and HAL_PIXEL_FORMAT_YV12 in
gralloc_gbm.cpp. It seems like adding NV12 support should be a natural
extension of this.
In fact, I've tried to implement this myself, and I get a half-working
greyscale image. Half-working because there are artifacts (green
stripes) and camera3 itself crashes while initializing a new stream
whenever I try to take a picture.
I'm unclear on why the image is greyscale, artifacted, and with
crashes. Memory corruption is one possible explanation. However, it
seems like this should be fundamentally possible since my PoC does
kind of work.
…-Nicholas
|
I see several things in gralloc_gbm that makes me wonder if YUV support is actually tested. Look at get_gbm_format(): YCbCr420 formats are mapped to a 0 (which afaict it's not a valid GBM_FORMAT_ entry) and
with "adjustments" to the buffer size to include the extra CbCr plane (not sure about the width halving though)
and the fact you see greyscale image might mean that only the Y plane is rendered, as the extra CbCr plane doesn't exist for mesa at rendering time ? Anyway, it seems all too fragile to consider it a viable way forward |
gbm_gralloc has limited YUV support implemented as a hack, but libgbm doesn't. Note how
Furthermore, if you look at home |
In mesa's libgbm implementation, 0 is considered to be
I wouldn't even call it close :-)
I don't think that's related. Rendering is completely decoupled from capture.
That I agree with. It may be possible to get it working with further hacks on selected platforms and for selected use cases, but I think it's a dead end. |
Currently, gralloc lacks support for
HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, which other implementations treat as NV12.
Change gralloc to support HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED as NV12 in line with these other implementations. Validated with the libcamera Android HAL.