Replies: 5 comments
-
Yes this is possible, assuming you can output the labels as binary mask images from LabelMe. There’s a function “convert segmentation to annotations” from the extras menu that should convert binary masks to RootPainter annotation files.
Please let me know if you run into any issues.
The labels in RootPainter are just PNG files. That way they can be easily viewed or edited in image viewing/editing software. Although I suggest using RootPainter to edit and create them.
If you use RootPainter normally you will see the saved annotation files (labels) in your project annotation folder.
You can often train models in RootPainter in just a few hours (including adding the labels within RootPainter) so it could be easier just to try it and label from scratch following the protocol (that’s what I would do). There’s a small risk that starting with a large existing labeled dataset could in some cases lead to worse results. RootPainter is designed to facilitate a corrective annotation protocol that involves annotation being assigned in response to weaknesses in the model. I think this could lead to certain advantages, such as how the labels end up being balanced between the classes or between different subsets of the classes that may be more challenging at different stages of training, akin to hard example mining or active learning. During corrective annotation, the annotator also gets to see how the model is doing during the annotation procedure itself allowing you to be more informed about how much and what annotation is necessary for your objective. From this point of view RootPainter is designed as an annotation tool for training models and not as a way to train models from existing labels. I.e it’s designed for annotation and training to be integrated as a combined process with a feedback loop.
By using RootPainter with a large existing labeled dataset. You’re just training a U-Net model in a fairly standard way (which can also work super well sometimes). RootPainter is designed to facilitate corrective annotation so you might be missing out on some advantages if using external labelling tools.
That said, if your goal is just to train a model to process a larger dataset, given you already have the annotations, you may as well give it a try if they are easy to convert.
Kind regards,
Abraham
|
Beta Was this translation helpful? Give feedback.
-
If I get time, I'll see if I can do some tests later today and get back to you about labelme export options. |
Beta Was this translation helpful? Give feedback.
-
Here's how my testing went: I created a project in RootPainter using a biopores dataset (https://zenodo.org/record/3754046/). Then I looked at the first two file names in the generated project file (B44-1_002.jpg, B92-1_003.jpg). These are the ones that appear first in the annotation procedure. I then created annotations for these files in labelme. I copied the two images (B44-1_002.jpg, B92-1_003.jpg) from the biopores dataset into their own folder name bp. Then I installed labelme using pip on osx with the instructions here: I ran labelme locally with the following command:
Then in the labelme UI I clicked on the ‘open dir’ option and opened the bp folder. I annotated biopores using the polygon tool and labeled them as bp. I actually found it a bit awkward. I prefer the RootPainter annotation tools, but of course I am more familiar with the brush tools in RootPainter. After annotating all the obvious biopores in the first image I clicked ‘next-image’ in labelme. Then I annotated the second image. I found the polygon annotation to be painful, but overall labelme was easy to get started with and it had no problem opening my images. If anyone else is following this, I suggest trying annotating in RootPainter but perhaps polygons are more convenient for more angular and less round structures i.e buildings in a city or simply when per-pixel segmentation is not of interest and the focus is more on approximate object localisation and classification. Labelme saves the polygon information as a list of points in a json file but I couldn’t see an option in the interface to export to a binary mask. It appears that this script should be used, which is part of labelme: I created a labels.txt file which looks like the following (the position/order of the labels is important as I want bp on the 6th line as that will make them blue, which RootPainter expects segmentation to be when it converts them to annotations):
And then ran
Which generated the following in output:
Which includes binary masks for the labelme annotations in SegmentationClassPNG See SegmentationClassPNG/B44-1_002_annot_proj.png: I do not recommend polygon tools for annotating this type of data. You can see how it’s led me to create some weird angular biopores. They have _annot_proj on the end of the file name. RootPainter expects the annotations to have the same name as the file name (but using the PNG extension). There's a few different ways you could batch rename the files but to address this I modified the labelme2voc.py script by adding the following on line 67.
The output masks now have the same names as the original images that were labeled. Next I used the RootPainter ‘convert segmentations to Annotations’ function to convert these binary masks to annotations in the standard RootPainter annotation format. I then added one of the annotations to the annotations/train folder in my RootPainter biopores project and the other to annotations/val and confirmed that model training is working. One thing I found interesting is that the model segmentations seemed to be more accurate than some of my polygon annotations. U-Net tends make the shapes rounder and less angular in it's predictions. I'm open to adding all this functionality to RootPainter as a `convert annotations from labelme' or something similar but I feel like this could be quite a niche use case and (for people running labelme, who are likely already very technical) maybe it’s easy enough to do this with what is provided already i.e the labelme2voc.py script provided as part of labelme and the ‘convert segmentations to Annotations’ which is already part of RootPainter. I'm also thinking it makes a lot more sense (based on my limited experience with labelme) to add the annotation in RootPainter rather than in labelme. What do you think? Perhaps you could share a little more about your use case? I hope that helps! |
Beta Was this translation helpful? Give feedback.
-
Here's a screenshot showing my annotation process in labelme Even though I found the annotation tool a bit awkward, after training for a while with the imported data the model is doing really well. |
Beta Was this translation helpful? Give feedback.
-
Wow thank you so much Abe! That looks great. Good to know, since there are many labeled sets available online. We have some root labels that were done in LabelMe. |
Beta Was this translation helpful? Give feedback.
-
Hello,
Is there a way to import labels when starting a project? Say someone had already begun labeling in LabelMe. Is there a way to train with these labels before using the predictive labeling in RootPainter? What is the preferred format of the imported labels?
Thanks!
Elizabeth
Beta Was this translation helpful? Give feedback.
All reactions