-
Notifications
You must be signed in to change notification settings - Fork 17
Examples
In this example, we show how to read images from a directory, label the images in classes, extract features, select features, select a classifier and evaluate the performance in only 12 lines! see Balu Code. See methodology in this paper.
See how can be used this 10-line code to design automatically a computer vision system to detect faces. In the dataset, there are 60 faces and 200 no-faces. Using only LBP features, Balu is able to select 15 features and a classifier with a performance of 95% validated with cross-validation (warning: the 95% is achieved in this data set, no warranty for other data sets) see Balu Code
In this example, we show how to detect pen tips in pencil cases using a tracking algorithm see Balu Code. Details of the method are published in this paper and video. If you want to use the graphic user interface shown in this video, execute the command Btr_gui
and load the file pencase.mat.
With only this command Bio_segshow('testimg1.jpg')
you can obtain this figure. In addition, you can test any segmentation algorithm using simple commands like the followings bellow. Details of the method are published in this paper.
I = imread('testimg2.jpg');
Bio_segshow(I,'Bim_segpca');
or
R = Bim_segpca(I);
imshow(R)
Try this command to segment in this image the sky, the clouds and the palm:
I = imread('testimg9.jpg');
Bim_segkmeans(I,3,1,1);
With only the following commands you can segment the well-known rice image of Matlab:
I = imread('rice.png');
figure;imshow(I)
[F,m] = Bim_segmowgli(I,[],40,1.5);
figure;imshow(F,[])
In this example, you can see how welding defects can be detected using sliding windows. The example selects 100 'no-defects' and 100 'defects' from a training image (where the ideal segmentation is apriori known). Afterward, LBP features are extracted and an LDA classifier is trained using SFS selected features. Finally, the trained classifier is used to segment a test image see Balu Code. Details of the method are published in this paper.
Sometimes you need to separate certain objects of an image, and no segmentation approach works fine. Try the function Bim_regiongrow
with this image, it is easy for example to separate the pen tips using this tool.
If you want to calibrate your computer vision system in order to process Lab color images (see our paper), first you have to estimate the parameters of the model M that converts from RGB to Lab* using the function Bim_labparam
, and second you can convert an RGB image X using the command:
Y = Bim_rgb2lab(X,M);
if you don't want to calibrate the computer vision system you can use the theoretical conversion implemented in the function Bim_rgb2lab0
.
In this example we show how to separate the characters 'T' and 'Y' using the eccentricity of the segmented regions. We use the train image to establish the threshold automatically and the test image to evaluate it see Balu Code.
In this example, we show how to use Hu moments to obtain a good separability in recognition of characters '1', '2' and '3' in this image. see Balu Code.
In this example, we show how to build a function that computes the centroid of a binary region see Balu Code.
In this example, we show how to separate three types of arrows using simple Balu commands. The classification is performed thresholding only one feature. See Training image, Test Image, and Balu Code.
In this example, we show how to fit a binary region to an ellipse. In example 1, we the best ellipses fitted to binary regions of this image. In example 2, we detect elliptical objects orientated to a determined angle see Balu Code).
This example shows how to use Bfs_balu algorithm. This algorithm has three steps: (1) normalizes (using Bft_norm
), (2) cleans (using Bfs_clean
), and (3) selects features (using Bfs_sfs
). In this example, the objective function that is maximized by SFS algorithm is the performance of SVM-RBF classifier see Balu Code.
LSEF select feature subsets based on their capacities to reproduce sample projections on principal axes. It can be used to estimate an approximation of PCA using a linear projection of some original features see Balu Code.
In this example, we preselect 10 features using SFS with Fisher criterium, and afterward, we select 4 from them using exhaustive search. In this example, the objective function that will be maximized by SFS algorithm is the performance of a KNN classifier with 5 neighbors see Balu Code
In this example, we show how to test several feature selection algorithms and their combinations see Balu Code.
In Balu, there are some definitions:
-
X
is the training data (one sample per row). It is a matrix of Nxm elements, N samples, and each sample has m features. -
d
is the ideal classification ofX
. It is a vector of N x 1 elements. For example,d(i)
is1
if samplei
belongs to class1
. -
Xt
is the testing data defined asX
. It is a matrix of Nt x m elements, Nt samples, and each sample has m features. -
dt
is the ideal classification ofXt
. It is a vector of Nt x 1 elements.dt
is never used to train a classifier, it is used to evaluate the classification ofXt
. -
ds
is the real classification ofXt
, i.e., the prediction using the trained classifier. It is a vector of Nt x 1 elements.ds
has to be compared withdt
to evaluate the performance of the classification ofXt
. If the prediction is perfect,ds(i)
is equal todt(i)
for every testing samplei
= 1, ... Nt. -
op
are the options of the classifier. For example, for KNN with 3 neighbors, we defineop.k=3
.
In Balu, the classifiers are implemented as functions Bcl_
name, where name is the name of the classifier. See the following example that shows Linear Discriminant Analysis using Bcl_lda
.
With simple code lines, you are able to train and test a classifier.
load datagauss % simulated data (2 classes, 2 features)
Bio_plotfeatures(X,d) % plot feature space
op.p = [0.5 0.5]; % a priori probabilities
ds = Bcl_lda(X,d,Xt,op); % LDA classifier
p = Bev_performance(ds,dt) % performance on test data
load datagauss % simulated data (2 classes, 2 features)
Bio_plotfeatures(X,d) % plot feature space
op.p = [0.5 0.5]; % a priori probabilities
op = Bcl_lda(X,d,op); % LDA classifier
ds = Bcl_lda(Xt,op); % LDA classifier
p = Bev_performance(ds,dt) % performance on test data
The same code can be used with Bcl_qda
, Bcl_knn
, ... you only have to define correctly options variable op
. Each classifier has an example, e.g., for KNN you only have to type help Bcl_knn
to see how to see a very simple example that shows you how to define the options.
In this example, we show how can you plot decision lines of different classifiers see Balu Code.
With Balu it is very easy to evaluate many classifiers on the same data. Take a look of this code to see how to compute 10-fold cross-validation performance of 9 classifiers (this example is given by typing help Bev_crossval
)