-
Notifications
You must be signed in to change notification settings - Fork 0
/
Steps.txt
executable file
·74 lines (48 loc) · 4.27 KB
/
Steps.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
Tutorial
https://becominghuman.ai/tensorflow-object-detection-api-tutorial-training-and-evaluating-custom-object-detector-ed2594afcf73
https://github.com/EdjeElectronics/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10
https://www.youtube.com/watch?v=Rgpfk6eYxJA&t=1568s
Configuring Tensorflow API
2. Directory Structure:
images/?—?This directory will contain our dataset.
training/?—?In this directory we will save our trained model.
eval/?—?Will save results of evaluation on trained model.
3. Annotation:
Create Annotations using software and put it into images/train and images/test folder seperately
4. convert xml into csv by running command
make sure you are in this folder
(tensorflow1) C:\tensorflow1\models\research\object_detection>
then run
set PYTHONPATH=C:\tensorflow1\models;C:\tensorflow1\models\research;C:\tensorflow1\models\research\slim
Set Python_Path:
dgx:
export PYTHONPATH=/home/dgxuser104/Balmukund/tensorflow1/models:/home/dgxuser104/Balmukund/tensorflow1/models/research:/home/dgxuser104/Balmukund/tensorflow1/models/research/slim:/home/dgxuser104/Balmukund/tensorflow1/models/research/object_detection
export PATH=$PATH:$PYTHONPATH
NOTE:- DGX uses linux OS. So '%' should be replaced with '$'.
Error:
AttributeError: module 'tensorflow' has no attribute 'contrib'
Solution:
pip install tensorflow-gpu==1.8.0
Protobuf Commands:
cd .. (perform the below command in Balmukund/tensorflow1/models/research) (In dgx './' is not idenified, hence this improvisation.)
protoc --python_out=object_detection object_detection/protos/anchor_generator.proto object_detection/protos/argmax_matcher.proto object_detection/protos object_detection/protos/anchor_generator.proto object_detection/protos/input_reader.proto object_detection/protos/losses.proto object_detection/protos/matcher.proto object_detection/protos/argmax_matcher.proto object_detection/protos/bipartite_matcher.proto object_detection/protos/box_coder.proto object_detection/protos/box_predictor.proto object_detection/protos/eval.proto object_detection/protos/faster_rcnn.proto object_detection/protos/faster_rcnn_box_coder.proto object_detection/protos/grid_anchor_generator.proto object_detection/protos/hyperparams.proto object_detection/protos/image_resizer.proto object_detection/protos/input_reader.proto object_detection/protos/losses.proto object_detection/protos/matcher.proto object_detection/protos/mean_stddev_box_coder.proto object_detection/protos/model.proto object_detection/protos/optimizer.proto object_detection/protos/pipeline.proto object_detection/protos/post_processing.proto object_detection/protos/preprocessor.proto object_detection/protos/region_similarity_calculator.proto object_detection/protos/square_box_coder.proto object_detection/protos/ssd.proto object_detection/protos/ssd_anchor_generator.proto object_detection/protos/string_int_label_map.proto object_detection/protos/train.proto object_detection/protos/keypoint_box_coder.proto object_detection/protos/multiscale_anchor_generator.proto object_detection/protos/graph_rewriter.proto object_detection/protos/calibration.proto object_detection/protos/flexible_grid_anchor_generator.proto
Creating csv file for test ad train data
python xml_to_csv.py
generating tf-record fro training and testing
python generate_tfrecord.py --csv_input=images\train_labels.csv --image_dir=images\train --output_path=train.record
python generate_tfrecord.py --csv_input=images\test_labels.csv --image_dir=images\test --output_path=test.record
Training:
python train.py --logtostderr --train_dir=training/ --pipeline_config_path=training/faster_rcnn_inception_v2_pets.config
For Evaluation
python eval.py --logtostderr --pipeline_config_path=training/faster_rcnn_inception_v2_pets.config --checkpoint_dir=training --eval_dir=eval
Visualise the result:
Evaluation:
tensorboard --logdir=eval/
Training:
tensorboard --logdir=training/
After Training and evaluation export infrence graph command that will be used for infrencing
python export_inference_graph.py --input_type image_tensor --pipeline_config_path training/faster_rcnn_inception_v2_pets.config --trained_checkpoint_prefix training/model.ckpt-XXXX --output_directory inference_graph
Presentation:
Add a short background problem description
diagrams of the methods
after result, detailed analysis of result