Skip to content

Commit

Permalink
Merge branch 'feature/new_api_documentation' of github.com:tier4/Cali…
Browse files Browse the repository at this point in the history
…brationTools into feature/new_api_documentation
  • Loading branch information
knzo25 committed Apr 25, 2024
2 parents 896c3bd + efe2ca2 commit 9b9a3c2
Show file tree
Hide file tree
Showing 21 changed files with 236 additions and 86 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
37 changes: 22 additions & 15 deletions sensor/docs/tutorials/mapping_based_calibrator.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Installation instructions can be found [here](../../README.md)

## Data preparation

Please download the data (rosbag) from [here](https://drive.google.com/drive/folders/1e0rajkGfXrKl-6E5oouALdbjeva1c5X1?usp=drive_link).
Please download the data (rosbag) from [here](https://drive.google.com/drive/folders/1e0rajkGfXrKl-6E5oouALdbjeva1c5X1).

The rosbag includes four pointcloud topics published by different lidar sensors and also includes `/tf_static` information.

Expand All @@ -23,7 +23,7 @@ The required space for calibration depends on the vehicle and sensors used. For

### Vehicle

When doing the calibration, user needs to drive the vehicle in order to collect the pointcloud for buliding map. While recording the data during the experiment, slow down the vehicle speed as munch as possible. For instance, drive slower than 5 km/hr is a good speed for recording a good data. Also during the experiment, try to avoid people walking around the vehicle, try to make to surrounding static.
When doing the calibration, the user needs to drive the vehicle to collect the point cloud for building the map. While recording data during the experiment, slow down the vehicle's speed as much as possible. For instance, driving slower than 5 km/hr is good for recording quality data. Also, during the experiment, try to avoid people walking around the vehicle and aim to keep the surroundings static.

## Launching the tool

Expand All @@ -36,26 +36,28 @@ ros2 run sensor_calibration_manager sensor_calibration_manager

In `project`, select `rdv`, and in `calibrator`, select `mapping_based_calibrator`. Then, press `Continue`.

![segment](../images/mapping_based_calibrator/menu1.jpg)
<p align="center">
<img src="../images/mapping_based_calibrator/menu1.jpg" alt="menu1">
</p>

A menu titled `Launcher configuration` should appear in the UI, and the user may change any parameter he deems convenient. However, for this tutorial, we will use the default values. After configuring the parameters, click `Launch`.

![segment](../images/mapping_based_calibrator/menu2.jpg)
![mapping_based_calibrator](../images/mapping_based_calibrator/menu2.jpg)

The following UI should be displayed. When the `Calibrate` button becomes available, click it.
If it does not become available, it means that either the required `tf` or services are not available.

In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play lidar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button.

![segment](../images/mapping_based_calibrator/menu3.jpg)
![mapping_based_calibrator](../images/mapping_based_calibrator/menu3.jpg)

Note: In the default values in the `/calibration_tools/sensor/sensor_calibration_manager/launch/rdv/mapping_based_lidar_lidar_calibrator.launch.xml`, the RDV vehicle set the top_lidar as `mapping lidar`, and other lidars as `calibration lidars`.

## Data collection (Mapping & Data paring)

Once you have clicked the `Calibrate` button, the first step of calibration process will automatically start building the map by using NDT/GICP algorithm with the `mapping lidar`. You can visualize process of building the map on the `rviz`.

![segment](../images/mapping_based_calibrator/map1.jpg)
![mapping_based_calibrator](../images/mapping_based_calibrator/map1.jpg)

You can also see the log in the console showing that the map is building.

Expand All @@ -71,25 +73,25 @@ You can also see the log in the console showing that the map is building.
[mapping_based_calibrator-1] [calibration_mapper]: New frame (id=28 | kid=-1). Distance=2.26 Delta_distance0.11 Delta_time0.10. Unprocessed=0 Frames=29 Keyframes=3
```

When the roabag is finished playing, you should see the pointcloud map and the path of the lidar frames like the picture below.
When the rosbag has finished playing, you should see the point cloud map and the path of the lidar frames, as shown in the picture below.

![segment](../images/mapping_based_calibrator/map2.jpg)
![mapping_based_calibrator](../images/mapping_based_calibrator/map2.jpg)

## Calibration

Calibration starts anytime when the user send the command `ros2 service call /stop_mapping std_srvs/srv/Empty`. User can also send this command before the rosbag ended if user think that the data collection is enough for calibration.
Calibration starts anytime when the user sends the command `ros2 service call /stop_mapping std_srvs/srv/Empty`. User can also send this command before the rosbag ends if they think the data collected is sufficient for calibration.

In this tutorial, we send the command after the rosbag run until the end. Once the command is sent, the displayed text should be as follows:
In this tutorial, we send the command after the rosbag runs until the end. Once the command is sent, the displayed text should be as follows:

```bash
[mapping_based_calibrator-1] [mapping_based_calibrator_node]: Mapper stopped through service (operator()())
[mapping_based_calibrator-1] [calibration_mapper]: Mapping thread is exiting (mappingThreadWorker())
[mapping_based_calibrator-1] [mapping_based_calibrator_node]: Beginning lidar calibration for pandar_front (operator()())
```

The calibration process may take some time as it involves multiple lidars. Users should remain patient and monitor the console output to follow the calibration progress.
The calibration process may take some time, as it involves multiple lidars. Users should remain patient and monitor the console output to track the progress of the calibration.

Once the calibration process is completed, the displayed text should be as follows:
Once the calibration process is complete, the displayed text should be as follows:

```bash
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: Calibration result as a tf main lidar -> lidar_calibrator(pandar_left)
Expand All @@ -108,15 +110,20 @@ Once the calibration process is completed, the displayed text should be as follo

User can also see the three different colors of pointcloud in the `rviz`. white for the map from the `mapping lidar`, red for the initial map from the `calibration lidars`, and green for the calibrated map from the `calibration lidars`.

![segment](../images/mapping_based_calibrator/map3.jpg)
![mapping_based_calibrator](../images/mapping_based_calibrator/map3.jpg)

## Results

After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow user to save the calibration data to a file.
![segment](../images/mapping_based_calibrator/menu4.jpg)

<p align="center">
<img src="../images/mapping_based_calibrator/menu4.jpg" alt="menu4" width="500">
</p>

To assess the calibration results, users can precisely measure static objects within the point cloud map, such as stationary vehicles, traffic cones, and walls.

The image below displays the vehicle within the pointcloud, allowing for a comparison of results before and after calibration. It is evident that the initial point cloud from `calibration lidars` (shown in red) has been successfully calibrated (shown in green) and is now aligned with the `mapping lidar` (shown in white).

![segment](../images/mapping_based_calibrator/vehicle_calibrated.jpg)
<p align="center">
<img src="../images/mapping_based_calibrator/vehicle_calibrated.jpg" alt="vehicle_calibrated" width="500">
</p>
155 changes: 135 additions & 20 deletions sensor/docs/tutorials/marker_radar_lidar_calibrator.md
Original file line number Diff line number Diff line change
@@ -1,38 +1,153 @@
# Marker radar lidar calibrator
# marker_radar_lidar_calibrator

Commands for running the tools (make sure to source the setup.bash before launching.)
In this tutorial, we will present a hands-on tutorial of the `marker_radar_lidar_calibrator`. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors.

Terminal 1: Launch autoware
General documentation regarding this calibrator can be found [here](../../marker_radar_lidar_calibrator/README.md).

```sh
ros2 launch autoware_launch logging_simulator.launch.xml map_path:=/home/yihsiangfang/autoware_map/sample-rosbag vehicle_model:=j6_gen1 sensor_model:=aip_x2 vehicle_id:=j6_gen1_01 rviz:=false localization:=false perception:=true map:=false control:=false planning:=false
```
## Setup

This tutorial assumes that the user has already built the calibration tools.
Installation instructions can be found [here](../../README.md)

## Data preparation

Please download the data (rosbag) from [here](https://drive.google.com/drive/folders/1S3Cz_VomvHBRgiCSt8JCOgN53UGz5TpZ).

The rosabg includes four different topics including `object_raw`, `pointcloud_raw`, and `/tf_static`.

## Environment preparation

### Overall calibration environment

The required space for calibration depends on the vehicle and sensors used. For a normal consumer-level car, a space of `5m x 10m` should be sufficient.

### Radar reflector

Radar reflector is the only moving element during the calibration process and must be detected by both radar and lidar. It is recommended to utilize a tripod to adjust the height of the radar reflector and also modify its center to align with the radar sensor.

## Launching the tool

Terminal 2: Launch the calibration tool
In this tutorial, we will use the X2 vehicle of Tier IV.
First, run the sensor calibration manager:

```sh
```bash
ros2 run sensor_calibration_manager sensor_calibration_manager
```

Change the parameters if needed, and make sure that you select the correct radar name.
In `project`, select `x2`, and in `calibrator`, select `marker_radar_lidar_calibrator`. Then, press `Continue`.

<p align="center">
<img src="../images/marker_radar_lidar_calibrator/menu1.jpg" alt="menu1">
</p>

A menu titled `Launcher configuration` should appear in the UI, and the user may change any parameter he deems convenient.
For this tutorial, we will modify the default value `radar_name` from `front_left` to `front_center`. For the `msg_type` and `transformation_type`, as `object_raw` topic is type of `radar_tracks` and the radar type is 2D in this tutorial, we keep them as default. After configuring the parameters, click `Launch`.

![marker_radar_lidar_calibrator](../images/marker_radar_lidar_calibrator/menu2.jpg)

The following UI should be displayed. When the `Calibrate` button becomes available, click it.
If it does not become available, it means that either the required `tf` or services are not available.

In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play radar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button.

![marker_radar_lidar_calibrator](../images/marker_radar_lidar_calibrator/menu3.jpg)

### Extract background model

Once the user starts running the tutorial rosbag, the point cloud will appear in `rviz` as shown in the example below. Press the `Extract Background Model button` in the UI to begin extracting the background.

<p align="center">
<img src="../images/marker_radar_lidar_calibrator/rviz1.jpg" alt="rviz1" width="500">
</p>

Once user click the button, it will show like the image below.

<p align="center">
<img src="../images/marker_radar_lidar_calibrator/rviz2.jpg" alt="rviz2" width="500">
</p>

Press the calibrate button to start the tool and then you can start to play the bag
Once the background is extracted, it will show like the image below. User can see that there are the `Add lidar-radar pair` button is enabled.

Terminal 3: Play bag
<p align="center">
<img src="../images/marker_radar_lidar_calibrator/rviz3.jpg" alt="rviz3" width="500">
</p>

```sh
ros2 bag play name_of_rosbag --clock --remap /tf:=/tf_old /tf_static:=/tf_static_old -r 0.2
UI, Rviz and Metric plotter
Also, following text should be shown in the console.

```bash
[marker_radar_lidar_calibrator]: Finished background model initialization
```

After the lidar pointcloud shows on the rviz.
### Add lidar-radar pair

After the background model has been extracted, the user can carry the radar reflector with the tripod and place it in front of the radar sensor. In the tutorial rosbag, the user will see that both the human and the radar reflector (with tripod) are identified as foreground objects in the image below.

Also, the green points represent the lidar foreground points, while the purple points indicate radar foreground detections. The blue point is the estimated center of the radar reflector derived from the lidar point cloud.

<p align="center">
<img src="../images/marker_radar_lidar_calibrator/add1.jpg" alt="add1" width="300" height="300">
</p>

When the purple line connects the purple point (the radar estimation of the reflector) and the blue point (the lidar estimation of the reflector), the user can press the `Add lidar-radar pair` button to register them as a pair.

Afterward, if the pair that the user added converges, it will become a converged pair, which will then be used for calibration. Additionally, the colors of the markers will change: the white point indicates the lidar estimation, the red point marks the initial radar estimation, and the green point signifies the calibrated estimation."

### Delete previous lidar-radar pair

During the calibration, if there are any mismatched pairs (e.g., a human appearing in front of both the radar and lidar), the user can click the `Delete previous lidar-radar pair` button to remove the previous outlier pair.

Using the tutorial rosbag as an example, we can delete the latest radar and lidar pair by clicking this button. The before and after changes should look like the images shown below.

<table>
<tr>
<td><img src="../images/marker_radar_lidar_calibrator/delete1.jpg" alt="delete1" width = 400px ></td>
<td><img src="../images/marker_radar_lidar_calibrator/delete2.jpg" alt="delete2" width = 400px ></td>
</tr>
<tr>
<td><p style="text-align: center;">Before deletion.</p></td>
<td><p style="text-align: center;">After deletion.</p></td>
</tr>
</table>

The console should also show the following text.

```bash
[marker_radar_lidar_calibrator]: The last track was successfully deleted. Remaining converged tracks: 4
```

### Metric plotter

The tool also provides a metric plotter for real time visualization shown in below.

![marker_radar_lidar_calibrator](../images/marker_radar_lidar_calibrator/metric_plotter1.jpg)

The subplots at the top display the cross-validation errors, while the bottom subplot shows the average errors in the calibration procedure. Plotting for the average errors begins after three pairs have been collected. For the cross-validation errors, plotting starts after four pairs have been collected.

Consider the left-top subplot, which plots the cross-validation errors for distance, as an example of how these errors are calculated. When the x-axis value is 3, it indicates that we estimate the transformation using 3 samples from the 5 converged tracks. We then calculate the distance errors using the remaining 2 samples. This process is repeated for 5 choose 3 (5C3) times, which totals 10 times, and the errors are then averaged. he light blue area represents the variance of the 10 calculated distance errors.

### Send calibration

User can click the `Send calibration` button once they are satisfied. It is recommended that user collect more pairs to increase the accuarcy. Therefore, in this tutorial, we will add all of the pairs in the rosbag. Additional, user can also stop the calibration when the line in the cross validation error is converged.

Once the `Send calibration` button are clicked, the result will be sent to the sensor calibration manager. No pairs can be added or deleted afterward like the image shown below. Please make sure you want to end the calibration process when you click the button.

First press the extract background model to extract the background
<table>
<tr>
<td><img src="../images/marker_radar_lidar_calibrator/end_calibration1.jpg" alt = "end_calibration1" width = 700px></td>
<td><img src="../images/marker_radar_lidar_calibrator/end_calibration2.jpg" alt = "end_calibration2" width = 700px></td>
</tr>
<tr>
<td><p style="text-align: center;">Rosbag ended.</p></td>
<td><p style="text-align: center;">After clicking send calibration.</p></td>
</tr>
</table>

Afterward, when the lidar and the radar detection show up, press the add lidar-radar pair to add them for calibration,
## Results

After you add more than three lidar-radar pairs, the metric plotter will show the average calibration error. After the pairs are more than four, the cross-validation error will also show on the plotter with an additional std error.
After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow user to save the calibration data to a file.

During the calibration, if you add some pairs that are not stable or mismatched, you can click the delete the previous pair to delete them
<p align="center">
<img src="../images/marker_radar_lidar_calibrator/menu4.jpg" alt="menu4" width="500">
</p>

Finally, when the cross-validation error is converged, you can press the send calibration to stop the calibration and then click the save calibration to save the calibration result in yaml
To evaluate the calibration result, user can measure the calibrated radar points (green) are closer than the initial radar points (red) to the lidar points.
Loading

0 comments on commit 9b9a3c2

Please sign in to comment.