Skip to content

Commit

Permalink
Merge branch 'feature/new_api_documentation' of github.com:tier4/Cali…
Browse files Browse the repository at this point in the history
…brationTools into feature/new_api_documentation
  • Loading branch information
knzo25 committed Apr 24, 2024
2 parents f6e9583 + 8021480 commit 5081700
Show file tree
Hide file tree
Showing 35 changed files with 716 additions and 7 deletions.
6 changes: 3 additions & 3 deletions common/tier4_ground_plane_utils/src/ground_plane_utils.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -85,9 +85,9 @@ std::tuple<bool, Eigen::Vector4d, pcl::PointCloud<PointType>::Ptr> extractGround
seg.setMaxIterations(parameters.max_iterations_);

pcl::PointCloud<PointType>::Ptr iteration_cloud = pointcloud;
int iteration_size = iteration_cloud->height * iteration_cloud->width;
int iteration_cloud_size = iteration_cloud->height * iteration_cloud->width;

while (iteration_size > parameters.min_plane_points_) {
while (iteration_cloud_size > parameters.min_plane_points_) {
seg.setInputCloud(iteration_cloud);
seg.segment(*inliers, *coefficients);

Expand Down Expand Up @@ -176,7 +176,7 @@ std::tuple<bool, Eigen::Vector4d, pcl::PointCloud<PointType>::Ptr> extractGround
extract.filter(next_cloud);

iteration_cloud->swap(next_cloud);
iteration_size = iteration_cloud->height * iteration_cloud->width;
iteration_cloud_size = iteration_cloud->height * iteration_cloud->width;
}
return std::make_tuple(false, model, inliers_pointcloud);
}
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
122 changes: 122 additions & 0 deletions sensor/docs/tutorials/mapping_based_calibrator.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
# mapping_based_calibrator

In this tutorial, we will present a hands-on tutorial of the `mapping_based_calibrator`, in particular, of its lidar-lidar calibration capabilities. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors.

General documentation regarding this calibrator can be found [here](../../mapping_based_calibrator/README.md).

## Setup

This tutorial assumes that the user has already built the calibration tools.
Installation instructions can be found [here](../../README.md)

## Data preparation

Please download the data (rosbag) from [here](https://drive.google.com/drive/folders/1e0rajkGfXrKl-6E5oouALdbjeva1c5X1?usp=drive_link).

The rosbag includes four pointcloud topics published by different lidar sensors and also includes `/tf_static` information.

## Environment preparation

### Overall calibration environment

The required space for calibration depends on the vehicle and sensors used. For a normal consumer-level car, a space of `5m x 30m` should be sufficient.

### Vehicle

When doing the calibration, user needs to drive the vehicle in order to collect the pointcloud for buliding map. While recording the data during the experiment, slow down the vehicle speed as munch as possible. For instance, drive slower than 5 km/hr is a good speed for recording a good data. Also during the experiment, try to avoid people walking around the vehicle, try to make to surrounding static.

## Launching the tool

In this tutorial, we will use the RDV of Tier IV (R&D Vehicle).
First, run the sensor calibration manager:

```bash
ros2 run sensor_calibration_manager sensor_calibration_manager
```

In `project`, select `rdv`, and in `calibrator`, select `mapping_based_calibrator`. Then, press `Continue`.

![segment](../images/mapping_based_calibrator/menu1.jpg)

A menu titled `Launcher configuration` should appear in the UI, and the user may change any parameter he deems convenient. However, for this tutorial, we will use the default values. After configuring the parameters, click `Launch`.

![segment](../images/mapping_based_calibrator/menu2.jpg)

The following UI should be displayed. When the `Calibrate` button becomes available, click it.
If it does not become available, it means that either the required `tf` or services are not available.

In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play lidar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button.

![segment](../images/mapping_based_calibrator/menu3.jpg)

Note: In the default values in the `/calibration_tools/sensor/sensor_calibration_manager/launch/rdv/mapping_based_lidar_lidar_calibrator.launch.xml`, the RDV vehicle set the top_lidar as `mapping lidar`, and other lidars as `calibration lidars`.

## Data collection (Mapping & Data paring)

Once you have clicked the `Calibrate` button, the first step of calibration process will automatically start building the map by using NDT/GICP algorithm with the `mapping lidar`. You can visualize process of building the map on the `rviz`.

![segment](../images/mapping_based_calibrator/map1.jpg)

You can also see the log in the console showing that the map is building.

```bash
[mapping_based_calibrator-1] [calibration_mapper]: ROS: New pointcloud. Unprocessed=1 Frames=26 Keyframes=2
[mapping_based_calibrator-1] [calibration_mapper]: Registrator innovation=0.00. Score=0.04
[mapping_based_calibrator-1] [calibration_mapper]: New frame (id=26 | kid=-1). Distance=2.04 Delta_distance0.11 Delta_time0.10. Unprocessed=0 Frames=27 Keyframes=2 (mappingThreadWorker())
[mapping_based_calibrator-1] [calibration_mapper]: ROS: New pointcloud. Unprocessed=1 Frames=27 Keyframes=2
[mapping_based_calibrator-1] [calibration_mapper]: Registrator innovation=0.00. Score=0.04
[mapping_based_calibrator-1] [calibration_mapper]: New frame (id=27 | kid=3). Distance=2.15 Delta_distance0.11 Delta_time0.10. Unprocessed=0 Frames=28 Keyframes=3 (mappingThreadWorker())
[mapping_based_calibrator-1] [calibration_mapper]: ROS: New pointcloud. Unprocessed=1 Frames=28 Keyframes=3
[mapping_based_calibrator-1] [calibration_mapper]: Registrator innovation=0.00. Score=0.01
[mapping_based_calibrator-1] [calibration_mapper]: New frame (id=28 | kid=-1). Distance=2.26 Delta_distance0.11 Delta_time0.10. Unprocessed=0 Frames=29 Keyframes=3
```

When the roabag is finished playing, you should see the pointcloud map and the path of the lidar frames like the picture below.

![segment](../images/mapping_based_calibrator/map2.jpg)

## Calibration

Calibration starts anytime when the user send the command `ros2 service call /stop_mapping std_srvs/srv/Empty`. User can also send this command before the rosbag ended if user think that the data collection is enough for calibration.

In this tutorial, we send the command after the rosbag run until the end. Once the command is sent, the displayed text should be as follows:

```bash
[mapping_based_calibrator-1] [mapping_based_calibrator_node]: Mapper stopped through service (operator()())
[mapping_based_calibrator-1] [calibration_mapper]: Mapping thread is exiting (mappingThreadWorker())
[mapping_based_calibrator-1] [mapping_based_calibrator_node]: Beginning lidar calibration for pandar_front (operator()())
```

The calibration process may take some time as it involves multiple lidars. Users should remain patient and monitor the console output to follow the calibration progress.

Once the calibration process is completed, the displayed text should be as follows:

```bash
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: Calibration result as a tf main lidar -> lidar_calibrator(pandar_left)
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: translation:
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: x: -0.001519
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: y: -0.609573
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: z: -0.366957
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: rotation:
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: x: 0.346912
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: y: 0.000018
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: z: -0.005994
[mapping_based_calibrator-1] [lidar_calibrator(pandar_left)]: w: 0.937887
[mapping_based_calibrator-1] [mapping_based_calibrator_node]: Lidar calibration for pandar_left finished
[mapping_based_calibrator-1] [mapping_based_calibrator_node]: Sending the results to the calibrator manager
```

User can also see the three different colors of pointcloud in the `rviz`. white for the map from the `mapping lidar`, red for the initial map from the `calibration lidars`, and green for the calibrated map from the `calibration lidars`.

![segment](../images/mapping_based_calibrator/map3.jpg)

## Results

After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow user to save the calibration data to a file.
![segment](../images/mapping_based_calibrator/menu4.jpg)

To assess the calibration results, users can precisely measure static objects within the point cloud map, such as stationary vehicles, traffic cones, and walls.

The image below displays the vehicle within the pointcloud, allowing for a comparison of results before and after calibration. It is evident that the initial point cloud from `calibration lidars` (shown in red) has been successfully calibrated (shown in green) and is now aligned with the `mapping lidar` (shown in white).

![segment](../images/mapping_based_calibrator/vehicle_calibrated.jpg)
38 changes: 38 additions & 0 deletions sensor/docs/tutorials/marker_radar_lidar_calibrator.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Marker radar lidar calibrator

Commands for running the tools (make sure to source the setup.bash before launching.)

Terminal 1: Launch autoware

```sh
ros2 launch autoware_launch logging_simulator.launch.xml map_path:=/home/yihsiangfang/autoware_map/sample-rosbag vehicle_model:=j6_gen1 sensor_model:=aip_x2 vehicle_id:=j6_gen1_01 rviz:=false localization:=false perception:=true map:=false control:=false planning:=false
```

Terminal 2: Launch the calibration tool

```sh
ros2 run sensor_calibration_manager sensor_calibration_manager
```

Change the parameters if needed, and make sure that you select the correct radar name.

Press the calibrate button to start the tool and then you can start to play the bag

Terminal 3: Play bag

```sh
ros2 bag play name_of_rosbag --clock --remap /tf:=/tf_old /tf_static:=/tf_static_old -r 0.2
UI, Rviz and Metric plotter
```

After the lidar pointcloud shows on the rviz.

First press the extract background model to extract the background

Afterward, when the lidar and the radar detection show up, press the add lidar-radar pair to add them for calibration,

After you add more than three lidar-radar pairs, the metric plotter will show the average calibration error. After the pairs are more than four, the cross-validation error will also show on the plotter with an additional std error.

During the calibration, if you add some pairs that are not stable or mismatched, you can click the delete the previous pair to delete them

Finally, when the cross-validation error is converged, you can press the send calibration to stop the calibration and then click the save calibration to save the calibration result in yaml
89 changes: 89 additions & 0 deletions sensor/docs/tutorials/tag_based_pnp_calibrator.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
# tag_based_pnp_calibrator

In this tutorial, we will present a hands-on tutorial of the `tag_based_pnp_calibrator`. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors.

General documentation regarding this calibrator can be found [here](../../tag_based_pnp_calibrator/README.md).

## Setup

This tutorial assumes that the user has already built the calibration tools.
Installation instructions can be found [here](../../README.md)

## Data preparation

Please download the data (rosbag) from [here](https://drive.google.com/drive/folders/1gFrrchW9mWM1huWMYuJ0nWB2n1BfCJag?usp=drive_link).

The rosabg includes four different topics including `camera_info`, `image_rect_color/compressed`, `pointcloud_raw`, and `/tf_static`.

## Environment preparation

### Overall calibration environment

The required space for calibration depends on the vehicle and sensors used. For a normal consumer-level car, a space of `5m x 10m` should be sufficient.

### Apriltag

Apriltag are the only moving elements during the calibration process and must detected by both cameras and lidars.
Depending on the lidar model and the available space, the required Apriltag size may vary, but so far we have had good results with 0.6m and 0.8m tags (the provided sizes correspond to an edge's size. In these cases the payloads are 0.45m and 0.6m).

In addition to the Apriltag size, which determines the physical positions in which a tag can be detected, it is of equal importance the structure in which the Apriltag is mounted. Depending on the structure's shape and size, it may interfere with the lidar detection algorithm, so it is recommended to prepare a mount that holds the tag in a way that is not visible to the sensor (see the provided example).

## Launching the tool

In this tutorial, we will use the X2 vehicle of Tier IV.
First, run the sensor calibration manager:

```bash
ros2 run sensor_calibration_manager sensor_calibration_manager
```

In `project`, select `x2`, and in `calibrator`, select `tag_based_pnp_calibrator`. Then, press `Continue`.

![segment](../images/tag_based_pnp_calibrator/menu1.jpg)

A menu titled `Launcher configuration` should appear in the UI, and the user may change any parameter he deems convenient.
For this tutorial, we will modify the default values `calibration_pairs` from `9` to `8` as the bag have 8 Apriltag detection and also modify the `camera_name` from `camera0` to `camera6`. After configuring the parameters, click `Launch`.

![segment](../images/tag_based_pnp_calibrator/menu2.jpg)

The following UI should be displayed. When the `Calibrate` button becomes available, click it.
If it does not become available, it means that either the required `tf` or services are not available.

In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play camera_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button.

![segment](../images/tag_based_pnp_calibrator/menu3.jpg)

## Calibration

The calibration start automatically after click the `Calibrate` button. It will keep calibrate the lidartag detection and apriltag detection until the number of the detection fit the user defined `calibratino_pairs` in the `Launcher configuration`.

When user start the calibration, `rviz` and the `image view` should be displayed like below.

![segment](../images/tag_based_pnp_calibrator/visualization1.jpg)

After the tools detect the lidartag and apriltag, it will shows the detectino markers on the `rviz` and the `image view`. The text in the rviz will also display the current number of lidar detection and apriltag detection pairs.

![segment](../images/tag_based_pnp_calibrator/visualization2.jpg)

Once user get the converged detection, user can start moving the tag to another position. Please make sure the moving distance is larger than the `calibration_min_pair_distance` and also make sure the tag is in the view of FOV of the lidar and camera.

In the end of the calibration, we can get 8 detection pairs which shown as below.

![segment](../images/tag_based_pnp_calibrator/visualization3.jpg)

## Results

After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow user to save the calibration data to a file.
![segment](../images/tag_based_pnp_calibrator/menu4.jpg)

User can modify the `visualization options` in the right side of the `image view`. To compare the results easier, user can set the `Marker size (m)` to `0.04` and set the `PC subsample factor` to `1`.

![segment](../images/tag_based_pnp_calibrator/visualization_bar.jpg)

After setting the options above, change the `/initial_tf` (in the `visualization options`) to `/current_tf`. By doing this, user can easily measure the difference after the calibration.

The images below show that with the calibrated tranformation, the projected pointcloud align better with the imaage.

![segment](../images/tag_based_pnp_calibrator/init_tf.jpg)

![segment](../images/tag_based_pnp_calibrator/calibrated_tf.jpg)
Loading

0 comments on commit 5081700

Please sign in to comment.