diff --git a/sensor/docs/images/marker_radar_lidar_calibrator/menu2.jpg b/sensor/docs/images/marker_radar_lidar_calibrator/menu2.jpg index d497e80d..350a9720 100644 Binary files a/sensor/docs/images/marker_radar_lidar_calibrator/menu2.jpg and b/sensor/docs/images/marker_radar_lidar_calibrator/menu2.jpg differ diff --git a/sensor/docs/tutorials/mapping_based_calibrator.md b/sensor/docs/tutorials/mapping_based_calibrator.md index 335bf80d..51f0c7de 100644 --- a/sensor/docs/tutorials/mapping_based_calibrator.md +++ b/sensor/docs/tutorials/mapping_based_calibrator.md @@ -1,6 +1,6 @@ # mapping_based_calibrator -In this tutorial, we will present a hands-on tutorial of the `mapping_based_calibrator`, in particular, of its lidar-lidar calibration capabilities. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. +In this tutorial, we will present a hands-on tutorial of the `mapping_based_calibrator`, in particular, of its lidar-lidar calibration capabilities. Although we provide pre-recorded rosbag, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. General documentation regarding this calibrator can be found [here](../../mapping_based_calibrator/README.md). @@ -47,7 +47,7 @@ A menu titled `Launcher configuration` should appear in the UI, and the user may The following UI should be displayed. When the `Calibrate` button becomes available, click it. If it does not become available, it means that either the required `tf` or services are not available. -In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play lidar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button. +In this tutorial, since the `tf` are published by the provided rosbag, run the rag (`ros2 bag play lidar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button. ![mapping_based_calibrator](../images/mapping_based_calibrator/menu3.jpg) @@ -55,7 +55,7 @@ Note: In the default values in the `/calibration_tools/sensor/sensor_calibration ## Data collection (Mapping & Data paring) -Once you have clicked the `Calibrate` button, the first step of calibration process will automatically start building the map by using NDT/GICP algorithm with the `mapping lidar`. You can visualize process of building the map on the `rviz`. +Once you have clicked the `Calibrate` button, the first step of the calibration process will automatically start building the map by using NDT/GICP algorithm with the `mapping lidar`. You can visualize the process of building the map on the `rviz`. ![mapping_based_calibrator](../images/mapping_based_calibrator/map1.jpg) @@ -79,7 +79,7 @@ When the rosbag has finished playing, you should see the point cloud map and the ## Calibration -Calibration starts anytime when the user sends the command `ros2 service call /stop_mapping std_srvs/srv/Empty`. User can also send this command before the rosbag ends if they think the data collected is sufficient for calibration. +Calibration starts anytime when the user sends the command `ros2 service call /stop_mapping std_srvs/srv/Empty`. The user can also send this command before the rosbag ends if they think the data collected is sufficient for calibration. In this tutorial, we send the command after the rosbag runs until the end. Once the command is sent, the displayed text should be as follows: diff --git a/sensor/docs/tutorials/marker_radar_lidar_calibrator.md b/sensor/docs/tutorials/marker_radar_lidar_calibrator.md index b3f0308a..0efedc84 100644 --- a/sensor/docs/tutorials/marker_radar_lidar_calibrator.md +++ b/sensor/docs/tutorials/marker_radar_lidar_calibrator.md @@ -1,6 +1,6 @@ # marker_radar_lidar_calibrator -In this tutorial, we will present a hands-on tutorial of the `marker_radar_lidar_calibrator`. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. +In this tutorial, we will present a hands-on tutorial of the `marker_radar_lidar_calibrator`. Although we provide pre-recorded rosbag, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. General documentation regarding this calibrator can be found [here](../../marker_radar_lidar_calibrator/README.md). @@ -23,7 +23,7 @@ The required space for calibration depends on the vehicle and sensors used. For ### Radar reflector -Radar reflector is the only moving element during the calibration process and must be detected by both radar and lidar. It is recommended to utilize a tripod to adjust the height of the radar reflector and also modify its center to align with the radar sensor. +The radar reflector is the only moving element during the calibration process and must be detected by both radar and lidar. It is recommended to utilize a tripod to adjust the height of the radar reflector and also modify its center to align with the radar sensor. ## Launching the tool @@ -41,14 +41,14 @@ In `project`, select `x2`, and in `calibrator`, select `marker_radar_lidar_calib

A menu titled `Launcher configuration` should appear in the UI, and the user may change any parameter he deems convenient. -For this tutorial, we will modify the default value `radar_name` from `front_left` to `front_center`. For the `msg_type` and `transformation_type`, as `object_raw` topic is type of `radar_tracks` and the radar type is 2D in this tutorial, we keep them as default. After configuring the parameters, click `Launch`. +For this tutorial, we will modify the default value `radar_name` from `front_left` to `front_center`. After configuring the parameters, click `Launch`. ![marker_radar_lidar_calibrator](../images/marker_radar_lidar_calibrator/menu2.jpg) The following UI should be displayed. When the `Calibrate` button becomes available, click it. If it does not become available, it means that either the required `tf` or services are not available. -In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play radar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button. +In this tutorial, since the `tf` are published by the provided rosbag, run the rag (`ros2 bag play radar_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button. ![marker_radar_lidar_calibrator](../images/marker_radar_lidar_calibrator/menu3.jpg) @@ -60,19 +60,19 @@ Once the user starts running the tutorial rosbag, the point cloud will appear in rviz1

-Once user click the button, it will show like the image below. +Once the user clicks the button, it will show like the image below.

rviz2

-Once the background is extracted, it will show like the image below. User can see that there are the `Add lidar-radar pair` button is enabled. +Once the background is extracted, it will show like the image below. The user can see that there are the `Add lidar-radar pair` button is enabled.

rviz3

-Also, following text should be shown in the console. +Also, the following text should be shown in the console. ```bash [marker_radar_lidar_calibrator]: Finished background model initialization @@ -117,19 +117,19 @@ The console should also show the following text. ### Metric plotter -The tool also provides a metric plotter for real time visualization shown in below. +The tool also provides a metric plotter for real-time visualization shown below. ![marker_radar_lidar_calibrator](../images/marker_radar_lidar_calibrator/metric_plotter1.jpg) The subplots at the top display the cross-validation errors, while the bottom subplot shows the average errors in the calibration procedure. Plotting for the average errors begins after three pairs have been collected. For the cross-validation errors, plotting starts after four pairs have been collected. -Consider the left-top subplot, which plots the cross-validation errors for distance, as an example of how these errors are calculated. When the x-axis value is 3, it indicates that we estimate the transformation using 3 samples from the 5 converged tracks. We then calculate the distance errors using the remaining 2 samples. This process is repeated for 5 choose 3 (5C3) times, which totals 10 times, and the errors are then averaged. he light blue area represents the variance of the 10 calculated distance errors. +Consider the left-top subplot, which plots the cross-validation errors for distance, as an example of how these errors are calculated. When the x-axis value is 3, it indicates that we estimate the transformation using 3 samples from the 5 converged tracks. We then calculate the distance errors using the remaining 2 samples. This process is repeated for 5 choose 3 (5C3) times, which totals 10 times, and the errors are then averaged. The light blue area represents the variance of the 10 calculated distance errors. ### Send calibration -User can click the `Send calibration` button once they are satisfied. It is recommended that user collect more pairs to increase the accuarcy. Therefore, in this tutorial, we will add all of the pairs in the rosbag. Additional, user can also stop the calibration when the line in the cross validation error is converged. +The user can click the `Send calibration` button once the user is satisfied. It is recommended that the user collect more pairs to increase the accuracy. Therefore, in this tutorial, we will add all of the pairs in the rosbag. Additionally, the user can also stop the calibration when the line in the cross-validation error is converged. -Once the `Send calibration` button are clicked, the result will be sent to the sensor calibration manager. No pairs can be added or deleted afterward like the image shown below. Please make sure you want to end the calibration process when you click the button. +Once the `Send calibration` button is clicked, the result will be sent to the sensor calibration manager. No pairs can be added or deleted afterward like the image shown below. Please make sure you want to end the calibration process when you click the button. @@ -144,10 +144,10 @@ Once the `Send calibration` button are clicked, the result will be sent to the s ## Results -After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow user to save the calibration data to a file. +After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow the user to save the calibration data to a file.

menu4

-To evaluate the calibration result, user can measure the calibrated radar points (green) are closer than the initial radar points (red) to the lidar points. +To evaluate the calibration result, the user can measure that the calibrated radar points (green) are closer than the initial radar points (red) to the lidar points. diff --git a/sensor/docs/tutorials/tag_based_pnp_calibrator.md b/sensor/docs/tutorials/tag_based_pnp_calibrator.md index 7ad1bcb5..87a04771 100644 --- a/sensor/docs/tutorials/tag_based_pnp_calibrator.md +++ b/sensor/docs/tutorials/tag_based_pnp_calibrator.md @@ -1,6 +1,6 @@ # tag_based_pnp_calibrator -In this tutorial, we will present a hands-on tutorial of the `tag_based_pnp_calibrator`. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. +In this tutorial, we will present a hands-on tutorial of the `tag_based_pnp_calibrator`. Although we provide pre-recorded rosbag, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. General documentation regarding this calibrator can be found [here](../../tag_based_pnp_calibrator/README.md). @@ -51,37 +51,37 @@ For this tutorial, we will modify the default values `calibration_pairs` from `9 The following UI should be displayed. When the `Calibrate` button becomes available, click it. If it does not become available, it means that either the required `tf` or services are not available. -In this tutorial, since the `tf` are published by the provided rosbags, run the rag (`ros2 bag play camera_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button. +In this tutorial, since the `tf` are published by the provided rosbag, run the rag (`ros2 bag play camera_lidar.db3 --clock -r 0.1`) first and launch the tools afterward to trigger the `Calibrate` button. ![segment](../images/tag_based_pnp_calibrator/menu3.jpg) ## Calibration -The calibration start automatically after click the `Calibrate` button. It will keep calibrate the LidarTag detections and AprilTag detections until the number of the detections fit the user defined `calibration_pairs` in the `Launcher configuration`. +The calibration starts automatically after clicking the `Calibrate` button. It will keep calibrating the LidarTag detections and AprilTag detections until the number of the detections fits the user-defined `calibration_pairs` in the `Launcher configuration`. When user start the calibration, `rviz` and the `image view` should be displayed like below. ![segment](../images/tag_based_pnp_calibrator/visualization1.jpg) -After the tools detect the LidarTag and AprilTag, it will shows the detection markers on the `rviz` and the `image view`. The text in the rviz will also display the current number of pairs of lidar detections and AprilTag detections. +After the tools detect the LidarTag and AprilTag, it will show the detection markers on the `rviz` and the `image view`. The text in the rviz will also display the current number of pairs of lidar detections and AprilTag detections. ![segment](../images/tag_based_pnp_calibrator/visualization2.jpg) -Once user get the converged detection, user can start moving the tag to another position. Please make sure the moving distance is larger than the `calibration_min_pair_distance` and also make sure the tag is in the view of FOV of the lidar and camera. +Once the user gets the converged detection, the user can start moving the tag to another position. Please make sure the moving distance is larger than the `calibration_min_pair_distance` and also make sure the tag is in the FOV of the lidar and camera. -In the end of the calibration, we can get 8 detection pairs which shown as below. +At the end of the calibration, we can get 8 detection pairs which are shown below. ![segment](../images/tag_based_pnp_calibrator/visualization3.jpg) ## Results -After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow user to save the calibration data to a file. +After the calibration process is finished, the sensor_calibration_manager will display the results in the tf tree and allow the user to save the calibration data to a file.

menu4

-User can modify the `visualization options` in the right side of the `image view`. To compare the results, please set the `Marker size (m)` to `0.04` and set the `PC subsample factor` to `1`. +The user can modify the `visualization options` on the right side of the `image view`. To compare the results, please set the `Marker size (m)` to `0.04` and set the `PC subsample factor` to `1`.

visualization_bar diff --git a/sensor/mapping_based_calibrator/README.md b/sensor/mapping_based_calibrator/README.md index 51940473..3b08f033 100644 --- a/sensor/mapping_based_calibrator/README.md +++ b/sensor/mapping_based_calibrator/README.md @@ -4,7 +4,7 @@ A tutorial for this calibrator can be found [here](../docs/tutorials/mapping_bas ## Purpose -The package `mapping_based_calibrator` allows extrinsic calibration among lidar sensor and lidar sensor used in autonomous driving and robotics. +The package `mapping_based_calibrator` allows extrinsic calibration among lidar sensors and lidar sensors used in autonomous driving and robotics. Note: depending on how this tool is configured it can perform the following calibrations: @@ -19,7 +19,7 @@ This algorithm aims to calibrate multiple lidars by using registration algorithm #### Step 1: Mapping (using mapping lidar) -First of all, the calibrator will designate one of the lidars (as defined in the launch file) as the mapping lidar for mapping purposes. The point cloud from this lidar utilizes either the NDT or GICP algorithm to calculate the pose, and also stores the point cloud as a map for future usage. +First of all, the calibrator will designate one of the lidars (as defined in the launch file) as the mapping lidar for mapping purposes. The point cloud from this lidar utilizes either the NDT or GICP algorithm to calculate the pose and also stores the point cloud as a map for future usage. #### Step 2: Calibration data preparation (using calibration lidars) diff --git a/sensor/marker_radar_lidar_calibrator/README.md b/sensor/marker_radar_lidar_calibrator/README.md index 4d0d32ed..e9440d01 100644 --- a/sensor/marker_radar_lidar_calibrator/README.md +++ b/sensor/marker_radar_lidar_calibrator/README.md @@ -4,7 +4,7 @@ A tutorial for this calibrator can be found [here](../docs/tutorials/marker_rada ## Purpose -The package `marker_radar_lidar_calibrator` allows extrinsic calibration among radar sensor and lidar sensor used in autonomous driving and robotics. +The package `marker_radar_lidar_calibrator` allows extrinsic calibration among the radar sensor and lidar sensor used in autonomous driving and robotics. ## Inner-workings / Algorithms @@ -34,7 +34,7 @@ Additionally, we provide a metric plotter that can indicate whether the calibrat ### Diagram -Below, you can see the how the algorithm is implemented in the `marker_radar_lidar_calibrator` package. +Below, you can see how the algorithm is implemented in the `marker_radar_lidar_calibrator` package. ![marker_radar_lidar_calibrator](../docs/images/marker_radar_lidar_calibrator/marker_radar_lidar_calibrator.jpg) diff --git a/sensor/marker_radar_lidar_calibrator/src/marker_radar_lidar_calibrator.cpp b/sensor/marker_radar_lidar_calibrator/src/marker_radar_lidar_calibrator.cpp index 38f8fe14..3fa24520 100644 --- a/sensor/marker_radar_lidar_calibrator/src/marker_radar_lidar_calibrator.cpp +++ b/sensor/marker_radar_lidar_calibrator/src/marker_radar_lidar_calibrator.cpp @@ -1557,7 +1557,7 @@ void ExtrinsicReflectorBasedCalibrator::visualizationMarkers( marker.scale.y = parameters_.reflector_radius; marker.scale.z = parameters_.reflector_radius; marker.color.a = 0.6; - marker.color.r = 1.0; + marker.color.r = 0.0; marker.color.g = 0.0; marker.color.b = 1.0; lidar_detections_marker_array.markers.push_back(marker); diff --git a/sensor/tag_based_pnp_calibrator/README.md b/sensor/tag_based_pnp_calibrator/README.md index 57303b6f..94e0c226 100644 --- a/sensor/tag_based_pnp_calibrator/README.md +++ b/sensor/tag_based_pnp_calibrator/README.md @@ -4,19 +4,19 @@ A tutorial for this calibrator can be found [here](../docs/tutorials/tag_based_p ## Purpose -The package `tag_based_pnp_calibrator` allows extrinsic calibration among Camera sensor and lidar sensor used in autonomous driving and robotics. +The package `tag_based_pnp_calibrator` allows extrinsic calibration among the camera sensor and lidar sensor used in autonomous driving and robotics. ## Inner-workings / Algorithms -The `tag_based_pnp_calibrator` utilizes the PnP algorithm to calculate the transformation between the lidar and camera. To run this package, you also need to operate the `apriltag_ros` package and the `lidartag` package to calculate the transformation. +The `tag_based_pnp_calibrator` utilizes the PnP algorithm to calculate the transformation between the camera and lidar. To run this package, you also need to operate the `apriltag_ros` package and the `lidartag` package to calculate the transformation. The `apriltag_ros` package detects the AprilTag and outputs the detection results. Conversely, the `lidartag` package detects the LidarTag and outputs its detection results. -The `tag_based_pnp_calibrator` utilizes the detections from both apriltag_ros and lidartag, employing a Kalman Filter to track these detections. If the detections converge, the calibrator applies the SQPnP algorithm provided by OpenCV to estimate the transformation between the image points from AprilTag and the object points from LidarTag. +The `tag_based_pnp_calibrator` utilizes the detections from both apriltag_ros and LidarTag, employing a Kalman Filter to track these detections. If the detections converge, the calibrator applies the SQPnP algorithm provided by OpenCV to estimate the transformation between the image points from AprilTag and the object points from LidarTag. ### Diagram -Below, you can see the how the algorithm is implemented in the `tag_based_pnp_calibrator` package. +Below, you can see how the algorithm is implemented in the `tag_based_pnp_calibrator` package. ![segment](../docs/images/tag_based_pnp_calibrator/tag_based_pnp_calibrator.jpg) @@ -99,10 +99,6 @@ References/External links ## Known issues/limitations -Our version of LidarTag only supports the family `16h5` - -Our codebase only supports AprilTag detections for `36h11` - ## Pro tips/recommendations During calibration, ensure that the lidar scan covers the tag, similar to the first example shown in the image below. However, if the tag resolution is low, as in the second example, and the lidar still detects the tag, it is acceptable. The third example demonstrates a scenario where the lidar scan fails to cover the tag, resulting in the inability to detect the LidarTag. @@ -111,7 +107,7 @@ During calibration, ensure that the lidar scan covers the tag, similar to the fi lidarscan_on_tag

-Also noted that when doing the calibration, it is necessary to rotate the tag in order to face to the camera like the image shown below. +Also note that when doing the calibration, it is necessary to rotate the tag facing the camera like the image shown below.

tag_position