From 3b8a2f086f557b00e673915218eaa9d72e458c9e Mon Sep 17 00:00:00 2001 From: Kenzo Lobos-Tsunekawa Date: Tue, 24 Sep 2024 21:09:58 +0900 Subject: [PATCH] chore: fixed dead links and spelling Signed-off-by: Kenzo Lobos-Tsunekawa --- .cspell.json | 1 + ...e_camera_lidar_calibrator_issue.launch.xml | 42 ------------------- .../tag_based_pnp_calibrator/README.md | 2 +- .../tag_based_sfm_calibrator/README.md | 2 +- docs/tutorials/mapping_based_calibrator.md | 6 +-- .../marker_radar_lidar_calibrator.md | 10 ++--- docs/tutorials/tag_based_pnp_calibrator.md | 4 +- docs/tutorials/tag_based_sfm_calibrator.md | 6 +-- 8 files changed, 16 insertions(+), 57 deletions(-) delete mode 100644 calibrators/sensor_calibration_manager/launch/default_project/interactive_camera_lidar_calibrator_issue.launch.xml diff --git a/.cspell.json b/.cspell.json index 7f5a192a..2a97aa78 100644 --- a/.cspell.json +++ b/.cspell.json @@ -37,6 +37,7 @@ "imread", "imshow", "imwrite", + "innoviz", "intrinsics", "kalman", "keyframes", diff --git a/calibrators/sensor_calibration_manager/launch/default_project/interactive_camera_lidar_calibrator_issue.launch.xml b/calibrators/sensor_calibration_manager/launch/default_project/interactive_camera_lidar_calibrator_issue.launch.xml deleted file mode 100644 index 225a0eac..00000000 --- a/calibrators/sensor_calibration_manager/launch/default_project/interactive_camera_lidar_calibrator_issue.launch.xml +++ /dev/null @@ -1,42 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/calibrators/tag_based_pnp_calibrator/README.md b/calibrators/tag_based_pnp_calibrator/README.md index 809ed5e0..a9a26290 100644 --- a/calibrators/tag_based_pnp_calibrator/README.md +++ b/calibrators/tag_based_pnp_calibrator/README.md @@ -103,7 +103,7 @@ A complete example with real data is provided in our [tutorial](../../docs/tutor - Place the available tags within the field-of-view of the sensors (see the [Pro tips/recommendations](#pro-tipsrecommendations) for more information). - Ensure that both sensors detect the tags, and wait until the detections converge. - Once the detections are added to the calibration data, move the tags to a new location, making sure to fulfill the `new_hypothesis_distance` criteria. -- If needed, just after moving the tag, stop its oscilattions with your hand. +- If needed, just after moving the tag, stop its oscillations with your hand. - Repeat the process until `calibration_convergence_min_pairs` pairs have been obtained and the calibration process finishes. ## Known issues/limitations diff --git a/calibrators/tag_based_sfm_calibrator/README.md b/calibrators/tag_based_sfm_calibrator/README.md index f587a3f1..7d18300c 100644 --- a/calibrators/tag_based_sfm_calibrator/README.md +++ b/calibrators/tag_based_sfm_calibrator/README.md @@ -212,7 +212,7 @@ Any intrinsic calibration board that can be used for this purpose is acceptable, A complete example with real data is provided in our [tutorial](../../docs/tutorials/tag_based_sfm_calibrator.md), so we recommend users interested in this tools to get hands-on-experience directly with that example. Nevertheless, the calibration process consists of the following steps: -- External camera intrinsic calibration. This can be done directly throught this tool if you have either apriltags or dot boards. +- External camera intrinsic calibration. This can be done directly with this tool if you have either apriltags or dot boards. - Place waypoint tags in positions where the calibrations can detect them accurately. - Obtain detections with the calibrations sensors (through the UI). - Using the external camera, obtain samples of the environment, taking care to avoid motion blur. The user should focus on connecting the calibrator sensors by taking photos containing multiple waypoints, ot at least photos that contain one waypoints and multiple other tags. diff --git a/docs/tutorials/mapping_based_calibrator.md b/docs/tutorials/mapping_based_calibrator.md index 81a08e11..0186b9a4 100644 --- a/docs/tutorials/mapping_based_calibrator.md +++ b/docs/tutorials/mapping_based_calibrator.md @@ -2,12 +2,12 @@ In this tutorial, we present a hands-on tutorial of the `mapping_based_calibrator`, particularly its lidar-lidar calibration capabilities. Although we provide a pre-recorded rosbag, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. -General documentation regarding this calibrator can be found [here](../../mapping_based_calibrator/README.md). +General documentation regarding this calibrator can be found [here](../../calibrators/mapping_based_calibrator/README.md). ## Setup This tutorial assumes that the user has already built the calibration tools. -Installation instructions can be found [here](../../../README.md). +Installation instructions can be found [here](../../README.md). ## Data preparation @@ -154,7 +154,7 @@ The image below displays the vehicle within the pointcloud, allowing for a compa - Mapping failed. Acceleration is too high. - Bad initial calibration is also a common cause for failures in the calibration process. If the mapping succeeds and there are good calibration features, but still the calibration fails it is usually due to the nature of classic pointcloud registration algorithms. - Check the `RViz` to see if any keyframe number on the path is red (normally it is white). If it is red, there is a chance that the motion of the vehicle is not smooth. We recommend the user calibrate with more stable movement again. - - If it is not feasible to restart the experiment, the user could tune the parameters in the `Calibration criteria parameters` described in the [documentation](../../mapping_based_calibrator/README.md). However, keep in mind that if the user sets the threshold too high, the accuracy of the calibration result will also decrease. + - If it is not feasible to restart the experiment, the user could tune the parameters in the `Calibration criteria parameters` described in the [documentation](../../calibrators/mapping_based_calibrator/README.md). However, keep in mind that if the user sets the threshold too high, the accuracy of the calibration result will also decrease. - Check whether all of the lidars apply time synchronization. - Make sure that the environment is rich in natural landmarks suitable for registration-based mapping in all directions. This will help the lidar capture sufficient details beyond simple features like lane surfaces or walls. diff --git a/docs/tutorials/marker_radar_lidar_calibrator.md b/docs/tutorials/marker_radar_lidar_calibrator.md index 7c92ecb5..a50a2ee0 100644 --- a/docs/tutorials/marker_radar_lidar_calibrator.md +++ b/docs/tutorials/marker_radar_lidar_calibrator.md @@ -2,12 +2,12 @@ In this tutorial, we present a hands-on tutorial of the `marker_radar_lidar_calibrator`. Although we provide a pre-recorded rosbag, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. -General documentation regarding this calibrator can be found [here](../../marker_radar_lidar_calibrator/README.md). +General documentation regarding this calibrator can be found [here](../../calibrators/marker_radar_lidar_calibrator/README.md). ## Setup This tutorial assumes that the user has already built the calibration tools. -Installation instructions can be found [here](../../../README.md). +Installation instructions can be found [here](../../README.md). ## Data preparation @@ -27,7 +27,7 @@ On the other side, the ground being leveled is not always a hard requirement. Ho ### Radar reflector -It is recommended to utilize a tripod to adjust the height of the radar reflector and also modify its center to align with the radar sensor. More information about the radar reflector can be found in the [general documentation](../../marker_radar_lidar_calibrator/README.md#radar-reflector). +It is recommended to utilize a tripod to adjust the height of the radar reflector and also modify its center to align with the radar sensor. More information about the radar reflector can be found in the [general documentation](../../calibrators/marker_radar_lidar_calibrator/README.md#radar-reflector). Before attempting to take calibration data from the reflectors, make sure that they can be detected by both sensors, which can be done visually with `RViz`. @@ -116,7 +116,7 @@ In the same fashion, the upper right and lower right images correspond to the `l ### Adding lidar-radar pairs -After the background model has been extracted, the user can carry the radar reflector (with the tripod) and place it in the [calibration area](../../marker_radar_lidar_calibrator/README.md#pro-tipsrecommendations). Once the reflector is positioned, the user should move away from it to avoid (otherwise, there is a hight risk of both objects coming part of the same cluster). +After the background model has been extracted, the user can carry the radar reflector (with the tripod) and place it in the [calibration area](../../calibrators/marker_radar_lidar_calibrator/README.md#pro-tipsrecommendations). Once the reflector is positioned, the user should move away from it to avoid (otherwise, there is a hight risk of both objects coming part of the same cluster). In the tutorial rosbag, the user will see that both the human and the radar reflector (with a tripod) are identified as foreground objects in the image below. @@ -134,7 +134,7 @@ Afterward, if the pair that the user added converges, it will be added to the da add2

-As described in the [Step 3: Matching and filtering](../../marker_radar_lidar_calibrator/README.md#step-3-matching-and-filtering) in the general documentation, we rely on the initial calibration to pair each lidar detection with its closest radar detection, and vice versa. Below, we show examples of good and bad initial calibration. +As described in the [Step 3: Matching and filtering](../../calibrators/marker_radar_lidar_calibrator/README.md#step-3-matching-and-filtering) in the general documentation, we rely on the initial calibration to pair each lidar detection with its closest radar detection, and vice versa. Below, we show examples of good and bad initial calibration. In both images, the calibrator detects one lidar detection (blue points) and two radar detections (purple lines). The radar detections identify two potential reflectors: a human and a radar reflector (located closer to the bottom of the image). In the good initial calibration image, the blue point from lidar detection will correctly match the radar detection, as they are the closest to each other. However, in the image of bad initial calibration, the lidar detection will incorrectly match the radar detection of the human. If the user identifies that the initial calibration is not good enough for the matching process to succeed, he will need to address this issue before attempting to continue with this process. diff --git a/docs/tutorials/tag_based_pnp_calibrator.md b/docs/tutorials/tag_based_pnp_calibrator.md index af217702..8ee10a35 100644 --- a/docs/tutorials/tag_based_pnp_calibrator.md +++ b/docs/tutorials/tag_based_pnp_calibrator.md @@ -2,12 +2,12 @@ In this tutorial, we present a hands-on tutorial of the `tag_based_pnp_calibrator`. Although we provide a pre-recorded rosbag, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. -General documentation regarding this calibrator can be found [here](../../tag_based_pnp_calibrator/README.md). +General documentation regarding this calibrator can be found [here](../../calibrators/tag_based_pnp_calibrator/README.md). ## Setup This tutorial assumes that the user has already built the calibration tools. -Installation instructions can be found [here](../../../README.md). +Installation instructions can be found [here](../../README.md). ## Data preparation diff --git a/docs/tutorials/tag_based_sfm_calibrator.md b/docs/tutorials/tag_based_sfm_calibrator.md index c97cd185..d0e7ccad 100644 --- a/docs/tutorials/tag_based_sfm_calibrator.md +++ b/docs/tutorials/tag_based_sfm_calibrator.md @@ -2,12 +2,12 @@ In this tutorial, we present a hands-on tutorial of the `tag_based_sfm_calibrator`, in particular, of its base-lidar calibration capabilities. Although we provide pre-recorded rosbags, the flow of the tutorial is meant to show the user the steps they must perform in their own use cases with live sensors. -General documentation regarding this calibrator can be found [here](../../tag_based_sfm_calibrator/README.md). +General documentation regarding this calibrator can be found [here](../../calibrators/tag_based_sfm_calibrator/README.md). ## Setup This tutorial assumes that the user has already built the calibration tools. -Installation instructions can be found [here](../../../README.md). +Installation instructions can be found [here](../../README.md). ## Data preparation @@ -32,7 +32,7 @@ Since we need the initial intrinsics for the external camera, we need a way to c ### Calibration tags -Although information on the different types of tags can be found in the [base documentation](../../tag_based_sfm_calibrator/README.md), here we provide additional practical guidelines: +Although information on the different types of tags can be found in the [base documentation](../../calibrators/tag_based_sfm_calibrator/README.md), here we provide additional practical guidelines: #### Ground tags