Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(pointcloud_preprocessor): update azimuth and distance in distortion_correction_node #5560

Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
08d9e45
distortion node is able to update azimuth and distance value based on…
vividf Nov 13, 2023
404e59a
wrong opencv head file
vividf Nov 13, 2023
6a1c6da
update the distortion node docs
vividf Nov 13, 2023
c99dbb7
update the distortion node docs
vividf Nov 13, 2023
598bd42
fix if else condition to update value even in sensor frame
vividf Nov 13, 2023
447a71d
update readme for distortion node
vividf Nov 13, 2023
ab81c47
style(pre-commit): autofix
pre-commit-ci[bot] Nov 13, 2023
1ed9e29
Update distortion-corrector.md
vividf Nov 14, 2023
8502d89
remove parameter default value
vividf Nov 14, 2023
2058788
remove get_parameter
vividf Nov 14, 2023
dffbd56
style(pre-commit): autofix
pre-commit-ci[bot] Nov 14, 2023
d779a36
Update distortion-corrector.md
vividf Nov 15, 2023
e2b9526
style(pre-commit): autofix
pre-commit-ci[bot] Nov 15, 2023
78dde17
Update distortion-corrector.md
vividf Nov 15, 2023
ef03a19
style(pre-commit): autofix
pre-commit-ci[bot] Nov 15, 2023
4929958
Update distortion-corrector.md
vividf Nov 15, 2023
a7ccf4a
Merge branch 'autowarefoundation:main' into feat/distortion_correctio…
vividf Nov 16, 2023
686ec6a
Merge branch 'autowarefoundation:main' into feat/distortion_correctio…
vividf Nov 16, 2023
7b51d4f
Merge branch 'autowarefoundation:main' into feat/distortion_correctio…
vividf Mar 4, 2024
192a89d
Merge branch 'autowarefoundation:main' into feat/distortion_correctio…
vividf Mar 7, 2024
fb2ab65
Merge branch 'autowarefoundation:main' into feat/distortion_correctio…
vividf Mar 12, 2024
55682dd
modify ring outlier filter node for supporting counter clockwise azimuth
vividf Mar 14, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 15 additions & 5 deletions sensing/pointcloud_preprocessor/docs/distortion-corrector.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

The `distortion_corrector` is a node that compensates pointcloud distortion caused by ego vehicle's movement during 1 scan.

Since the LiDAR sensor scans by rotating an internal laser, the resulting point cloud will be distorted if the ego-vehicle moves during a single scan (as shown by the figure below). The node corrects this by interpolating sensor data using odometry of ego-vehicle.
Since the LiDAR sensor scans by rotating an internal laser, the resulting point cloud will be distorted if the ego-vehicle moves during a single scan (as shown by the figure below). The node corrects this by interpolating sensor data using the odometry of the ego-vehicle.

## Inner-workings / Algorithms

Expand Down Expand Up @@ -39,9 +39,19 @@ $ ExactPointTime = TimeStamp + TimeOffset $

### Core Parameters

| Name | Type | Default Value | Description |
| ---------------------- | ------ | ------------- | ----------------------------------------------------------- |
| `timestamp_field_name` | string | "time_stamp" | time stamp field name |
| `use_imu` | bool | true | use gyroscope for yaw rate if true, else use vehicle status |
| Name | Type | Default Value | Description |
| ----------------------------- | ------ | ------------- | ----------------------------------------------------------- |
| `timestamp_field_name` | string | "time_stamp" | time stamp field name |
| `use_imu` | bool | true | use gyroscope for yaw rate if true, else use vehicle status |
| `update_azimuth_and_distance` | bool | false | update the azimuth and distance based on undistorted xyz |

## Assumptions / Known limits

When setting the parameter `update_azimuth_and_distance` to `true`, the node will calculate the per-point azimuth and distance values based on the undistorted XYZ using the `cv:fastAtan2` function. The azimuth value will have its origin along the x-axis. Please make sure the frame coordinates are the same as the coordinate system mentioned above. For Autoware users, the `frame_id` of the input pointcloud is `base_link`, in which the coordinate system is the same as the coordinate system mentioned above, therefore it will not cause any issues.

Please note that by setting the parameter `update_azimuth_and_distance` to `true`, the time will increase by around 13%.
Also note that due to the `cv::fastAtan2` algorithm (accuracy is about 0.3 degrees), there is the **possibility of changing beam order for high azimuth resolution LiDAR**.

## References/External links

<https://docs.opencv.org/3.4/db/de0/group__core__utils.html#ga7b356498dd314380a0c386b059852270>
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@
#include <tf2_ros/transform_listener.h>

// Include tier4 autoware utils
#include <opencv2/opencv.hpp>
#include <tier4_autoware_utils/ros/debug_publisher.hpp>
#include <tier4_autoware_utils/system/stop_watch.hpp>

Expand Down Expand Up @@ -81,6 +82,8 @@ class DistortionCorrectorComponent : public rclcpp::Node
std::string base_link_frame_ = "base_link";
std::string time_stamp_field_name_;
bool use_imu_;
bool update_azimuth_and_distance_;
float azimuth_factor_ = 100.0;
};

} // namespace pointcloud_preprocessor
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
// Parameter
time_stamp_field_name_ = declare_parameter("time_stamp_field_name", "time_stamp");
use_imu_ = declare_parameter("use_imu", true);
update_azimuth_and_distance_ = declare_parameter<bool>("update_azimuth_and_distance");

// Publisher
undistorted_points_pub_ =
Expand Down Expand Up @@ -202,116 +203,122 @@
sensor_msgs::PointCloud2Iterator<float> it_x(points, "x");
sensor_msgs::PointCloud2Iterator<float> it_y(points, "y");
sensor_msgs::PointCloud2Iterator<float> it_z(points, "z");
sensor_msgs::PointCloud2Iterator<float> it_azimuth(points, "azimuth");
sensor_msgs::PointCloud2Iterator<float> it_distance(points, "distance");
sensor_msgs::PointCloud2ConstIterator<double> it_time_stamp(points, time_stamp_field_name_);

float theta{0.0f};
float x{0.0f};
float y{0.0f};
double prev_time_stamp_sec{*it_time_stamp};
const double first_point_time_stamp_sec{*it_time_stamp};

auto twist_it = std::lower_bound(
std::begin(twist_queue_), std::end(twist_queue_), first_point_time_stamp_sec,
[](const geometry_msgs::msg::TwistStamped & x, const double t) {
return rclcpp::Time(x.header.stamp).seconds() < t;
});
twist_it = twist_it == std::end(twist_queue_) ? std::end(twist_queue_) - 1 : twist_it;

decltype(angular_velocity_queue_)::iterator imu_it;
if (use_imu_ && !angular_velocity_queue_.empty()) {
imu_it = std::lower_bound(
std::begin(angular_velocity_queue_), std::end(angular_velocity_queue_),
first_point_time_stamp_sec, [](const geometry_msgs::msg::Vector3Stamped & x, const double t) {
return rclcpp::Time(x.header.stamp).seconds() < t;
});
imu_it =
imu_it == std::end(angular_velocity_queue_) ? std::end(angular_velocity_queue_) - 1 : imu_it;
}

const tf2::Transform tf2_base_link_to_sensor_inv{tf2_base_link_to_sensor.inverse()};

// For performance, do not instantiate `rclcpp::Time` inside of the for-loop
double twist_stamp = rclcpp::Time(twist_it->header.stamp).seconds();

// For performance, instantiate outside of the for-loop
tf2::Quaternion baselink_quat{};
tf2::Transform baselink_tf_odom{};
tf2::Vector3 point{};
tf2::Vector3 undistorted_point{};

// For performance, avoid transform computation if unnecessary
bool need_transform = points.header.frame_id != base_link_frame_;

for (; it_x != it_x.end(); ++it_x, ++it_y, ++it_z, ++it_time_stamp) {
for (; it_x != it_x.end(); ++it_x, ++it_y, ++it_z, ++it_azimuth, ++it_distance, ++it_time_stamp) {
while (twist_it != std::end(twist_queue_) - 1 && *it_time_stamp > twist_stamp) {
++twist_it;
twist_stamp = rclcpp::Time(twist_it->header.stamp).seconds();
}

float v{static_cast<float>(twist_it->twist.linear.x)};
float w{static_cast<float>(twist_it->twist.angular.z)};

if (std::abs(*it_time_stamp - twist_stamp) > 0.1) {
RCLCPP_WARN_STREAM_THROTTLE(
get_logger(), *get_clock(), 10000 /* ms */,
"twist time_stamp is too late. Could not interpolate.");
v = 0.0f;
w = 0.0f;
}

if (use_imu_ && !angular_velocity_queue_.empty()) {
// For performance, do not instantiate `rclcpp::Time` inside of the for-loop
double imu_stamp = rclcpp::Time(imu_it->header.stamp).seconds();

for (;
(imu_it != std::end(angular_velocity_queue_) - 1 &&
*it_time_stamp > rclcpp::Time(imu_it->header.stamp).seconds());
++imu_it) {
}

while (imu_it != std::end(angular_velocity_queue_) - 1 && *it_time_stamp > imu_stamp) {
++imu_it;
imu_stamp = rclcpp::Time(imu_it->header.stamp).seconds();
}

if (std::abs(*it_time_stamp - imu_stamp) > 0.1) {
RCLCPP_WARN_STREAM_THROTTLE(
get_logger(), *get_clock(), 10000 /* ms */,
"imu time_stamp is too late. Could not interpolate.");
} else {
w = static_cast<float>(imu_it->vector.z);
}
}

const auto time_offset = static_cast<float>(*it_time_stamp - prev_time_stamp_sec);

point.setValue(*it_x, *it_y, *it_z);

if (need_transform) {
point = tf2_base_link_to_sensor_inv * point;
}

theta += w * time_offset;
baselink_quat.setValue(
0, 0, tier4_autoware_utils::sin(theta * 0.5f),
tier4_autoware_utils::cos(theta * 0.5f)); // baselink_quat.setRPY(0.0, 0.0, theta);
const float dis = v * time_offset;
x += dis * tier4_autoware_utils::cos(theta);
y += dis * tier4_autoware_utils::sin(theta);

baselink_tf_odom.setOrigin(tf2::Vector3(x, y, 0.0));
baselink_tf_odom.setRotation(baselink_quat);

undistorted_point = baselink_tf_odom * point;

if (need_transform) {
undistorted_point = tf2_base_link_to_sensor * undistorted_point;
}

*it_x = static_cast<float>(undistorted_point.getX());
*it_y = static_cast<float>(undistorted_point.getY());
*it_z = static_cast<float>(undistorted_point.getZ());

if (update_azimuth_and_distance_ && need_transform) {
*it_distance = sqrt(*it_x * *it_x + *it_y * *it_y + *it_z * *it_z);
*it_azimuth = cv::fastAtan2(*it_y, *it_x) * azimuth_factor_;
}

Check warning on line 321 in sensing/pointcloud_preprocessor/src/distortion_corrector/distortion_corrector.cpp

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ Getting worse: Complex Method

DistortionCorrectorComponent::undistortPointCloud increases in cyclomatic complexity from 21 to 23, threshold = 9. This function has many conditional statements (e.g. if, for, while), leading to lower code health. Avoid adding more conditionals and code to it without refactoring.
prev_time_stamp_sec = *it_time_stamp;
}
return true;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -116,8 +116,8 @@ void RingOutlierFilterComponent::faster_filter(
*reinterpret_cast<const float *>(&input->data[current_data_idx + azimuth_offset]);
const float & next_azimuth =
*reinterpret_cast<const float *>(&input->data[next_data_idx + azimuth_offset]);
float azimuth_diff = next_azimuth - current_azimuth;
azimuth_diff = azimuth_diff < 0.f ? azimuth_diff + 36000.f : azimuth_diff;
float azimuth_diff = std::abs(next_azimuth - current_azimuth);
azimuth_diff = azimuth_diff > 18000.f ? 36000.f - azimuth_diff : azimuth_diff;

const float & current_distance =
*reinterpret_cast<const float *>(&input->data[current_data_idx + distance_offset]);
Expand Down
Loading