Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

TF Tree in blue stack #391

Answered by evan-palmer
Manouselis asked this question in Q&A
Discussion options

Hi everyone,

I am trying to implement a custom position controller. To do that, I need to transform my position error from the map frame to the base_link frame. When trying to do that I noticed that there is no TF Connectivity between map and base_link. By inspecting the TF tree, I also saw that map, odom and base_link are all disconnected from each other. I would normally expect a transformation similar to: map --> odom --> base_link. Why is that not the case?

frames_disconnected.pdf

You must be logged in to vote

The easiest way to achieve this is to add a ROS bridge from Gazebo. This is the solution that I used when implementing the new demos for the UVMS controllers here.

The map -> odom link doesn't exist in this simulated system because it uses the absolute Gazebo state measurements. In practice, when you are using a USBL + DVL, those TFs are more important due to the low sampling rates of the sensors.

Replies: 2 comments 6 replies

Comment options

The easiest way to achieve this is to add a ROS bridge from Gazebo. This is the solution that I used when implementing the new demos for the UVMS controllers here.

The map -> odom link doesn't exist in this simulated system because it uses the absolute Gazebo state measurements. In practice, when you are using a USBL + DVL, those TFs are more important due to the low sampling rates of the sensors.

You must be logged in to vote
6 replies
Comment options

On hardware, we obtain the odom -> base_link transform using our state estimator (robot_localization). We treat the odom frame as our psuedo-map frame. However, if you happen to receive a periodic GPS lock, you could likely define a true map -> odom transform.

With regards to MAVROS and ArduPilot, I previously spent time trying to integrate the ArduPilot EKF (exposed in ROS via the /mavros/local_position/odom topic); however, I ran into issues integrating additional sensor modalities and configuring it in a way that worked for my specific use-case. This encouraged my adoption of the robot_localization package.

Comment options

Thanks for the quick reply @evan-palmer. Can you guide me to where exactly the robot_localization package was implemented in the blue repo (or in the docker image), as I can not find any of the topics I expect from robot localization such as odometry/filtered or nodes like the ekf_localization_node?

Comment options

I haven't included that implementation in this project because the configuration is tailored specifically to our own hardware deployment. To integrate it with your AUV, you can simply change the endpoint from which you receive odometry estimates (i.e., instead of receiving odometry from Gazebo, you receive it via odometry/filtered. An example demonstrating how to configure the EKF node is here.

Comment options

Hi @evan-palmer. In one of your previous replies, you mentioned that:

We treat the odom frame as our pseudo-map frame.

I was wondering whether you do this with a simple method, such as publishing an identity static transform between the two frames (ros2 run tf2_ros static_transform_publisher 0.0 0.0 0.0 0.0 0.0 0.0 1.0 map odom), or a different, more complex way?

Comment options

At the moment, yes our the map frame is equivalent to the odom frame. The sensors that we use to measure position are static and treated as the inertial frame for our system. We are continuously updating our localization framework though, so that may change in the near future. However, treating the fixed position sensor as the inertial frame for your EKF is a simple solution that can give you a baseline to improve from. A more complex solution (and likely better solution) would be to use your DVL and IMU estimates for dead reckoning (i.e., odom -> base_link) and use your USBL or sonar to provide low-frequency corrections.

Answer selected by Manouselis
Comment options

As a quick side comment, you may find the mavros_controllers package helpful as a reference for your position controller.

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet

AltStyle によって変換されたページ (->オリジナル) /