Skip to content

Commit

Permalink
Convert tfs and sensor messages to enu frame (#112)
Browse files Browse the repository at this point in the history
* added comment to static transform publisher

* added enu conversion and publisher

* odom enu working

* removed useless tf broadcaster

* fix blueprint to load blue cones

* cam tf being published in airsim_ros_wrapper for consistent parsing

* sensor tfs and images correspond to each other

* only publishing odom in enu and imu in enu, but found some bugs

* added tf dependency

* removed static transform publisher

* made topics docs a bit clearer and added units documentation

* update docs

* update deps in actions

* lidar points transformed and IMU bugfix and tested

* Remove temporary testing topics and rename /fsds/odom_ned to /fsds/odom

* Support multiple lidars

* Fix lidar and camera rotating to the wrong side

* Add units to integration guide

* Set vehicle starting point in world close to the ground to ensure the car drives on a positive Z relative to the starting positoin

* revert settings.json to master

* Add frame and timestamp to odometry message

* Added docs about odom map frame

Co-authored-by: Sijmen Huizenga <[email protected]>
  • Loading branch information
davidoort and SijmenHuizenga authored Jun 17, 2020
1 parent 7730cd6 commit 3f84a48
Show file tree
Hide file tree
Showing 16 changed files with 262 additions and 218 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ jobs:
# Root directory of the catkin workspace
workspace: ./ros
- name: Install ROS deps
run: sudo apt-get install python-catkin-tools ros-melodic-tf2-geometry-msgs python-catkin-tools ros-melodic-rqt-multiplot ros-melodic-joy ros-melodic-cv-bridge ros-melodic-image-transport libyaml-cpp-dev libcurl4-openssl-dev
run: sudo apt-get install python-catkin-tools ros-melodic-tf2-geometry-msgs ros-melodic-tf python-catkin-tools ros-melodic-rqt-multiplot ros-melodic-joy ros-melodic-cv-bridge ros-melodic-image-transport libyaml-cpp-dev libcurl4-openssl-dev
- name: Compilation settings
run: |
sudo apt-get install gcc-8 g++-8
Expand Down
Binary file modified UE4Project/Content/TrainingMap.umap
Binary file not shown.
4 changes: 2 additions & 2 deletions UE4Project/Content/blueprint/RefereeBlueprint.uasset
Git LFS file not shown
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,12 @@ bool UnrealLidarSensor::shootLaser(const msr::airlib::Pose& lidar_pose, const ms

// tranform to lidar frame
point = VectorMath::transformToBodyFrame(point_v_i, lidar_pose + vehicle_pose, true);

// Convert to ENU frame
point.y() = - point.y();
point.z() = - point.z();



// The above should be same as first transforming to vehicle-body frame and then to lidar frame
// Vector3r point_v_b = VectorMath::transformToBodyFrame(point_v_i, vehicle_pose, true);
Expand Down
Binary file added docs/images/enu_ned.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/imu.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ The IMU sensor in FSDS is using AirSim's built in IMU sensor simulation, that ha

The maximum achievable internal IMU frequency is 1000 Hz, ouputing information of the vehicle's 3-axis angular velocity, 3-axis linear acceleration, as well as its orientation in quaternions.

IMU gyroscope and accelomter bias and accuracy parameters can be found and fine-tuned in AirLib/include/sensors/imu/ImuSimpleParams.hpp, or can be initialized with custom parameters in your settings.json file.
IMU gyroscope and accelomter bias and accuracy parameters can be found and fine-tuned in AirLib/include/sensors/imu/ImuSimpleParams.hpp, or can be initialized with custom parameters in your `settings.json` file.

The angular velocity, linear acceleration outputs as well as their biases have artificially introduced Gaussian noise (0 mean, standard deviation of 1) updated on each IMU cycle.

Expand Down
28 changes: 16 additions & 12 deletions docs/integration-handbook.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,30 +152,34 @@ The IMU captures the acceleration, orientation and angular rate of the vehicle.
More detailed technical information about the IMU implementation of the IMU can be found [here](imu.md).

### Sensor specification
Teams are expected to provide their sensor suite as a single AirSim settings.json file.
Teams are expected to provide their sensor suite as a single AirSim `settings.json` file.
Most of the parameters in the settings.json file will be set by the officials to ensure fairness during competition.
You are allowed to configure the following subset of parameters within the boundaries of the above rules.

* Cameras
* * camera name
* * Width, Height
* * FOV_Degrees
* * X, Y, Z
* * Pitch, Roll, Yaw
* * Width, Height (pixels)
* * FOV_Degrees (degrees)
* * X, Y, Z (meters)
* * Pitch, Roll, Yaw (degrees)
* Lidars
* * NumberOfChannels
* * PointsPerSecond
* * RotationsPerSecond
* * HorizontalFOVStart
* * HorizontalFOVEnd
* * VerticalFOVUpper
* * VerticalFOVLower
* * X, Y, Z
* * Pitch, Roll, Yaw
* * HorizontalFOVStart (degrees)
* * HorizontalFOVEnd (degrees)
* * VerticalFOVUpper (degrees)
* * VerticalFOVLower(degrees)
* * X, Y, Z (meters)
* * Pitch, Roll, Yaw (degrees)

The GPS and IMU are configured equally for all teams according to the rules in the previous chapter.

We recommend to copy the settings.json in this repository as a base and configure the cameras and lidar from thereon.
X, Y, Z, Pitch, Roll, Yaw have to be specified in a NED frame (it might take some experimentation to understand the correct signs to use).
The transforms you will get from ROS however, will be in the ENU frame (which is the default ros coordinate system).
Distance values are in meters and rotation values are degrees.

We recommend to copy the `settings.json` in this repository as a base and configure the cameras and lidar from thereon.

## Launching the simulator
To run the simulation, read the [simulation guide](how-to-simulate.md).
Expand Down
102 changes: 33 additions & 69 deletions docs/ros-bridge.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,91 +23,55 @@ roslaunch fsds_ros_bridge fsds_ros_bridge.launch
This launches the following nodes:
* `/fsds/ros_bridge` node responsible for IMU, GPS, lidar, vehicle setpoints and go/finish signals.
* `/fsds/camera/CAMERANAME` node is run for each camera configured in the `settings.json`. The nodes are launched using the `cameralauncher.py` script.
* `/fsds/camera/CAMERANAME/tf` node that publishes the transform with the camera position and orientation.
* `/fsds/ned_to_enu_pub` node that translates between NED and ENU frame. See 'Coordinate frames and transforms' for more information about this.

## Publishers
- `/fsds/gps` [sensor_msgs/NavSatFix](https://docs.ros.org/api/sensor_msgs/html/msg/NavSatFix.html)
This the current GPS coordinates of the drone in airsim published at 10hz
[Read all about the gps simulation model here.](gps.md)
Data is in the `fsds/FSCar` frame.

- `/fsds/imu` [sensor_msgs/Imu](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/Imu.html)
Velocity, orientation and acceleratoin information at 250hz.
[Read all about the IMU model here.](imu.md)
Data is in the `fsds/FSCar` frame.

- `/fsds/testing_only/odom` [nav_msgs/Odometry](https://docs.ros.org/api/nav_msgs/html/msg/Odometry.html)
Ground truth car position and orientation in NED frame. THIS WILL NOT BE STREAMED DURING COMPETITION.

- `/fsds/testing_only/track` [fs_msgs/Track](https://github.com/FS-Online/fs_msgs/blob/master/msg/Track.msg)
Ground truth cone position and color with respect to the starting location of the car. THIS WILL NOT BE STREAMED DURING COMPETITION.
Currently this only publishes the *initial position* of cones that are part of the track spline. Any cones placed manually in the world are not published here.
Additionally, the track is published once and the message is latched (meaning it is always available for a newly created subscriber).

- `/fsds/camera/CAMERA_NAME` [sensor_msgs/Image](https://docs.ros.org/api/sensor_msgs/html/msg/Image.html)
One of this topic type will exist for every camera specified in the `settings.json` file.
On this topic, camera frames are published. The format will be bgra8.
`CAMERA_NAME` will be replaced by the corresponding in the `Cameras` object in the `settings.json` file.
`IMAGE_TYPE` is determand by the `SensorType` field.
When choosing 0, it will be 'Scene'.

- `/fsds/lidar/LIDARNAME` [sensor_msgs/PointCloud](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/PointCloud.html)
Publishes the lidar points for each lidar sensor.
All points are in the `fsds/LIDARNAME` frame.
Transformations between the `fsds/LIDARNAME` and `fsds/FSCar` frame are being published regularly.

- `/fsds/lidar/LIDARNAME` [sensor_msgs/PointCloud](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/PointCloud.html).
Publishes the lidar points for each lidar sensor.

- `/fsds/signal/go` [fs_msgs/GoSignal](https://github.com/FS-Online/fs_msgs/blob/master/msg/GoSignal.msg)
GO signal that is sent every second by the ROS bridge.
The car is only allowed to drive once this message has been received.
If no GO signal is received for more than 4 seconds, the AS can assume that `fsds_ros_bridge` has been shut down.
This message also includes the mission type and track.
More info about signal topics can be found in the [integration handbook](integration-handbook.md)

- `/fsds/signal/finished` [fs_msgs/FinishedSignal](https://github.com/FS-Online/fs_msgs/blob/master/msg/FinishedSignal.msg)
Finished signal that is sent by the AS to stop the mission. More info about signal topics can be found in the [integration handbook](integration-handbook.md)

- `/tf` [tf2_msgs/TFMessage](https://docs.ros.org/api/tf2_msgs/html/msg/TFMessage.html)
See 'Coordinate frames and transforms'



## Subscribers
- `/fsds/control_command` [fs_msgs/ControlCommand](https://github.com/FS-Online/fs_msgs/blob/master/msg/ControlCommand.msg)
This message includes the dimensionless values throttle, steering and brake.
Throttle and brake range from 0 to 1.
For steering `-1` steers full to the left and `+1` steers full to the right.
The contents of this message fill the essential parts of the `msr::airlib::CarApiBase::CarControl` struct.
This is the only way to control the car when the airsim ROS client is connected (keyboard will no longer work!).

- `/fsds/signal/finished` [fs_msgs/FinishedSignal](https://github.com/FS-Online/fs_msgs/blob/master/msg/FinishedSignal.msg)
Finished signal that is sent by the AS to stop the mission.
The ros bridge will forward the signal to the operator which in turn will stop the ros bridge and finish the run.

## Published topics


| Topic name | Description | Message | Rate (hz) |
|---|---|---|---|
| `/fsds/gps` | This the current GPS coordinates of the drone in airsim. Read all about the gps simulation model [here](gps.md). Data is in the `fsds/FSCar` frame. | [sensor_msgs/NavSatFix](https://docs.ros.org/api/sensor_msgs/html/msg/NavSatFix.html) | 10 |
| `/fsds/imu` | Velocity, orientation and acceleration information. Read all about the IMU model [here](imu.md). Data is in the `fsds/FSCar` (enu) frame. | [sensor_msgs/Imu](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/Imu.html) | 250 |
| `/fsds/testing_only/odom` | Ground truth car position and orientation in ENU frame about the CoG of the car (`fsds/FSCar`). The units are `m` for distance and `rad` for angles. The message is in the `fsds/map` frame. This is a frame that is not (yet) used anywhere else and is just here so you can easely reference it if needed. *THIS WILL NOT BE STREAMED DURING COMPETITION.* | [nav_msgs/Odometry](https://docs.ros.org/api/nav_msgs/html/msg/Odometry.html) | 250 |
| `/fsds/testing_only/track` | Ground truth cone position and color with respect to the starting location of the car in ENU. Currently this only publishes the *initial position* of cones that are part of the track spline. Any cones placed manually in the world are not published here. Additionally, the track is published once and the message is latched (meaning it is always available for a newly created subscriber). *THIS WILL NOT BE STREAMED DURING COMPETITION.* | [fs_msgs/Track](https://github.com/FS-Online/fs_msgs/blob/master/msg/Track.msg) | Latched |
| `/fsds/camera/CAMERA_NAME` | One of this topic type will exist for every camera specified in the `settings.json` file. On this topic, camera frames are published. The format will be bgra8. `CAMERA_NAME` will be replaced by the corresponding in the `Cameras` object in the `settings.json` file. `IMAGE_TYPE` is determand by the `SensorType` field. When choosing 0, it will be 'Scene'. | [sensor_msgs/Image](https://docs.ros.org/api/sensor_msgs/html/msg/Image.html) | ~18 |
| `/fsds/lidar/LIDARNAME` | Publishes the lidar points for each lidar sensor. All points are in the `fsds/LIDARNAME` frame. Transformations between the `fsds/LIDARNAME` and `fsds/FSCar` frame are being published regularly. More info on the lidar sensor can be found [here](integration-handbook.md) | [sensor_msgs/PointCloud](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/PointCloud.html) | 10 |
| `/fsds/signal/go` | GO signal that is sent every second by the ROS bridge.The car is only allowed to drive once this message has been received. If no GO signal is received for more than 4 seconds, the AS can assume that `fsds_ros_bridge` has been shut down. This message also includes the mission type and track. More info about signal topics can be found in the [integration handbook](integration-handbook.md) | [fs_msgs/GoSignal](https://github.com/FS-Online/fs_msgs/blob/master/msg/GoSignal.msg) | 1 |
| `/tf_static` | See [Coordinate frames and transforms](#coordinate-frames-and-transforms) | [tf2_msgs/TFMessage](https://docs.ros.org/api/tf2_msgs/html/msg/TFMessage.html) | 1 |


## Subscribed topics

| Topic name | Description | Message |
|---|---|---|
| `/fsds/control_command` | This message includes the dimensionless values throttle, steering and brake. Throttle and brake range from 0 to 1. For steering `-1` steers full to the left and `+1` steers full to the right. The contents of this message fill the essential parts of the `msr::airlib::CarApiBase::CarControl` struct. This is the only way to control the car when the airsim ROS client is connected (keyboard will no longer work!). | [fs_msgs/ControlCommand](https://github.com/FS-Online/fs_msgs/blob/master/msg/ControlCommand.msg) |
| `/fsds/signal/finished` | Finished signal that is sent by the AS to stop the mission. The ros bridge will forward the signal to the operator which in turn will stop the ros bridge and finish the run. | [fs_msgs/FinishedSignal](https://github.com/FS-Online/fs_msgs/blob/master/msg/FinishedSignal.msg) |


## Services

- `/fsds/reset` [fsds_ros_bridge/Reset](https://github.com/FS-Online/Formula-Student-Driverless-Simulator/blob/master/ros/src/fsds_ros_bridge/srv/Reset.srv)
Resets car to start location.

## Units

If a topic streams a standard ROS message (like [sensor_msgs/Imu](http://docs.ros.org/melodic/api/sensor_msgs/html/msg/Imu.html)) then the units will be the recommended units in the message documentation. Custom messages (from the [fs_msgs](https://github.com/FS-Online/fs_msgs) package) use the units specified in the message documentation as well. If in doubt, interpret distances in meters, angles in radians and rates in m/s and rad/s, etc.

## Coordinate frames and transforms

The primary frame is the `fsds/FSCar` frame.
This frame centers the center of the car.
The primary frame is the `fsds/FSCar` frame, which is fixed at the center of the car following the ENU coordinate system convention.
The center of the car is the Unreal Engine car pawn position, which in turn is also the center of gravity.
It is using the NED coordinate system.
The `fsds/FSCar/enu` frame is the same point, translated to the ENU system.
Read more about the differences between ENU and NED [here](https://en.wikipedia.org/wiki/Local_tangent_plane_coordinates).

The ros bridge regularly publishes static transforms between the `fsds/FSCar` frame and each of the cameras and lidars.
Naming of these frames is `fsds/SENSORNAME`.
For example, a lidar named `Example` will publish it's points in the `fsds/Example` frame.
The position and orientation of a camera named `Test` will become available in the frame `/fsds/Test`.

*PLEASE NOTE*: the transforms published on the /tf_static topic are a direct conversion of the transforms specified in the `settings.json` file but expressed in a ENU coordinate system instead of in a NED coordinate system (which is what the `settings.json` file uses). Read more about the differences between ENU and NED [here](https://en.wikipedia.org/wiki/Local_tangent_plane_coordinates). For a quick illustration of the two frames, see the image below:

![ENUvsNED](images/enu_ned.png)

Only static transforms within the vehicle are published.
Transforms to the ground truth are disabled because this would take away the challenge of the competition.
Transforms to the ground truth are disabled because this would take away the state estimation challenge of the competition.

## Parameters
- `/fsds/ros_bridge/update_gps_every_n_sec` [double]
Expand Down
1 change: 1 addition & 0 deletions ros/src/fsds_ros_bridge/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ find_package(catkin REQUIRED COMPONENTS
sensor_msgs
std_msgs
std_srvs
tf
tf2
tf2_ros
fs_msgs
Expand Down
6 changes: 3 additions & 3 deletions ros/src/fsds_ros_bridge/include/airsim_ros_wrapper.h
Original file line number Diff line number Diff line change
Expand Up @@ -37,12 +37,14 @@ STRICT_MODE_OFF //todo what does this do?
#include <sensor_msgs/NavSatFix.h>
#include <sensor_msgs/PointCloud2.h>
#include <std_srvs/Empty.h>
#include <tf/tf.h>
#include <tf2/LinearMath/Matrix3x3.h>
#include <tf2/LinearMath/Quaternion.h>
#include <tf2_geometry_msgs/tf2_geometry_msgs.h>
#include <tf2_ros/static_transform_broadcaster.h>
#include <tf2_ros/transform_broadcaster.h>
#include <tf2_ros/transform_listener.h>
#include <std_msgs/Float32.h>
#include <unordered_map>
#include <fstream>
#include <curl/curl.h>
Expand Down Expand Up @@ -160,7 +162,7 @@ class AirsimROSWrapper


AirSimSettingsParser airsim_settings_parser_;
std::map<std::string, std::string> vehicle_lidar_map_;
std::vector<std::string> lidar_names_vec_;
std::vector<geometry_msgs::TransformStamped> static_tf_msg_vec_;
std::string mission_name_; // rosparam obtained from launch file
std::string track_name_; // rosparam obtained from launch file
Expand All @@ -180,9 +182,7 @@ class AirsimROSWrapper
// std::recursive_mutex lidar_mutex_;

/// ROS tf
tf2_ros::TransformBroadcaster tf_broadcaster_;
tf2_ros::StaticTransformBroadcaster static_tf_pub_;
tf2_ros::Buffer tf_buffer_;

/// ROS Timers.
ros::Timer odom_update_timer_;
Expand Down
3 changes: 0 additions & 3 deletions ros/src/fsds_ros_bridge/launch/fsds_ros_bridge.launch
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,5 @@

<node pkg="fsds_ros_bridge" name="cameralauncher" type="cameralauncher.py" output="screen" required="true"/>

<!-- Static transforms -->
<include file="$(find fsds_ros_bridge)/launch/static_transforms.launch"/>

</group>
</launch>
3 changes: 0 additions & 3 deletions ros/src/fsds_ros_bridge/launch/static_transforms.launch

This file was deleted.

2 changes: 1 addition & 1 deletion ros/src/fsds_ros_bridge/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<build_depend>sensor_msgs</build_depend>
<build_depend>fs_msgs</build_depend>
<build_depend>std_msgs</build_depend>

<build_depend>tf</build_depend>
<!-- <exec_depend>nodelet</exec_depend> -->
<exec_depend>geometry_msgs</exec_depend>
<exec_depend>image_transport</exec_depend>
Expand Down
14 changes: 7 additions & 7 deletions ros/src/fsds_ros_bridge/scripts/cameralauncher.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,13 +32,13 @@ def args(argsmap):
'framerate': CAMERA_FRAMERATE,
'host_ip': AIRSIM_HOSTIP
})))
launch.launch(
roslaunch.core.Node(
'tf', 'static_transform_publisher',
namespace="fsds/camera/"+cameraname, name='tf',
required=True, output='screen',
# see http://wiki.ros.org/tf#static_transform_publisher why this works
args="static_transform_publisher "+str(camsettings['X'])+" "+str(camsettings['Y'])+" "+str(camsettings['Z'])+" "+str(camsettings['Yaw'])+" "+str(camsettings['Pitch'])+" "+str(camsettings['Roll'])+" /fsds/FSCar /fsds/"+cameraname+" 1000"))
# launch.launch(
# roslaunch.core.Node(
# 'tf', 'static_transform_publisher',
# namespace="fsds/camera/"+cameraname, name='tf',
# required=True, output='screen',
# # see http://wiki.ros.org/tf#static_transform_publisher why this works
# args="static_transform_publisher "+str(camsettings['X'])+" "+str(camsettings['Y'])+" "+str(camsettings['Z'])+" "+str(camsettings['Yaw'])+" "+str(camsettings['Pitch'])+" "+str(camsettings['Roll'])+" /fsds/FSCar /fsds/"+cameraname+" 1000"))

try:
launch.spin()
Expand Down
Loading

0 comments on commit 3f84a48

Please sign in to comment.