Harmonic and Jazzy tutorial
Adding Lidar
Introduction and explanation
What software are we using and why
All of these instructions have been tested using the current version of Ubuntu: Noble Numbat.We are going to use two packages: Gazebo for the physics based Simulator and ROS2 (Robot Operating System) for the control system. There are many releases of each of these, and only some combinations are mutually compatible. Also, since we are focusing on a stable development platform, we will use the Long Term Support (LTS) versions that are current at the time we are posting this. As of 28Aug2024, this means we will use Gazebo Harmonic which is supported until 2028.
How will the demonstration proceed.We will go through 7 steps: 1) Install Gazebo Fortress and confirm that it is correct. 2) Examine an empty world 3) Build a simple robot and test it using keyboard controls. 4) Build a 'world' for our robot to explore. 5) Install ROS2 (and its components) and confirm the installation. 6) Add Lidar to the robot 7) Write control software in ROS2 to make the robot autonomous.
This post covers Step 6 Starting world description (for the last tutorial): hjn-pong-world-3.sdf Ending world description: hjn-pong-world-4.sdf
How will the demonstration proceed.We will go through 7 steps: 1) Install Gazebo Fortress and confirm that it is correct. 2) Examine an empty world 3) Build a simple robot and test it using keyboard controls. 4) Build a 'world' for our robot to explore. 5) Install ROS2 (and its components) and confirm the installation. 6) Add Lidar to the robot 7) Write control software in ROS2 to make the robot autonomous.
This post covers Step 6 Starting world description (for the last tutorial): hjn-pong-world-3.sdf Ending world description: hjn-pong-world-4.sdf
Create a Sensing Robot
In the last section, we installed and tested ROS2, to make sure that we can send commands from ROS2 to the Simulator and get data back from the Simulator into the ROS2 environment.
It basically functions like this:
Of course we haven't written the robot controller yet, but we know all the connections are working.
An autonomous robot needs to do four things:
1) Observe the world - gather sensor data about the current world
2) Orient itself - assess what the data means relative to its goals
3) Decide - given the current world and its goals, what should it do?
4) Act - implement that decision
This is the "OODA Loop" pioneered by Col. John Boyd
The start of the loop is to sense the world.
So let's get to work.
Add a Lidar sensor
We have a robot but it is basically just a remote controlled vehicle. As long as a human (you) is watching the simulations display, they can control the bot. But, we want more - we want to build a system that can see what the robot sees and make decisions about what to do next and then send commands to the robot to change its behavior.
To do that, the robot needs a sensor. We will add a lidar to the bot and test it in gazebo, then set up ROS2 to grab the data via another ROS2-Gazebo bridge. Before we can use a sensor in the Gazebo Simulator we need to add a new plugin to the simulation set up. Up at the top of the sdf file, we have plugins for Physics, UserCommands, SceneBroadcaster, and Contact. We got those for free when we created the empty world. Now we need to add another plugin in the same section. Under the Contact Plugin add:
To do that, the robot needs a sensor. We will add a lidar to the bot and test it in gazebo, then set up ROS2 to grab the data via another ROS2-Gazebo bridge. Before we can use a sensor in the Gazebo Simulator we need to add a new plugin to the simulation set up. Up at the top of the sdf file, we have plugins for Physics, UserCommands, SceneBroadcaster, and Contact. We got those for free when we created the empty world. Now we need to add another plugin in the same section. Under the Contact Plugin add:
- Here is the code for the lidar.
- There are three parts: a frame, the actual sensor, and a sensor housing attached to the robot chassis. We will start with the frame - a frame is used to set up a relative reference in space and pose, which enables a complex object to be designed in isolation and attached to another model. This frame definition defines a point 0.9m in X and 0.3. in Z from the centerpoint of the chassis, and aligned with the chassis orientation
- Since the Lidar frame is part of the bot, we want to place it in the pong-bot model and we put frames up at the top.
- So, right under:
- <model name='pong-bot' canonical_link='chassis'>
- <pose relative_to='world'>0 0 0 0 0 0</pose>
- Add:
Now that we have our reference frame, we can add a lidar - this is a built in sensor from the gazebo package.
We will place this sensor inside the chassis link at the top.
Under
<link name='chassis'>
Add:
A couple of things to note:
1) the pose of the lidar is in reference to the lidar_frame
2) A couple of sensor specs are enumerated:
a) the update rate is in Hz - 10 means do a scan 10 times a second
b) in the horizontal section, we call out 16 samples per scan line (you can adjust this to match a specific lidar
c) the angles determine the width of the scan line in radians. (this is about 320 degrees)
d) the verticle section call out a single narrow scan line
e) the range section defines the effective rance (both minimum and maximum) in meters.
3) Always on tells the simulator that the lidar always sends data
4) Visualize tells the simulator to calculate the end points of the rays
Okay, this will create the lidar, but it is just the sensor system we also need to model the housing.
So we add a new link to the model and a fixed joint to attach it to the robot chassis.
This needs to be added inside the robot model
The lidar housing is highlighed in green in the image below.
So at this point you can do a quick test.
1) Fire up Gazebo with your sdf file
2) On the pull down, start the key publisher so that you can send commands to the robot
3) Add an new feature call Visualize Lidar from the same pull down menu
4) Scroll down to the Visualize Lidar interface and find the data source selector (the orange circle with the arrow), click in the box and select /lidar
5) Start the simulator
As you drive the robot around you can see the lidar simulation. The dark segments are rays that hit, the lighter segments are out of range (either below the minimum or above the maximum range.
You can also use the check box labeled "Show Non Hitting Rays" to suppress the misses.
Talking To ROS2
Now that we know everything is working on the Gazebo Harmonic side, let's send some data over to ROS2.
In the last tutorial, we set up a bridges from ROS2 to Harmonic that passed simple data integers that represented single characters. Now we need to send more complex data types. Both Gazebo and ROS2 have built in data types for many sensors, and lidar is one of these. The data type is specified as :
We can use a gz_ros bridge to translate from Gazebo to ROS. So let's set up a bridge for lidar data:
As we discussed before, this command does several things:
1) It launches ros2, and requests a ros_gz_bridge from the paqrameter_bridge package
2) It tells the bridge to set up for creating ROS2 -based sensor_msgs/mgs/LaserScan out of gazebo gz.msgs.LaserScan which come over the /lidar topic
this is the same thing we did in the last tutorial with Int32 messages. But then we tell it to do one more thing:
3) It then publishes the new messages on a topic with a different name: map the /lidar topic onto the new /laser_scan topic.
This capability allows us to deal with merging different packages that might have namespace collisions by remapping the topic names.
So, we can now setup a system to run the simulation in Gazebo, collect the lidar data and send it over to ROS, and then let a ROS program pick the data up.
To do this we need to launch three applications the gz_ros bridge, the gazebo simulation, and the ROS monitor:
1) source /opt/ros/jazzy/setup.bash && ros2 run ros_gz_bridge parameter_bridge /lidar@sensor_msgs/msg/LaserScan[gz.msgs.LaserScan --ros-args -r /lidar:=/laser_scan
2) gz sim pong_world_5.sdf
3) source /opt/ros/jazzy/setup.bash && ros2 topic echo /laser_scan
Once the Gazebo Sim is up, activate the key publisher and start the sim
You should immediately see the lidar data being displayed on the ROS screen. As you drive the robot around you should see the data on the ROS side change.
What we did
This has been a long one, but we got a lot done:
We have:
1) Installed a Lidar sensor, by setting up a reference frame
2) added a lidar housing link, and a fixed joint to mount it to the chassis
3) used the build in "Visualize Lidar" display in Gazebo to see the lidar in action
4) set up a bridge to pass the lidar data over to ROS2, and
5) Used a builting ROS2 app to show the actual lidar data that is being sent over the bridge.
Now that we have the ability to get data from the simulation over to the ROS2 side, and th ability to send commands from the ROS2 side over to the simulated robot. wouldn't it be sweet if we could put in a robot controller to drive the robot autonimously?Next step - Built and run an autonomous robot controller
Previous Step | Current Step | Next Step |
Add a Lidar Sensor |