Raspberry Pi 5, Ubuntu, ROS2 Jazzy Jalisco, iRobot Create 3 – all playing together

This will be a rolling post as I work through this, in the hopes that others also can find the help they need to put these things together. ROS can be a difficult mess to just get the initial plumbing up and running, before you even get to do the fun stuff. Compatibility, difficult documentation, and the niche nature of ROS means that often those who attempt to jump in get dissapointed and stop before they can make the robot move. There will be times where this guide makes mistakes, skips parts, or generally is confusing, but I write it in the hopes that my ramble as I work through the lastest LTS stable release of ROS2 on the latest Raspberry Pi hardware will help others get past these difficult early parts.

For those who don’t know, the Raspberry Pi 5 only supports Ubuntu 23.10 and onwards because of a kernel support problem with the Raspberry Pi 5 hardware. This creates a complex situation with those wanting to use the hardware with ROS2 Humble Hawksbill as it only supports (easily) Ubuntu 22.04.

The only clean and easily replicable method here is to move onto the next just released LTS version of ROS2: Jazzy Jalisco. This is made for Ubuntu 24.04 LTS, which means we can finally fit these things together.

These are the broad steps to get rolling:
1. Image the SD card
Use the Raspberry Pi imager tool to image Ubuntu 24.04 server onto an SD card. You’ll find this option in the “Other general-purpose OS” section of the menu, then Ubuntu, and finally “Ubuntu Server 24.04.x LTS (64 bit). (we use server because desktop is not needed, we don’t want to eat up all the processing power and RAM to render a desktop, ssh will suffice).
IMPORTANT: make sure you use the options area when you’re about to image to set up things like wifi and login information, and also enable SSH. This will mean your pi will turn up on your network with credentials and SSH ready to log in

2. Update the iRobot Create 3
You’ll need to follow the instructions provided by iRobot. This usually involves going to the iRobot Create 3 web page, downloading the latest firmware file, powering on the robot, waiting for it to create its own wifi access point, connecting to it, opening the default config website on it, uploading the firmware file, waiting for it to finish.
Update: It seems at the time of writing, that they officially only support up to ROS2 “Iron” (older stable release) but should still be compatible with “Jazzy” (latest long term support release).
https://iroboteducation.github.io/create3_docs/releases/i_0_0/
At this point I downloaded manually and uploaded to the robot the Cyclone DDS version of the firmware, as this didn’t need a discovery server, and should “just work” happily with Jazzy Jalisco (plus it seems Jazzy moving forward is using cyclone by default? Correct me if I’m wrong).

3. Mount the Raspberry Pi 5 in the back case of the robot
This may involve 3D printing one of the many mounts for Raspberry pis to fit in the case, or doing whatever you see best to mount it safely. Be mindful of contact with metal etc.

4. Plugging the Raspberry Pi 5 into onboard power
There is a hidden USB C connector inside the back caddy of the robot, so with a short USB C to USB C cable you can power the Raspberry Pi, as well as provide it with USB networking to control the robot

5. Setting up the Raspberry Pi 5 to run networking over USB
This one was a little complex, as it wasn’t immediately clear what was wrong, and there being mixed messages about the USB C port on the Raspberry Pi 5. Many say that it’s for power only, and various sources say it’s not officially supported, but the USB C connector can power as well as have data run over it. Basically you have to load a kernel module to enable gadget ethernet over USB, then configure the new network interface to be the right subnet to reach the robot.

First, add the module to load on boot:
echo "g_ether" | sudo tee -a /etc/modules
This loads the “g_ether” (gadget ethernet) module to load on boot. This will create a new network connection called “usb0” when plugged into the robot.

Next, add the network config for this new network connection:
sudo bash -c 'cat > /etc/netplan/99-usb0.yaml << EOF
network:
version: 2
ethernets:
usb0:
dhcp4: no
addresses:
- 192.168.186.3/24
EOF'

This creates a new config file called “99-usb0.yaml” in the /etc/netplan folder and puts the config for the new network interface in place. Notice the address/subnet? That’s because the iRobot Create uses address 192.168.186.2/24 by default. If your robot is configured differently, then change the address accordingly.

Apply the new netplan config:
sudo netplan apply

Check it worked:
ip addr show usb0
This should show your connection with an address assigned, and up.

6. Installing ROS2
This step I won’t step out, as it’s covered well here:
https://docs.ros.org/en/jazzy/Installation/Ubuntu-Install-Debs.html
I would however recommend keeping it to the base packages (sudo apt install ros-jazzy-ros-base) because you don’t have a desktop installation of Ubuntu, so you want to keep it to the basics, and connect using your laptop with ROS2 Jazzy installed on it to run any visualisation etc.

7. Set up the correct middleware for ROS2 and the iRobot Create
The middleware is what is used to pass messages between different components in ROS2. We have to have the robot using the same middleware as the Raspberry Pi in order for all of the components to talk to each other.

You should have installed the firmware with the cycloneDDS version in a previous step. Now we want to install and set up ROS2 on the Raspberry Pi.

Run:
sudo apt install ros-jazzy-rmw-cyclonedds-cpp
Which will install the CycloneDDS middleware for ROS2
Then:
export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
Which will tell ROS2 to use it.
echo 'export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp' >> ~/.bashrc
To make this permanently happen without having to export it each login

We then need to confirm that the robot is set up properly:
Go to the Application->Configuration menu of the robot’s webpage
Ensure that in “RMW_IMPLEMENTATION” setting it has “rmw_cyclonedds_cpp” selected
IMPORTANT: untick the “Enable Fast DDS discovery server?” setting (if you don’t it still appears to try and use FastDDS instead of CycloneDDS)
Press the menu item Application->RestartApplication to restart it. This should then have it turn up on discovery on the network for ROS2

Finally:
Run command ros2 topic list and you should see something like (probably way more):
/cmd_audio
/cmd_lightring
/cmd_vel
/cmd_vel_stamped
/parameter_events
/rosout

This means it’s visible and things are talking! If not, check you’ve done everything above, and ensured that the networking is up.

7. Have some fun testing manual driving
Because we have our /cmd_vel topic exposed now to ROS2, we have the ability to send commands to drive.

First we’ll need to install a ROS2 package which is a lightweight command line keyboard controller:
sudo apt install ros-jazzy-teleop-twist-keyboard

Then we’ll run it with standard arguments (expects /cmd_vel to exist etc):
ros2 run teleop_twist_keyboard teleop_twist_keyboard
(there are instructions on screen)

You should be now able to drive the robot around!

8. Connect an RPLIDAR A1 on top and scan the room
A remote control robot is pretty boring, we want it to make its own decisions and drive itself, so we need to have some sensing of the environment and mapping to be able to decide how to drive and where to drive. This is called SLAM (Simultaneous Location And Mapping), and to get there we need spatial awareness.

We’re going to use a cheap and easy to get RPLIDAR A1 2D lidar to see with. I 3D printed a bracket to mount it on top in the middle of the robot to make it simple for now. Connect it to the Raspberry Pi with a USB cable.

We will now create a build environment, and grab the driver for this to build.

Create the ROS workspace to build from in the home directory:
mkdir -p ~/ros2_ws/src
Move to the src directory inside:
cd ~/ros2_ws/src
Clone the source code from the Slamtec github:
git clone https://github.com/Slamtec/sllidar_ros2.git
Move back to the top of the ROS workspace:
cd ~/ros2_ws
Grab the system dependencies this source code will want:
rosdep install --from-paths src --ignore-src -r -y
Build the driver/utilities for the LIDAR:
colcon build --symlink-install
“Source” the build environment to overlay the current ROS2 system environment (allows for the new driver to be used in place, not having to install it systemwide):
source install/setup.bash

We’re ready to launch the ROS node for the lidar (this is for the defaults with my A1 LIDAR, if yours is different you will need a different launch file – my launch file is sllidar_a1_launch.py):
ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=laser

Let’s check that the topic for /scan exists with:
ros2 topic list

This is great – we appear to be running if you see it.

But it’s not much use unless we can actually see the output.
One method is to just run:
ros2 topic echo /scan
But you’ll be quickly overwhelmed with data – we humans need visuals!

ROS2 uses rviz2 as a tool to visualise data.

It’s best you don’t run this on the Raspberry Pi, so install ROS2 Jazzy Desktop package onto your own laptop. This can be messy if your system isn’t compatible, but let’s leave that for you to follow the instructions and figure out. On my laptop running Ubuntu 22.04 it as messy, so I just decided to run up a virtual Ubuntu 24.04 desktop that I can install ROS2 Jazzy in.

Then we can run rviz2 to see the scan data from the /scan topic

Or can we? No…

We’ve created a new problem here, in that we are now jumping across multiple networks, and hoping that the middleware (CycloneDDS) will jump through these worlds with us with its multicasting. It unfortunately won’t.

We’ll have to set up cycloneDDS to unicast to particular endpoints or subnets instead using the only unifying point on the network that all parties can reach: the Raspberry Pi onboard the robot.

So we’ll configure ROS on both the Raspberry Pi, and the laptop/VM to unicast to definite endpoints instead of relying on network broadcasts for discoveries.

On the Raspberry Pi, we’ll create a new file in the home directory called cyclonedds.xml and put this in it (using nano or another commandline text editor of choice):
<CycloneDDS>
<Domain id="any">
<General>
<Interfaces>
<NetworkInterface name="wlan0"/>
<NetworkInterface name="usb0"/>
</Interfaces>
</General>
<Discovery>
<Peers>
<Peer address="192.168.20.101"/> <!-- The IP of the laptop/VM-->
<Peer address="192.168.186.2"/> <!-- The IP of the iRobot -->
</Peers>
</Discovery>
</Domain>
</CycloneDDS>


And to export this as a system variable for ROS to find it type this at the commandline:
export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml

And to make this persist across logins/reboots, add it to your bashrc file that is read/sourced each time you login:
echo 'export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml' >> ~/.bashrc

This ensures that ROS2 on the Raspberry pi points both at the robot via its usb network link, and to the laptop/VM via the wifi network link.

Now we must do the same on the Laptop/VM to make it point back at the Raspberry Pi:

Again, we put the following in a file called cyclonedds.xml in the home directory (enp0s3 is the name of the network adaptor on mine, adjust yours accordingly by checking with “ip address” at the commandline on the laptop/VM):
<CycloneDDS>
<Domain id="any">
<General>
<Interfaces>
<NetworkInterface name="enp0s3"/>
</Interfaces>
</General>
<Discovery>
<Peers>
<Peer address="192.168.20.117"/> <!-- The IP of the RPi5-->
</Peers>
</Discovery>
</Domain>
</CycloneDDS>

And again export this system variable, and add it to the bashrc of the laptop/VM:
export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml

echo 'export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml' >> ~/.bashrc

Now we can run the LIDAR driver on the Raspberry Pi:
ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=laser

Making sure that runs successfully, we then jump to our laptop/VM and try to look for published ROS topics made available by the LIDAR – this should be at minimum /scan:
ros2 topic list

If you’re lucky you’ll see a whole bunch more. I’m not a super expert on DDS messaging, but it seems to me like my Raspberry Pi is also acting as a relay, passing through the topics from the Robot itself, which is more than I had hoped for!

If you’ve been trying to make this work unsuccessfully to this point, reboot both machines, you may have hanging processes, or topics stuck with one of the instances that keep causing conflicts.

We can NOW finally run rviz2 on the laptop/VM machine now that we can see the topics turning up.

Type at the commandline on the laptop/VM:
rviz2

You’ll see a window open. First things first, because we’re just wanting to initially see the scan output, we don’t have a proper stack with a map, and origin body etc, so we want to “Fixed Frame” setting on the top left pane and change that from the default of “map” to “laser” which is the frame that we’re running with the LIDAR driver (remember we put “laser” at the end for the “frame_id” in the command?).

Now we can press the “add” button, and go to “by topic” tab in the window that pops up, and you should see “/scan” in the available topics. Choose the “LaserScan” under that, and you should now see your scan output results from the LIDAR!

Take a breath – this is a good moment!

So we now have 2D LIDAR scan results flowing, we have topics and messaging passing around our robotics network, and we have the ability to drive the robot.

9. Build a map of a space to keep for future navigation

Now we are going to use a SLAM engine to gather a couple of things:
– Wheel odometry
– 2D Lidar Scan results

And try to create a map so that the robot in future can decide how to navigate places on its own. We will remote control the robot via SSH on the Raspberry Pi, and watch the map grow on screen with rviz2.

We’re going to use the built-in “SLAM Toolbox” in ROS2 as a general all-rounder for this.
Install it on the Raspberry Pi with:
sudo apt install ros-jazzy-slam-toolbox

But before we run it:
Previously we’d launched the LIDAR driver with:
ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=laser

But the the frame_id is not linked to the base of the robot, it’s off in it’s own world. So we will kill off that process and launch it instead with:
ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=base_link

Now this is slightly lazy, as ideally we have a topology set up that places the laser where it actually sits on the robot, but for now, let’s just treat the laser as the origin at the centre of the robot to make things easy. Later on we’ll build a proper model of the robot, with transforms putting the sensors where they actually live on the robot.

Now it’s time to actually launch the SLAM Toolbox which will take all available sensor inputs (wheel odometry from the robot – /odom, distances in 360 degrees from the 2D LIDAR from /scan) and begin to build a map which will be available at topic /map:
ros2 launch slam_toolbox online_async_launch.py

Back to rviz2, if you set the fixed frame back to “map”, and add by topic the /map, you’ll now start to see the beginning of a simple map being built.

We’ll need to drive the robot around to be able to make it grow and refine, so in another terminal SSH to your Raspberry pi and run the remote control tool we used above to drive it around your room/home:
ros2 run teleop_twist_keyboard teleop_twist_keyboard

So it doesn’t work? Yes that’s right. It will begin to show an occupancy map, but we’re not actually going to get much sense (including making a huge mess), as our odometry from the robot base isn’t been transformed properly to work with SLAM with the base link of the body and everything else going on, and needs some filtering with other sensors to provide a nice fusion that works properly.

QUICK STOP HERE: I’ve come back from the future here to point out that although we can see these things, they’re not working properly because the clock is slightly off on the iRobot Create (unsure why – probably remote servers it uses by default being slightly off), and messaging only works properly when everyone shares the same clock close enough to not cause alignment problems. This took me a while to figure out further down as my SLAM engine just couldn’t get it together.

So? We have to install something called chrony (a network time keeper) on the Raspberry Pi as it will be the master clock, and then reconfigure the iRobot Create to point to it for network time so that their clocks align closely.

Install the time server on the RPi:
sudo apt install chrony

Configure the timeserver using nano to edit the config:
sudo nano /etc/chrony/chrony.conf
Go to the bottom of the file and put:
allow 192.168.186.0/24
local stratum 10

This allows the iRobot subnet (the usb gadget link) to be able to access time from the RPi.
Restart the chrony service to read the config:
sudo /etc/init.d/chrony restart

Now we have to go to the web interface of the iRobot Create, go to the “beta features” menu, and click “edit ntp.conf”. All you need to have in here is:
server 192.168.186.1 prefer iburst minpoll 4 maxpoll 4

Be sure to restart the iRobot and give it some time to catch up its clock. It won’t always happen immediate, as it doesn’t like big time skips.

Now back to robot localisation:

We’re going to install a package:
sudo apt install ros-jazzy-robot-localization

Now we will create some preferences for the localisation. Let’s put this in a new config folder under the ros2_ws folder in our home directory for neatness – call it ekf_odom.yaml (full path ~/ros_ws/config/ekf_odom.yaml):
mkdir -p ~/ros_ws/config
nano ~/ros_ws/config/ekf_odom.yaml


This is my config file contents:

ekf_filter_node:
ros__parameters:
frequency: 30.0
sensor_timeout: 1.0
two_d_mode: true # Assuming Create 3 is 2D motion only
odom0: /odom
odom0_config: [true, true, false, # x, y, z
false, false, true, # roll, pitch, yaw
true, true, false, # vx, vy, vz
false, false, true, # vroll, vpitch, vyaw
false, false, false] # ax, ay, az
odom0_differential: false
odom0_relative: false
imu0: /imu/data
imu0_config: [false, false, false,
true, true, true,
false, false, false,
false, false, true,
true, true, true]
imu0_differential: false
publish_tf: true
map_frame: map
odom_frame: odom
base_link_frame: base_link
transform_time_offset: 0.1

Why EKF? Extended Kalman Filter – it takes noisy and imperfect inputs, weights them against others (eg wheel measurements vs IMU) and decides roughly where it thinks the robot must be (pose).

Let’s now launch our localization node using the preferences file above:
ros2 run robot_localization ekf_node --ros-args --params-file ~/ros_ws/config/ekf_odom.yaml

Great! So we have inputs being smoothed now, ready for a SLAM engine to process and confidently build a map.

The SLAM engine can run in two modes:
1. Mapping – creating a map
2. Localisation – using an existing map to figure out where we are

We don’t run in mapping mode constantly, as we could possibly get lost and make a messy map, so generally speaking we map, then use a nice clean map as a base for localising the robot (pose).

Let’s create a config file for the mapping mode. Create a new file in the config folder:
nano ~/ros2_ws/config/slam_toolbox_mapping_config.yaml

The contents of mine:

slam_toolbox:
ros__parameters:
solver_plugin: "solver_plugins::CeresSolver"
solver_threads: 4 #max number compiled
ceres_solver_options:
linear_solver_type: "SPARSE_NORMAL_CHOLESKY"
mode: "mapping"
map_frame: "map"
odom_frame: "odom" # <-- This matches what EKF publishes
odom_topic: "/odometry/filtered"
base_frame: "base_link"
scan_frame: "base_link"
scan_topic: "/scan"
use_scan_matching: true
use_scan_barycenter: false
map_update_interval: 0.2
resolution: 0.05 # Map resolution (meters per pixel)
max_laser_range: 5.0 #12m is full range, reliability drops 5/6m
update_min_d: 0.02 # Minimum movement before update (distance)
update_min_a: 0.02 # Minimum rotation before update (radians)
transform_publish_period: 0.05 # How often to publish tf transforms
use_pose_extrapolator: true
transform_timeout: 1.5
tf_buffer_duration: 10.0 # seconds

You may want to tweak these for your purposes, but they work for me.

Let’s now launch the Slam Toolbox mapping node:
ros2 launch slam_toolbox online_async_launch.py params_file:=~/ros2_ws/config/slam_toolbox_config.yaml log_level:=debug

You’ll see some messages flowing by, but don’t be alarmed if you see some messages like:
Failed to compute odom pose
For just a few repeats. This is just everything being a little slow on loading, and while mapping this may happen some more as the CPU on the RPi struggles to keep up. It’s only a concern if you see it rolling non stop. In this case you probably don’t have a synchronised clock on the iRobot as described above.

Launch rviz2 on your laptop/VM to see the beginnings of the map being built by adding by node /map, as well as adding /pose to see where the SLAM engine sees your robot.

In another terminal window launch the keyboard remote control again, and carefully and slowly drive your iRobot Create around your space, watching the /map topic with rviz2 (above), and being careful to try and stay out of its vision. You may need a couple of goes before you get the hang of it. To clear the map just kill off the slam engine and relaunch (in case it gets messy etc)

Once you have a solid map happening and all of the walls around your space are mapped enough, close your keyboard remote control in the terminal, and make a couple of service calls to save the map and data.

Save the occupancy grid if you want to – we don’t seem to need this anymore (change the filename to whatever you’re mapping):
ros2 service call /slam_toolbox/save_map slam_toolbox/srv/SaveMap "{name: {data: 'apartment_map'}}"
(you should get a result of 0 – if not, then it didn’t save)

Save the serialised pose graph (this is important for localisation):
ros2 service call /slam_toolbox/serialize_map slam_toolbox/srv/SerializePoseGraph "{filename: 'apartment_map'}"

These will turn up in the folder relative to where the slam-toolbox was launched in our other terminal. You’ll want to move these maps to somewhere you know like creating a maps folder in our workspace:
mkdir -p ~/ros_ws/maps
And from the folder in which the maps have been saved (should have 3 files with the name of apartment, with a different extension on the end):
mv ./apartment.* ~/ros_ws/maps/

That’s it – we have a solid map saved! Now quit the slam-toolbox (ctrl-c in the terminal running it).

We are now going to create a config file for the localisation mode only of slam_toolbox:
nano ~/ros_ws/config/slam_toolbox_localisation_config.yaml

This is the contents of mine:

slam_toolbox:
ros__parameters:
solver_plugin: "solver_plugins::CeresSolver"
solver_threads: 4 #max number compiled
ceres_solver_options:
linear_solver_type: "SPARSE_NORMAL_CHOLESKY"
mode: "localization"
map_file_name: "/home/jesse/ros2_ws/maps/apartment_map"
map_frame: "map"
odom_frame: "odom" # <-- This matches what EKF publishes
odom_topic: "/odometry/filtered"
base_frame: "base_link"
scan_frame: "base_link"
scan_topic: "/scan"
use_scan_matching: true
resolution: 0.05 # Map resolution (meters per pixel)
max_laser_range: 5.0 # 12m is full range, reliability drops 5/6m
update_min_d: 0.02 # Minimum movement before update (distance)
update_min_a: 0.02 # Minimum rotation before update (radians)
transform_publish_period: 0.05 # How often to publish tf transforms

use_pose_extrapolator: true
transform_timeout: 1.5
tf_buffer_duration: 10.0 # seconds

Now we can launch a localisation only engine that doesn’t change the map, but still provides mapping services:
ros2 launch slam_toolbox localization_launch.py slam_params_file:=$HOME/ros2_ws/config/slam_toolbox_localisation_config.yaml

Not very impressive huh? Only really works slowly and if you start from the same position each time (where the map started).

So we will launch a map server, host the map on a node, then use other tools to provide localisation, and navigation (route planning, control etc).

But…. things are getting messy, and our config files and maps are just sitting in a folder under our ~/ros2_ws/ folder… It’s time to neaten up and create a total robot package because a robot isn’t just the hardware, its the collection of software and configuration as a whole also.

We’re going to create a new package for ROS2 under ~/ros2_ws/src/ that will begin to hold our configs, maps, and any special launch packages, so eventually it’ll all be in a single place.

Do:
cd ~/ros2_ws/src

ros2 pkg create --build-type ament_python myrobot --dependencies launch_ros rclpy
(of course change the name of “myrobot” if you want to name it something else)
This will create the directory structure in the src folder ready to begin adding your files.

We’ll create a config folder inside our new structure:
mkdir ~/ros2_ws/src/myrobot/config

And move things we’ve maybe been keeping in other folders here (adjust to suit):
mv ~/ros2_ws/config/* ~/ros2_ws/src/myrobot/config/
mv ~/ros2_ws/maps/* ~/ros2_ws/src/myrobot/config/

For the rest of this blog I’ll assume that our configs are all in this new folder structure.

For the Navigation and localisation components, we’re going to need a bunch of configs and install some packages, as well as launch them:
(for the packages, you could also just install with sudo apt install ros-jazzy-navigation2 which would pull most if not all of the below)

1. AMCL (adaptive monte-carlo localiser – figures out where the robot is on the map):
Install package:
sudo apt install ros-jazzy-nav2-amcl
Config:
nano ~/ros2_ws/src/myrobot/config/amcl_params.yaml
Contents of my config file:
amcl:
ros__parameters:
use_sim_time: false
scan_topic: "/scan"

odom_topic: "/odometry/filtered" # use fused odometry
odom_frame_id: "odom"
base_frame_id: "base_link"
global_frame_id: "map"
update_min_d: 0.2
update_min_a: 0.2
min_particles: 500
max_particles: 2000
alpha1: 0.2
alpha2: 0.2
alpha3: 0.2
alpha4: 0.2
alpha5: 0.2
laser_max_range: 5.0 # match your RPLIDAR range if limiting
laser_min_range: 0.05
pf_err: 0.05
pf_z: 0.99
resample_interval: 1
transform_tolerance: 1.0

Launch:
ros2 run nav2_amcl amcl --ros-args --params-file ~/ros2_ws/src/myrobot/config/amcl_params.yaml

2. NAV2 map server (hosts the pre-built map we made earlier)
Install package:
sudo apt install ros-jazzy-nav2-map-server
Config:
nano ~/ros2_ws/src/myrobot/config/map_server_params.yaml
Contents of my config file:
map_server:
ros__parameters:
use_sim_time: false
yaml_filename: "/home/jesse/ros2_ws/src/myrobot/config/apartment_map.yaml"

Launch:
ros2 run nav2_map_server map_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/map_server_params.yaml

3. NAV2 Planner (plans global path through map)

Install packages:
sudo apt install ros-jazzy-nav2-navfn-planner ros-jazzy-nav2-planner ros-jazzy-nav2-util
Config:

nano ~/ros2_ws/src/myrobot/config/planner_server_params.yaml
Contents of my config file:
planner_server:
ros__parameters:

odom_topic: "/odometry/filtered" # use fused odometry
expected_planner_frequency: 1.0
planner_plugins: ["GridBased"]
GridBased:
plugin: "nav2_navfn_planner::NavfnPlanner"

Launch:
ros2 run nav2_planner planner_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/planner_server_params.yaml

4. NAV2 Controller (drives the robot along the path)
Install packages:
sudo apt install ros-jazzy-nav2-controller ros-jazzy-dwb-core ros-jazzy-dwb-critics ros-jazzy-dwb-plugins
Config:
nano ~/ros2_ws/src/myrobot/config/controller_server_params.yaml
Contents of my config file:
controller_server:
ros__parameters:

odom_topic: "/odometry/filtered" # use fused odometry
use_sim_time: false
controller_frequency: 10.0
controller_plugins: ["FollowPath"]
FollowPath:
plugin: "dwb_core::DWBLocalPlanner"
debug_trajectory_details: false
min_vel_x: 0.0
max_vel_x: 0.26
min_vel_theta: -1.0
max_vel_theta: 1.0
acc_lim_x: 2.5
acc_lim_theta: 3.2
decel_lim_x: -2.5
decel_lim_theta: -3.2
vx_samples: 20
vtheta_samples: 40
xy_goal_tolerance: 0.25
yaw_goal_tolerance: 0.25
stateful: True
critics:
- RotateToGoal
- Oscillation
- BaseObstacle
- GoalAlign
- PathAlign
- PathDist
- GoalDist
# Example critic configs (optional tuning)
PathAlign:
scale: 32.0
GoalAlign:
scale: 24.0
PathDist:
scale: 32.0
GoalDist:
scale: 24.0
RotateToGoal:
scale: 32.0

Launch:
ros2 run nav2_controller controller_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/controller_server_params.yaml

5. NAV2 Behaviours Server (provides recovery behaviours like spin, wait, and backup – needed by following BT Navigator):
Install packages:
sudo apt install ros-jazzy-nav2-behaviors
Config:
nano ~/ros2_ws/src/myrobot/config/behavior_server_params.yaml
Contents of my config:
behavior_server:
ros__parameters:

costmap_topic: "/local_costmap/costmap_raw"
footprint_topic: "/local_costmap/footprint"
use_sim_time: false
recovery_plugins: ["spin", "backup", "wait"]
spin:
plugin: "nav2_behaviors::Spin"
backup:
plugin: "nav2_behaviors::BackUp"
wait:
plugin: "nav2_behaviors::Wait"

Launch:
ros2 run nav2_behaviors behavior_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/behavior_server_params.yaml

6. NAV2 Behaviour Tree Navigator (coordinates behaviour like “go to pose” etc)
Install packages:
sudo apt install ros-jazzy-nav2-bt-navigator ros2-jazzy-nav2-behavior-tree
Config:
nano ~/ros2_ws/src/config/bt_navigator_params.yaml
Contents of my config:
bt_navigator:
ros__parameters:

odom_topic: "/odometry/filtered" # use fused odometry
default_nav_to_pose_bt_xml: "/opt/ros/jazzy/share/nav2_bt_navigator/behavior_trees/navigate_w_replanning_only_if_path_becomes_invalid.xml"
use_sim_time: false

Launch:
ros2 run nav2_bt_navigator bt_navigator --ros-args --params-file ~/ros2_ws/src/myrobot/config/bt_navigator_params.yaml

6. NAV2 lifecycle manager (coordinates starting and stopping of NAV2 components)**
None of the above NAV2 components including AMCL will actually fully start properly, instead waiting in a deactivated state – they need to be told to via backend commands, as part of the complex NAV2 lifecycle ecosystem. See note below this block on the lifecycle manager.
Install Packages:
sudo apt install ros-jazzy-nav2-lifecycle-manager
Config:
nano ~/ros2_ws/src/myrobot/config/lifecycle_manager_nav2.yaml
Contents of my config:
lifecycle_manager_navigation:
ros__parameters:
autostart: true # Automatically configures and activates below
node_names:
- map_server
- amcl
- planner_server
- controller_server

- behavior_server
- bt_navigator
use_sim_time: false

Launch:
ros2 run nav2_lifecycle_manager lifecycle_manager --ros-args --params-file ~/ros2_ws/src/myrobot/config/lifecycle_manager_nav2.yaml

**NOTES ON THE LIFECYCLE MANAGER:
You can manually start each of the above NAV2 components yourself without using the lifecycle manager initially to check that they run and ensure they’re configured correctly by bringing them up one at a time in order and activating them so you can see the logs.
For example if we want to configure and enable the AMCL server:
Run it like above:
ros2 run nav2_amcl amcl --ros-args --params-file ~/ros2_ws/src/myrobot/config/amcl_params.yaml
In another terminal set it to configure (the name of the service is the 4th argument above “amcl”):
ros2 lifecycle set /amcl configure
(this reads the config file, and you’ll see errors if there is a problem loading)
Now set it to activate and start running:
ros2 lifecycle set /amcl activate
This way you can make sure each component runs properly before bundling it all together with the lifecycle manager which can get difficult.

It would be fun to just keep going here, but we are really starting to get messy, and starting each thing up is becoming a pain, so before we continue, lets:
1. test the individual components (including adding the NAV2 components)
2. Create a unified launch for all of these components to start and stop neatly, so we’re not continuing to chew up terminal windows.

Recap on tools we’re bringing up now that we already have a saved map:

1. EK filter to fuse and filter IMU and wheel odometry for better localisation:
ros2 run robot_localization ekf_node --ros-args --params-file ~/ros2_ws/src/myrobot/config/ekf_odom.yaml
(provides /odometry/filtered topic)

2. LIDAR driver for RPLIDAR A1 sensor:
ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=base_link
(provides /scan topic)

3. NAV2 map server with our existing map:
ros2 run nav2_map_server map_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/map_server_params.yaml
(provides /map topic)

4. AMCL localising server to know where we are:
ros2 run nav2_amcl amcl --ros-args --params-file ~/ros2_ws/src/myrobot/config/amcl_params.yaml

5. NAV2 planner server to plan routes
ros2 run nav2_planner planner_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/planner_server_params.yaml

6. NAV2 controller server to drive the robot on route
ros2 run nav2_controller controller_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/controller_server_params.yaml

7. NAV2 Behaviours Server (provides recovery behaviours like spin, wait, and backup – needed by following BT Navigator):
ros2 run nav2_behaviors behavior_server --ros-args --params-file ~/ros2_ws/src/myrobot/config/behavior_server_params.yaml

8. NAV2 behaviour tree navigator to set behaviour
ros2 run nav2_bt_navigator bt_navigator --ros-args --params-file ~/ros2_ws/src/myrobot/config/bt_navigator_params.yaml

9. NAV2 lifecycle manager to activate and monitor the NAV2 components (if using yet)
ros2 run nav2_lifecycle_manager lifecycle_manager --ros-args --params-file ~/ros2_ws/src/myrobot/config/lifecycle_manager_nav2.yaml

Try testing each of the above. It’s messy isn’t it. Also very hard to manage, with a high risk of accidentally running two instances of a service if you’re not careful. So we’ll now create what is called a bring up package that does all of the above in a single command and should die gracefully when quitting.

We’ll put a special file into our myrobot package:
nano ~/ros2_ws/src/myrobot/launch/bringup_launch.py

Contents of the file (sorry if formatting gets messed up, this is not easy in Wordress):

from launch import LaunchDescription
from launch_ros.actions import Node
from launch.actions import TimerAction
from ament_index_python.packages import get_package_share_directory

def generate_launch_description():
	bringup_dir = get_package_share_directory('myrobot')
	ekf_params = bringup_dir + '/config/ekf_odom.yaml'
	map_params = bringup_dir + '/config/map_server_params.yaml'
	amcl_params = bringup_dir + '/config/amcl_params.yaml'
	controller_params = bringup_dir + '/config/controller_server_params.yaml'
	planner_params = bringup_dir + '/config/planner_server_params.yaml'
	behavior_params = bringup_dir + '/config/behavior_server_params.yaml'
	bt_params = bringup_dir + '/config/bt_navigator_params.yaml'

	# EKF node
	ekf_node = Node(
		package='robot_localization',
		executable='ekf_node',
		name='ekf_filter_node',
		output='screen',
		parameters=[ekf_params]
	)
	# LIDAR node
	lidar_node = Node(
		package='sllidar_ros2',
		executable='sllidar_node',
		name='sllidar_node',
		output='screen',
		parameters=[{'serial_port': '/dev/ttyUSB0', 'baud_rate': 115200}]
	)
	# Map server
	map_server = Node(
		package='nav2_map_server',
		executable='map_server',
		name='map_server',
		output='screen',
		parameters=[map_params]
	)
	# AMCL localization
	amcl = Node(
		package='nav2_amcl',
		executable='amcl',
		name='amcl',
		output='screen',
		parameters=[amcl_params]
	)
	# Controller server (local costmap)
	controller_server = Node(
		package='nav2_controller',
		executable='controller_server',
		name='controller_server',
		output='screen',
		parameters=[controller_params]
	)
	# Planner server (global costmap)
	planner_server = Node(
		package='nav2_planner',
		executable='planner_server',
		name='planner_server',
		output='screen',
		parameters=[planner_params]
	)
	# Behaviour server (recovery behaviours)
	behavior_server = Node(
		package='nav2_behaviors',
		executable='behavior_server',
		name='behavior_server',
		output='screen',
		parameters=[behavior_params]
	)
	# BT Navigator
	bt_navigator = Node(
		package='nav2_bt_navigator',
		executable='bt_navigator',
		name='bt_navigator',
		output='screen',
		parameters=[bt_params]
	)
	# Lifecycle Manager
	lifecycle_manager_node = Node(
		package='nav2_lifecycle_manager',
		executable='lifecycle_manager',
		name='lifecycle_manager_navigation',
		output='screen',
		parameters=[{
			'use_sim_time': False,
			'autostart': True,
			'node_names': [
				'map_server',
				'amcl',
				'controller_server',
				'planner_server',
				'behavior_server',
				'bt_navigator'
			]
		}]
	)

	lifecycle_manager = TimerAction(
		period=5.0,  # wait 5 seconds for other nodes
		actions=[lifecycle_manager_node]
	)



	return LaunchDescription([
		ekf_node,
		lidar_node,
		map_server,
		amcl,
		controller_server,
		planner_server,
		behavior_server,
		bt_navigator,
		lifecycle_manager
	])

Now add/edit a file in your ~/ros2_ws/src/myrobot folder called setup.py (we’re running a python ament build). This is what the build system uses to put files in the right place etc:
nano ~/ros2_ws/src/myrobot/setup.py
Contents of mine:
from setuptools import find_packages, setup
import os
from glob import glob
package_name = 'myrobot'
setup(
name=package_name,
version='0.0.0',
packages=find_packages(exclude=['test']),
data_files=[
('share/ament_index/resource_index/packages',
['resource/' + package_name]),
('share/' + package_name, ['package.xml']),
# THIS installs the launch files:
(os.path.join('share', package_name, 'launch'), glob('launch/.py')),
# And this installs your config:

(os.path.join('share', package_name, 'config'), glob('config/.yaml')),
],
install_requires=['setuptools'],
zip_safe=True,
maintainer='jesse',
maintainer_email='jesse@cake.net.au',
description='Basic package for my robot',
license='TODO: License declaration',
tests_require=['pytest'],
entry_points={
'console_scripts': [],
},
)

You should already have a package.xml in the folder from creating the package earlier on.

Now we can build the “myrobot” package ready for ros2 to call in standard commands:
Go to the root directory of the workspace:
cd ~/ros2_ws/
Run the build process:
colcon build --symlink-install
Source the workspace again:
source install/setup.bash

Now we can finally just call a single launch command to bring the whole lot up!
ros2 launch myrobot bringup_launch.py
(if you have problems, you can get debugging info from your launch with ros2 launch myrobot bringup_launch.py --debug )

Hmmm… we seem to have transforms not working, let’s check our lidar is working properly

To be continued…




This is a rolling blog of my journey on this particular project, so I’ll keep adding as I work through it.

Leave a Reply

Your email address will not be published. Required fields are marked *