This will be a rolling post as I work through this, in the hopes that others also can find the help they need to put these things together.
For those who don’t know, the Raspberry Pi 5 only supports Ubuntu 23.10 and onwards because of a kernel support problem with the Raspberry Pi 5 hardware. This creates a complex situation with those wanting to use the hardware with ROS2 Humble Hawksbill as it only supports (easily) Ubuntu 22.04.
The only clean and easily replicable method here is to move onto the next just released LTS version of ROS2: Jazzy Jalisco. This is made for Ubuntu 24.04 LTS, which means we can finally fit these things together.
These are the broad steps to get rolling:
1. Image the SD card
Use the Raspberry Pi imager tool to image Ubuntu 24.04 server onto an SD card. You’ll find this option in the “Other general-purpose OS” section of the menu, then Ubuntu, and finally “Ubuntu Server 24.04.x LTS (64 bit). (we use server because desktop is not needed, we don’t want to eat up all the processing power and RAM to render a desktop, ssh will suffice).
IMPORTANT: make sure you use the options area when you’re about to image to set up things like wifi and login information, and also enable SSH. This will mean your pi will turn up on your network with credentials and SSH ready to log in
2. Update the iRobot Create 3
You’ll need to follow the instructions provided by iRobot. This usually involves going to the iRobot Create 3 web page, downloading the latest firmware file, powering on the robot, waiting for it to create its own wifi access point, connecting to it, opening the default config website on it, uploading the firmware file, waiting for it to finish.
Update: It seems at the time of writing, that they officially only support up to ROS2 “Iron” (older stable release) but should still be compatible with “Jazzy” (latest long term support release).
https://iroboteducation.github.io/create3_docs/releases/i_0_0/
At this point I downloaded manually and uploaded to the robot the Cyclone DDS version of the firmware, as this didn’t need a discovery server, and should “just work” happily with Jazzy Jalisco (plus it seems Jazzy moving forward is using cyclone by default? Correct me if I’m wrong).
3. Mount the Raspberry Pi 5 in the back case of the robot
This may involve 3D printing one of the many mounts for Raspberry pis to fit in the case, or doing whatever you see best to mount it safely. Be mindful of contact with metal etc.
4. Plugging the Raspberry Pi 5 into onboard power
There is a hidden USB C connector inside the back caddy of the robot, so with a short USB C to USB C cable you can power the Raspberry Pi, as well as provide it with USB networking to control the robot
5. Setting up the Raspberry Pi 5 to run networking over USB
This one was a little complex, as it wasn’t immediately clear what was wrong, and there being mixed messages about the USB C port on the Raspberry Pi 5. Many say that it’s for power only, and various sources say it’s not officially supported, but the USB C connector can power as well as have data run over it. Basically you have to load a kernel module to enable gadget ethernet over USB, then configure the new network interface to be the right subnet to reach the robot.
First, add the module to load on boot:echo "g_ether" | sudo tee -a /etc/modules
This loads the “g_ether” (gadget ethernet) module to load on boot. This will create a new network connection called “usb0” when plugged into the robot.
Next, add the network config for this new network connection:sudo bash -c 'cat > /etc/netplan/99-usb0.yaml << EOF
network:
version: 2
ethernets:
usb0:
dhcp4: no
addresses:
- 192.168.186.3/24
EOF'
This creates a new config file called “99-usb0.yaml” in the /etc/netplan folder and puts the config for the new network interface in place. Notice the address/subnet? That’s because the iRobot Create uses address 192.168.186.2/24 by default. If your robot is configured differently, then change the address accordingly.
Apply the new netplan config:sudo netplan apply
Check it worked:ip addr show usb0
This should show your connection with an address assigned, and up.
6. Installing ROS2
This step I won’t step out, as it’s covered well here:
https://docs.ros.org/en/jazzy/Installation/Ubuntu-Install-Debs.html
I would however recommend keeping it to the base packages (sudo apt install ros-jazzy-ros-base) because you don’t have a desktop installation of Ubuntu, so you want to keep it to the basics, and connect using your laptop with ROS2 Jazzy installed on it to run any visualisation etc.
7. Set up the correct middleware for ROS2 and the iRobot Create
The middleware is what is used to pass messages between different components in ROS2. We have to have the robot using the same middleware as the Raspberry Pi in order for all of the components to talk to each other.
You should have installed the firmware with the cycloneDDS version in a previous step. Now we want to install and set up ROS2 on the Raspberry Pi.
Run:sudo apt install ros-jazzy-rmw-cyclonedds-cpp
Which will install the CycloneDDS middleware for ROS2
Then:export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
Which will tell ROS2 to use it.echo 'export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp' >> ~/.bashrc
To make this permanently happen without having to export it each login
We then need to confirm that the robot is set up properly:
Go to the Application->Configuration menu of the robot’s webpage
Ensure that in “RMW_IMPLEMENTATION” setting it has “rmw_cyclonedds_cpp” selected
IMPORTANT: untick the “Enable Fast DDS discovery server?” setting (if you don’t it still appears to try and use FastDDS instead of CycloneDDS)
Press the menu item Application->RestartApplication to restart it. This should then have it turn up on discovery on the network for ROS2
Finally:
Run command ros2 topic list
and you should see something like (probably way more):/cmd_audio
/cmd_lightring
/cmd_vel
/cmd_vel_stamped
/parameter_events
/rosout
This means it’s visible and things are talking! If not, check you’ve done everything above, and ensured that the networking is up.
7. Have some fun testing manual driving
Because we have our /cmd_vel topic exposed now to ROS2, we have the ability to send commands to drive.
First we’ll need to install a ROS2 package which is a lightweight command line keyboard controller:sudo apt install ros-jazzy-teleop-twist-keyboard
Then we’ll run it with standard arguments (expects /cmd_vel to exist etc):ros2 run teleop_twist_keyboard teleop_twist_keyboard
(there are instructions on screen)
You should be now able to drive the robot around!
8. Connect an RPLIDAR A1 on top and scan the room
A remote control robot is pretty boring, we want it to make its own decisions and drive itself, so we need to have some sensing of the environment and mapping to be able to decide how to drive and where to drive. This is called SLAM (Simultaneous Location And Mapping), and to get there we need spatial awareness.
We’re going to use a cheap and easy to get RPLIDAR A1 2D lidar to see with. I 3D printed a bracket to mount it on top in the middle of the robot to make it simple for now. Connect it to the Raspberry Pi with a USB cable.
We will now create a build environment, and grab the driver for this to build.
Create the ROS workspace to build from in the home directory:mkdir -p ~/ros2_ws/src
Move to the src directory inside:cd ~/ros2_ws/src
Clone the source code from the Slamtec github:git clone https://github.com/Slamtec/sllidar_ros2.git
Move back to the top of the ROS workspace:cd ~/ros2_ws
Grab the system dependencies this source code will want:rosdep install --from-paths src --ignore-src -r -y
Build the driver/utilities for the LIDAR:colcon build --symlink-install
“Source” the build environment to overlay the current ROS2 system environment (allows for the new driver to be used in place, not having to install it systemwide):source install/setup.bash
We’re ready to launch the ROS node for the lidar (this is for the defaults with my A1 LIDAR, if yours is different you will need a different launch file – my launch file is sllidar_a1_launch.py):ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=laser
Let’s check that the topic for /scan exists with:ros2 topic list
This is great – we appear to be running if you see it.
But it’s not much use unless we can actually see the output.
One method is to just run:ros2 topic echo /scan
But you’ll be quickly overwhelmed with data – we humans need visuals!
ROS2 uses rviz2 as a tool to visualise data.
It’s best you don’t run this on the Raspberry Pi, so install ROS2 Jazzy Desktop package onto your own laptop. This can be messy if your system isn’t compatible, but let’s leave that for you to follow the instructions and figure out. On my laptop running Ubuntu 22.04 it as messy, so I just decided to run up a virtual Ubuntu 24.04 desktop that I can install ROS2 Jazzy in.
Then we can run rviz2 to see the scan data from the /scan topic
Or can we? No…
We’ve created a new problem here, in that we are now jumping across multiple networks, and hoping that the middleware (CycloneDDS) will jump through these worlds with us with its multicasting. It unfortunately won’t.
We’ll have to set up cycloneDDS to unicast to particular endpoints or subnets instead using the only unifying point on the network that all parties can reach: the Raspberry Pi onboard the robot.
So we’ll configure ROS on both the Raspberry Pi, and the laptop/VM to unicast to definite endpoints instead of relying on network broadcasts for discoveries.
On the Raspberry Pi, we’ll create a new file in the home directory called cyclonedds.xml and put this in it (using nano or another commandline text editor of choice):<CycloneDDS>
<Domain id="any">
<General>
<Interfaces>
<NetworkInterface name="wlan0"/>
<NetworkInterface name="usb0"/>
</Interfaces>
</General>
<Discovery>
<Peers>
<Peer address="192.168.20.101"/> <!-- The IP of the laptop/VM-->
<Peer address="192.168.186.2"/> <!-- The IP of the iRobot -->
</Peers>
</Discovery>
</Domain>
</CycloneDDS>
And to export this as a system variable for ROS to find it type this at the commandline:export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml
And to make this persist across logins/reboots, add it to your bashrc file that is read/sourced each time you login:echo 'export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml' >> ~/.bashrc
This ensures that ROS2 on the Raspberry pi points both at the robot via its usb network link, and to the laptop/VM via the wifi network link.
Now we must do the same on the Laptop/VM to make it point back at the Raspberry Pi:
Again, we put the following in a file called cyclonedds.xml in the home directory (enp0s3 is the name of the network adaptor on mine, adjust yours accordingly by checking with “ip address” at the commandline on the laptop/VM):<CycloneDDS>
<Domain id="any">
<General>
<Interfaces>
<NetworkInterface name="enp0s3"/>
</Interfaces>
</General>
<Discovery>
<Peers>
<Peer address="192.168.20.117"/> <!-- The IP of the RPi5-->
</Peers>
</Discovery>
</Domain>
</CycloneDDS>
And again export this system variable, and add it to the bashrc of the laptop/VM:export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml
echo 'export CYCLONEDDS_URI=file://$HOME/cyclonedds.xml' >> ~/.bashrc
Now we can run the LIDAR driver on the Raspberry Pi:ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=laser
Making sure that runs successfully, we then jump to our laptop/VM and try to look for published ROS topics made available by the LIDAR – this should be at minimum /scan:ros2 topic list
If you’re lucky you’ll see a whole bunch more. I’m not a super expert on DDS messaging, but it seems to me like my Raspberry Pi is also acting as a relay, passing through the topics from the Robot itself, which is more than I had hoped for!
If you’ve been trying to make this work unsuccessfully to this point, reboot both machines, you may have hanging processes, or topics stuck with one of the instances that keep causing conflicts.
We can NOW finally run rviz2 on the laptop/VM machine now that we can see the topics turning up.
Type at the commandline on the laptop/VM:rviz2
You’ll see a window open. First things first, because we’re just wanting to initially see the scan output, we don’t have a proper stack with a map, and origin body etc, so we want to “Fixed Frame” setting on the top left pane and change that from the default of “map” to “laser” which is the frame that we’re running with the LIDAR driver (remember we put “laser” at the end for the “frame_id” in the command?).
Now we can press the “add” button, and go to “by topic” tab in the window that pops up, and you should see “/scan” in the available topics. Choose the “LaserScan” under that, and you should now see your scan output results from the LIDAR!
Take a breath – this is a good moment!
So we now have 2D LIDAR scan results flowing, we have topics and messaging passing around our robotics network, and we have the ability to drive the robot.
9. Build a map of a space to keep for future navigation
Now we are going to use a SLAM engine to gather a couple of things:
– Wheel odometry
– 2D Lidar Scan results
And try to create a map so that the robot in future can decide how to navigate places on its own. We will remote control the robot via SSH on the Raspberry Pi, and watch the map grow on screen with rviz2.
We’re going to use the built-in “SLAM Toolbox” in ROS2 as a general all-rounder for this.
Install it on the Raspberry Pi with:sudo apt install ros-jazzy-slam-toolbox
But before we run it:
Previously we’d launched the LIDAR driver with:ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=laser
But the the frame_id is not linked to the base of the robot, it’s off in it’s own world. So we will kill off that process and launch it instead with:ros2 launch sllidar_ros2 sllidar_a1_launch.py serial_port:=/dev/ttyUSB0 serial_baudrate:=115200 frame_id:=base_link
Now this is slightly lazy, as ideally we have a topology set up that places the laser where it actually sits on the robot, but for now, let’s just treat the laser as the origin at the centre of the robot to make things easy. Later on we’ll build a proper model of the robot, with transforms putting the sensors where they actually live on the robot.
Now it’s time to actually launch the SLAM Toolbox which will take all available sensor inputs (wheel odometry from the robot – /odom, distances in 360 degrees from the 2D LIDAR from /scan) and begin to build a map which will be available at topic /map:ros2 launch slam_toolbox online_async_launch.py
Back to rviz2, if you set the fixed frame back to “map”, and add by topic the /map, you’ll now start to see the beginning of a simple map being built.
We’ll need to drive the robot around to be able to make it grow and refine, so in another terminal SSH to your Raspberry pi and run the remote control tool we used above to drive it around your room/home:ros2 run teleop_twist_keyboard teleop_twist_keyboard
So it doesn’t work? Yes that’s right. It will begin to show an occupancy map, but we’re not actually going to get much sense, as our odometry from the robot base isn’t been transformed properly to work with SLAM with the base link of the body and everything else going on, and needs some filtering with other sensors to provide a nice fusion that works properly.
QUICK STOP HERE: I’ve come back from the future here to point out that although we can see these things, they’re not working properly because the clock is slightly off on the iRobot Create (unsure why), and messaging only works properly when everyone shares the same clock close enough to not cause alignment problems. This took me a while to figure out further down as my SLAM engine just couldn’t get it together.
So? We have to install something called chrony (a network time keeper) on the Raspberry Pi as it will be the master clock, and then reconfigure the iRobot Create to point to it for network time so that their clocks align closely.
Install the time server on the RPi:sudo apt install chrony
Configure the timeserver using nano to edit the config:sudo nano /etc/chrony/chrony.conf
Go to the bottom of the file and put:allow 192.168.186.0/24
local stratum 10
This allows the iRobot subnet (the usb gadget link) to be able to access time from the RPi.
Restart the chrony service to read the config:sudo /etc/init.d/chrony restart
Now we have to go to the web interface of the iRobot Create, go to the “beta features” menu, and click “edit ntp.conf”. All you need to have in here is:server 192.168.186.1 prefer iburst minpoll 4 maxpoll 4
Be sure to restart the iRobot and give it some time to catch up its clock. It won’t always happen immediate, as it doesn’t like big time skips.
Now back to robot localisation:
We’re going to install a package:sudo apt install ros-jazzy-robot-localization
Now we will create some preferences for the localisation. Let’s put this in a new config folder under the ros2_ws folder in our home directory for neatness – call it ekf_odom.yaml (full path ~/ros_ws/config/ekf_odom.yaml):mkdir -p ~/ros_ws/config
nano ~/ros_ws/config/ekf_odom.yaml
This is my config file contents:
ekf_filter_node:
ros__parameters:
frequency: 30.0
sensor_timeout: 1.0
two_d_mode: true # Assuming Create 3 is 2D motion only
odom0: /odom
odom0_config: [true, true, false, # x, y, z
false, false, true, # roll, pitch, yaw
true, true, false, # vx, vy, vz
false, false, true, # vroll, vpitch, vyaw
false, false, false] # ax, ay, az
odom0_differential: false
odom0_relative: false
imu0: /imu/data
imu0_config: [false, false, false,
true, true, true,
false, false, false,
false, false, true,
true, true, true]
imu0_differential: false
publish_tf: true
map_frame: map
odom_frame: odom
base_link_frame: base_link
transform_time_offset: 0.1
Why EKF? Extended Kalman Filter – it takes noisy and imperfect inputs, weights them against others (eg wheel measurements vs IMU) and decides roughly where it thinks the robot must be (pose).
Let’s now launch our localization node using the preferences file above:ros2 run robot_localization ekf_node --ros-args --params-file ~/ros_ws/config/ekf_odom.yaml
This is a rolling blog of my journey on this particular project, so I’ll keep adding as I work through it.