Install the required ROS2 drivers:
sudo apt install -y ros-jazzy-depthai-ros ros-jazzy-depthai-bridge ros-jazzy-depthai-examples
Create the config file:nano ~/ros2_ws/src/myrobot/config/oak_run.yaml
Contents of my config (for basic depth, no colour, low resolution, disabling the neural node to save power – low processing required):
camera:
ros__parameters:
camera:
i_pipeline_type: 'Depth'
i_tf_camera_model: 'OAK-D-LITE'
i_tf_camera_name: 'oak'
i_tf_parent_frame: 'oak-d-base-frame'
i_enable_ir: true
i_nn_type: none
pipeline_gen:
i_enable_imu: false
left:
i_resolution: 400P
right:
i_resolution: 400P
stereo:
i_width: 640
i_height: 400
i_subpixel: false
i_lr_check: false
i_publish_topic: true
i_align_depth: true
i_output_disparity: false
i_output_depth: true
i_depth_preset: HIGH_ACCURACY
rgb:
i_disable_node: true
Run the node directly:
ros2 run depthai_ros_driver camera_node --ros-args --params-file ~/ros2_ws/src/myrobot/config/oak_run.yaml
You’ll probably notice if you look at kernel messages that an important thing happens here:
When first connected, the Oak D Lite sensor turns up as a USB 2 device. This had me scratching my head trying different cables and ports until I realised it doesn’t go into USB 3+ mode (super speed) until the driver is loaded and activates the sensor. Be sure to only plug it into the blue USB ports of the Raspberry Pi 5, and use a USB 3 capable cable (again look for the blue connector on the USB A end.
This should give you the ability to see a depth field as a topic. Further work required here to link to the robot body (transform) and add to launch, link to costmaps. (to do next)
Also worth noting: The topics for output from the sensor differ to standard sensors like the Intel Realsense series, but I’ll update this when I’ve got some more time to explain.