INITIALIZING SYSTEMS

0%
ROS2 DEVELOPMENT

ROS2 Development Guide
Architecture, Nav2 & Industrial Deployment

A definitive technical guide to building production-grade robotics systems with ROS2 -- covering DDS middleware internals, Nav2 autonomous navigation, MoveIt2 manipulation, micro-ROS for embedded targets, real-time scheduling, simulation pipelines, containerized deployment, and cloud robotics architectures for industrial APAC operations.

ROBOTICS January 2026 35 min read Technical Depth: Expert

1. Executive Summary -- ROS2 Ecosystem Overview

ROS2 (Robot Operating System 2) has become the de facto standard middleware for production robotics. Unlike its predecessor ROS1, which was designed primarily for academic research on single machines with best-effort UDP multicast, ROS2 was engineered from the ground up for multi-robot, safety-critical, real-time industrial deployments. Built on the OMG Data Distribution Service (DDS) standard, ROS2 provides deterministic communication, fine-grained Quality of Service (QoS) policies, lifecycle-managed nodes, and native support for security through DDS-Security (SROS2).

The migration from ROS1 (final distribution: Noetic, EOL May 2025) to ROS2 is now an industry imperative. ROS2 Jazzy Jalisco (May 2024, LTS through 2029) and the upcoming Rolling release provide stable foundations for commercial deployment. The ecosystem has matured dramatically: Nav2 has replaced the ROS1 navigation stack, MoveIt2 provides full manipulation planning, and micro-ROS extends the ROS2 graph down to microcontrollers running FreeRTOS or Zephyr.

This guide covers every layer of the ROS2 stack, from DDS wire-protocol internals to cloud-native deployment patterns, with production-tested code examples and configuration templates. Whether you are migrating an existing ROS1 fleet, building a new AMR from scratch, or deploying industrial arms in a Vietnamese manufacturing facility, this resource provides the architectural decisions and implementation details you need.

10M+
ROS-based Robots Deployed Worldwide
3,800+
Packages in the ROS2 Ecosystem
<1ms
Achievable Latency with RT Kernel + DDS
72%
Commercial Robots Using ROS/ROS2 (ABI Research 2025)

1.1 Why ROS2 Over ROS1

ROS1 served the robotics community for over 15 years, but its architectural limitations became blockers for commercial deployment. ROS1 relied on a single-point-of-failure rosmaster for service discovery. Its TCP/UDP transport layer lacked QoS guarantees. There was no native support for security, lifecycle management, or real-time execution. Multi-robot systems required fragile workarounds with separate ROS masters or multimaster packages.

ROS2 addresses every one of these limitations. DDS provides decentralized peer-to-peer discovery, eliminating the rosmaster bottleneck. QoS policies (reliability, durability, deadline, liveliness) allow developers to tune communication semantics per-topic. Managed nodes with lifecycle states (unconfigured, inactive, active, finalized) enable graceful startup and shutdown sequences critical for industrial systems. SROS2 provides authentication, encryption, and access control at the DDS layer.

Migration Tip: ROS1 Bridge

For teams running mixed ROS1/ROS2 fleets during migration, the ros1_bridge package translates messages bidirectionally between ROS1 topics and ROS2 topics at runtime. Use it as a transitional tool -- not a permanent architecture. The bridge adds 2-5ms latency per message and does not translate custom message types without explicit mapping configuration. Plan your full migration within 12-18 months.

2. ROS2 Architecture & DDS Middleware

2.1 The DDS Layer

At the heart of ROS2 lies the Data Distribution Service (DDS), an OMG standard for real-time publish-subscribe middleware. ROS2 abstracts DDS through the rmw (ROS Middleware) interface, allowing vendors to be swapped without changing application code. The two dominant DDS implementations in the ROS2 ecosystem are Eclipse Cyclone DDS (default since Galactic) and eProsima Fast DDS (formerly Fast-RTPS).

DDS operates through the RTPS (Real-Time Publish-Subscribe) wire protocol, which handles participant discovery, endpoint matching, and data serialization. When a ROS2 node creates a publisher on a topic, DDS announces this endpoint to all other DDS participants on the same DDS domain (identified by ROS_DOMAIN_ID, default 0). Subscribers that match the topic name and compatible QoS profile automatically establish communication -- no broker, no master, no single point of failure.

FeatureCycloneDDS (Eclipse)Fast DDS (eProsima)Connext DDS (RTI)
LicenseEclipse Public License 2.0Apache 2.0Commercial
ROS2 DefaultYes (Galactic+)Was default (Foxy)No (optional)
Shared MemoryYes (Iceoryx integration)Yes (built-in)Yes
Minimum Latency~30 microseconds (SHM)~45 microseconds (SHM)~20 microseconds (SHM)
DiscoverySimple + staticSimple + server-basedMultiple modes
Security (DDS-Sec)YesYesYes
Best ForLow-latency edge robotsMulti-robot cloud systemsSafety-critical (DO-178C)
Memory Footprint~2 MB~5 MB~8 MB

2.2 Nodes, Topics, Services & Actions

Nodes are the fundamental computational units in ROS2. Each node is an independent process (or thread within a component container) responsible for a single coherent function -- a LiDAR driver, a path planner, a motor controller. Nodes communicate through four mechanisms:

2.3 Component Containers and Intra-Process Communication

Running each node as a separate OS process introduces inter-process overhead (serialization, context switching). ROS2 component containers load multiple nodes as shared libraries into a single process, enabling zero-copy intra-process communication. For image pipelines -- where a camera driver publishes 30 FPS of 1080p frames consumed by a detector node -- intra-process communication eliminates serialization entirely, reducing latency from ~5ms to ~50 microseconds.

# Launch file loading multiple nodes into a single component container # This enables zero-copy intra-process communication from launch import LaunchDescription from launch_ros.actions import ComposableNodeContainer from launch_ros.descriptions import ComposableNode def generate_launch_description(): container = ComposableNodeContainer( name='perception_container', namespace='', package='rclcpp_components', executable='component_container_mt', # Multi-threaded composable_node_descriptions=[ ComposableNode( package='realsense2_camera', plugin='realsense2_camera::RealSenseNodeFactory', name='camera', parameters=[{ 'enable_depth': True, 'depth_module.profile': '640x480x30', 'pointcloud.enable': True, }], extra_arguments=[{'use_intra_process_comms': True}], ), ComposableNode( package='point_cloud_processor', plugin='pcl_ros::VoxelGridFilter', name='voxel_filter', remappings=[('/input', '/camera/depth/color/points')], parameters=[{'leaf_size': 0.02}], extra_arguments=[{'use_intra_process_comms': True}], ), ComposableNode( package='object_detector', plugin='yolo_ros::YoloDetector', name='detector', parameters=[{ 'model_path': '/opt/models/yolov8n.onnx', 'confidence_threshold': 0.6, 'device': 'cuda:0', }], extra_arguments=[{'use_intra_process_comms': True}], ), ], output='screen', ) return LaunchDescription([container])

Nav2 (Navigation2) is the production navigation framework for ROS2, replacing the ROS1 move_base stack with a modular, plugin-based architecture built on behavior trees, lifecycle-managed servers, and configurable planning/control algorithms. Nav2 handles everything from global path planning and local trajectory optimization to recovery behaviors and waypoint following.

3.1 Architecture Overview

Nav2 is composed of several lifecycle-managed server nodes coordinated by the BT Navigator, which executes behavior trees defining the navigation logic. The core servers are:

3.2 Behavior Trees for Navigation

Nav2 replaced ROS1's finite state machine with BehaviorTree.CPP, a powerful library for building modular, composable navigation strategies. Behavior trees (BTs) offer several advantages over state machines: they are easier to visualize, inherently support fallback and recovery patterns, and can be modified at runtime via XML files without recompilation.

A typical navigate_to_pose BT follows this logic: (1) compute a global path, (2) follow the path with the controller, (3) if the controller fails (obstacle), re-plan, (4) if re-planning fails, execute recovery (spin, backup), (5) if recovery fails, abort with error.

<!-- Nav2 Behavior Tree XML: navigate_w_replanning_and_recovery.xml --> <root main_tree_to_execute="MainTree"> <BehaviorTree ID="MainTree"> <RecoveryNode number_of_retries="6" name="NavigateRecovery"> <PipelineSequence name="NavigateWithReplanning"> <RateController hz="1.0"> <RecoveryNode number_of_retries="1" name="ComputePathRecovery"> <ComputePathToPose goal="{goal}" path="{path}" planner_id="GridBased"/> <ClearEntireCostmap name="ClearGlobalCostmap" service_name="/global_costmap/clear_entirely_global_costmap"/> </RecoveryNode> </RateController> <RecoveryNode number_of_retries="1" name="FollowPathRecovery"> <FollowPath path="{path}" controller_id="FollowPath"/> <ClearEntireCostmap name="ClearLocalCostmap" service_name="/local_costmap/clear_entirely_local_costmap"/> </RecoveryNode> </PipelineSequence> <SequenceStar name="RecoveryActions"> <ClearEntireCostmap name="ClearGlobalCostmap" service_name="/global_costmap/clear_entirely_global_costmap"/> <ClearEntireCostmap name="ClearLocalCostmap" service_name="/local_costmap/clear_entirely_local_costmap"/> <Spin spin_dist="1.57"/> <Wait wait_duration="3"/> <BackUp backup_dist="0.30" backup_speed="0.10"/> </SequenceStar> </RecoveryNode> </BehaviorTree> </root>

3.3 Nav2 Parameter Configuration

Nav2 performance depends heavily on parameter tuning for your specific robot kinematics, sensor configuration, and operating environment. Below is a production-tested parameter file for a differential-drive AMR with 2D LiDAR:

# nav2_params.yaml -- Production configuration for differential-drive AMR # Tuned for indoor warehouse with 0.5m aisle clearance bt_navigator: ros__parameters: use_sim_time: false global_frame: map robot_base_frame: base_link odom_topic: /odom default_bt_xml_filename: "navigate_w_replanning_and_recovery.xml" plugin_lib_names: - nav2_compute_path_to_pose_action_bt_node - nav2_follow_path_action_bt_node - nav2_back_up_action_bt_node - nav2_spin_action_bt_node - nav2_wait_action_bt_node - nav2_clear_costmap_service_bt_node - nav2_rate_controller_bt_node - nav2_pipeline_sequence_bt_node - nav2_recovery_node_bt_node planner_server: ros__parameters: expected_planner_frequency: 5.0 planner_plugins: ["GridBased"] GridBased: plugin: "nav2_smac_planner/SmacPlanner2D" tolerance: 0.25 max_iterations: 1000000 max_on_approach_iterations: 1000 max_planning_time: 2.0 cost_travel_multiplier: 2.0 allow_unknown: true smoother: max_iterations: 1000 w_smooth: 0.3 w_data: 0.2 tolerance: 1.0e-10 controller_server: ros__parameters: controller_frequency: 20.0 min_x_velocity_threshold: 0.001 min_theta_velocity_threshold: 0.001 progress_checker_plugin: "progress_checker" goal_checker_plugins: ["general_goal_checker"] controller_plugins: ["FollowPath"] progress_checker: plugin: "nav2_controller::SimpleProgressChecker" required_movement_radius: 0.5 movement_time_allowance: 10.0 general_goal_checker: plugin: "nav2_controller::SimpleGoalChecker" xy_goal_tolerance: 0.15 yaw_goal_tolerance: 0.25 stateful: true FollowPath: plugin: "nav2_regulated_pure_pursuit_controller::RegulatedPurePursuitController" desired_linear_vel: 0.8 lookahead_dist: 0.6 min_lookahead_dist: 0.3 max_lookahead_dist: 0.9 lookahead_time: 1.5 rotate_to_heading_angular_vel: 1.8 transform_tolerance: 0.1 use_velocity_scaled_lookahead_dist: true min_approach_linear_velocity: 0.05 approach_velocity_scaling_dist: 0.6 use_regulated_linear_velocity_scaling: true use_cost_regulated_linear_velocity_scaling: false regulated_linear_scaling_min_radius: 0.9 regulated_linear_scaling_min_speed: 0.25 max_angular_accel: 3.2 allow_reversing: false global_costmap: global_costmap: ros__parameters: update_frequency: 1.0 publish_frequency: 1.0 global_frame: map robot_base_frame: base_link robot_radius: 0.30 resolution: 0.05 track_unknown_space: true plugins: ["static_layer", "obstacle_layer", "inflation_layer"] static_layer: plugin: "nav2_costmap_2d::StaticLayer" map_subscribe_transient_local: true obstacle_layer: plugin: "nav2_costmap_2d::ObstacleLayer" enabled: true observation_sources: scan scan: topic: /scan max_obstacle_height: 2.0 clearing: true marking: true data_type: "LaserScan" raytrace_max_range: 5.0 raytrace_min_range: 0.0 obstacle_max_range: 4.5 obstacle_min_range: 0.0 inflation_layer: plugin: "nav2_costmap_2d::InflationLayer" cost_scaling_factor: 3.0 inflation_radius: 0.55 local_costmap: local_costmap: ros__parameters: update_frequency: 5.0 publish_frequency: 2.0 global_frame: odom robot_base_frame: base_link rolling_window: true width: 3 height: 3 resolution: 0.05 robot_radius: 0.30 plugins: ["voxel_layer", "inflation_layer"] voxel_layer: plugin: "nav2_costmap_2d::VoxelLayer" enabled: true observation_sources: scan scan: topic: /scan max_obstacle_height: 2.0 clearing: true marking: true data_type: "LaserScan" inflation_layer: plugin: "nav2_costmap_2d::InflationLayer" cost_scaling_factor: 3.0 inflation_radius: 0.55

3.4 Lifecycle Management

Nav2 nodes implement ROS2 lifecycle states, providing structured startup and shutdown. This is critical for production robots where sensors must initialize before navigation begins. The lifecycle sequence is: unconfigured -> configure() -> inactive -> activate() -> active. The Nav2 lifecycle_manager orchestrates this sequence across all navigation nodes, ensuring correct ordering (costmaps before planner, planner before controller).

4. MoveIt2 for Manipulation

MoveIt2 is the standard motion planning framework for robotic arms in ROS2. It provides inverse kinematics solvers, collision-aware motion planning, grasp generation, and trajectory execution with real-time control. MoveIt2 supports any robot described by a URDF/XACRO model and integrates with all major industrial arm platforms (Universal Robots, FANUC, ABB, KUKA, Franka Emika).

4.1 Motion Planning Pipeline

MoveIt2's planning pipeline processes motion requests through several stages:

  1. Kinematics: IK solvers (KDL, TRAC-IK, or BioIK) compute joint configurations for desired end-effector poses. TRAC-IK typically finds solutions 2-3x faster than KDL with higher success rates near joint limits.
  2. Planning: OMPL (Open Motion Planning Library) provides sampling-based planners -- RRTConnect for fast bi-directional planning, RRT* for optimal paths, PRM for multi-query scenarios. Pilz Industrial Motion Planner adds LIN (linear), PTP (point-to-point), and CIRC (circular) motion primitives required by industrial standards.
  3. Collision Checking: FCL (Flexible Collision Library) performs continuous collision detection against the planning scene (robot self-collision, known obstacles, octomap from depth sensors). Checking runs at microsecond granularity per configuration.
  4. Trajectory Processing: Time-parameterization (TOTG or iterative spline) converts geometric paths into time-optimal, jerk-limited trajectories respecting joint velocity, acceleration, and jerk constraints.
  5. Execution: MoveIt2 servo enables real-time Cartesian velocity control for teleoperation or visual servoing, publishing joint commands at 100-1000 Hz.

4.2 Grasp Planning

For pick-and-place applications, MoveIt2 integrates with grasp planners that generate candidate grasps from object geometry or point cloud data. The pipeline is: (1) segment the target object from the scene, (2) generate grasp candidates (approach vector, gripper width, pre-grasp offset), (3) filter candidates by IK feasibility and collision, (4) rank by grasp quality metric, (5) execute the highest-ranked grasp with approach-grasp-retreat sequence.

MoveIt2 Servo for Real-Time Control

MoveIt2 Servo enables real-time, collision-aware Cartesian or joint velocity streaming -- essential for visual servoing, teleoperation, and force-guided assembly. Servo runs at up to 1 kHz, continuously checking the commanded trajectory against the planning scene and halting motion if a collision is imminent. Configure with moveit_servo.yaml, setting publish_period, command_in_type (twist or joint), and scale factors for velocity limiting.

5. Perception Pipeline -- SLAM, Point Clouds & Object Detection

5.1 SLAM Algorithms in ROS2

Simultaneous Localization and Mapping (SLAM) is the foundation of autonomous robot navigation. ROS2 supports multiple SLAM implementations, each with distinct strengths:

SLAM SystemSensor InputMap TypeBest ForCPU Load
Cartographer2D/3D LiDAR + IMUOccupancy grid / 3D submapsLarge-scale warehousesMedium
RTAB-MapRGB-D, stereo, LiDAR3D point cloud + occupancyMulti-session visual SLAMHigh
SLAM Toolbox2D LiDAROccupancy gridReal-time 2D mappingLow
KISS-ICP3D LiDAR3D point cloudOutdoor, large environmentsLow
ORB-SLAM3Monocular / stereo / RGB-DSparse feature mapVisual-only systemsMedium

SLAM Toolbox is the recommended starting point for 2D indoor navigation. It provides both online (real-time) and offline (post-processing) mapping modes, supports serialization of maps for later re-use, and integrates directly with Nav2 through the /slam_toolbox node. For production warehouses, we recommend building a map offline with high-quality scans, then using AMCL (Adaptive Monte Carlo Localization) for online localization against the saved map.

5.2 Point Cloud Processing

3D perception pipelines in ROS2 process point clouds from depth cameras (Intel RealSense, Orbbec) or 3D LiDAR (Velodyne, Ouster, Livox) through several stages:

5.3 Object Detection Integration

Production perception systems typically fuse 2D detection (faster, more mature) with 3D depth data. The pipeline detects objects in the RGB image using YOLOv8 or RT-DETR, then projects the 2D bounding box into 3D space using the aligned depth frame and camera intrinsics. This yields a 3D bounding box with pose estimation suitable for grasp planning or obstacle representation.

#!/usr/bin/env python3 """ROS2 node: 2D detection + depth projection for 3D object localization""" import rclpy from rclpy.node import Node from sensor_msgs.msg import Image, CameraInfo from vision_msgs.msg import Detection3DArray, Detection3D, ObjectHypothesisWithPose from geometry_msgs.msg import Pose from cv_bridge import CvBridge import numpy as np from ultralytics import YOLO class ObjectDetector3D(Node): def __init__(self): super().__init__('object_detector_3d') # Parameters self.declare_parameter('model_path', '/opt/models/yolov8n.pt') self.declare_parameter('confidence_threshold', 0.65) self.declare_parameter('device', 'cuda:0') model_path = self.get_parameter('model_path').value self.conf_thresh = self.get_parameter('confidence_threshold').value device = self.get_parameter('device').value self.model = YOLO(model_path) self.model.to(device) self.bridge = CvBridge() self.camera_info = None # Subscribers self.create_subscription(Image, '/camera/color/image_raw', self.image_cb, 10) self.create_subscription(Image, '/camera/aligned_depth/image_raw', self.depth_cb, 10) self.create_subscription(CameraInfo, '/camera/color/camera_info', self.info_cb, 10) # Publisher self.det_pub = self.create_publisher(Detection3DArray, '/detections_3d', 10) self.latest_depth = None self.get_logger().info(f'Object detector initialized with {model_path}') def info_cb(self, msg): self.camera_info = msg self.fx = msg.k[0] self.fy = msg.k[4] self.cx = msg.k[2] self.cy = msg.k[5] def depth_cb(self, msg): self.latest_depth = self.bridge.imgmsg_to_cv2(msg, desired_encoding='passthrough') def image_cb(self, msg): if self.latest_depth is None or self.camera_info is None: return cv_image = self.bridge.imgmsg_to_cv2(msg, 'bgr8') results = self.model(cv_image, conf=self.conf_thresh, verbose=False) det_array = Detection3DArray() det_array.header = msg.header for result in results: for box in result.boxes: x1, y1, x2, y2 = map(int, box.xyxy[0].tolist()) cx_px = (x1 + x2) // 2 cy_px = (y1 + y2) // 2 # Get median depth in bounding box (robust to noise) depth_roi = self.latest_depth[y1:y2, x1:x2] valid_depths = depth_roi[depth_roi > 0] if len(valid_depths) == 0: continue z = float(np.median(valid_depths)) / 1000.0 # mm to meters # Back-project to 3D x_3d = (cx_px - self.cx) * z / self.fx y_3d = (cy_px - self.cy) * z / self.fy det = Detection3D() det.header = msg.header hyp = ObjectHypothesisWithPose() hyp.hypothesis.class_id = str(int(box.cls[0])) hyp.hypothesis.score = float(box.conf[0]) hyp.pose.pose.position.x = x_3d hyp.pose.pose.position.y = y_3d hyp.pose.pose.position.z = z det.results.append(hyp) det_array.detections.append(det) self.det_pub.publish(det_array) def main(args=None): rclpy.init(args=args) node = ObjectDetector3D() rclpy.spin(node) node.destroy_node() rclpy.shutdown() if __name__ == '__main__': main()

6. Industrial ROS2 -- micro-ROS, PLC Integration & OPC-UA

6.1 micro-ROS for Embedded Systems

micro-ROS extends the ROS2 communication graph down to resource-constrained microcontrollers (MCUs) running FreeRTOS, Zephyr, or NuttX. This enables sensors, motor controllers, and safety I/O boards to participate as first-class ROS2 nodes without a Linux-capable processor. micro-ROS supports publishers, subscribers, services, parameters, and even actions on MCUs with as little as 50KB of RAM.

The architecture uses a micro-ROS Agent running on a Linux host that bridges the MCU's XRCE-DDS (DDS for eXtremely Resource Constrained Environments) protocol to the full DDS network. Communication between MCU and Agent can use serial (UART), USB, UDP over Wi-Fi/Ethernet, or custom transports.

50KB
Minimum RAM for micro-ROS Node
100us
Publisher Latency on STM32 + UART
15+
Supported MCU Platforms
RTOS
FreeRTOS, Zephyr, NuttX Support

6.2 ROS2 + PLC Integration

Industrial facilities run PLCs (Programmable Logic Controllers) for machine control, safety interlocking, and process automation. Integrating ROS2 robots with existing PLC infrastructure requires bridging two paradigms: ROS2's DDS publish-subscribe model and the PLC's cyclic scan-based execution model.

Common integration patterns include:

6.3 Safety Standards

Industrial ROS2 deployments must comply with safety standards including ISO 10218 (industrial robot safety), ISO/TS 15066 (collaborative robot operation), and IEC 62443 (industrial cybersecurity). While ROS2 itself is not safety-certified, the architecture supports safety-rated implementations through:

7. Real-Time ROS2 -- DDS QoS, RT Kernels & Scheduling

Real-time performance in robotics means deterministic execution within bounded time limits -- not necessarily fast, but predictable. A motor controller that occasionally takes 50ms instead of 1ms is more dangerous than one that consistently takes 5ms. ROS2 provides the building blocks for real-time systems, but achieving true determinism requires careful configuration at every layer: OS kernel, DDS middleware, memory allocation, and application code.

7.1 DDS Quality of Service Profiles

DDS QoS policies are the primary mechanism for controlling communication behavior in ROS2. Critical QoS parameters for real-time systems:

// C++ ROS2 real-time publisher with custom QoS and pre-allocated message // Demonstrates deadline, liveliness, and memory pre-allocation patterns #include <rclcpp/rclcpp.hpp> #include <sensor_msgs/msg/joint_state.hpp> #include <rclcpp/qos.hpp> class RTJointStatePublisher : public rclcpp::Node { public: RTJointStatePublisher() : Node("rt_joint_publisher") { // Configure real-time QoS profile auto qos = rclcpp::QoS(rclcpp::KeepLast(1)) .reliability(rclcpp::ReliabilityPolicy::BestEffort) .durability(rclcpp::DurabilityPolicy::Volatile) .deadline(std::chrono::milliseconds(5)) // 5ms deadline .liveliness(rclcpp::LivelinessPolicy::ManualByTopic) .liveliness_lease_duration(std::chrono::milliseconds(20)); publisher_ = this->create_publisher<sensor_msgs::msg::JointState>( "/joint_states", qos); // Pre-allocate message to avoid runtime heap allocation msg_ = std::make_unique<sensor_msgs::msg::JointState>(); msg_->name = {"joint_1", "joint_2", "joint_3", "joint_4", "joint_5", "joint_6"}; msg_->position.resize(6, 0.0); msg_->velocity.resize(6, 0.0); msg_->effort.resize(6, 0.0); // 1kHz timer for joint state publishing timer_ = this->create_wall_timer( std::chrono::milliseconds(1), std::bind(&RTJointStatePublisher::publish_joint_state, this)); } private: void publish_joint_state() { msg_->header.stamp = this->get_clock()->now(); // Read joint encoders (hardware-specific) read_joint_encoders(msg_->position.data(), 6); // Publish without any allocation publisher_->publish(*msg_); } void read_joint_encoders(double* positions, int count) { // Hardware interface -- populate from EtherCAT, CAN, etc. (void)positions; (void)count; } rclcpp::Publisher<sensor_msgs::msg::JointState>::SharedPtr publisher_; std::unique_ptr<sensor_msgs::msg::JointState> msg_; rclcpp::TimerBase::SharedPtr timer_; }; int main(int argc, char** argv) { rclcpp::init(argc, argv); // Use SingleThreadedExecutor for deterministic callback ordering rclcpp::executors::SingleThreadedExecutor executor; auto node = std::make_shared<RTJointStatePublisher>(); executor.add_node(node); executor.spin(); rclcpp::shutdown(); return 0; }

7.2 RT Kernel Configuration

For sub-millisecond determinism, deploy ROS2 on a PREEMPT_RT patched Linux kernel. Ubuntu 24.04 includes the RT kernel as an optional install (linux-image-realtime). Key system-level configurations for real-time ROS2:

8. Simulation -- Gazebo, Isaac Sim & Webots

Simulation is non-negotiable in professional ROS2 development. It enables algorithm validation, regression testing, CI/CD integration, and synthetic data generation without hardware. The ROS2 ecosystem supports multiple simulators, each with distinct capabilities.

SimulatorPhysics EngineRenderingROS2 IntegrationBest For
Gazebo HarmonicDART, Bullet, TPEOgre2 (PBR)Native (gz-ros2-control)General-purpose, multi-robot
NVIDIA Isaac SimPhysX 5RTX ray tracingNative bridgeAI training, digital twins
WebotsODEOpenGL / WRENwebots_ros2 driverEducation, quick prototyping
MuJoCoCustom (Euler/RK4)OpenGLCommunity bridgeContact-rich manipulation, RL
O3DEPhysX 5Atom (PBR, RT)ROS2 GemLarge outdoor environments

8.1 Gazebo Harmonic (Recommended Default)

Gazebo Harmonic (gz-harmonic, released 2024) is the primary simulator for ROS2 development. It replaces the legacy Gazebo Classic with a modern, modular architecture featuring improved physics, rendering, and sensor simulation. Key capabilities:

8.2 NVIDIA Isaac Sim

Isaac Sim provides photorealistic rendering with RTX ray tracing, GPU-accelerated PhysX 5 physics, and built-in tools for synthetic data generation (domain randomization, automatic labeling). Its ROS2 bridge publishes sensor data, tf transforms, and joint states directly onto the ROS2 network. Isaac Sim excels for:

8.3 URDF/Xacro Robot Description

Robot models in ROS2 are defined using URDF (Unified Robot Description Format) and its macro-enabled variant Xacro. A well-structured Xacro model parameterizes dimensions, mass properties, and sensor placement for easy reconfiguration.

<?xml version="1.0"?> <!-- Xacro model for a differential-drive AMR with LiDAR and RGB-D camera --> <robot xmlns:xacro="http://www.ros.org/wiki/xacro" name="seraphim_amr"> <!-- Properties --> <xacro:property name="base_width" value="0.40"/> <xacro:property name="base_length" value="0.50"/> <xacro:property name="base_height" value="0.15"/> <xacro:property name="wheel_radius" value="0.075"/> <xacro:property name="wheel_width" value="0.04"/> <xacro:property name="wheel_separation" value="0.44"/> <xacro:property name="caster_radius" value="0.035"/> <xacro:property name="lidar_height" value="0.28"/> <!-- Base link --> <link name="base_link"> <visual> <geometry><box size="${base_length} ${base_width} ${base_height}"/></geometry> <material name="dark_grey"><color rgba="0.2 0.2 0.2 1.0"/></material> </visual> <collision> <geometry><box size="${base_length} ${base_width} ${base_height}"/></geometry> </collision> <inertial> <mass value="15.0"/> <inertia ixx="0.29" ixy="0" ixz="0" iyy="0.39" iyz="0" izz="0.49"/> </inertial> </link> <!-- Wheel macro --> <xacro:macro name="drive_wheel" params="prefix y_offset"> <link name="${prefix}_wheel_link"> <visual> <geometry><cylinder radius="${wheel_radius}" length="${wheel_width}"/></geometry> <material name="black"><color rgba="0.1 0.1 0.1 1.0"/></material> </visual> <collision> <geometry><cylinder radius="${wheel_radius}" length="${wheel_width}"/></geometry> </collision> <inertial> <mass value="1.0"/> <inertia ixx="0.0015" ixy="0" ixz="0" iyy="0.0015" iyz="0" izz="0.0028"/> </inertial> </link> <joint name="${prefix}_wheel_joint" type="continuous"> <parent link="base_link"/> <child link="${prefix}_wheel_link"/> <origin xyz="0 ${y_offset} 0" rpy="-1.5708 0 0"/> <axis xyz="0 0 1"/> </joint> </xacro:macro> <xacro:drive_wheel prefix="left" y_offset="${wheel_separation/2}"/> <xacro:drive_wheel prefix="right" y_offset="${-wheel_separation/2}"/> <!-- LiDAR mount --> <link name="lidar_link"> <visual> <geometry><cylinder radius="0.04" length="0.05"/></geometry> <material name="green"><color rgba="0.0 0.8 0.0 1.0"/></material> </visual> </link> <joint name="lidar_joint" type="fixed"> <parent link="base_link"/> <child link="lidar_link"/> <origin xyz="0 0 ${lidar_height}" rpy="0 0 0"/> </joint> <!-- RGB-D Camera --> <link name="camera_link"> <visual> <geometry><box size="0.025 0.09 0.025"/></geometry> <material name="silver"><color rgba="0.7 0.7 0.7 1.0"/></material> </visual> </link> <joint name="camera_joint" type="fixed"> <parent link="base_link"/> <child link="camera_link"/> <origin xyz="${base_length/2} 0 0.20" rpy="0 0.15 0"/> </joint> <!-- Gazebo Harmonic plugins --> <gazebo> <plugin filename="gz-sim-diff-drive-system" name="gz::sim::systems::DiffDrive"> <left_joint>left_wheel_joint</left_joint> <right_joint>right_wheel_joint</right_joint> <wheel_separation>${wheel_separation}</wheel_separation> <wheel_radius>${wheel_radius}</wheel_radius> <odom_topic>odom</odom_topic> <tf_topic>tf</tf_topic> <frame_id>odom</frame_id> <child_frame_id>base_link</child_frame_id> </plugin> </gazebo> <gazebo reference="lidar_link"> <sensor type="gpu_lidar" name="lidar"> <update_rate>10</update_rate> <lidar> <scan><horizontal> <samples>720</samples> <resolution>1</resolution> <min_angle>-3.14159</min_angle> <max_angle>3.14159</max_angle> </horizontal></scan> <range><min>0.12</min><max>25.0</max></range> </lidar> <topic>scan</topic> </sensor> </gazebo> </robot>

9. Deployment & DevOps -- Docker, CI/CD & OTA Updates

9.1 Containerized ROS2 Deployment

Docker containers have become the standard deployment mechanism for ROS2 applications. Containers provide reproducible builds, dependency isolation, and simplified fleet-wide updates. A well-structured ROS2 Docker image uses multi-stage builds to minimize image size and separate build-time dependencies from runtime.

# Dockerfile -- Multi-stage ROS2 Jazzy production image # Build stage: compile ROS2 workspace FROM ros:jazzy-ros-base AS builder ENV DEBIAN_FRONTEND=noninteractive RUN apt-get update && apt-get install -y --no-install-recommends \ python3-colcon-common-extensions \ python3-rosdep \ ros-jazzy-nav2-bringup \ ros-jazzy-slam-toolbox \ ros-jazzy-robot-localization \ ros-jazzy-ros2-control \ ros-jazzy-ros2-controllers \ && rm -rf /var/lib/apt/lists/* WORKDIR /ros2_ws COPY src/ src/ # Install dependencies and build RUN . /opt/ros/jazzy/setup.sh && \ rosdep install --from-paths src --ignore-src -r -y && \ colcon build \ --cmake-args -DCMAKE_BUILD_TYPE=Release \ --parallel-workers $(nproc) \ --event-handlers console_cohesion+ # Runtime stage: minimal image with only runtime dependencies FROM ros:jazzy-ros-core AS runtime ENV DEBIAN_FRONTEND=noninteractive RUN apt-get update && apt-get install -y --no-install-recommends \ ros-jazzy-nav2-bringup \ ros-jazzy-slam-toolbox \ ros-jazzy-robot-localization \ ros-jazzy-ros2-control \ ros-jazzy-ros2-controllers \ python3-yaml \ && rm -rf /var/lib/apt/lists/* # Copy built workspace from builder COPY --from=builder /ros2_ws/install /ros2_ws/install COPY config/ /ros2_ws/config/ COPY launch/ /ros2_ws/launch/ # Setup entrypoint COPY docker-entrypoint.sh / RUN chmod +x /docker-entrypoint.sh ENV ROS_DOMAIN_ID=42 ENV RMW_IMPLEMENTATION=rmw_cyclonedds_cpp ENV CYCLONEDDS_URI=/ros2_ws/config/cyclonedds.xml ENTRYPOINT ["/docker-entrypoint.sh"] CMD ["ros2", "launch", "/ros2_ws/launch/robot_bringup.launch.py"]

9.2 CI/CD Pipeline

A robust CI/CD pipeline for ROS2 projects automates building, testing, and deploying robot software. We recommend GitHub Actions or GitLab CI with the following stages:

  1. Lint & Format: Run ament_lint_auto, ament_clang_format, ament_flake8, and ament_xmllint to enforce code style.
  2. Build: Compile the workspace with colcon build in a Docker container matching the target ROS2 distribution.
  3. Unit Tests: Execute colcon test for gtest (C++) and pytest (Python) test suites. Publish test results as artifacts.
  4. Integration Tests: Launch the full navigation stack in Gazebo headless mode, send navigation goals, and verify the robot reaches them within tolerance. Use launch_testing for orchestrated scenario testing.
  5. Build Docker Image: Build and push the production Docker image to a container registry (ECR, GCR, or self-hosted).
  6. Deploy: For staging: deploy to a test robot via docker-compose pull. For production: trigger OTA update through fleet management platform (Balena, AWS IoT Greengrass, or custom).

9.3 OTA Updates

Over-the-air updates are critical for maintaining robot fleets in the field. The update mechanism must be atomic (rollback on failure), bandwidth-efficient (delta updates), and staged (canary deployments to a subset of robots before fleet-wide rollout). Options include:

Fleet Update Best Practice

Never push updates to your entire fleet simultaneously. Use a staged rollout: 1 robot (canary) -> 10% of fleet (early adopters) -> 50% -> 100%. Monitor key metrics (navigation success rate, battery consumption, error rates) at each stage. Automated rollback triggers should halt the rollout if any metric degrades beyond a configurable threshold.

10. Cloud Robotics with ROS2

10.1 Architecture Patterns

Cloud robotics extends the ROS2 computation graph from the robot to the cloud, enabling capabilities that exceed onboard compute -- large-scale SLAM map management, fleet-wide path optimization, ML model training and deployment, and centralized monitoring dashboards. The key architectural decision is partitioning computation between edge (robot), fog (on-premise server), and cloud tiers.

10.2 AWS RoboMaker

AWS RoboMaker provides cloud services specifically designed for ROS/ROS2 robotics:

10.3 Azure IoT + ROS2

Microsoft's Azure IoT ecosystem integrates with ROS2 through the azure_iot_hub_ros2 bridge, enabling:

10.4 Fog Computing for Multi-Robot Coordination

For latency-sensitive multi-robot coordination (fleet path planning, shared map updates, task allocation), a fog computing layer running on an on-premise server provides the best balance of performance and reliability. The fog server runs a ROS2 graph with fleet-level nodes:

#!/usr/bin/env python3 """Fog-level fleet coordinator node -- allocates tasks to available robots""" import rclpy from rclpy.node import Node from std_msgs.msg import String from geometry_msgs.msg import PoseStamped from custom_msgs.msg import TaskRequest, TaskAssignment, RobotStatus from scipy.optimize import linear_sum_assignment import numpy as np class FleetCoordinator(Node): def __init__(self): super().__init__('fleet_coordinator') self.declare_parameter('num_robots', 10) self.declare_parameter('assignment_interval', 2.0) # seconds self.robot_states = {} # robot_id -> RobotStatus self.pending_tasks = [] # List of TaskRequest # Subscribe to robot status from all robots (namespace wildcard) self.create_subscription( RobotStatus, '/fleet/robot_status', self.robot_status_cb, 10) # Subscribe to incoming task requests from WMS self.create_subscription( TaskRequest, '/fleet/task_requests', self.task_request_cb, 10) # Publish task assignments self.assignment_pub = self.create_publisher( TaskAssignment, '/fleet/task_assignments', 10) # Periodic task assignment using Hungarian algorithm interval = self.get_parameter('assignment_interval').value self.create_timer(interval, self.assign_tasks) self.get_logger().info('Fleet coordinator initialized') def robot_status_cb(self, msg): self.robot_states[msg.robot_id] = msg def task_request_cb(self, msg): self.pending_tasks.append(msg) self.get_logger().info(f'Received task {msg.task_id} for zone {msg.target_zone}') def assign_tasks(self): # Filter available robots (not busy, battery > 20%) available = {rid: state for rid, state in self.robot_states.items() if state.status == 'idle' and state.battery_pct > 20} if not available or not self.pending_tasks: return robot_ids = list(available.keys()) tasks = self.pending_tasks[:len(robot_ids)] # Match count # Build cost matrix (distance from each robot to each task) n = min(len(robot_ids), len(tasks)) cost_matrix = np.zeros((n, n)) for i, rid in enumerate(robot_ids[:n]): for j, task in enumerate(tasks[:n]): rx = available[rid].pose.position.x ry = available[rid].pose.position.y tx = task.target_pose.pose.position.x ty = task.target_pose.pose.position.y cost_matrix[i][j] = np.hypot(tx - rx, ty - ry) # Hungarian algorithm for optimal assignment row_idx, col_idx = linear_sum_assignment(cost_matrix) for r, c in zip(row_idx, col_idx): assignment = TaskAssignment() assignment.robot_id = robot_ids[r] assignment.task_id = tasks[c].task_id assignment.target_pose = tasks[c].target_pose self.assignment_pub.publish(assignment) self.get_logger().info( f'Assigned task {tasks[c].task_id} to robot {robot_ids[r]} ' f'(distance: {cost_matrix[r][c]:.1f}m)') # Remove assigned tasks assigned_indices = sorted(col_idx, reverse=True) for idx in assigned_indices: self.pending_tasks.pop(idx) def main(args=None): rclpy.init(args=args) node = FleetCoordinator() rclpy.spin(node) node.destroy_node() rclpy.shutdown() if __name__ == '__main__': main()

11. Complete Code Examples

11.1 Python ROS2 Node -- Sensor Fusion

This example demonstrates a complete ROS2 Python node implementing an Extended Kalman Filter for fusing wheel odometry with IMU data. This pattern is fundamental for all mobile robot applications.

#!/usr/bin/env python3 """ ROS2 EKF-based odometry fusion node Fuses wheel encoder odometry with IMU angular velocity Publishes fused odometry on /odom_fused """ import rclpy from rclpy.node import Node from rclpy.qos import QoSProfile, ReliabilityPolicy, HistoryPolicy from nav_msgs.msg import Odometry from sensor_msgs.msg import Imu from geometry_msgs.msg import TransformStamped from tf2_ros import TransformBroadcaster import numpy as np from math import sin, cos class OdometryFusion(Node): def __init__(self): super().__init__('odometry_fusion') # Parameters self.declare_parameter('odom_noise_x', 0.05) self.declare_parameter('odom_noise_y', 0.05) self.declare_parameter('odom_noise_theta', 0.1) self.declare_parameter('imu_noise_theta', 0.01) self.declare_parameter('publish_tf', True) # State: [x, y, theta] self.state = np.zeros(3) self.P = np.eye(3) * 0.1 # Covariance # Process noise ox = self.get_parameter('odom_noise_x').value oy = self.get_parameter('odom_noise_y').value ot = self.get_parameter('odom_noise_theta').value self.Q = np.diag([ox**2, oy**2, ot**2]) # Measurement noise (IMU yaw rate) it = self.get_parameter('imu_noise_theta').value self.R_imu = np.array([[it**2]]) self.last_time = None self.last_imu_yaw_rate = 0.0 # QoS for sensor data sensor_qos = QoSProfile( reliability=ReliabilityPolicy.BEST_EFFORT, history=HistoryPolicy.KEEP_LAST, depth=5) # Subscribers self.create_subscription(Odometry, '/odom', self.odom_cb, sensor_qos) self.create_subscription(Imu, '/imu/data', self.imu_cb, sensor_qos) # Publisher self.odom_pub = self.create_publisher(Odometry, '/odom_fused', 10) # TF broadcaster if self.get_parameter('publish_tf').value: self.tf_broadcaster = TransformBroadcaster(self) else: self.tf_broadcaster = None self.get_logger().info('Odometry fusion node initialized') def imu_cb(self, msg): self.last_imu_yaw_rate = msg.angular_velocity.z def odom_cb(self, msg): current_time = self.get_clock().now() if self.last_time is None: self.last_time = current_time return dt = (current_time - self.last_time).nanoseconds / 1e9 self.last_time = current_time if dt <= 0 or dt > 1.0: return # Extract velocities from wheel odometry vx = msg.twist.twist.linear.x vy = msg.twist.twist.linear.y wz = msg.twist.twist.angular.z # === PREDICTION (from wheel odometry) === theta = self.state[2] dx = (vx * cos(theta) - vy * sin(theta)) * dt dy = (vx * sin(theta) + vy * cos(theta)) * dt dtheta = wz * dt self.state[0] += dx self.state[1] += dy self.state[2] += dtheta # Jacobian of motion model F = np.eye(3) F[0, 2] = (-vx * sin(theta) - vy * cos(theta)) * dt F[1, 2] = (vx * cos(theta) - vy * sin(theta)) * dt self.P = F @ self.P @ F.T + self.Q * dt # === UPDATE (from IMU yaw rate) === H = np.array([[0.0, 0.0, 1.0]]) # Observe theta_dot z = np.array([self.last_imu_yaw_rate * dt]) y_innov = z - H @ np.array([0, 0, dtheta]) S = H @ self.P @ H.T + self.R_imu K = self.P @ H.T @ np.linalg.inv(S) self.state += (K @ y_innov).flatten() self.P = (np.eye(3) - K @ H) @ self.P # Normalize theta to [-pi, pi] self.state[2] = np.arctan2(sin(self.state[2]), cos(self.state[2])) # Publish fused odometry self._publish_odom(msg.header.stamp) def _publish_odom(self, stamp): odom = Odometry() odom.header.stamp = stamp odom.header.frame_id = 'odom' odom.child_frame_id = 'base_link' odom.pose.pose.position.x = self.state[0] odom.pose.pose.position.y = self.state[1] odom.pose.pose.orientation.z = sin(self.state[2] / 2) odom.pose.pose.orientation.w = cos(self.state[2] / 2) # Populate covariance (6x6, we fill x, y, yaw) cov = [0.0] * 36 cov[0] = self.P[0, 0] # x-x cov[7] = self.P[1, 1] # y-y cov[35] = self.P[2, 2] # yaw-yaw odom.pose.covariance = cov self.odom_pub.publish(odom) # Broadcast TF if self.tf_broadcaster: t = TransformStamped() t.header = odom.header t.child_frame_id = 'base_link' t.transform.translation.x = self.state[0] t.transform.translation.y = self.state[1] t.transform.rotation.z = sin(self.state[2] / 2) t.transform.rotation.w = cos(self.state[2] / 2) self.tf_broadcaster.sendTransform(t) def main(args=None): rclpy.init(args=args) node = OdometryFusion() rclpy.spin(node) node.destroy_node() rclpy.shutdown() if __name__ == '__main__': main()

11.2 C++ ROS2 Node -- Motor Controller

A minimal C++ ROS2 node demonstrating a hardware interface for differential drive motor control via CAN bus. This pattern is the foundation for the ros2_control hardware interface.

// differential_drive_hw.cpp -- ros2_control hardware interface #include <hardware_interface/system_interface.hpp> #include <hardware_interface/types/hardware_interface_type_values.hpp> #include <rclcpp/rclcpp.hpp> #include <vector> #include <string> namespace seraphim_hardware { class DiffDriveHardware : public hardware_interface::SystemInterface { public: hardware_interface::CallbackReturn on_init( const hardware_interface::HardwareInfo& info) override { if (hardware_interface::SystemInterface::on_init(info) != hardware_interface::CallbackReturn::SUCCESS) return hardware_interface::CallbackReturn::ERROR; hw_positions_.resize(2, 0.0); hw_velocities_.resize(2, 0.0); hw_commands_.resize(2, 0.0); // Initialize CAN bus connection can_device_ = info_.hardware_parameters["can_device"]; // e.g., "can0" RCLCPP_INFO(rclcpp::get_logger("DiffDriveHW"), "Initialized on CAN device: %s", can_device_.c_str()); return hardware_interface::CallbackReturn::SUCCESS; } std::vector<hardware_interface::StateInterface> export_state_interfaces() override { std::vector<hardware_interface::StateInterface> interfaces; interfaces.emplace_back("left_wheel_joint", "position", &hw_positions_[0]); interfaces.emplace_back("left_wheel_joint", "velocity", &hw_velocities_[0]); interfaces.emplace_back("right_wheel_joint", "position", &hw_positions_[1]); interfaces.emplace_back("right_wheel_joint", "velocity", &hw_velocities_[1]); return interfaces; } std::vector<hardware_interface::CommandInterface> export_command_interfaces() override { std::vector<hardware_interface::CommandInterface> interfaces; interfaces.emplace_back("left_wheel_joint", "velocity", &hw_commands_[0]); interfaces.emplace_back("right_wheel_joint", "velocity", &hw_commands_[1]); return interfaces; } hardware_interface::return_type read( const rclcpp::Time& /*time*/, const rclcpp::Duration& /*period*/) override { // Read encoder values from CAN bus // can_read_encoders(hw_positions_.data(), hw_velocities_.data()); return hardware_interface::return_type::OK; } hardware_interface::return_type write( const rclcpp::Time& /*time*/, const rclcpp::Duration& /*period*/) override { // Send velocity commands to motor drivers via CAN // can_send_velocity(hw_commands_[0], hw_commands_[1]); return hardware_interface::return_type::OK; } private: std::string can_device_; std::vector<double> hw_positions_, hw_velocities_, hw_commands_; }; } // namespace seraphim_hardware #include <pluginlib/class_list_macros.hpp> PLUGINLIB_EXPORT_CLASS(seraphim_hardware::DiffDriveHardware, hardware_interface::SystemInterface)

11.3 Complete Launch File

A production launch file that brings up the full navigation stack -- robot state publisher, sensor drivers, SLAM or localization, Nav2, and Rviz2 for visualization.

# robot_bringup.launch.py -- Full navigation stack launch import os from ament_index_python.packages import get_package_share_directory from launch import LaunchDescription from launch.actions import DeclareLaunchArgument, IncludeLaunchDescription from launch.conditions import IfCondition from launch.launch_description_sources import PythonLaunchDescriptionSource from launch.substitutions import LaunchConfiguration, PathJoinSubstitution from launch_ros.actions import Node def generate_launch_description(): pkg_dir = get_package_share_directory('seraphim_amr') nav2_dir = get_package_share_directory('nav2_bringup') use_sim = LaunchConfiguration('use_sim_time', default='false') map_file = LaunchConfiguration('map', default=os.path.join(pkg_dir, 'maps', 'warehouse.yaml')) params_file = LaunchConfiguration('params', default=os.path.join(pkg_dir, 'config', 'nav2_params.yaml')) rviz_config = os.path.join(pkg_dir, 'rviz', 'navigation.rviz') return LaunchDescription([ DeclareLaunchArgument('use_sim_time', default_value='false'), DeclareLaunchArgument('map', default_value=map_file), DeclareLaunchArgument('params', default_value=params_file), DeclareLaunchArgument('rviz', default_value='true'), # Robot state publisher (URDF -> tf) Node( package='robot_state_publisher', executable='robot_state_publisher', parameters=[{ 'robot_description': open(os.path.join( pkg_dir, 'urdf', 'seraphim_amr.urdf.xacro')).read(), 'use_sim_time': use_sim, }], ), # LiDAR driver Node( package='sllidar_ros2', executable='sllidar_node', parameters=[{ 'serial_port': '/dev/ttyUSB0', 'serial_baudrate': 256000, 'frame_id': 'lidar_link', 'scan_mode': 'Standard', }], condition=IfCondition(LaunchConfiguration('use_sim_time', default='false')), ), # Nav2 full stack IncludeLaunchDescription( PythonLaunchDescriptionSource( os.path.join(nav2_dir, 'launch', 'bringup_launch.py')), launch_arguments={ 'use_sim_time': use_sim, 'map': map_file, 'params_file': params_file, }.items(), ), # Rviz2 Node( package='rviz2', executable='rviz2', arguments=['-d', rviz_config], parameters=[{'use_sim_time': use_sim}], condition=IfCondition(LaunchConfiguration('rviz', default='true')), ), ])

12. APAC ROS2 Ecosystem & Community

12.1 Vietnam

Vietnam's ROS2 ecosystem is rapidly growing, driven by the country's expanding robotics manufacturing sector and its strong software engineering talent pool. Vietnam produces over 50,000 engineering graduates annually, and universities including VNU-HCM, Hanoi University of Science and Technology, and Da Nang University of Technology are incorporating ROS2 into their robotics curricula.

Key developments in the Vietnamese ROS2 ecosystem:

12.2 Japan & South Korea

Japan and South Korea are the largest ROS2 markets in APAC. Japan's TIER IV (Autoware) is the leading contributor to ROS2-based autonomous driving software, and SoftBank Robotics uses ROS2 for its commercial service robots. In South Korea, ROBOTIS (creators of TurtleBot and Dynamixel actuators) is a founding ROS2 ecosystem contributor, and Naver Labs operates one of Asia's largest indoor robot fleets running ROS2.

12.3 Singapore & Southeast Asia

Singapore's Advanced Remanufacturing and Technology Centre (ARTC) and A*STAR research labs actively contribute to ROS2 packages for industrial applications. The National Robotics Programme funds ROS2 development for healthcare, logistics, and facilities management. In Thailand, the EEC Smart Factory initiative is driving ROS2 adoption in automotive manufacturing, while Indonesian startups are building ROS2-based delivery robots for Jakarta's dense urban environment.

12.4 Contributing to the Ecosystem

The ROS2 ecosystem thrives on open-source contributions. Whether you are fixing a documentation bug, contributing a new Nav2 plugin, or sharing a hardware driver, the community welcomes contributions. Key contribution pathways:

600+
Companies Using ROS2 in Production (Open Robotics)
45K+
ROS Discourse Active Members
2,200+
GitHub Contributors to ROS2 Core
28
Countries with Active ROS User Groups
Build Production Robotics with ROS2

Seraphim Vietnam provides end-to-end ROS2 development services -- from architecture consulting and Nav2 tuning through custom perception pipelines and fleet deployment. Our engineers have deployed ROS2-based systems in warehouses, factories, and outdoor environments across APAC. Schedule a technical consultation to discuss your ROS2 project requirements.

Get the ROS2 Development Assessment

Receive a customized architecture review including technology stack recommendations, migration roadmap, and deployment strategy for your robotics project.

© 2026 Seraphim Co., Ltd.