ROS 2 Jazzy Jalisco: Migration Guide from Humble (2026)

ROS 2 Jazzy Jalisco: Migration Guide from Humble (2026)

ROS 2 Jazzy Jalisco: Migration Guide from Humble (2026)

Humble EOL is May 2027—Plan Your Fleet Migration Now

ROS 2 Humble (May 2022, Ubuntu 22.04 LTS) has been the gold standard for deployed robotics fleets for the past four years. It still accounts for roughly 65% of production ROS 2 systems worldwide. But its end-of-life window—May 2027—is now less than 13 months away. If your fleet runs Humble, migration planning shifts from “eventually” to “urgent.”

Enter ROS 2 Jazzy Jalisco (May 2024, Ubuntu 24.04 LTS), the next long-term support release with a maintenance window extending to May 2029—a full five-year runway. Jazzy brings Python 3.12 baseline, revamped DDS configuration, rclcpp lifecycle improvements, and tighter integration with modern Linux toolchains. For fleets running safety-critical robots (AMRs, collaborative arms, autonomous vehicles), the upgrade is non-negotiable. For others, the sooner you migrate, the less technical debt you accumulate. Many enterprise teams are already in the migration pipeline as of Q2 2026, and delays now mean rushed, error-prone deployments later.

This post walks you through the complete migration workflow: pre-flight audits, step-by-step build procedures, DDS trade-offs, and a pragmatic fleet rollout timeline. Whether you’re running 1 robot or 500, you’ll find the checklist, code snippets, and architectural decisions needed to move to Jazzy confidently by Q4 2026.

What this post covers:
– Why Humble’s EOL should drive your 2026 roadmap
– What’s new in Jazzy (and what breaks)
– Pre-migration audit checklist
– The 9-step migration workflow with bash commands
– DDS choice: Cyclone vs. Fast vs. Zenoh-bridge
– Common pitfalls and remediation
– Fleet rollout phases and a practical timeline


Why Migrate Now

The mathematical case for Jazzy migration is stark and rooted in supply-chain and security realities.

Humble’s EOL timeline. Humble enters security-fix-only mode in May 2026 (approximately now). During this phase, critical CVEs get backported, but new features and non-critical improvements cease. After May 2027, no patches—not even critical CVEs—will be backported to Humble. If you’re shipping robots with Humble in 2027 or later, you’re shipping an OS with no vendor support. For enterprise and safety-critical deployments (autonomous shuttles, surgical robots, factory arms), this creates compliance and liability issues. Insurance underwriters increasingly require LTS coverage for production deployments.

Iron was the stepping stone, not the destination. The distro between Humble and Jazzy, Iron (May 2023 → Nov 2024), was labeled non-LTS from day one. It was designed as a rapid-iteration release to stress-test new features (lifecycle nodes, message_filters rewrites, rcl API stabilization) before baking them into Jazzy. Many teams ran Iron in CI/CD pipelines or research branches but deliberately skipped production deployments due to its 18-month window. Jumping Iron entirely—straight from Humble to Jazzy—is the officially endorsed path. Iron is already EOL as of November 2024, and the ROS 2 maintainers recommend all production systems skip it.

Jazzy LTS gives you 5 full years. Humble had 5 years of support (May 2022 → May 2027); Jazzy extends that to May 2029. For fleet operators managing 3-year robot hardware lifecycles, Jazzy adoption by mid-2026 means your systems stay vendor-supported through their entire operational lifespan. A robot deployed in Q4 2026 will have support until Q4 2029—well past typical hardware refresh windows.

Ubuntu 24.04 LTS unlocks newer hardware. Humble rode Ubuntu 22.04 (Jammy), which is rock-solid but aging. Jazzy lands on Ubuntu 24.04 (Noble), offering Python 3.12 (vs. 3.10 in Humble), GCC 14 compiler toolchain (with better optimization passes), improved systemd performance, and critical driver updates for newer GPUs, edge accelerators, and lidar sensors. For on-robot compute (NVIDIA Jetson AGX, x86 industrial edge, ARM-based custom boards), the newer kernel (6.8+) and driver ecosystem often unlock better hardware integration and power efficiency.

New rcl/rclcpp features pay for the migration. Jazzy ships with refined lifecycle node patterns, Type Adaptation (zero-copy message passes for sensor pipelines), DDS profile isolation, and new ros2_control state-machine features. For vision stacks, multi-sensor fusion, and real-time control loops, these features can cut CPU usage by 20-40% and reduce tail latencies. The migration cost is high; the performance gains are measurable.


What’s New in Jazzy Jalisco

Jazzy’s release notes (May 2024) introduced changes across the stack. Let’s map the key upgrades and break-points so you know what to test.

Distro Timeline: Humble → Iron → Jazzy

rclcpp Lifecycle & Node Architecture

Humble behavior: Nodes inherited from rclcpp::Node with optional lifecycle callbacks (on_configure, on_activate, etc.) if they extended rclcpp_lifecycle::LifecycleNode. Mixing stateful and stateless nodes in the same executor sometimes caused scheduler surprises. Lifecycle transitions were asynchronous; you could call activate() and not know if the node had truly transitioned until the callback completed.

Jazzy changes:
– Lifecycle nodes are now the canonical pattern for long-lived services (drivers, sensors, bridges).
– The executor no longer silently de-prioritizes lifecycle nodes; instead, transitions are now explicit via the managed-node API and synchronous by default.
rclcpp::Node::now() behavior changed to align with ROS time provider contracts—use_sim_time parameter is now checked at call time, not at node instantiation time.
– New rclcpp::spin_until_future_complete() provides cleaner shutdown semantics for executor-based patterns.

Migration step: If you have custom nodes that override on_configure() or on_activate(), test those callbacks end-to-end in Jazzy. No API rewrite needed in most cases, but timing guarantees differ. If you relied on asynchronous lifecycle transitions for performance, benchmark Jazzy’s synchronous behavior.

Type Adaptation & Message Filters

Humble: Message-filter chains (e.g., ApproximateTime, ExactTime synchronizers) relied on copy-heavy per-filter callbacks. For multi-sensor pipelines (4x camera + IMU + LiDAR), this meant redundant serialization/deserialization overhead at each filter stage. A typical vision-fusion node would copy data 3-5 times before the control loop saw it.

Jazzy: Type Adaptation allows you to pass domain-specific sensor types (e.g., raw Eigen matrices, OpenCV cv::Mat objects, sensor_msgs::Image pre-wrapped) directly through the middleware without intermediate ROS message conversion. The middleware adapts your types on transmit/receive using zero-copy shared memory when possible.

Impact: For vision stacks or multi-sensor fusion nodes, Jazzy’s Type Adaptation can drop copy overhead by 40-60%, cut serialize/deserialize CPU by 30%, and reduce end-to-end latency by 10-20 ms on commodity hardware. You’ll need to annotate message types with rclcpp::TypeAdapter<> specializations, but the payoff is real for latency-sensitive code. This is especially valuable on embedded platforms (Jetson, RPI5) where memory bandwidth is a bottleneck.

DDS Configuration: XML Profiles Over Env Vars

Humble: DDS QoS configuration lived in environment variables (RMW_IMPLEMENTATION, CYCLONEDDS_URI, FASTRTPS_DEFAULT_PROFILES_FILE). For fleet-wide QoS tuning, ops teams had to embed large shell scripts into robot launch files. Debugging was painful: which shell script set which env var? Did the container inherit the right defaults?

Jazzy: ROS introduces DDS Profile support via the rmw layer. You can now write a single XML file defining multicast settings, latency budgets, and resource limits, then point the launch system to it:

export RMW_CONFIG_DIR=/etc/ros2/rmw_profiles
export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
# Launch files now read /etc/ros2/rmw_profiles/cyclone_profile.xml automatically

For CycloneDDS and FastDDS, this eliminates configuration sprawl and makes DDS tuning repeatable and auditable. You can commit the XML profile to version control and track changes like code.

ROS 2 Launch & Deprecated XML Tags

Humble: Launch files in both .launch.py (recommended) and .launch.xml (deprecated but functional) syntax coexisted. The XML syntax was forgiving about edge cases.

Jazzy: A number of XML tags are now warnings or hard errors:
<env name="..." value="..." /> inside <node> blocks (use <env_set> instead)
<remap from="..." to="..." /> on arguments (remap goes on topics/services, not parameters)
– Implicit prefix attribute on groups (now explicit with <launch_prefix>)
<group ns="..."> nesting behavior tightened; accidental namespace collisions now error.

Migration step: Run your old .launch.xml files through Jazzy’s launch validator early:

ros2 launch --launch-prefix="python3 -m launch.tools.print_launch_system" my_package my_launch.xml 2>&1 | grep -i deprecated

Most warnings are one-line fixes. Errors need refactoring; plan 1-2 hours per complex launch file.

ros2 CLI & Package Introspection

Humble: ros2 pkg was read-only; metadata lived in package.xml and setup.py.

Jazzy: The CLI gains structured query tools for workspace introspection:

ros2 pkg list --format=json > workspace.json
ros2 pkg show my_package --format=json

For CI/CD pipelines that validate workspace consistency (no duplicate packages, all dependencies declared), these commands replace custom Python scripts.

ros2_control Improvements

Humble: Hardware interface plugins required explicit on_activate() and on_deactivate() calls, and state transitions weren’t always atomic. A controller could observe a half-activated hardware interface.

Jazzy:
– Hardware interfaces are now full lifecycle nodes by default, with atomic state transitions.
– Controller switching is now atomic—no risk of a half-activated controller poisoning your state.
– New StateInterface::get_value() overloads for concurrent reads without locks (for real-time safety).
– New predefined QoS profiles for control loops: CONTROL_NODE_BEST_EFFORT, CONTROL_NODE_RELIABLE.

Impact: If you’ve patched together a custom controller manager or built non-standard hardware plugins, Jazzy’s defaults may conflict. Test ros2_control nodes early.

MoveIt2 Alignment

For manipulation stacks, MoveIt2 2.x is the assumed version in Jazzy. MoveIt2 2.0 released in August 2024 and carries breaking API changes from 1.x—no longer compatible with Humble. If your fleet runs MoveIt2 1.x now, expect a two-phase migration: (1) upgrade to Humble + MoveIt2 2.0 in staging, (2) then jump to Jazzy + MoveIt2 2.x final. Budget 3-4 weeks for MoveIt2 1.x → 2.x migration alone.


Pre-Migration Audit Checklist

Before you spin up a Jazzy build container, spend 1–2 days auditing your codebase and dependencies. A missed third-party package or hidden Python 3.10→3.12 incompatibility will derail your timeline by weeks.

Pre-Migration Audit Flowchart

1. Inventory All Packages

Action: List every package in your workspace and every external dependency.

cd ~/ros2_humble_ws
find src -name "package.xml" | wc -l  # Count your packages
vcs export src > src/workspace.repos  # Snapshot your VCS repos

Categorize:
Internal packages (your own code)
Third-party published (pulled from rosdep/apt/conan)
Third-party pinned (git checkouts of forks or unpublished repos)
ROS core (installed via apt install ros-humble-*)

2. Check Jazzy Availability on the ROS Buildfarm

Not every Humble package is auto-built for Jazzy. Some publishers pause maintenance; others migrate slowly. Use the ROS Buildfarm status page:

# Query the buildfarm for each package
curl -s https://index.ros.org/api/packages/jazzy/ | jq '.[] | select(.package_name=="my_package")'

If your core dependencies (e.g., a custom driver stack) are missing, contact the maintainer or plan for a 2-3 week interim on a pre-release branch.

3. Verify rosdep Keys & Resolve Missing Dependencies

rosdep update
cd ~/ros2_humble_ws
rosdep install -i --from-paths src --rosdistro jazzy --dry-run 2>&1 | grep -E "ERROR|WARN"

Common issues:
– Python 3 packages pinned to 3.10 (PEP 668 namespace narrowing in Ubuntu 24.04).
– CUDA/TensorRT libraries not available for 24.04 (Nvidia often lags by 3-6 months).
– Missing -dev packages for external libraries.

Resolve these in your pre-Jazzy CI environment (Ubuntu 24.04 container) before touching your robot fleet.

4. Evaluate DDS Implementation Choice

Humble ships with FastDDS by default. Jazzy shifts the default to CycloneDDS, citing better latency and lower CPU overhead on embedded hardware.

Decision tree:
Do you rely on FastDDS-specific plugins or custom QoS profiles? → Stay on FastDDS; Jazzy still supports it.
Do you have vendor support contracts (e.g., eProsima)? → FastDDS.
Are you operating in Kubernetes or multi-robot WAN scenarios? → Cyclone (better multicast handling) or Zenoh-bridge for cross-network.
Are you latency-optimizing for real-time control? → Cyclone is typically faster; bench both in your environment.

5. Python 3.12 Compatibility Check

Ubuntu 24.04 ships Python 3.12; Humble had 3.10. Common incompatibilities:

# ❌ Broken in Python 3.12: typing.Union syntax
from typing import Union
def foo(x: Union[int, str]) -> None: ...

# ✅ Fixed: Use | operator
def foo(x: int | str) -> None: ...

# ❌ Broken: distutils (removed in 3.12)
from distutils.core import setup  # GONE

# ✅ Fixed: Use setuptools or importlib.metadata

Scan your Python packages:

grep -r "from distutils" src/
grep -r "Union\[" src/ --include="*.py"

6. Message & Service Definition Compatibility

Humble: IDL parser was stricter about nested message syntax.

Jazzy: More permissive, but watch for:
– Array bounds now enforced (e.g., int32[<=10] must match exactly).
– String encoding defaults changed from ISO-8859-1 to UTF-8 (almost always a win, but test binary protocols).

Run a quick check:

colcon build --packages-select your_interfaces --ros-args --log-level info 2>&1 | grep -i "error\|deprecat"

7. Hardware Driver & Ubuntu 24.04 Availability

Some sensor drivers (Velodyne, Intel RealSense, FLIR) are slow to test on Ubuntu 24.04. Verify with the vendor or community:

apt-cache search velodyne | grep noble
# Or check ros-packages.org / buildfarm

If critical drivers are missing, plan a ~4-week buffer to either patch drivers locally or wait for vendor releases.


Step-by-Step Migration Workflow

Assuming your audit passed, here’s the hands-on workflow. Start in a dev container or staging VM, not on production robots.

Migration Workflow Swimlane

Phase 1: Environment Setup

Step 1: Spin up Ubuntu 24.04 container or image.

docker run -it --name ros2_jazzy_dev ubuntu:24.04 /bin/bash
# Or install Ubuntu 24.04 on a staging VM / RPI5 or Jetson Orin

Step 2: Bootstrap ROS 2 Jazzy.

# Add ROS GPG key
sudo curl -sSL https://repo.ros2.org/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg

# Add ROS 2 repository
echo "deb [signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros2.org/ubuntu $(. /etc/os-release && echo $UBUNTU_CODENAME) main" | sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null

sudo apt update
sudo apt install -y ros-jazzy-desktop

Step 3: Initialize your workspace.

mkdir -p ~/ros2_jazzy_ws/src
cd ~/ros2_jazzy_ws
source /opt/ros/jazzy/setup.bash

Phase 2: Import & Resolve Dependencies

Step 4: Import your VCS repositories.

Use the .repos file you captured earlier:

cd ~/ros2_jazzy_ws/src
vcs import < path/to/workspace.repos
# Or git clone each of your custom repos manually

Step 5: Run rosdep and lock down dependencies.

rosdep install -i --from-paths ~/ros2_jazzy_ws/src --rosdistro jazzy

If there are unresolved keys, do not force --skip-keys on a whim. Each unresolved key is a potential runtime failure.

Phase 3: Build & Test API Changes

Step 6: Attempt a colcon build.

cd ~/ros2_jazzy_ws
colcon build --symlink-install

Expected breakages:
rclcpp::logging::get_logger() signature changed → use rclcpp::get_logger("node_name") instead.
rclcpp::spin_some() no longer accepts rate-limiting; wrap in a loop or use rclcpp::executors::MultiThreadedExecutor with period.
rclcpp_action::GoalHandle::execute() callback signature tightened; ensure std::shared_ptr<> are explicitly captured.

Action: Check the official Jazzy migration guide for your packages. Many are one-line regex fixes.

Step 7: Rewrite launch files.

Validate your .launch.xml and .launch.py files:

ros2 launch my_package my_robot.launch.xml --print-launch-system 2>&1 | grep -i deprecated

Common fixes:
<arg name="x" default="5" /><let name="x" value="5" />
– Remap tags must live at node or topic level, not argument level.

Phase 4: Regression Testing & Bag Replay

Step 8: Run your test suite against Jazzy builds.

colcon test --packages-select my_package

Critical: If you have bag files from Humble production runs, replay them in Jazzy and compare outputs:

ros2 bag play humble_production_run.bag &
ros2 launch my_package offline_processing.launch.py

# Validate outputs against golden data
diff golden_output.csv current_output.csv

This catches subtle message serialization or TF tree changes.

Phase 5: Canary Deployment

Step 9: Deploy to a pilot robot or simulator.

  • Build a Jazzy system image (if using fleet imaging).
  • SSH into a test robot; update launch configurations; start nodes.
  • Run real workloads (navigation, manipulation, sensor fusion) for 24–48 hours.
  • Monitor logs for crashes, latency spikes, or DDS connectivity issues.
# On the test robot
source /opt/ros/jazzy/setup.bash
cd ~/ros2_jazzy_ws
colcon build --symlink-install
ros2 launch my_fleet my_robot.launch.py 2>&1 | tee ~/jazz_canary_$(date +%s).log

DDS Considerations: Cyclone, Fast, and Zenoh-Bridge

One of Jazzy’s biggest shifts is the default DDS middleware switch from FastDDS to CycloneDDS. This isn’t just a name change—it affects multicast, latency, and fleet deployments across your entire system.

DDS Choice Decision Tree

CycloneDDS: The New Default

Why Jazzy switched:
Lower latency on real-time paths (microsecond-scale on loopback, sub-millisecond on LAN).
Lower CPU overhead for multi-publisher scenarios (key for 10-50 node fleets).
Better IPv6 and multicast handling (important for Kubernetes deployments).
Eclipse IoT governance (permissive license, vendor-neutral).

Trade-offs:
– Smaller ecosystem than FastDDS (fewer commercial plugins).
– Less mature in non-LAN environments (Cyclone’s RxO-aware multicast breaks over routed networks; use Zenoh-bridge instead).

Migration: If you’re not relying on FastDDS-specific features, the switch is mostly transparent. Set:

export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp

FastDDS: Still Supported (But Plan Ahead)

If you have deep FastDDS integrations (custom QoS policies, monitoring dashboards), stay on FastDDS in Jazzy:

export RMW_IMPLEMENTATION=rmw_fastrtps_cpp

Note: FastDDS is maintained by eProsima as a separate product. For Jazzy LTS support, verify that eProsima commits to FastDDS updates through May 2029.

Zenoh-Bridge for ROS 2: Crossing WAN/Air Gaps

Emerging pattern: Zenoh is a lightweight pub/sub broker that sits between ROS 2 and the network. It excels at:
Cross-WAN: Tunneling ROS messages over lossy or high-latency links (satellite uplinks, 4G fleets).
Switching between DDS variants: One robot on Cyclone, another on FastDDS—Zenoh bridges them transparently.
Fog computing: Edge nodes filter and re-publish to cloud without full ROS 2 install.

For single-site, LAN-only deployments, Zenoh adds latency and complexity. For distributed fleets (e.g., swarm robotics, federated research labs), it’s increasingly valuable.

DDS Profile XML Example (CycloneDDS)

Instead of shell env-var scripts, Jazzy’s profile system lets you commit QoS as code:

<!-- /etc/ros2/cyclone_profile.xml -->
<?xml version="1.0"?>
<CycloneDDS>
  <Domain>
    <General>
      <NetworkInterfaceAddress>eth0</NetworkInterfaceAddress>
      <AllowMulticast>true</AllowMulticast>
    </General>
    <Discovery>
      <ParticipantIndex>1</ParticipantIndex>
      <MaxAutoParticipantIndex>10</MaxAutoParticipantIndex>
    </Discovery>
    <Internal>
      <MaxMessageSize>65536</MaxMessageSize>
    </Internal>
  </Domain>
</CycloneDDS>

Then in your launch file:

from launch import LaunchDescription
from launch.actions import SetEnvironmentVariable

def generate_launch_description():
    return LaunchDescription([
        SetEnvironmentVariable('CYCLONEDDS_URI', 'file:///etc/ros2/cyclone_profile.xml'),
    ])

Trade-offs and Common Migration Pitfalls

Ubuntu 24.04 Driver Gaps

Not every sensor or FPGA card has been certified on Ubuntu 24.04. Check before production:

  • Velodyne LiDAR: Updated drivers exist, but older firmware may timeout.
  • Intel RealSense: Full Ubuntu 24.04 support as of firmware 5.15.x (released Q1 2025).
  • NVIDIA Jetson: Ubuntu 24.04 support comes with JetPack 6.x (released Q2 2025); if you’re on JetPack 5.x, staying on Ubuntu 22.04 + Humble is viable until late 2026.

Remediation: Test every hardware interface in a staging environment. Don’t assume a driver “should” work.

Third-Party Package Lag

Some packages you depend on (e.g., a niche CV library, a research framework) may not have pre-built Jazzy binaries for weeks or months. Your options:

  1. Wait (if non-critical).
  2. Build from source in your CI (adds 5–10 min per build).
  3. Submit a PR to the maintainer with Jazzy support.
  4. Vendor locally (copy source into your workspace; not ideal, but unblocks migration).

CycloneDDS vs. FastDDS QoS Divergences

If you switch to Cyclone, re-test your QoS assumptions. Example divergence:

Humble + FastDDS:

qos = QoSProfile(
    reliability=ReliabilityPolicy.RELIABLE,
    durability=DurabilityPolicy.VOLATILE,
    deadline=Duration(seconds=1)
)

Jazzy + CycloneDDS:
The same profile may behave differently under loss or clock skew because Cyclone’s deadline-driven handling is stricter. Test under adversarial conditions (dropped packets, clock jitter) before fleet deployment.

Conda/Mamba Incompatibilities

If you manage your ROS 2 environment via Conda (popular in academic labs), watch for:

  • PEP 668: Python 3.12 on Ubuntu 24.04 marks system Python as externally managed. Conda may refuse to install packages over system ROS.
  • Workaround: Use a containerized build (Docker) or a dedicated Python 3.12 venv for Jazzy; don’t mix Conda and system Python.

ament_python & setuptools Changes

Humble used ament_cmake_python for mixed C++/Python packages. Jazzy codified ament_python as the default.

If you have a custom setup.py: It must now list all dependencies in setup.cfg or pyproject.toml. The old pattern of reading from package.xml still works, but it’s being phased out. Update it:

# pyproject.toml
[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"

[project]
name = "my_package"
version = "1.0.0"
dependencies = [
    "rclpy",
    "geometry-msgs",
]

Practical Recommendations + Fleet Rollout Plan

You’ve passed the audit, built a successful Jazzy image, and validated on a pilot robot. Now scale to your fleet confidently with phased stages.

Fleet Rollout Phases

Phased Rollout Timeline (Q2–Q4 2026)

Phase 1: Development & CI (Q2 2026, now through June)
– Jazzy in your staging CI/CD pipeline.
– Parallel Humble + Jazzy builds for 6–8 weeks (safety net if blockers emerge).
– Target: All internal packages passing tests on Jazzy.

Phase 2: Single Pilot Robot (Q2 2026, June)
– Deploy Jazzy system image to 1 production robot.
– Run real workloads (navigation, task execution) for 2–4 weeks.
– Monitor: latency, DDS packet loss, CPU/memory spikes, driver crashes.
– Log everything; create a “lessons learned” doc.

Phase 3: Canary Fleet (5%) (Q3 2026, July–August)
– Roll out to ~5% of your fleet (e.g., 5 robots if you have 100).
– Randomize canary selection to catch environment-specific bugs.
– Set up automated telemetry dashboards; alert on anomalies.
– Target: 2–4 week canary window with zero critical incidents.

Phase 4: Progressive Rollout (50% → 100%) (Q3–Q4 2026, September–October)
– Week 1: 50% of fleet.
– Week 2: 75% of fleet.
– Week 3: 100% of fleet.
– Maintain Humble images for emergency rollback (keep 2–3 robot units on Humble for 4 weeks post-migration).

Phase 5: Decommission Humble Images (Q4 2026, November)
– Archive Humble system images; delete from CI/CD.
– Update runbooks and documentation.
– Close out Humble-related tickets.

Rollout Checklist

Before Phase 2 (pilot), confirm:

  • [ ] Audit checklist: 100% pass.
  • [ ] Colcon build: Clean build, no warnings.
  • [ ] Test suite: All tests passing.
  • [ ] Bag replay: Golden-data comparison within tolerance.
  • [ ] Canary robot: 48h continuous operation, zero crashes, latency < baseline + 10%.
  • [ ] Rollback plan: 1-click restore of Humble image, tested.
  • [ ] Team training: All ops and dev staff walked through Jazzy launch/debug workflow.
  • [ ] Monitoring: Dashboards set up; alerts configured.

Frequently Asked Questions

Q: When does ROS 2 Humble reach EOL?

A: May 2027 is the hard EOL date. Security patches will end; the distro will be effectively unmaintained. If you’re shipping robots after May 2027, they must be on Jazzy or later. Plan migration to complete by Q4 2026 at the latest.

Q: Is Jazzy LTS?

A: Yes. ROS 2 Jazzy Jalisco is an LTS release supported until May 2029. It’s the recommended long-term platform for production fleets until mid-2029.

Q: Should I skip Iron?

A: Almost certainly yes. Iron (May 2023 → Nov 2024) was the intermediate non-LTS release used to stress-test new APIs. It’s already EOL. Go straight from Humble to Jazzy; many teams do, and it’s the endorsed path.

Q: Which DDS should I pick—Cyclone, Fast, or Zenoh?

A: Default: Try CycloneDDS (Jazzy’s default). It’s faster and lighter for typical LAN deployments. Vendor support: If you have a FastDDS contract or deep FastDDS integrations, stay on FastDDS—still supported in Jazzy. WAN/Multi-site: If you’re bridging robots across WANs or cloud, use Zenoh-bridge; it adds a layer but decouples DDS variant choice.

Q: How long does migration take?

A: Audit: 1–2 days. Dev build + validation: 3–5 days. Canary robot: 2–4 weeks. Full fleet rollout: 4–6 weeks. Total: 2–3 months for a fleet of 10–100 robots, assuming no critical blockers (missing packages, driver failures). Budget 4–5 months if you have vendor dependencies or custom middleware.


Further Reading

Internal resources:
ROS 2 Navigation & Autonomous Mobile Robots: Warehouse Navigation with Nav2
Humanoid Robot Benchmark 2026: Figure, Optimus, Unitree, Digit
DDS (Data Distribution Service) Protocol: Complete Guide for IoT & Robotics

Official sources:
ROS 2 Jazzy Jalisco Release Notes — authoritative API changes and migration steps.
ROS 2 Iron → Jazzy Migration Guide — detailed walkthrough from the ROS 2 team.


Author: Riju | Published: 2026-04-27 | Read time: ~14 minutes

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *