# DexROSBridge
A Python bridge for republishing sensor data between Zenoh and ROS2 networks when using Dexmate Robots.
## Installation
### Prerequisites
- Ubuntu 22.04 or 24.04
- Python 3.10+
- ROS2 (Humble or Jazzy)
- [dexcontrol](https://github.com/dexmate-ai/dexcontrol) library installed and configured
**Deployment Flexibility:**
This package can run on either:
- **Local development machine** (recommended for development and testing)
- **Jetson onboard computer**
As long as `dexcontrol` can connect to the robot and Zenoh network, you can run this ROS bridge and other ROS applications on your local dev machine instead of the robot's Jetson.
**Note:**
We recommend using robostack and micromamba for the ROS2 and Python environments instead of the system-level ROS2 (at `/opt/ros`). [Please follow the instructions here](https://robostack.github.io/GettingStarted.html) to create a micromamba environment with ROS2 inside.
If you have ROS2 installed on your system (i.e., at `/opt/ros/{ROS_DISTRO}`), please DO NOT source it before activating the micromamba environment to avoid environment conflicts. The keys to successful communication between your system ROS2 and the mamba ROS2 are:
1. Make sure to use the same ROS2 distribution. (e.g., communication between Jazzy and Humble is problematic)
2. Make sure to use the same RMW_IMPLEMENTATION (by default it's the same)
### Install from source
```bash
# Activate your micromamba environment with ROS2 installed first
# micromamba activate ros_env
pip install dexcontrol_rosbridge
# Or install from source code below
# git clone https://github.com/dexmate-ai/dexcontrol-rosbridge.git
# pip install -e ./dexcontrol-rosbridge/
```
## Usage
### Re-publish sensor streams from Zenoh topics
First, make sure you set the correct robot name. This is the prefix for Zenoh topics. You can access the robot name on your robot's onboard computer by running `echo $ROBOT_NAME`. Also ensure that the corresponding sensors (e.g., `head_camera` or `lidar`) have been started using `dexsensor launch --sensor`.
```bash
export ROBOT_NAME=[...]
```
The repository provides a unified script that handles sensor types through command-line arguments:
```bash
# Publish head cameras and IMU
python scripts/republish_sensors.py --sensors head
# Publish LIDAR only
python scripts/republish_sensors.py --sensors lidar
# Publish LIDAR and chassis IMU together
python scripts/republish_sensors.py --sensors lidar chassis_imu
```
**Available sensor groups:**
- `head` - Head cameras (left/right RGB) + Head IMU
- `lidar` - Base LIDAR sensor (PointCloud2)
- `chassis_imu` - Chassis IMU sensor (publish only when explicitly requested)
- `wrist` - Wrist cameras (left/right RGB for manipulation, if present)
- `all` - Enable all available sensors
### Wheel odometry
Run:
```bash
python scripts/publish_wheel_odometry.py
```
This will publish the wheel odometry using the differential vehicle model on Vega. The published topics are:
```
/odom # the wheel odometry
/left_wheel_velocity
/right_wheel_velocity
```
## Configuration
Sensor configurations can be customized in `src/dexcontrol_rosbridge/sensor_configs.py`:
```python
def get_head_sensor_configs():
camera_configs = [
{
"name": "head_left_rgb",
"zenoh_topic": "camera/head/left_rgb",
"ros2_topic": "/head_camera/left/image",
"frame_id": "head_camera_left_link",
"compressed": True,
"compression_format": "jpeg",
"queue_size": 2, # Small queue for low latency
},
# ... more cameras
]
return camera_configs, imu_configs
```
## Architecture
### Multi-Process Design
Each sensor stream runs in a dedicated process:
```
Main Process
├── Camera Process 1 (+ Zenoh Session + ROS2 Node)
├── Camera Process 2 (+ Zenoh Session + ROS2 Node)
├── IMU Process (+ Zenoh Session + ROS2 Node)
└── LIDAR Process (+ Zenoh Session + ROS2 Node)
```
This is primarily for GIL isolation.
### Data Flow
```
Zenoh Network → Zenoh Handler → Bounded Queue → Consumer Thread → Converter → ROS2 Publisher
```
1. **Zenoh Handler**: Fast callback that enqueues samples (<0.1ms) without blocking
2. **Bounded Queue**: Small FIFO queue (size: 1-3) for maintaining temporal ordering
3. **Consumer Thread**: Single thread that processes samples sequentially (preserves order)
4. **Converter**: Decodes and converts data to ROS2 message format
5. **ROS2 Publisher**: Publishes to ROS2 topics with proper timestamps and frame IDs
**Congestion Handling:**
- When processing can't keep up with incoming rate, **oldest frames are dropped**
- Drop statistics are logged
## License
GNU Affero General Public License v3.0 (AGPL-3.0)
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Raw data
{
"_id": null,
"home_page": null,
"name": "dexcontrol-rosbridge",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "bridge, robotics, ros2, sensors, zenoh",
"author": null,
"author_email": "Guofei Chen <contact@dexmate.ai>",
"download_url": null,
"platform": null,
"description": "# DexROSBridge\n\nA Python bridge for republishing sensor data between Zenoh and ROS2 networks when using Dexmate Robots.\n\n## Installation\n\n### Prerequisites\n\n- Ubuntu 22.04 or 24.04\n- Python 3.10+\n- ROS2 (Humble or Jazzy)\n- [dexcontrol](https://github.com/dexmate-ai/dexcontrol) library installed and configured\n\n**Deployment Flexibility:**\n\nThis package can run on either:\n- **Local development machine** (recommended for development and testing)\n- **Jetson onboard computer**\n\nAs long as `dexcontrol` can connect to the robot and Zenoh network, you can run this ROS bridge and other ROS applications on your local dev machine instead of the robot's Jetson.\n\n**Note:**\nWe recommend using robostack and micromamba for the ROS2 and Python environments instead of the system-level ROS2 (at `/opt/ros`). [Please follow the instructions here](https://robostack.github.io/GettingStarted.html) to create a micromamba environment with ROS2 inside.\n\nIf you have ROS2 installed on your system (i.e., at `/opt/ros/{ROS_DISTRO}`), please DO NOT source it before activating the micromamba environment to avoid environment conflicts. The keys to successful communication between your system ROS2 and the mamba ROS2 are:\n\n1. Make sure to use the same ROS2 distribution. (e.g., communication between Jazzy and Humble is problematic)\n2. Make sure to use the same RMW_IMPLEMENTATION (by default it's the same)\n\n### Install from source\n\n```bash\n# Activate your micromamba environment with ROS2 installed first\n# micromamba activate ros_env\npip install dexcontrol_rosbridge\n\n# Or install from source code below\n# git clone https://github.com/dexmate-ai/dexcontrol-rosbridge.git\n# pip install -e ./dexcontrol-rosbridge/\n```\n\n## Usage\n\n### Re-publish sensor streams from Zenoh topics\n\nFirst, make sure you set the correct robot name. This is the prefix for Zenoh topics. You can access the robot name on your robot's onboard computer by running `echo $ROBOT_NAME`. Also ensure that the corresponding sensors (e.g., `head_camera` or `lidar`) have been started using `dexsensor launch --sensor`.\n\n```bash\nexport ROBOT_NAME=[...]\n```\n\nThe repository provides a unified script that handles sensor types through command-line arguments:\n\n```bash\n# Publish head cameras and IMU\npython scripts/republish_sensors.py --sensors head\n\n# Publish LIDAR only\npython scripts/republish_sensors.py --sensors lidar\n\n# Publish LIDAR and chassis IMU together\npython scripts/republish_sensors.py --sensors lidar chassis_imu\n```\n\n**Available sensor groups:**\n- `head` - Head cameras (left/right RGB) + Head IMU\n- `lidar` - Base LIDAR sensor (PointCloud2)\n- `chassis_imu` - Chassis IMU sensor (publish only when explicitly requested)\n- `wrist` - Wrist cameras (left/right RGB for manipulation, if present)\n- `all` - Enable all available sensors\n\n### Wheel odometry\nRun:\n```bash\npython scripts/publish_wheel_odometry.py\n```\n\nThis will publish the wheel odometry using the differential vehicle model on Vega. The published topics are:\n\n```\n/odom # the wheel odometry\n/left_wheel_velocity\n/right_wheel_velocity\n```\n\n## Configuration\n\nSensor configurations can be customized in `src/dexcontrol_rosbridge/sensor_configs.py`:\n\n```python\ndef get_head_sensor_configs():\n camera_configs = [\n {\n \"name\": \"head_left_rgb\",\n \"zenoh_topic\": \"camera/head/left_rgb\",\n \"ros2_topic\": \"/head_camera/left/image\",\n \"frame_id\": \"head_camera_left_link\",\n \"compressed\": True,\n \"compression_format\": \"jpeg\",\n \"queue_size\": 2, # Small queue for low latency\n },\n # ... more cameras\n ]\n return camera_configs, imu_configs\n```\n\n## Architecture\n\n### Multi-Process Design\n\nEach sensor stream runs in a dedicated process:\n\n```\nMain Process\n\u251c\u2500\u2500 Camera Process 1 (+ Zenoh Session + ROS2 Node)\n\u251c\u2500\u2500 Camera Process 2 (+ Zenoh Session + ROS2 Node)\n\u251c\u2500\u2500 IMU Process (+ Zenoh Session + ROS2 Node)\n\u2514\u2500\u2500 LIDAR Process (+ Zenoh Session + ROS2 Node)\n```\n\nThis is primarily for GIL isolation.\n\n### Data Flow\n\n```\nZenoh Network \u2192 Zenoh Handler \u2192 Bounded Queue \u2192 Consumer Thread \u2192 Converter \u2192 ROS2 Publisher\n```\n\n1. **Zenoh Handler**: Fast callback that enqueues samples (<0.1ms) without blocking\n2. **Bounded Queue**: Small FIFO queue (size: 1-3) for maintaining temporal ordering\n3. **Consumer Thread**: Single thread that processes samples sequentially (preserves order)\n4. **Converter**: Decodes and converts data to ROS2 message format\n5. **ROS2 Publisher**: Publishes to ROS2 topics with proper timestamps and frame IDs\n\n**Congestion Handling:**\n- When processing can't keep up with incoming rate, **oldest frames are dropped**\n- Drop statistics are logged\n\n## License\n\nGNU Affero General Public License v3.0 (AGPL-3.0)\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n",
"bugtrack_url": null,
"license": "AGPL-3.0",
"summary": "A Python bridge for ROS2 sensor data republishing with Zenoh communication",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/dexmate-ai/dexcontrol-rosbridge",
"Issues": "https://github.com/dexmate-ai/dexcontrol-rosbridge/issues",
"Repository": "https://github.com/dexmate-ai/dexcontrol-rosbridge"
},
"split_keywords": [
"bridge",
" robotics",
" ros2",
" sensors",
" zenoh"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8ec1966ef7948eef4fc72059414447aa90eaef34a563bd72ddef8e2118a5e8b2",
"md5": "a3916632dea996da3f10a1939f213702",
"sha256": "5a37ece321ff4f40299acd697fae23889f7998e653164f579566062eec65ad19"
},
"downloads": -1,
"filename": "dexcontrol_rosbridge-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a3916632dea996da3f10a1939f213702",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 58341,
"upload_time": "2025-10-08T20:52:35",
"upload_time_iso_8601": "2025-10-08T20:52:35.480073Z",
"url": "https://files.pythonhosted.org/packages/8e/c1/966ef7948eef4fc72059414447aa90eaef34a563bd72ddef8e2118a5e8b2/dexcontrol_rosbridge-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-08 20:52:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dexmate-ai",
"github_project": "dexcontrol-rosbridge",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "dexcontrol-rosbridge"
}