# Gym SO-ARM
A gymnasium environment for SO-ARM101 single-arm manipulation based on gym-aloha, featuring multi-camera support and advanced simulation capabilities.

## Features
- **SO-ARM101 6DOF Robotic Arm**: Complete simulation of the SO-ARM101 robotic manipulator with white color scheme
- **Multi-Camera System**: Three camera views with runtime switching:
- Overview camera: Top-down perspective
- Front camera: Side view of the workspace
- Wrist camera: First-person view from the robot's gripper
- **Interactive GUI Viewer**: OpenCV-based viewer with keyboard controls
- **Grid-Based Object Placement**: 3×3 grid system for randomized object positioning
- **MP4 Video Recording**: Automatic recording of camera observations to timestamped MP4 files
- **6DOF Joint Control**: Direct control of all 6 joints including gripper via action space
- **Gymnasium Compatible**: Full OpenAI Gym/Gymnasium interface compliance
- **MuJoCo Physics**: High-fidelity physics simulation using dm-control
## Installation
gym-soarm works with Python 3.10
### From Source
```bash
# Clone the repository
git clone https://github.com/your-org/gym-soarm.git
cd gym-soarm
# Install in development mode
pip install -e .
# Or install with development dependencies
pip install -e ".[dev,test]"
```
### Using pip
```bash
pip install gym-soarm
```
## Quick Start
### Basic Usage
```python
import gymnasium as gym
import gym_soarm
# Create environment with human rendering and camera configuration
env = gym.make('SoArm-v0', render_mode='human', obs_type='pixels_agent_pos', camera_config='front_wrist')
# Reset environment with specific cube position
obs, info = env.reset(options={'cube_grid_position': 4})
# The environment automatically records MP4 videos when using example.py
# Access joint positions and camera images
print(f"Joint positions: {obs['agent_pos']}") # 6 joint values including gripper
print(f"Available cameras: {list(obs['pixels'].keys())}") # front_camera, wrist_camera
# Run simulation with 6DOF joint control
for _ in range(200):
action = env.action_space.sample() # 6D action: [shoulder_pan, shoulder_lift, elbow_flex, wrist_flex, wrist_roll, gripper]
obs, reward, terminated, truncated, info = env.step(action)
if terminated or truncated:
obs, info = env.reset()
env.close()
```
### Interactive Joint Control
For real-time joint manipulation using sliders, use the interactive control sample:
```bash
# Run the slider control sample
python examples/slider_control_final.py
```
**Features:**
- **Real-time Control**: Use trackbars to control each of the 6 joints (shoulder_pan, shoulder_lift, elbow_flex, wrist_flex, wrist_roll, gripper)
- **Visual Feedback**: Live display of joint angles in radians
- **Reset Functionality**: Reset button to return robot to initial position (all joints at 0.0 rad)
- **Keyboard Controls**:
- **SPACE**: Step simulation forward
- **ESC**: Exit application
- **R**: Quick reset shortcut
**Usage Instructions:**
1. Adjust joint angles using the trackbars at the top of the control window
2. Press SPACE to step the simulation and see the robot move
3. Use the "Reset" trackbar (set to 1) to reset the environment
4. Press ESC to exit the application
This sample is perfect for:
- Understanding joint limits and robot kinematics
- Manual robot positioning and pose testing
- Interactive exploration of the workspace
- Educational demonstrations of robotic arm control
### Grid Position Control
You can specify the initial position of the blue cube using a 3×3 grid system:
```python
import gymnasium as gym
import gym_soarm
env = gym.make('SoArm-v0', render_mode='human')
# Place cube at specific grid position (0-8)
obs, info = env.reset(options={'cube_grid_position': 4}) # Center position
# Use random position (default behavior)
obs, info = env.reset(options={'cube_grid_position': None})
```
**Grid Layout (positions 0-8):**
```
0: (-10cm, -7.5cm) 1: (-10cm, 0cm) 2: (-10cm, +7.5cm)
3: ( 0cm, -7.5cm) 4: ( 0cm, 0cm) 5: ( 0cm, +7.5cm)
6: (+10cm, -7.5cm) 7: (+10cm, 0cm) 8: (+10cm, +7.5cm)
```
The cube will be placed at the specified grid position with a random rotation (0°, 30°, 45°, or 60°).
### Camera Configuration
You can configure which cameras are included in observations to optimize performance and focus on relevant viewpoints:
```python
import gymnasium as gym
import gym_soarm
# Front camera only (minimal, fastest)
env = gym.make('SoArm-v0', obs_type='pixels', camera_config='front_only')
# Front and wrist cameras (default, balanced)
env = gym.make('SoArm-v0', obs_type='pixels', camera_config='front_wrist')
# All cameras (comprehensive, slower)
env = gym.make('SoArm-v0', obs_type='pixels', camera_config='all')
obs, info = env.reset()
print(f"Available cameras: {list(obs.keys())}")
```
**Camera Configuration Options:**
- `front_only`: Only front camera (side view) - fastest, minimal observations
- `front_wrist`: Front camera + wrist camera (first-person view) - balanced performance
- `all`: All three cameras (overview + front + wrist) - comprehensive but slower
**Observation Structure by Configuration:**
```python
# front_only
obs = {
'front_camera': np.ndarray(shape=(480, 640, 3))
}
# front_wrist
obs = {
'front_camera': np.ndarray(shape=(480, 640, 3)),
'wrist_camera': np.ndarray(shape=(480, 640, 3))
}
# all
obs = {
'overview_camera': np.ndarray(shape=(480, 640, 3)),
'front_camera': np.ndarray(shape=(480, 640, 3)),
'wrist_camera': np.ndarray(shape=(480, 640, 3))
}
```
### MP4 Video Recording
The `example.py` script automatically records camera observations to MP4 videos:
```python
import gymnasium as gym
import gym_soarm
# Run the example script with video recording
env = gym.make('SoArm-v0', render_mode='human', obs_type='pixels_agent_pos', camera_config='front_wrist')
# Videos are automatically saved to videos/ directory with timestamps
# - front_camera_20250729_143022.mp4
# - wrist_camera_20250729_143022.mp4
# Manual video recording can be implemented using:
frames_storage = {}
obs, info = env.reset()
# Store frames from each camera
if "pixels" in obs:
for camera_name, frame in obs['pixels'].items():
if camera_name not in frames_storage:
frames_storage[camera_name] = []
frames_storage[camera_name].append(frame.copy())
# Use save_frames_to_mp4() function from example.py to save videos
```
### Camera Switching
During simulation with `render_mode='human'`, use these keyboard controls:
- **'1'**: Switch to overview camera
- **'2'**: Switch to front camera
- **'3'**: Switch to wrist camera
- **'q'**: Quit simulation
## Environment Details
### Observation Space
The environment provides rich observations including:
- **Robot Joint Positions**: 6DOF joint positions including gripper (6-dimensional)
- **Camera Images**: RGB images from configured cameras (480×640×3 each)
- **Object Information**: Positions and orientations of manipulated objects
```python
# For obs_type='pixels_agent_pos'
obs_space = gym.spaces.Dict({
'agent_pos': gym.spaces.Box(-np.inf, np.inf, shape=(6,), dtype=np.float64), # Joint positions
'pixels': gym.spaces.Dict({
'front_camera': gym.spaces.Box(0, 255, shape=(480, 640, 3), dtype=np.uint8),
'wrist_camera': gym.spaces.Box(0, 255, shape=(480, 640, 3), dtype=np.uint8)
})
})
```
### Action Space
6DOF joint position control for the SO-ARM101:
- **Dimensions**: 6 (shoulder_pan, shoulder_lift, elbow_flex, wrist_flex, wrist_roll, gripper)
- **Range**: Joint-specific limits based on hardware specifications
- **Control**: Direct joint position targets
### Workspace Configuration
- **Table Size**: 64cm × 45cm
- **Object Grid**: 3×3 positioning system with ±10cm(X), ±7.5cm(Y) spacing
- **Cube Size**: 3cm × 3cm × 3cm blue cubes
- **Robot Base**: Positioned at (0, 0.15, 0) with 90° rotation
- **Robot Color**: White color scheme with black servo motors for visual clarity
### Camera Specifications
| Camera | Position | Orientation | FOV | Description |
|--------|----------|-------------|-----|-------------|
| Overview | (0, 0.4, 0.8) | Top-down | 90° | Bird's eye view |
| Front | (0, 0.7, 0.25) | Angled forward | 120° | Side perspective |
| Wrist | (0, -0.04, 0) | 30° X-rotation | 110° | First-person view |
## Development
### Project Structure
```
gym-soarm/
├── gym_soarm/ # Main package
│ ├── __init__.py # Package initialization
│ ├── env.py # Main environment class
│ ├── constants.py # Environment constants
│ ├── assets/ # Robot models and scenes
│ │ ├── so101_new_calib.xml # SO-ARM101 robot model (white color)
│ │ ├── so_arm_main_new.xml # Scene with table and objects
│ │ └── assets/ # STL mesh files
│ └── tasks/ # Task implementations
│ ├── __init__.py
│ └── sim.py # Manipulation tasks
├── examples/ # Example scripts and demonstrations
│ ├── example.py # Basic usage with MP4 recording
│ └── slider_control_final.py # Interactive joint control with sliders
├── videos/ # Auto-generated MP4 video outputs
├── setup.py # Package setup
├── pyproject.toml # Poetry configuration
└── README.md # This file
```
### Running Tests
```bash
# Install test dependencies
pip install -e ".[test]"
# Run comprehensive test suite
pytest tests/ -v
# Run specific test categories
pytest tests/test_e2e.py -v # End-to-end tests
pytest tests/test_camera_config.py -v # Camera configuration tests
# Run basic functionality test
python examples/example.py
# Test interactive joint control
python examples/slider_control_final.py
# Test camera configuration features
python test_camera_features.py
```
### Code Style
The project uses Ruff for linting and formatting:
```bash
# Install development dependencies
pip install -e ".[dev]"
# Run linting
ruff check gym_soarm/
# Auto-format code
ruff format gym_soarm/
```
## Hardware Requirements
- **Python**: ≥3.10
- **OpenGL**: Required for rendering
- **Memory**: ≥4GB RAM recommended
- **Storage**: ~500MB for assets and dependencies
## Troubleshooting
### Common Issues
1. **MuJoCo Installation**: Ensure MuJoCo ≥2.3.7 is properly installed
2. **OpenGL Context**: On headless systems, use `xvfb-run` for rendering
3. **Asset Loading**: Verify all `.stl` files are present in `assets/assets/`
### Platform-Specific Notes
- **macOS**: May require XQuartz for OpenGL support
- **Linux**: Ensure proper GPU drivers for hardware acceleration
- **Windows**: Use WSL2 for best compatibility
## Citation
If you use this environment in your research, please cite:
```bibtex
@software{gym_soarm,
title={Gym SO-ARM: A Gymnasium Environment for SO-ARM101 Manipulation},
author={SO-ARM Development Team},
version={0.1.0},
year={2024},
url={https://github.com/your-org/gym-soarm}
}
```
## License
Apache 2.0 License - see [LICENSE](LICENSE) file for details.
## Contributing
Contributions are welcome! Please read our contributing guidelines and submit pull requests to our GitHub repository.
## Support
For questions and support:
- GitHub Issues: [Report bugs or request features](https://github.com/your-org/gym-soarm/issues)
- Discussions: [Community discussions](https://github.com/your-org/gym-soarm/discussions)
Raw data
{
"_id": null,
"home_page": null,
"name": "gym-soarm",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "robotics, deep, reinforcement, learning, so-arm, single-arm, environment, gym, gymnasium, dm-control, mujoco",
"author": "Masato Kawamura (masato-ka)",
"author_email": "jp6uzv@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/82/2d/29e86bb302c032e0801f9324d1b39a75d63ce0d16f53f8b1a38478b7184c/gym_soarm-0.1.0.tar.gz",
"platform": null,
"description": "# Gym SO-ARM\n\nA gymnasium environment for SO-ARM101 single-arm manipulation based on gym-aloha, featuring multi-camera support and advanced simulation capabilities.\n\n\n\n## Features\n\n- **SO-ARM101 6DOF Robotic Arm**: Complete simulation of the SO-ARM101 robotic manipulator with white color scheme\n- **Multi-Camera System**: Three camera views with runtime switching:\n - Overview camera: Top-down perspective\n - Front camera: Side view of the workspace\n - Wrist camera: First-person view from the robot's gripper\n- **Interactive GUI Viewer**: OpenCV-based viewer with keyboard controls\n- **Grid-Based Object Placement**: 3\u00d73 grid system for randomized object positioning\n- **MP4 Video Recording**: Automatic recording of camera observations to timestamped MP4 files\n- **6DOF Joint Control**: Direct control of all 6 joints including gripper via action space\n- **Gymnasium Compatible**: Full OpenAI Gym/Gymnasium interface compliance\n- **MuJoCo Physics**: High-fidelity physics simulation using dm-control\n\n## Installation\n\ngym-soarm works with Python 3.10\n\n### From Source\n\n```bash\n# Clone the repository\ngit clone https://github.com/your-org/gym-soarm.git\ncd gym-soarm\n\n# Install in development mode\npip install -e .\n\n# Or install with development dependencies\npip install -e \".[dev,test]\"\n```\n\n### Using pip\n\n```bash\npip install gym-soarm\n```\n\n## Quick Start\n\n### Basic Usage\n\n```python\nimport gymnasium as gym\nimport gym_soarm\n\n# Create environment with human rendering and camera configuration\nenv = gym.make('SoArm-v0', render_mode='human', obs_type='pixels_agent_pos', camera_config='front_wrist')\n\n# Reset environment with specific cube position\nobs, info = env.reset(options={'cube_grid_position': 4})\n\n# The environment automatically records MP4 videos when using example.py\n# Access joint positions and camera images\nprint(f\"Joint positions: {obs['agent_pos']}\") # 6 joint values including gripper\nprint(f\"Available cameras: {list(obs['pixels'].keys())}\") # front_camera, wrist_camera\n\n# Run simulation with 6DOF joint control\nfor _ in range(200):\n action = env.action_space.sample() # 6D action: [shoulder_pan, shoulder_lift, elbow_flex, wrist_flex, wrist_roll, gripper]\n obs, reward, terminated, truncated, info = env.step(action)\n \n if terminated or truncated:\n obs, info = env.reset()\n\nenv.close()\n```\n\n### Interactive Joint Control\n\nFor real-time joint manipulation using sliders, use the interactive control sample:\n\n```bash\n# Run the slider control sample\npython examples/slider_control_final.py\n```\n\n**Features:**\n- **Real-time Control**: Use trackbars to control each of the 6 joints (shoulder_pan, shoulder_lift, elbow_flex, wrist_flex, wrist_roll, gripper)\n- **Visual Feedback**: Live display of joint angles in radians\n- **Reset Functionality**: Reset button to return robot to initial position (all joints at 0.0 rad)\n- **Keyboard Controls**: \n - **SPACE**: Step simulation forward\n - **ESC**: Exit application\n - **R**: Quick reset shortcut\n\n**Usage Instructions:**\n1. Adjust joint angles using the trackbars at the top of the control window\n2. Press SPACE to step the simulation and see the robot move\n3. Use the \"Reset\" trackbar (set to 1) to reset the environment\n4. Press ESC to exit the application\n\nThis sample is perfect for:\n- Understanding joint limits and robot kinematics\n- Manual robot positioning and pose testing\n- Interactive exploration of the workspace\n- Educational demonstrations of robotic arm control\n\n### Grid Position Control\n\nYou can specify the initial position of the blue cube using a 3\u00d73 grid system:\n\n```python\nimport gymnasium as gym\nimport gym_soarm\n\nenv = gym.make('SoArm-v0', render_mode='human')\n\n# Place cube at specific grid position (0-8)\nobs, info = env.reset(options={'cube_grid_position': 4}) # Center position\n\n# Use random position (default behavior)\nobs, info = env.reset(options={'cube_grid_position': None})\n```\n\n**Grid Layout (positions 0-8):**\n```\n0: (-10cm, -7.5cm) 1: (-10cm, 0cm) 2: (-10cm, +7.5cm)\n3: ( 0cm, -7.5cm) 4: ( 0cm, 0cm) 5: ( 0cm, +7.5cm) \n6: (+10cm, -7.5cm) 7: (+10cm, 0cm) 8: (+10cm, +7.5cm)\n```\n\nThe cube will be placed at the specified grid position with a random rotation (0\u00b0, 30\u00b0, 45\u00b0, or 60\u00b0).\n\n### Camera Configuration\n\nYou can configure which cameras are included in observations to optimize performance and focus on relevant viewpoints:\n\n```python\nimport gymnasium as gym\nimport gym_soarm\n\n# Front camera only (minimal, fastest)\nenv = gym.make('SoArm-v0', obs_type='pixels', camera_config='front_only')\n\n# Front and wrist cameras (default, balanced)\nenv = gym.make('SoArm-v0', obs_type='pixels', camera_config='front_wrist')\n\n# All cameras (comprehensive, slower)\nenv = gym.make('SoArm-v0', obs_type='pixels', camera_config='all')\n\nobs, info = env.reset()\nprint(f\"Available cameras: {list(obs.keys())}\")\n```\n\n**Camera Configuration Options:**\n- `front_only`: Only front camera (side view) - fastest, minimal observations\n- `front_wrist`: Front camera + wrist camera (first-person view) - balanced performance\n- `all`: All three cameras (overview + front + wrist) - comprehensive but slower\n\n**Observation Structure by Configuration:**\n```python\n# front_only\nobs = {\n 'front_camera': np.ndarray(shape=(480, 640, 3))\n}\n\n# front_wrist \nobs = {\n 'front_camera': np.ndarray(shape=(480, 640, 3)),\n 'wrist_camera': np.ndarray(shape=(480, 640, 3))\n}\n\n# all\nobs = {\n 'overview_camera': np.ndarray(shape=(480, 640, 3)),\n 'front_camera': np.ndarray(shape=(480, 640, 3)),\n 'wrist_camera': np.ndarray(shape=(480, 640, 3))\n}\n```\n\n### MP4 Video Recording\n\nThe `example.py` script automatically records camera observations to MP4 videos:\n\n```python\nimport gymnasium as gym\nimport gym_soarm\n\n# Run the example script with video recording\nenv = gym.make('SoArm-v0', render_mode='human', obs_type='pixels_agent_pos', camera_config='front_wrist')\n\n# Videos are automatically saved to videos/ directory with timestamps\n# - front_camera_20250729_143022.mp4\n# - wrist_camera_20250729_143022.mp4\n\n# Manual video recording can be implemented using:\nframes_storage = {}\nobs, info = env.reset()\n\n# Store frames from each camera\nif \"pixels\" in obs:\n for camera_name, frame in obs['pixels'].items():\n if camera_name not in frames_storage:\n frames_storage[camera_name] = []\n frames_storage[camera_name].append(frame.copy())\n\n# Use save_frames_to_mp4() function from example.py to save videos\n```\n\n### Camera Switching\n\nDuring simulation with `render_mode='human'`, use these keyboard controls:\n\n- **'1'**: Switch to overview camera\n- **'2'**: Switch to front camera \n- **'3'**: Switch to wrist camera\n- **'q'**: Quit simulation\n\n## Environment Details\n\n### Observation Space\n\nThe environment provides rich observations including:\n\n- **Robot Joint Positions**: 6DOF joint positions including gripper (6-dimensional)\n- **Camera Images**: RGB images from configured cameras (480\u00d7640\u00d73 each)\n- **Object Information**: Positions and orientations of manipulated objects\n\n```python\n# For obs_type='pixels_agent_pos'\nobs_space = gym.spaces.Dict({\n 'agent_pos': gym.spaces.Box(-np.inf, np.inf, shape=(6,), dtype=np.float64), # Joint positions\n 'pixels': gym.spaces.Dict({\n 'front_camera': gym.spaces.Box(0, 255, shape=(480, 640, 3), dtype=np.uint8),\n 'wrist_camera': gym.spaces.Box(0, 255, shape=(480, 640, 3), dtype=np.uint8)\n })\n})\n```\n\n### Action Space\n\n6DOF joint position control for the SO-ARM101:\n\n- **Dimensions**: 6 (shoulder_pan, shoulder_lift, elbow_flex, wrist_flex, wrist_roll, gripper)\n- **Range**: Joint-specific limits based on hardware specifications\n- **Control**: Direct joint position targets\n\n### Workspace Configuration\n\n- **Table Size**: 64cm \u00d7 45cm\n- **Object Grid**: 3\u00d73 positioning system with \u00b110cm(X), \u00b17.5cm(Y) spacing\n- **Cube Size**: 3cm \u00d7 3cm \u00d7 3cm blue cubes\n- **Robot Base**: Positioned at (0, 0.15, 0) with 90\u00b0 rotation\n- **Robot Color**: White color scheme with black servo motors for visual clarity\n\n### Camera Specifications\n\n| Camera | Position | Orientation | FOV | Description |\n|--------|----------|-------------|-----|-------------|\n| Overview | (0, 0.4, 0.8) | Top-down | 90\u00b0 | Bird's eye view |\n| Front | (0, 0.7, 0.25) | Angled forward | 120\u00b0 | Side perspective |\n| Wrist | (0, -0.04, 0) | 30\u00b0 X-rotation | 110\u00b0 | First-person view |\n\n## Development\n\n### Project Structure\n\n```\ngym-soarm/\n\u251c\u2500\u2500 gym_soarm/ # Main package\n\u2502 \u251c\u2500\u2500 __init__.py # Package initialization\n\u2502 \u251c\u2500\u2500 env.py # Main environment class\n\u2502 \u251c\u2500\u2500 constants.py # Environment constants\n\u2502 \u251c\u2500\u2500 assets/ # Robot models and scenes\n\u2502 \u2502 \u251c\u2500\u2500 so101_new_calib.xml # SO-ARM101 robot model (white color)\n\u2502 \u2502 \u251c\u2500\u2500 so_arm_main_new.xml # Scene with table and objects\n\u2502 \u2502 \u2514\u2500\u2500 assets/ # STL mesh files\n\u2502 \u2514\u2500\u2500 tasks/ # Task implementations\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2514\u2500\u2500 sim.py # Manipulation tasks\n\u251c\u2500\u2500 examples/ # Example scripts and demonstrations\n\u2502 \u251c\u2500\u2500 example.py # Basic usage with MP4 recording\n\u2502 \u2514\u2500\u2500 slider_control_final.py # Interactive joint control with sliders\n\u251c\u2500\u2500 videos/ # Auto-generated MP4 video outputs\n\u251c\u2500\u2500 setup.py # Package setup\n\u251c\u2500\u2500 pyproject.toml # Poetry configuration\n\u2514\u2500\u2500 README.md # This file\n```\n\n### Running Tests\n\n```bash\n# Install test dependencies\npip install -e \".[test]\"\n\n# Run comprehensive test suite\npytest tests/ -v\n\n# Run specific test categories\npytest tests/test_e2e.py -v # End-to-end tests\npytest tests/test_camera_config.py -v # Camera configuration tests\n\n# Run basic functionality test\npython examples/example.py\n\n# Test interactive joint control\npython examples/slider_control_final.py\n\n# Test camera configuration features\npython test_camera_features.py\n```\n\n### Code Style\n\nThe project uses Ruff for linting and formatting:\n\n```bash\n# Install development dependencies\npip install -e \".[dev]\"\n\n# Run linting\nruff check gym_soarm/\n\n# Auto-format code\nruff format gym_soarm/\n```\n\n## Hardware Requirements\n\n- **Python**: \u22653.10\n- **OpenGL**: Required for rendering\n- **Memory**: \u22654GB RAM recommended\n- **Storage**: ~500MB for assets and dependencies\n\n## Troubleshooting\n\n### Common Issues\n\n1. **MuJoCo Installation**: Ensure MuJoCo \u22652.3.7 is properly installed\n2. **OpenGL Context**: On headless systems, use `xvfb-run` for rendering\n3. **Asset Loading**: Verify all `.stl` files are present in `assets/assets/`\n\n### Platform-Specific Notes\n\n- **macOS**: May require XQuartz for OpenGL support\n- **Linux**: Ensure proper GPU drivers for hardware acceleration\n- **Windows**: Use WSL2 for best compatibility\n\n## Citation\n\nIf you use this environment in your research, please cite:\n\n```bibtex\n@software{gym_soarm,\n title={Gym SO-ARM: A Gymnasium Environment for SO-ARM101 Manipulation},\n author={SO-ARM Development Team},\n version={0.1.0},\n year={2024},\n url={https://github.com/your-org/gym-soarm}\n}\n```\n\n## License\n\nApache 2.0 License - see [LICENSE](LICENSE) file for details.\n\n## Contributing\n\nContributions are welcome! Please read our contributing guidelines and submit pull requests to our GitHub repository.\n\n## Support\n\nFor questions and support:\n- GitHub Issues: [Report bugs or request features](https://github.com/your-org/gym-soarm/issues)\n- Discussions: [Community discussions](https://github.com/your-org/gym-soarm/discussions)\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "A gymnasium environment for SO-ARM100 single-arm manipulation based on gym-aloha",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [
"robotics",
" deep",
" reinforcement",
" learning",
" so-arm",
" single-arm",
" environment",
" gym",
" gymnasium",
" dm-control",
" mujoco"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "abc0ed32d8b0dedb69b807e338d5ece1d0a586708a55c4daa69519d6225b8086",
"md5": "ced7f72c5c9cde9b4ff47ae717c13e57",
"sha256": "c88eafb96ce15b4f3f26f507f0e81da021007e676ecff1f232b5c765920ef1bf"
},
"downloads": -1,
"filename": "gym_soarm-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ced7f72c5c9cde9b4ff47ae717c13e57",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 5979048,
"upload_time": "2025-08-09T12:34:04",
"upload_time_iso_8601": "2025-08-09T12:34:04.195872Z",
"url": "https://files.pythonhosted.org/packages/ab/c0/ed32d8b0dedb69b807e338d5ece1d0a586708a55c4daa69519d6225b8086/gym_soarm-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "822d29e86bb302c032e0801f9324d1b39a75d63ce0d16f53f8b1a38478b7184c",
"md5": "0c7ea0bbecb5c697e4a1fd06557daa16",
"sha256": "b714b9f4a6796d5c309fc22ed3dd3672848bf87814fbf0fb1211ad8221fc96f7"
},
"downloads": -1,
"filename": "gym_soarm-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "0c7ea0bbecb5c697e4a1fd06557daa16",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 5981178,
"upload_time": "2025-08-09T12:34:06",
"upload_time_iso_8601": "2025-08-09T12:34:06.304987Z",
"url": "https://files.pythonhosted.org/packages/82/2d/29e86bb302c032e0801f9324d1b39a75d63ce0d16f53f8b1a38478b7184c/gym_soarm-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-09 12:34:06",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "gym-soarm"
}