nao-bridge-client


Namenao-bridge-client JSON
Version 0.1.5 PyPI version JSON
download
home_pageNone
SummaryPython client library for NAO Bridge HTTP API
upload_time2025-07-26 18:32:33
maintainerNone
docs_urlNone
authorDave Snowdon
requires_python>=3.8
licenseNone
keywords aldebaran api-client nao robot robotics
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # NAO Bridge Client

This directory contains a Python 3 client for the NAO Bridge HTTP API.

## Installation

1. Install the client package

```bash
pip install nao-bridge-client
```

## Quick Start

```python
from nao_bridge_client import NAOBridgeClient

# Create client instance
client = NAOBridgeClient("http://localhost:3000")

# Get robot status
status = client.get_status()
print(f"Robot connected: {status.data.robot_connected}")

# Enable stiffness and stand up
client.enable_stiffness()
client.stand()

# Make robot speak
client.say("Hello, I am a NAO robot!")

# Control LEDs
client.set_leds(leds={"eyes": "blue"})

# Sit down and disable stiffness
client.sit()
client.disable_stiffness()

# Close the client
client.close()
```

## Using Context Manager

The client supports context manager usage for automatic cleanup:

```python
with NAOBridgeClient("http://localhost:3000") as client:
    status = client.get_status()
    print(f"Robot connected: {status.data.robot_connected}")
    # Client automatically closed when exiting the context
```

## Error Handling

The client provides proper error handling with custom exceptions:

```python
from nao_bridge_client import NAOBridgeClient, NAOBridgeError

client = NAOBridgeClient("http://localhost:3000")

try:
    status = client.get_status()
    print("Success!")
except NAOBridgeError as e:
    print(f"API Error: {e.message}")
    print(f"Error Code: {e.code}")
    if e.details:
        print(f"Details: {e.details}")
except Exception as e:
    print(f"Unexpected error: {e}")
```

## Available Methods

### Status and Information
- `get_status()` - Get robot and API status
- `get_operations()` - List active operations
- `get_operation(operation_id)` - Get status of specific operation

### Robot Control
- `enable_stiffness(duration=None)` - Enable robot stiffness
- `disable_stiffness()` - Disable robot stiffness
- `put_in_rest()` - Put robot in rest mode
- `wake_up()` - Wake up robot from rest mode
- `set_autonomous_life_state(state)` - Set autonomous life state

### Posture Control
- `stand(speed=None, variant=None)` - Move to standing position
- `sit(speed=None, variant=None)` - Move to sitting position
- `crouch(speed=None)` - Move to crouching position
- `lie(speed=None, position=None)` - Move to lying position

### Movement Control
- `move_arms_preset(position=None, duration=None, arms=None, offset=None)` - Control arms
- `control_hands(left_hand=None, right_hand=None, duration=None)` - Control hands
- `move_head(yaw=None, pitch=None, duration=None)` - Control head positioning

### Speech and LEDs
- `say(text, blocking=None, animated=None)` - Make robot speak
- `set_leds(leds=None, duration=None)` - Control LED colors
- `turn_off_leds()` - Turn off all LEDs

### Walking
- `start_walking(x=None, y=None, theta=None, speed=None)` - Start walking
- `stop_walking()` - Stop walking
- `walk_preset(action=None, duration=None, speed=None)` - Preset walking patterns

### Sensors
- `get_sonar()` - Get sonar sensor readings
- `get_joint_angles(chain)` - Get joint angles for chain
- `get_joint_names(chain)` - Get joint names for a specified chain

### Vision and Camera
- `get_camera_image_json(camera, resolution)` - Get camera image as JSON with base64 data
- `get_camera_image_bytes(camera, resolution)` - Get camera image as raw JPEG bytes
- `get_camera_resolutions()` - Get available camera resolutions

### Animations
- `execute_animation(animation, parameters=None)` - Execute predefined animations
- `get_animations()` - Get list of available animations
- `execute_sequence(sequence, blocking=None)` - Execute movement sequences

### Behaviors
- `execute_behaviour(behaviour, blocking=None)` - Execute a behavior on the robot
- `get_behaviours(behaviour_type)` - Get list of behaviours by type
- `set_behaviour_default(behaviour, default=True)` - Set a behaviour as default

### Configuration
- `set_duration(duration)` - Set global movement duration

## Async Usage

The client also supports async operations:

```python
import asyncio
from nao_bridge_client import NAOBridgeClient

async def main():
    async with NAOBridgeClient("http://localhost:3000") as client:
        # Get status asynchronously
        status = await client.async_get_status()
        print(f"Robot connected: {status.data.robot_connected}")
        
        # Make robot speak asynchronously
        await client.async_say("Hello from async!")
        
        # Start walking asynchronously
        await client.async_start_walking(x=0.1, speed=0.5)
        
        # Get sensor data asynchronously
        sonar = await client.async_get_sonar()
        print(f"Sonar left: {sonar.data.left}, right: {sonar.data.right}")

# Run the async function
asyncio.run(main())
```

### Available Async Methods
- `async_get_status()` - Get robot status (async)
- `async_say(text, blocking=None, animated=None)` - Make robot speak (async)
- `async_start_walking(x=None, y=None, theta=None, speed=None)` - Start walking (async)
- `async_stop_walking()` - Stop walking (async)
- `async_move_head(yaw=None, pitch=None, duration=None)` - Move robot head (async)
- `async_get_sonar()` - Get sonar readings (async)
- `async_get_joint_angles(chain)` - Get joint angles for chain (async)
- `async_get_camera_image_json(camera, resolution)` - Get camera image as JSON (async)

## Data Models

The client uses Pydantic models for type-safe request and response handling:

### Core Data Models
- `StatusData` - Robot status information
- `SonarData` - Sonar sensor readings
- `VisionData` - Camera image metadata
- `JointAnglesData` - Joint angle information

### Response Models
- `BaseResponse` - Base response structure
- `StatusResponse` - Robot status information
- `SuccessResponse` - Successful operation responses
- `SonarResponse` - Sonar sensor data
- `VisionResponse` - Camera image data
- `JointAnglesResponse` - Joint angles data
- `AnimationsListResponse` - Available animations list
- `OperationsResponse` - Active operations list
- `OperationResponse` - Single operation status
- `BehaviourResponse` - Behavior execution response
- `BehavioursListResponse` - Available behaviors list

### Request Models
- `DurationRequest` - Duration-based operations
- `PostureRequest` - Posture change requests
- `SpeechRequest` - Speech commands
- `WalkRequest` - Walking commands
- `HeadPositionRequest` - Head positioning
- `LEDsRequest` - LED control
- `AnimationExecuteRequest` - Animation execution
- `SequenceRequest` - Movement sequences
- `BehaviourExecuteRequest` - Behavior execution
- `ArmsPresetRequest` - Arms preset positions
- `HandsRequest` - Hand control
- `LieRequest` - Lie posture
- `AutonomousLifeRequest` - Autonomous life state

## Exception Types

- `NAOBridgeError` - Base exception for all NAO Bridge errors

## Example Usage

See `example_usage.py` for comprehensive examples demonstrating:

- Basic robot control
- Movement and positioning
- Speech and LED control
- Sensor reading
- Animation execution
- Walking control
- Sequence execution
- Error handling
- Context manager usage
- Async operations

## Running the Example

```bash
python example_usage.py
```

Make sure the NAO Bridge server is running on `http://localhost:3000` and a NAO robot is connected before running the example.

## API Documentation

The client is based on the OpenAPI/Swagger specification available at:
- Swagger UI: `http://localhost:3000/swagger`
- OpenAPI JSON: `http://localhost:3000/api/v1/swagger.json`

## Requirements

- Python 3.8+
- httpx>=0.24.0
- pydantic>=2.0.0
- pillow>=8.0.0
- typing-extensions>=4.0.0 (for Python < 3.11)

## License

This client is part of the NAO Bridge project and follows the same license terms. 

## Installing from test.pypi.org

Allow main index as fallback

```bash
pip install -i https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ nao-bridge-client
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "nao-bridge-client",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "aldebaran, api-client, nao, robot, robotics",
    "author": "Dave Snowdon",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/d0/d2/23094c81af9652503c5fa25de4b4b518c70eaeeca3a3e634aa4e2b12ef84/nao_bridge_client-0.1.5.tar.gz",
    "platform": null,
    "description": "# NAO Bridge Client\n\nThis directory contains a Python 3 client for the NAO Bridge HTTP API.\n\n## Installation\n\n1. Install the client package\n\n```bash\npip install nao-bridge-client\n```\n\n## Quick Start\n\n```python\nfrom nao_bridge_client import NAOBridgeClient\n\n# Create client instance\nclient = NAOBridgeClient(\"http://localhost:3000\")\n\n# Get robot status\nstatus = client.get_status()\nprint(f\"Robot connected: {status.data.robot_connected}\")\n\n# Enable stiffness and stand up\nclient.enable_stiffness()\nclient.stand()\n\n# Make robot speak\nclient.say(\"Hello, I am a NAO robot!\")\n\n# Control LEDs\nclient.set_leds(leds={\"eyes\": \"blue\"})\n\n# Sit down and disable stiffness\nclient.sit()\nclient.disable_stiffness()\n\n# Close the client\nclient.close()\n```\n\n## Using Context Manager\n\nThe client supports context manager usage for automatic cleanup:\n\n```python\nwith NAOBridgeClient(\"http://localhost:3000\") as client:\n    status = client.get_status()\n    print(f\"Robot connected: {status.data.robot_connected}\")\n    # Client automatically closed when exiting the context\n```\n\n## Error Handling\n\nThe client provides proper error handling with custom exceptions:\n\n```python\nfrom nao_bridge_client import NAOBridgeClient, NAOBridgeError\n\nclient = NAOBridgeClient(\"http://localhost:3000\")\n\ntry:\n    status = client.get_status()\n    print(\"Success!\")\nexcept NAOBridgeError as e:\n    print(f\"API Error: {e.message}\")\n    print(f\"Error Code: {e.code}\")\n    if e.details:\n        print(f\"Details: {e.details}\")\nexcept Exception as e:\n    print(f\"Unexpected error: {e}\")\n```\n\n## Available Methods\n\n### Status and Information\n- `get_status()` - Get robot and API status\n- `get_operations()` - List active operations\n- `get_operation(operation_id)` - Get status of specific operation\n\n### Robot Control\n- `enable_stiffness(duration=None)` - Enable robot stiffness\n- `disable_stiffness()` - Disable robot stiffness\n- `put_in_rest()` - Put robot in rest mode\n- `wake_up()` - Wake up robot from rest mode\n- `set_autonomous_life_state(state)` - Set autonomous life state\n\n### Posture Control\n- `stand(speed=None, variant=None)` - Move to standing position\n- `sit(speed=None, variant=None)` - Move to sitting position\n- `crouch(speed=None)` - Move to crouching position\n- `lie(speed=None, position=None)` - Move to lying position\n\n### Movement Control\n- `move_arms_preset(position=None, duration=None, arms=None, offset=None)` - Control arms\n- `control_hands(left_hand=None, right_hand=None, duration=None)` - Control hands\n- `move_head(yaw=None, pitch=None, duration=None)` - Control head positioning\n\n### Speech and LEDs\n- `say(text, blocking=None, animated=None)` - Make robot speak\n- `set_leds(leds=None, duration=None)` - Control LED colors\n- `turn_off_leds()` - Turn off all LEDs\n\n### Walking\n- `start_walking(x=None, y=None, theta=None, speed=None)` - Start walking\n- `stop_walking()` - Stop walking\n- `walk_preset(action=None, duration=None, speed=None)` - Preset walking patterns\n\n### Sensors\n- `get_sonar()` - Get sonar sensor readings\n- `get_joint_angles(chain)` - Get joint angles for chain\n- `get_joint_names(chain)` - Get joint names for a specified chain\n\n### Vision and Camera\n- `get_camera_image_json(camera, resolution)` - Get camera image as JSON with base64 data\n- `get_camera_image_bytes(camera, resolution)` - Get camera image as raw JPEG bytes\n- `get_camera_resolutions()` - Get available camera resolutions\n\n### Animations\n- `execute_animation(animation, parameters=None)` - Execute predefined animations\n- `get_animations()` - Get list of available animations\n- `execute_sequence(sequence, blocking=None)` - Execute movement sequences\n\n### Behaviors\n- `execute_behaviour(behaviour, blocking=None)` - Execute a behavior on the robot\n- `get_behaviours(behaviour_type)` - Get list of behaviours by type\n- `set_behaviour_default(behaviour, default=True)` - Set a behaviour as default\n\n### Configuration\n- `set_duration(duration)` - Set global movement duration\n\n## Async Usage\n\nThe client also supports async operations:\n\n```python\nimport asyncio\nfrom nao_bridge_client import NAOBridgeClient\n\nasync def main():\n    async with NAOBridgeClient(\"http://localhost:3000\") as client:\n        # Get status asynchronously\n        status = await client.async_get_status()\n        print(f\"Robot connected: {status.data.robot_connected}\")\n        \n        # Make robot speak asynchronously\n        await client.async_say(\"Hello from async!\")\n        \n        # Start walking asynchronously\n        await client.async_start_walking(x=0.1, speed=0.5)\n        \n        # Get sensor data asynchronously\n        sonar = await client.async_get_sonar()\n        print(f\"Sonar left: {sonar.data.left}, right: {sonar.data.right}\")\n\n# Run the async function\nasyncio.run(main())\n```\n\n### Available Async Methods\n- `async_get_status()` - Get robot status (async)\n- `async_say(text, blocking=None, animated=None)` - Make robot speak (async)\n- `async_start_walking(x=None, y=None, theta=None, speed=None)` - Start walking (async)\n- `async_stop_walking()` - Stop walking (async)\n- `async_move_head(yaw=None, pitch=None, duration=None)` - Move robot head (async)\n- `async_get_sonar()` - Get sonar readings (async)\n- `async_get_joint_angles(chain)` - Get joint angles for chain (async)\n- `async_get_camera_image_json(camera, resolution)` - Get camera image as JSON (async)\n\n## Data Models\n\nThe client uses Pydantic models for type-safe request and response handling:\n\n### Core Data Models\n- `StatusData` - Robot status information\n- `SonarData` - Sonar sensor readings\n- `VisionData` - Camera image metadata\n- `JointAnglesData` - Joint angle information\n\n### Response Models\n- `BaseResponse` - Base response structure\n- `StatusResponse` - Robot status information\n- `SuccessResponse` - Successful operation responses\n- `SonarResponse` - Sonar sensor data\n- `VisionResponse` - Camera image data\n- `JointAnglesResponse` - Joint angles data\n- `AnimationsListResponse` - Available animations list\n- `OperationsResponse` - Active operations list\n- `OperationResponse` - Single operation status\n- `BehaviourResponse` - Behavior execution response\n- `BehavioursListResponse` - Available behaviors list\n\n### Request Models\n- `DurationRequest` - Duration-based operations\n- `PostureRequest` - Posture change requests\n- `SpeechRequest` - Speech commands\n- `WalkRequest` - Walking commands\n- `HeadPositionRequest` - Head positioning\n- `LEDsRequest` - LED control\n- `AnimationExecuteRequest` - Animation execution\n- `SequenceRequest` - Movement sequences\n- `BehaviourExecuteRequest` - Behavior execution\n- `ArmsPresetRequest` - Arms preset positions\n- `HandsRequest` - Hand control\n- `LieRequest` - Lie posture\n- `AutonomousLifeRequest` - Autonomous life state\n\n## Exception Types\n\n- `NAOBridgeError` - Base exception for all NAO Bridge errors\n\n## Example Usage\n\nSee `example_usage.py` for comprehensive examples demonstrating:\n\n- Basic robot control\n- Movement and positioning\n- Speech and LED control\n- Sensor reading\n- Animation execution\n- Walking control\n- Sequence execution\n- Error handling\n- Context manager usage\n- Async operations\n\n## Running the Example\n\n```bash\npython example_usage.py\n```\n\nMake sure the NAO Bridge server is running on `http://localhost:3000` and a NAO robot is connected before running the example.\n\n## API Documentation\n\nThe client is based on the OpenAPI/Swagger specification available at:\n- Swagger UI: `http://localhost:3000/swagger`\n- OpenAPI JSON: `http://localhost:3000/api/v1/swagger.json`\n\n## Requirements\n\n- Python 3.8+\n- httpx>=0.24.0\n- pydantic>=2.0.0\n- pillow>=8.0.0\n- typing-extensions>=4.0.0 (for Python < 3.11)\n\n## License\n\nThis client is part of the NAO Bridge project and follows the same license terms. \n\n## Installing from test.pypi.org\n\nAllow main index as fallback\n\n```bash\npip install -i https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ nao-bridge-client\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Python client library for NAO Bridge HTTP API",
    "version": "0.1.5",
    "project_urls": {
        "Documentation": "https://github.com/davesnowdon/nao-bridge/tree/main/clients/python#readme",
        "Issues": "https://github.com/davesnowdon/nao-bridge/issues",
        "Source": "https://github.com/davesnowdon/nao-bridge"
    },
    "split_keywords": [
        "aldebaran",
        " api-client",
        " nao",
        " robot",
        " robotics"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d272d1a6d88bae0dfadb9612ca9084f929474a7f2a2dc9684c49fcb0e8a4b95e",
                "md5": "985fe13de69cec7c82473e05e6f30946",
                "sha256": "7a5d810353cc44740c746762cbe3faf01f75611c0621711711dfce9220a4cf19"
            },
            "downloads": -1,
            "filename": "nao_bridge_client-0.1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "985fe13de69cec7c82473e05e6f30946",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 9304,
            "upload_time": "2025-07-26T18:32:31",
            "upload_time_iso_8601": "2025-07-26T18:32:31.967287Z",
            "url": "https://files.pythonhosted.org/packages/d2/72/d1a6d88bae0dfadb9612ca9084f929474a7f2a2dc9684c49fcb0e8a4b95e/nao_bridge_client-0.1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d0d223094c81af9652503c5fa25de4b4b518c70eaeeca3a3e634aa4e2b12ef84",
                "md5": "86443168f3bf34b99c6c12fc90716ef0",
                "sha256": "8565a6d4da97cb0ed062fe818bd5335a20273387a2fc949cfbe6b3f047cbd77d"
            },
            "downloads": -1,
            "filename": "nao_bridge_client-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "86443168f3bf34b99c6c12fc90716ef0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 11295,
            "upload_time": "2025-07-26T18:32:33",
            "upload_time_iso_8601": "2025-07-26T18:32:33.448632Z",
            "url": "https://files.pythonhosted.org/packages/d0/d2/23094c81af9652503c5fa25de4b4b518c70eaeeca3a3e634aa4e2b12ef84/nao_bridge_client-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-26 18:32:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "davesnowdon",
    "github_project": "nao-bridge",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "nao-bridge-client"
}
        
Elapsed time: 1.26440s