# AWS Bedrock Data Automation MCP Server
A Model Context Protocol (MCP) server for Amazon Bedrock Data Automation that enables AI assistants to analyze documents, images, videos, and audio files using Amazon Bedrock Data Automation projects.
## Features
- **Project Management**: List and get details about Bedrock Data Automation projects
- **Asset Analysis**: Extract insights from unstructured content using Bedrock Data Automation
- **Support for Multiple Content Types**: Process documents, images, videos, and audio files
- **Integration with Amazon S3**: Seamlessly upload and download assets and results
## Prerequisites
1. Install `uv` from [Astral](https://docs.astral.sh/uv/getting-started/installation/) or the [GitHub README](https://github.com/astral-sh/uv#installation)
2. Install Python using `uv python install 3.10`
3. Set up AWS credentials with access to Amazon Bedrock Data Automation
- You need an AWS account with Amazon Bedrock Data Automation enabled
- Configure AWS credentials with `aws configure` or environment variables
- Ensure your IAM role/user has permissions to use Amazon Bedrock Data Automation
4. Create an AWS S3 Bucket
- Example AWS CLI command to create the bucket
- ```bash
aws s3 create-bucket <bucket-name>
```
## Installation
[](https://cursor.com/install-mcp?name=bedrock-data-automation-mcp-server&config=eyJjb21tYW5kIjoidXZ4IGF3c2xhYnMuYXdzLWJlZHJvY2stZGF0YS1hdXRvbWF0aW9uLW1jcC1zZXJ2ZXJAbGF0ZXN0IiwiZW52Ijp7IkFXU19QUk9GSUxFIjoieW91ci1hd3MtcHJvZmlsZSIsIkFXU19SRUdJT04iOiJ1cy1lYXN0LTEiLCJBV1NfQlVDS0VUX05BTUUiOiJ5b3VyLXMzLWJ1Y2tldC1uYW1lIiwiQkFTRV9ESVIiOiIvcGF0aC90by9iYXNlL2RpcmVjdG9yeSIsIkZBU1RNQ1BfTE9HX0xFVkVMIjoiRVJST1IifSwiZGlzYWJsZWQiOmZhbHNlLCJhdXRvQXBwcm92ZSI6W119)
Configure the MCP server in your MCP client configuration (e.g., for Amazon Q Developer CLI, edit `~/.aws/amazonq/mcp.json`):
```json
{
"mcpServers": {
"bedrock-data-automation-mcp-server": {
"command": "uvx",
"args": ["awslabs.aws-bedrock-data-automation-mcp-server@latest"],
"env": {
"AWS_PROFILE": "your-aws-profile",
"AWS_REGION": "us-east-1",
"AWS_BUCKET_NAME": "your-s3-bucket-name",
"BASE_DIR": "/path/to/base/directory",
"FASTMCP_LOG_LEVEL": "ERROR"
},
"disabled": false,
"autoApprove": []
}
}
}
```
or docker after a successful `docker build -t awslabs/aws-bedrock-data-automation-mcp-server .`:
```json
{
"mcpServers": {
"bedrock-data-automation-mcp-server": {
"command": "docker",
"args": [
"run",
"--rm",
"--interactive",
"--env",
"AWS_PROFILE",
"--env",
"AWS_REGION",
"--env",
"AWS_BUCKET_NAME",
"--env",
"BASE_DIR",
"--env",
"FASTMCP_LOG_LEVEL",
"awslabs/aws-bedrock-data-automation-mcp-server:latest"
],
"env": {
"AWS_PROFILE": "your-aws-profile",
"AWS_REGION": "us-east-1",
"AWS_BUCKET_NAME": "your-s3-bucket-name",
"BASE_DIR": "/path/to/base/directory",
"FASTMCP_LOG_LEVEL": "ERROR"
},
"disabled": false,
"autoApprove": []
}
}
}
```
## Environment Variables
- `AWS_PROFILE`: AWS CLI profile to use for credentials
- `AWS_REGION`: AWS region to use (default: us-east-1)
- `AWS_BUCKET_NAME`: S3 bucket name for storing assets and results
- `BASE_DIR`: Base directory for file operations (optional)
- `FASTMCP_LOG_LEVEL`: Logging level (ERROR, WARNING, INFO, DEBUG)
## AWS Authentication
The server uses the AWS profile specified in the `AWS_PROFILE` environment variable. If not provided, it defaults to the default credential provider chain.
```json
"env": {
"AWS_PROFILE": "your-aws-profile",
"AWS_REGION": "us-east-1"
}
```
Make sure the AWS profile has permissions to access Amazon Bedrock Data Automation services. The MCP server creates a boto3 session using the specified profile to authenticate with AWS services. Amazon Bedrock Data Automation services is currently available in the following regions: us-east-1 and us-west-2.
## Tools
### getprojects
Get a list of data automation projects.
```python
getprojects() -> list
```
Returns a list of available Bedrock Data Automation projects.
### getprojectdetails
Get details of a specific data automation project.
```python
getprojectdetails(projectArn: str) -> dict
```
Returns detailed information about a specific Bedrock Data Automation project.
### analyzeasset
Analyze an asset using a data automation project.
```python
analyzeasset(assetPath: str, projectArn: Optional[str] = None) -> dict
```
Extracts insights from unstructured content (documents, images, videos, audio) using Amazon Bedrock Data Automation.
- `assetPath`: Path to the asset file to analyze
- `projectArn`: ARN of the Bedrock Data Automation project to use (optional, uses default public project if not provided)
## Example Usage
```python
# List available projects
projects = await getprojects()
# Get details of a specific project
project_details = await getprojectdetails(projectArn="arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project")
# Analyze a document
results = await analyzeasset(assetPath="/path/to/document.pdf")
# Analyze an image with a specific project
results = await analyzeasset(
assetPath="/path/to/image.jpg",
projectArn="arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project"
)
```
## Security Considerations
- Use AWS IAM roles with appropriate permissions
- Store credentials securely
- Use temporary credentials when possible
- Ensure S3 bucket permissions are properly configured
## License
This project is licensed under the Apache License, Version 2.0. See the [LICENSE](https://github.com/awslabs/mcp/blob/main/src/aws-bedrock-data-automation-mcp-server/LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "awslabs.aws-bedrock-data-automation-mcp-server",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": null,
"author": "Amazon Web Services",
"author_email": "AWSLabs MCP <203918161+awslabs-mcp@users.noreply.github.com>, Ayush Goyal <ayush987goyal@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/d0/4f/8f474779faefcd58c8d75921ce6a080b0c4e6cc26f6524f0cf22abd11bee/awslabs_aws_bedrock_data_automation_mcp_server-0.0.4.tar.gz",
"platform": null,
"description": "# AWS Bedrock Data Automation MCP Server\n\nA Model Context Protocol (MCP) server for Amazon Bedrock Data Automation that enables AI assistants to analyze documents, images, videos, and audio files using Amazon Bedrock Data Automation projects.\n\n## Features\n\n- **Project Management**: List and get details about Bedrock Data Automation projects\n- **Asset Analysis**: Extract insights from unstructured content using Bedrock Data Automation\n- **Support for Multiple Content Types**: Process documents, images, videos, and audio files\n- **Integration with Amazon S3**: Seamlessly upload and download assets and results\n\n## Prerequisites\n\n1. Install `uv` from [Astral](https://docs.astral.sh/uv/getting-started/installation/) or the [GitHub README](https://github.com/astral-sh/uv#installation)\n2. Install Python using `uv python install 3.10`\n3. Set up AWS credentials with access to Amazon Bedrock Data Automation\n - You need an AWS account with Amazon Bedrock Data Automation enabled\n - Configure AWS credentials with `aws configure` or environment variables\n - Ensure your IAM role/user has permissions to use Amazon Bedrock Data Automation\n4. Create an AWS S3 Bucket\n - Example AWS CLI command to create the bucket\n - ```bash\n aws s3 create-bucket <bucket-name>\n ```\n\n## Installation\n\n[](https://cursor.com/install-mcp?name=bedrock-data-automation-mcp-server&config=eyJjb21tYW5kIjoidXZ4IGF3c2xhYnMuYXdzLWJlZHJvY2stZGF0YS1hdXRvbWF0aW9uLW1jcC1zZXJ2ZXJAbGF0ZXN0IiwiZW52Ijp7IkFXU19QUk9GSUxFIjoieW91ci1hd3MtcHJvZmlsZSIsIkFXU19SRUdJT04iOiJ1cy1lYXN0LTEiLCJBV1NfQlVDS0VUX05BTUUiOiJ5b3VyLXMzLWJ1Y2tldC1uYW1lIiwiQkFTRV9ESVIiOiIvcGF0aC90by9iYXNlL2RpcmVjdG9yeSIsIkZBU1RNQ1BfTE9HX0xFVkVMIjoiRVJST1IifSwiZGlzYWJsZWQiOmZhbHNlLCJhdXRvQXBwcm92ZSI6W119)\n\nConfigure the MCP server in your MCP client configuration (e.g., for Amazon Q Developer CLI, edit `~/.aws/amazonq/mcp.json`):\n\n```json\n{\n \"mcpServers\": {\n \"bedrock-data-automation-mcp-server\": {\n \"command\": \"uvx\",\n \"args\": [\"awslabs.aws-bedrock-data-automation-mcp-server@latest\"],\n \"env\": {\n \"AWS_PROFILE\": \"your-aws-profile\",\n \"AWS_REGION\": \"us-east-1\",\n \"AWS_BUCKET_NAME\": \"your-s3-bucket-name\",\n \"BASE_DIR\": \"/path/to/base/directory\",\n \"FASTMCP_LOG_LEVEL\": \"ERROR\"\n },\n \"disabled\": false,\n \"autoApprove\": []\n }\n }\n}\n```\n\nor docker after a successful `docker build -t awslabs/aws-bedrock-data-automation-mcp-server .`:\n\n```json\n{\n \"mcpServers\": {\n \"bedrock-data-automation-mcp-server\": {\n \"command\": \"docker\",\n \"args\": [\n \"run\",\n \"--rm\",\n \"--interactive\",\n \"--env\",\n \"AWS_PROFILE\",\n \"--env\",\n \"AWS_REGION\",\n \"--env\",\n \"AWS_BUCKET_NAME\",\n \"--env\",\n \"BASE_DIR\",\n \"--env\",\n \"FASTMCP_LOG_LEVEL\",\n \"awslabs/aws-bedrock-data-automation-mcp-server:latest\"\n ],\n \"env\": {\n \"AWS_PROFILE\": \"your-aws-profile\",\n \"AWS_REGION\": \"us-east-1\",\n \"AWS_BUCKET_NAME\": \"your-s3-bucket-name\",\n \"BASE_DIR\": \"/path/to/base/directory\",\n \"FASTMCP_LOG_LEVEL\": \"ERROR\"\n },\n \"disabled\": false,\n \"autoApprove\": []\n }\n }\n}\n```\n\n## Environment Variables\n\n- `AWS_PROFILE`: AWS CLI profile to use for credentials\n- `AWS_REGION`: AWS region to use (default: us-east-1)\n- `AWS_BUCKET_NAME`: S3 bucket name for storing assets and results\n- `BASE_DIR`: Base directory for file operations (optional)\n- `FASTMCP_LOG_LEVEL`: Logging level (ERROR, WARNING, INFO, DEBUG)\n\n## AWS Authentication\n\nThe server uses the AWS profile specified in the `AWS_PROFILE` environment variable. If not provided, it defaults to the default credential provider chain.\n\n```json\n\"env\": {\n \"AWS_PROFILE\": \"your-aws-profile\",\n \"AWS_REGION\": \"us-east-1\"\n}\n```\n\nMake sure the AWS profile has permissions to access Amazon Bedrock Data Automation services. The MCP server creates a boto3 session using the specified profile to authenticate with AWS services. Amazon Bedrock Data Automation services is currently available in the following regions: us-east-1 and us-west-2.\n\n## Tools\n\n### getprojects\n\nGet a list of data automation projects.\n\n```python\ngetprojects() -> list\n```\n\nReturns a list of available Bedrock Data Automation projects.\n\n### getprojectdetails\n\nGet details of a specific data automation project.\n\n```python\ngetprojectdetails(projectArn: str) -> dict\n```\n\nReturns detailed information about a specific Bedrock Data Automation project.\n\n### analyzeasset\n\nAnalyze an asset using a data automation project.\n\n```python\nanalyzeasset(assetPath: str, projectArn: Optional[str] = None) -> dict\n```\n\nExtracts insights from unstructured content (documents, images, videos, audio) using Amazon Bedrock Data Automation.\n\n- `assetPath`: Path to the asset file to analyze\n- `projectArn`: ARN of the Bedrock Data Automation project to use (optional, uses default public project if not provided)\n\n## Example Usage\n\n```python\n# List available projects\nprojects = await getprojects()\n\n# Get details of a specific project\nproject_details = await getprojectdetails(projectArn=\"arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project\")\n\n# Analyze a document\nresults = await analyzeasset(assetPath=\"/path/to/document.pdf\")\n\n# Analyze an image with a specific project\nresults = await analyzeasset(\n assetPath=\"/path/to/image.jpg\",\n projectArn=\"arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project\"\n)\n```\n\n## Security Considerations\n\n- Use AWS IAM roles with appropriate permissions\n- Store credentials securely\n- Use temporary credentials when possible\n- Ensure S3 bucket permissions are properly configured\n\n## License\n\nThis project is licensed under the Apache License, Version 2.0. See the [LICENSE](https://github.com/awslabs/mcp/blob/main/src/aws-bedrock-data-automation-mcp-server/LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "An AWS Labs Model Context Protocol (MCP) server for Bedrock Data Automation",
"version": "0.0.4",
"project_urls": {
"changelog": "https://github.com/awslabs/mcp/blob/main/src/aws-bedrock-data-automation-mcp-server/CHANGELOG.md",
"docs": "https://awslabs.github.io/mcp/servers/aws-bedrock-data-automation-mcp-server/",
"documentation": "https://awslabs.github.io/mcp/servers/aws-bedrock-data-automation-mcp-server/",
"homepage": "https://awslabs.github.io/mcp/",
"repository": "https://github.com/awslabs/mcp.git"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b7ee0e265f8064227059c7022d2b7b186135534a7fbce73ab44120452d0698ae",
"md5": "d4e89de58fe9113021539c5693fa253c",
"sha256": "a3304720a10ca96703ba7b8edea59a7ef04502e6aff0fc45190f691c8c5e82bc"
},
"downloads": -1,
"filename": "awslabs_aws_bedrock_data_automation_mcp_server-0.0.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d4e89de58fe9113021539c5693fa253c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 15068,
"upload_time": "2025-07-11T22:06:52",
"upload_time_iso_8601": "2025-07-11T22:06:52.670009Z",
"url": "https://files.pythonhosted.org/packages/b7/ee/0e265f8064227059c7022d2b7b186135534a7fbce73ab44120452d0698ae/awslabs_aws_bedrock_data_automation_mcp_server-0.0.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d04f8f474779faefcd58c8d75921ce6a080b0c4e6cc26f6524f0cf22abd11bee",
"md5": "75f43911d683c6362e34f7617cf225c1",
"sha256": "8c31bd9fa1c040e54df60fabf1196da82d252bed7af5848251f9362deb4275d1"
},
"downloads": -1,
"filename": "awslabs_aws_bedrock_data_automation_mcp_server-0.0.4.tar.gz",
"has_sig": false,
"md5_digest": "75f43911d683c6362e34f7617cf225c1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 68809,
"upload_time": "2025-07-11T22:06:53",
"upload_time_iso_8601": "2025-07-11T22:06:53.763766Z",
"url": "https://files.pythonhosted.org/packages/d0/4f/8f474779faefcd58c8d75921ce6a080b0c4e6cc26f6524f0cf22abd11bee/awslabs_aws_bedrock_data_automation_mcp_server-0.0.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 22:06:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "awslabs",
"github_project": "mcp",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "awslabs.aws-bedrock-data-automation-mcp-server"
}