google-cloud-mcp


Namegoogle-cloud-mcp JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryA comprehensive Model Context Protocol (MCP) server for Google Cloud Platform services including BigQuery, Cloud Logging, Cloud Storage, and Compute Engine.
upload_time2025-07-31 10:21:46
maintainerlockon-n
docs_urlNone
authorlockon-n
requires_python>=3.8
licenseMIT
keywords ai bigquery cloud-logging cloud-storage compute-engine fastmcp gcp google-cloud mcp model-context-protocol
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Google Cloud MCP Server

A comprehensive Model Context Protocol (MCP) server for Google Cloud Platform services. This project provides Python wrapper modules and MCP tools for managing various GCP services including BigQuery, Cloud Logging, Cloud Storage, and Compute Engine.

## ๐Ÿš€ Features

- **BigQuery**: Data warehouse operations including query execution, data loading/exporting, job management, and cost estimation
- **Cloud Logging**: Comprehensive log management with reading/writing logs, bucket management, sinks, exclusions, and metrics
- **Cloud Storage**: Complete bucket and object management with lifecycle policies and batch operations
- **Compute Engine**: Virtual machine lifecycle management and zone operations
- **MCP Integration**: Full MCP server implementation for AI model interactions
- **Flexible Authentication**: Support for both service account files and default credentials

## ๐Ÿ“ฆ Installation

### From PyPI
```bash
pip install google-cloud-mcp
```

### From Source
```bash
git clone https://github.com/lockon-n/google-cloud-mcp.git
cd google-cloud-mcp
pip install -e .
```

## ๐Ÿ”ง Setup

### Authentication

#### Option 1: Service Account (Recommended)
1. Create a service account in Google Cloud Console
2. Download the JSON key file
3. Place it as `service-account-key.json` in your project root

#### Option 2: Default Credentials
```bash
gcloud auth application-default login
```

### Running the MCP Server
```bash
python main.py
```

## ๐Ÿ› ๏ธ Available Tools

### BigQuery Tools

#### `google-cloud-bigquery_execute_query`
Execute SQL queries with optional cost estimation.
```json
{
  "query": "SELECT * FROM dataset.table LIMIT 10",
  "dry_run": false,
  "max_results": 1000
}
```

#### `google-cloud-bigquery_create_dataset`
Create a new BigQuery dataset.
```json
{
  "dataset_id": "my_dataset",
  "description": "My dataset description",
  "location": "US"
}
```

#### `google-cloud-bigquery_create_table`
Create a new table with schema.
```json
{
  "dataset_id": "my_dataset",
  "table_id": "my_table",
  "schema": [
    {"name": "id", "type": "INTEGER", "mode": "REQUIRED"},
    {"name": "name", "type": "STRING", "mode": "NULLABLE"}
  ]
}
```

#### `google-cloud-bigquery_load_data_from_csv`
Load data from CSV file into BigQuery table.
```json
{
  "dataset_id": "my_dataset",
  "table_id": "my_table",
  "csv_file_path": "/path/to/data.csv",
  "write_disposition": "WRITE_APPEND"
}
```

#### `google-cloud-bigquery_export_to_storage`
Export BigQuery table to Cloud Storage.
```json
{
  "dataset_id": "my_dataset",
  "table_id": "my_table",
  "bucket_name": "my-bucket",
  "file_path": "exports/data.csv"
}
```

#### `google-cloud-bigquery_list_datasets`
List all datasets in the project.

#### `google-cloud-bigquery_list_tables`
List all tables in a dataset.
```json
{
  "dataset_id": "my_dataset"
}
```

#### `google-cloud-bigquery_get_job_status`
Get the status of a BigQuery job.
```json
{
  "job_id": "job_12345"
}
```

### Cloud Logging Tools

#### `google-cloud-logging_write_log`
Write log entries (text or structured).
```json
{
  "log_name": "my-application",
  "message": "Application started successfully",
  "severity": "INFO",
  "labels": {"component": "backend"}
}
```

#### `google-cloud-logging_read_logs`
Read and filter log entries.
```json
{
  "filter_string": "severity>=ERROR",
  "max_results": 100,
  "time_range_hours": 24
}
```

#### `google-cloud-logging_list_logs`
List all log names in the project.

#### `google-cloud-logging_delete_log`
Delete all entries in a specific log.
```json
{
  "log_name": "old-application"
}
```

#### `google-cloud-logging_create_log_bucket`
Create a new log bucket for retention management.
```json
{
  "bucket_id": "my-log-bucket",
  "retention_days": 90,
  "description": "Application logs bucket",
  "locked": false
}
```

#### `google-cloud-logging_update_log_bucket`
Update log bucket configuration.
```json
{
  "bucket_id": "my-log-bucket",
  "retention_days": 120,
  "description": "Updated description"
}
```

#### `google-cloud-logging_delete_log_bucket`
Delete a log bucket.
```json
{
  "bucket_id": "my-log-bucket"
}
```

#### `google-cloud-logging_clear_log_bucket`
Clear all logs from a bucket while keeping the bucket.
```json
{
  "bucket_id": "my-log-bucket"
}
```

#### `google-cloud-logging_list_log_buckets`
List all log buckets in the project.

#### `google-cloud-logging_create_log_sink`
Create a log sink for exporting logs.
```json
{
  "sink_name": "my-sink",
  "destination": "storage.googleapis.com/my-export-bucket",
  "filter_string": "severity>=WARNING"
}
```

#### `google-cloud-logging_list_log_sinks`
List all log sinks in the project.

#### `google-cloud-logging_delete_log_sink`
Delete a log sink.
```json
{
  "sink_name": "my-sink"
}
```

#### `google-cloud-logging_create_exclusion`
Create a log exclusion filter.
```json
{
  "exclusion_name": "debug-exclusion",
  "filter_string": "severity<INFO"
}
```

#### `google-cloud-logging_list_exclusions`
List all log exclusions.

#### `google-cloud-logging_delete_exclusion`
Delete a log exclusion.
```json
{
  "exclusion_name": "debug-exclusion"
}
```

#### `google-cloud-logging_search_logs`
Search logs with simplified parameters.
```json
{
  "search_query": "error occurred",
  "time_range_hours": 12,
  "severity_levels": ["ERROR", "CRITICAL"]
}
```

#### `google-cloud-logging_export_logs_to_storage`
Create a sink to export logs to Cloud Storage.
```json
{
  "sink_name": "storage-export",
  "bucket_name": "my-log-exports"
}
```

#### `google-cloud-logging_export_logs_to_bigquery`
Create a sink to export logs to BigQuery.
```json
{
  "sink_name": "bigquery-export",
  "dataset_id": "log_analysis"
}
```

### Cloud Storage Tools

#### `google-cloud-storage_create_bucket`
Create a new Cloud Storage bucket.
```json
{
  "bucket_name": "my-new-bucket",
  "location": "US",
  "storage_class": "STANDARD"
}
```

#### `google-cloud-storage_delete_bucket`
Delete a bucket.
```json
{
  "bucket_name": "my-bucket"
}
```

#### `google-cloud-storage_list_buckets`
List all buckets in the project.

#### `google-cloud-storage_upload_file`
Upload a file to a bucket.
```json
{
  "bucket_name": "my-bucket",
  "local_file_path": "/path/to/file.txt",
  "blob_name": "uploads/file.txt"
}
```

#### `google-cloud-storage_download_file`
Download a file from a bucket.
```json
{
  "bucket_name": "my-bucket",
  "blob_name": "uploads/file.txt",
  "local_file_path": "/path/to/download/file.txt"
}
```

#### `google-cloud-storage_delete_file`
Delete a file from a bucket.
```json
{
  "bucket_name": "my-bucket",
  "blob_name": "uploads/file.txt"
}
```

#### `google-cloud-storage_list_files`
List files in a bucket.
```json
{
  "bucket_name": "my-bucket",
  "prefix": "uploads/",
  "max_results": 100
}
```

#### `google-cloud-storage_copy_file`
Copy a file within or between buckets.
```json
{
  "source_bucket": "source-bucket",
  "source_blob": "file.txt",
  "destination_bucket": "dest-bucket",
  "destination_blob": "backup/file.txt"
}
```

#### `google-cloud-storage_move_file`
Move a file within or between buckets.
```json
{
  "source_bucket": "source-bucket",
  "source_blob": "file.txt",
  "destination_bucket": "dest-bucket",
  "destination_blob": "moved/file.txt"
}
```

#### `google-cloud-storage_generate_signed_url`
Generate a signed URL for temporary access.
```json
{
  "bucket_name": "my-bucket",
  "blob_name": "private/file.txt",
  "expiration_hours": 24,
  "method": "GET"
}
```

#### `google-cloud-storage_set_bucket_lifecycle`
Set lifecycle management policies.
```json
{
  "bucket_name": "my-bucket",
  "rules": [
    {
      "action": "Delete",
      "conditions": {"age": 365}
    }
  ]
}
```

#### `google-cloud-storage_batch_upload`
Upload multiple files to a bucket.
```json
{
  "bucket_name": "my-bucket",
  "file_mappings": [
    {
      "local_path": "/path/file1.txt",
      "blob_name": "uploads/file1.txt"
    }
  ]
}
```

#### `google-cloud-storage_batch_download`
Download multiple files from a bucket.
```json
{
  "bucket_name": "my-bucket",
  "file_mappings": [
    {
      "blob_name": "uploads/file1.txt",
      "local_path": "/path/download/file1.txt"
    }
  ]
}
```

#### `google-cloud-storage_search_files`
Search for files using patterns.
```json
{
  "bucket_name": "my-bucket",
  "name_pattern": "*.log",
  "size_range": {"min_bytes": 1024, "max_bytes": 1048576}
}
```

### Compute Engine Tools

#### `google-cloud-compute_create_instance`
Create a new VM instance.
```json
{
  "instance_name": "my-vm",
  "zone": "us-central1-a",
  "machine_type": "e2-medium",
  "image_family": "ubuntu-2004-lts",
  "image_project": "ubuntu-os-cloud"
}
```

#### `google-cloud-compute_delete_instance`
Delete a VM instance.
```json
{
  "instance_name": "my-vm",
  "zone": "us-central1-a"
}
```

#### `google-cloud-compute_start_instance`
Start a stopped VM instance.
```json
{
  "instance_name": "my-vm",
  "zone": "us-central1-a"
}
```

#### `google-cloud-compute_stop_instance`
Stop a running VM instance.
```json
{
  "instance_name": "my-vm",
  "zone": "us-central1-a"
}
```

#### `google-cloud-compute_restart_instance`
Restart a VM instance.
```json
{
  "instance_name": "my-vm",
  "zone": "us-central1-a"
}
```

#### `google-cloud-compute_list_instances`
List all VM instances in a zone.
```json
{
  "zone": "us-central1-a"
}
```

#### `google-cloud-compute_get_instance`
Get detailed information about a VM instance.
```json
{
  "instance_name": "my-vm",
  "zone": "us-central1-a"
}
```

#### `google-cloud-compute_list_zones`
List all available zones in the project.

#### `google-cloud-compute_get_operation_status`
Get the status of a Compute Engine operation.
```json
{
  "operation_name": "operation-123456",
  "zone": "us-central1-a"
}
```

## ๐Ÿ—๏ธ Architecture

The project follows a modular architecture with separate managers for each GCP service:

- **BigQuery** (`src/big_query.py`): Data warehouse operations including query execution, data loading/exporting, job management, and cost estimation
- **Cloud Logging** (`src/cloud_logging.py`): Log management operations including reading/writing logs, managing buckets, sinks, exclusions, and metrics
- **Cloud Storage** (`src/cloud_storage.py`): Bucket and object management including CRUD operations, lifecycle management, and batch operations
- **Compute Engine** (`src/compute_engine.py`): Virtual machine management including instance lifecycle operations and zone management

Each manager class follows a consistent pattern:
- Constructor takes `project_id` and optional `service_account_path`
- Methods return dictionaries with operation results
- Comprehensive error handling and logging
- Support for both service account and default credential authentication

## ๐Ÿงช Testing

Run the test server:
```bash
python test_server.py
```

This will test all available MCP tools and verify their functionality.

## ๐Ÿ”ง Development

### Project Structure
```
google-cloud-mcp/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ server.py          # MCP server implementation
โ”‚   โ”œโ”€โ”€ big_query.py       # BigQuery manager
โ”‚   โ”œโ”€โ”€ cloud_logging.py   # Cloud Logging manager  
โ”‚   โ”œโ”€โ”€ cloud_storage.py   # Cloud Storage manager
โ”‚   โ””โ”€โ”€ compute_engine.py  # Compute Engine manager
โ”œโ”€โ”€ main.py                # Entry point
โ”œโ”€โ”€ test_server.py         # Test runner
โ”œโ”€โ”€ pyproject.toml         # Package configuration
โ”œโ”€โ”€ CLAUDE.md             # Development guidelines
โ””โ”€โ”€ README.md             # This file
```

### Key Design Patterns

1. **Consistent Return Types**: All methods return dictionaries for easy JSON serialization
2. **Flexible Authentication**: Support for both service account files and default credentials  
3. **Comprehensive Error Handling**: Specific exception handling for different GCP error types
4. **Extensive Logging**: Detailed logging throughout for debugging and monitoring
5. **MCP Integration**: Full compliance with MCP protocol specifications

## ๐Ÿ“‹ Requirements

- Python 3.8+
- Google Cloud SDK (optional, for default credentials)
- Required Python packages (automatically installed):
  - `google-cloud-bigquery`
  - `google-cloud-logging` 
  - `google-cloud-storage`
  - `google-cloud-compute`
  - `mcp`

## ๐Ÿค Contributing

1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request

## ๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

## ๐Ÿ™ Acknowledgments

- Google Cloud Platform for comprehensive APIs
- MCP (Model Context Protocol) for the integration framework
- The Python community for excellent cloud libraries

---

**Note**: This project is designed to work seamlessly with AI models through the MCP protocol, providing a comprehensive interface for Google Cloud Platform operations.
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "google-cloud-mcp",
    "maintainer": "lockon-n",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "ai, bigquery, cloud-logging, cloud-storage, compute-engine, fastmcp, gcp, google-cloud, mcp, model-context-protocol",
    "author": "lockon-n",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/23/f2/a3d1a04c0aa694d6ced8b3e5c0c7f0b5055567bc6f80bdd45402cc0c4fe2/google_cloud_mcp-1.0.0.tar.gz",
    "platform": null,
    "description": "# Google Cloud MCP Server\n\nA comprehensive Model Context Protocol (MCP) server for Google Cloud Platform services. This project provides Python wrapper modules and MCP tools for managing various GCP services including BigQuery, Cloud Logging, Cloud Storage, and Compute Engine.\n\n## \ud83d\ude80 Features\n\n- **BigQuery**: Data warehouse operations including query execution, data loading/exporting, job management, and cost estimation\n- **Cloud Logging**: Comprehensive log management with reading/writing logs, bucket management, sinks, exclusions, and metrics\n- **Cloud Storage**: Complete bucket and object management with lifecycle policies and batch operations\n- **Compute Engine**: Virtual machine lifecycle management and zone operations\n- **MCP Integration**: Full MCP server implementation for AI model interactions\n- **Flexible Authentication**: Support for both service account files and default credentials\n\n## \ud83d\udce6 Installation\n\n### From PyPI\n```bash\npip install google-cloud-mcp\n```\n\n### From Source\n```bash\ngit clone https://github.com/lockon-n/google-cloud-mcp.git\ncd google-cloud-mcp\npip install -e .\n```\n\n## \ud83d\udd27 Setup\n\n### Authentication\n\n#### Option 1: Service Account (Recommended)\n1. Create a service account in Google Cloud Console\n2. Download the JSON key file\n3. Place it as `service-account-key.json` in your project root\n\n#### Option 2: Default Credentials\n```bash\ngcloud auth application-default login\n```\n\n### Running the MCP Server\n```bash\npython main.py\n```\n\n## \ud83d\udee0\ufe0f Available Tools\n\n### BigQuery Tools\n\n#### `google-cloud-bigquery_execute_query`\nExecute SQL queries with optional cost estimation.\n```json\n{\n  \"query\": \"SELECT * FROM dataset.table LIMIT 10\",\n  \"dry_run\": false,\n  \"max_results\": 1000\n}\n```\n\n#### `google-cloud-bigquery_create_dataset`\nCreate a new BigQuery dataset.\n```json\n{\n  \"dataset_id\": \"my_dataset\",\n  \"description\": \"My dataset description\",\n  \"location\": \"US\"\n}\n```\n\n#### `google-cloud-bigquery_create_table`\nCreate a new table with schema.\n```json\n{\n  \"dataset_id\": \"my_dataset\",\n  \"table_id\": \"my_table\",\n  \"schema\": [\n    {\"name\": \"id\", \"type\": \"INTEGER\", \"mode\": \"REQUIRED\"},\n    {\"name\": \"name\", \"type\": \"STRING\", \"mode\": \"NULLABLE\"}\n  ]\n}\n```\n\n#### `google-cloud-bigquery_load_data_from_csv`\nLoad data from CSV file into BigQuery table.\n```json\n{\n  \"dataset_id\": \"my_dataset\",\n  \"table_id\": \"my_table\",\n  \"csv_file_path\": \"/path/to/data.csv\",\n  \"write_disposition\": \"WRITE_APPEND\"\n}\n```\n\n#### `google-cloud-bigquery_export_to_storage`\nExport BigQuery table to Cloud Storage.\n```json\n{\n  \"dataset_id\": \"my_dataset\",\n  \"table_id\": \"my_table\",\n  \"bucket_name\": \"my-bucket\",\n  \"file_path\": \"exports/data.csv\"\n}\n```\n\n#### `google-cloud-bigquery_list_datasets`\nList all datasets in the project.\n\n#### `google-cloud-bigquery_list_tables`\nList all tables in a dataset.\n```json\n{\n  \"dataset_id\": \"my_dataset\"\n}\n```\n\n#### `google-cloud-bigquery_get_job_status`\nGet the status of a BigQuery job.\n```json\n{\n  \"job_id\": \"job_12345\"\n}\n```\n\n### Cloud Logging Tools\n\n#### `google-cloud-logging_write_log`\nWrite log entries (text or structured).\n```json\n{\n  \"log_name\": \"my-application\",\n  \"message\": \"Application started successfully\",\n  \"severity\": \"INFO\",\n  \"labels\": {\"component\": \"backend\"}\n}\n```\n\n#### `google-cloud-logging_read_logs`\nRead and filter log entries.\n```json\n{\n  \"filter_string\": \"severity>=ERROR\",\n  \"max_results\": 100,\n  \"time_range_hours\": 24\n}\n```\n\n#### `google-cloud-logging_list_logs`\nList all log names in the project.\n\n#### `google-cloud-logging_delete_log`\nDelete all entries in a specific log.\n```json\n{\n  \"log_name\": \"old-application\"\n}\n```\n\n#### `google-cloud-logging_create_log_bucket`\nCreate a new log bucket for retention management.\n```json\n{\n  \"bucket_id\": \"my-log-bucket\",\n  \"retention_days\": 90,\n  \"description\": \"Application logs bucket\",\n  \"locked\": false\n}\n```\n\n#### `google-cloud-logging_update_log_bucket`\nUpdate log bucket configuration.\n```json\n{\n  \"bucket_id\": \"my-log-bucket\",\n  \"retention_days\": 120,\n  \"description\": \"Updated description\"\n}\n```\n\n#### `google-cloud-logging_delete_log_bucket`\nDelete a log bucket.\n```json\n{\n  \"bucket_id\": \"my-log-bucket\"\n}\n```\n\n#### `google-cloud-logging_clear_log_bucket`\nClear all logs from a bucket while keeping the bucket.\n```json\n{\n  \"bucket_id\": \"my-log-bucket\"\n}\n```\n\n#### `google-cloud-logging_list_log_buckets`\nList all log buckets in the project.\n\n#### `google-cloud-logging_create_log_sink`\nCreate a log sink for exporting logs.\n```json\n{\n  \"sink_name\": \"my-sink\",\n  \"destination\": \"storage.googleapis.com/my-export-bucket\",\n  \"filter_string\": \"severity>=WARNING\"\n}\n```\n\n#### `google-cloud-logging_list_log_sinks`\nList all log sinks in the project.\n\n#### `google-cloud-logging_delete_log_sink`\nDelete a log sink.\n```json\n{\n  \"sink_name\": \"my-sink\"\n}\n```\n\n#### `google-cloud-logging_create_exclusion`\nCreate a log exclusion filter.\n```json\n{\n  \"exclusion_name\": \"debug-exclusion\",\n  \"filter_string\": \"severity<INFO\"\n}\n```\n\n#### `google-cloud-logging_list_exclusions`\nList all log exclusions.\n\n#### `google-cloud-logging_delete_exclusion`\nDelete a log exclusion.\n```json\n{\n  \"exclusion_name\": \"debug-exclusion\"\n}\n```\n\n#### `google-cloud-logging_search_logs`\nSearch logs with simplified parameters.\n```json\n{\n  \"search_query\": \"error occurred\",\n  \"time_range_hours\": 12,\n  \"severity_levels\": [\"ERROR\", \"CRITICAL\"]\n}\n```\n\n#### `google-cloud-logging_export_logs_to_storage`\nCreate a sink to export logs to Cloud Storage.\n```json\n{\n  \"sink_name\": \"storage-export\",\n  \"bucket_name\": \"my-log-exports\"\n}\n```\n\n#### `google-cloud-logging_export_logs_to_bigquery`\nCreate a sink to export logs to BigQuery.\n```json\n{\n  \"sink_name\": \"bigquery-export\",\n  \"dataset_id\": \"log_analysis\"\n}\n```\n\n### Cloud Storage Tools\n\n#### `google-cloud-storage_create_bucket`\nCreate a new Cloud Storage bucket.\n```json\n{\n  \"bucket_name\": \"my-new-bucket\",\n  \"location\": \"US\",\n  \"storage_class\": \"STANDARD\"\n}\n```\n\n#### `google-cloud-storage_delete_bucket`\nDelete a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\"\n}\n```\n\n#### `google-cloud-storage_list_buckets`\nList all buckets in the project.\n\n#### `google-cloud-storage_upload_file`\nUpload a file to a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"local_file_path\": \"/path/to/file.txt\",\n  \"blob_name\": \"uploads/file.txt\"\n}\n```\n\n#### `google-cloud-storage_download_file`\nDownload a file from a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"blob_name\": \"uploads/file.txt\",\n  \"local_file_path\": \"/path/to/download/file.txt\"\n}\n```\n\n#### `google-cloud-storage_delete_file`\nDelete a file from a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"blob_name\": \"uploads/file.txt\"\n}\n```\n\n#### `google-cloud-storage_list_files`\nList files in a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"prefix\": \"uploads/\",\n  \"max_results\": 100\n}\n```\n\n#### `google-cloud-storage_copy_file`\nCopy a file within or between buckets.\n```json\n{\n  \"source_bucket\": \"source-bucket\",\n  \"source_blob\": \"file.txt\",\n  \"destination_bucket\": \"dest-bucket\",\n  \"destination_blob\": \"backup/file.txt\"\n}\n```\n\n#### `google-cloud-storage_move_file`\nMove a file within or between buckets.\n```json\n{\n  \"source_bucket\": \"source-bucket\",\n  \"source_blob\": \"file.txt\",\n  \"destination_bucket\": \"dest-bucket\",\n  \"destination_blob\": \"moved/file.txt\"\n}\n```\n\n#### `google-cloud-storage_generate_signed_url`\nGenerate a signed URL for temporary access.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"blob_name\": \"private/file.txt\",\n  \"expiration_hours\": 24,\n  \"method\": \"GET\"\n}\n```\n\n#### `google-cloud-storage_set_bucket_lifecycle`\nSet lifecycle management policies.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"rules\": [\n    {\n      \"action\": \"Delete\",\n      \"conditions\": {\"age\": 365}\n    }\n  ]\n}\n```\n\n#### `google-cloud-storage_batch_upload`\nUpload multiple files to a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"file_mappings\": [\n    {\n      \"local_path\": \"/path/file1.txt\",\n      \"blob_name\": \"uploads/file1.txt\"\n    }\n  ]\n}\n```\n\n#### `google-cloud-storage_batch_download`\nDownload multiple files from a bucket.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"file_mappings\": [\n    {\n      \"blob_name\": \"uploads/file1.txt\",\n      \"local_path\": \"/path/download/file1.txt\"\n    }\n  ]\n}\n```\n\n#### `google-cloud-storage_search_files`\nSearch for files using patterns.\n```json\n{\n  \"bucket_name\": \"my-bucket\",\n  \"name_pattern\": \"*.log\",\n  \"size_range\": {\"min_bytes\": 1024, \"max_bytes\": 1048576}\n}\n```\n\n### Compute Engine Tools\n\n#### `google-cloud-compute_create_instance`\nCreate a new VM instance.\n```json\n{\n  \"instance_name\": \"my-vm\",\n  \"zone\": \"us-central1-a\",\n  \"machine_type\": \"e2-medium\",\n  \"image_family\": \"ubuntu-2004-lts\",\n  \"image_project\": \"ubuntu-os-cloud\"\n}\n```\n\n#### `google-cloud-compute_delete_instance`\nDelete a VM instance.\n```json\n{\n  \"instance_name\": \"my-vm\",\n  \"zone\": \"us-central1-a\"\n}\n```\n\n#### `google-cloud-compute_start_instance`\nStart a stopped VM instance.\n```json\n{\n  \"instance_name\": \"my-vm\",\n  \"zone\": \"us-central1-a\"\n}\n```\n\n#### `google-cloud-compute_stop_instance`\nStop a running VM instance.\n```json\n{\n  \"instance_name\": \"my-vm\",\n  \"zone\": \"us-central1-a\"\n}\n```\n\n#### `google-cloud-compute_restart_instance`\nRestart a VM instance.\n```json\n{\n  \"instance_name\": \"my-vm\",\n  \"zone\": \"us-central1-a\"\n}\n```\n\n#### `google-cloud-compute_list_instances`\nList all VM instances in a zone.\n```json\n{\n  \"zone\": \"us-central1-a\"\n}\n```\n\n#### `google-cloud-compute_get_instance`\nGet detailed information about a VM instance.\n```json\n{\n  \"instance_name\": \"my-vm\",\n  \"zone\": \"us-central1-a\"\n}\n```\n\n#### `google-cloud-compute_list_zones`\nList all available zones in the project.\n\n#### `google-cloud-compute_get_operation_status`\nGet the status of a Compute Engine operation.\n```json\n{\n  \"operation_name\": \"operation-123456\",\n  \"zone\": \"us-central1-a\"\n}\n```\n\n## \ud83c\udfd7\ufe0f Architecture\n\nThe project follows a modular architecture with separate managers for each GCP service:\n\n- **BigQuery** (`src/big_query.py`): Data warehouse operations including query execution, data loading/exporting, job management, and cost estimation\n- **Cloud Logging** (`src/cloud_logging.py`): Log management operations including reading/writing logs, managing buckets, sinks, exclusions, and metrics\n- **Cloud Storage** (`src/cloud_storage.py`): Bucket and object management including CRUD operations, lifecycle management, and batch operations\n- **Compute Engine** (`src/compute_engine.py`): Virtual machine management including instance lifecycle operations and zone management\n\nEach manager class follows a consistent pattern:\n- Constructor takes `project_id` and optional `service_account_path`\n- Methods return dictionaries with operation results\n- Comprehensive error handling and logging\n- Support for both service account and default credential authentication\n\n## \ud83e\uddea Testing\n\nRun the test server:\n```bash\npython test_server.py\n```\n\nThis will test all available MCP tools and verify their functionality.\n\n## \ud83d\udd27 Development\n\n### Project Structure\n```\ngoogle-cloud-mcp/\n\u251c\u2500\u2500 src/\n\u2502   \u251c\u2500\u2500 __init__.py\n\u2502   \u251c\u2500\u2500 server.py          # MCP server implementation\n\u2502   \u251c\u2500\u2500 big_query.py       # BigQuery manager\n\u2502   \u251c\u2500\u2500 cloud_logging.py   # Cloud Logging manager  \n\u2502   \u251c\u2500\u2500 cloud_storage.py   # Cloud Storage manager\n\u2502   \u2514\u2500\u2500 compute_engine.py  # Compute Engine manager\n\u251c\u2500\u2500 main.py                # Entry point\n\u251c\u2500\u2500 test_server.py         # Test runner\n\u251c\u2500\u2500 pyproject.toml         # Package configuration\n\u251c\u2500\u2500 CLAUDE.md             # Development guidelines\n\u2514\u2500\u2500 README.md             # This file\n```\n\n### Key Design Patterns\n\n1. **Consistent Return Types**: All methods return dictionaries for easy JSON serialization\n2. **Flexible Authentication**: Support for both service account files and default credentials  \n3. **Comprehensive Error Handling**: Specific exception handling for different GCP error types\n4. **Extensive Logging**: Detailed logging throughout for debugging and monitoring\n5. **MCP Integration**: Full compliance with MCP protocol specifications\n\n## \ud83d\udccb Requirements\n\n- Python 3.8+\n- Google Cloud SDK (optional, for default credentials)\n- Required Python packages (automatically installed):\n  - `google-cloud-bigquery`\n  - `google-cloud-logging` \n  - `google-cloud-storage`\n  - `google-cloud-compute`\n  - `mcp`\n\n## \ud83e\udd1d Contributing\n\n1. Fork the repository\n2. Create a feature branch (`git checkout -b feature/amazing-feature`)\n3. Commit your changes (`git commit -m 'Add amazing feature'`)\n4. Push to the branch (`git push origin feature/amazing-feature`)\n5. Open a Pull Request\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the LICENSE file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Google Cloud Platform for comprehensive APIs\n- MCP (Model Context Protocol) for the integration framework\n- The Python community for excellent cloud libraries\n\n---\n\n**Note**: This project is designed to work seamlessly with AI models through the MCP protocol, providing a comprehensive interface for Google Cloud Platform operations.",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A comprehensive Model Context Protocol (MCP) server for Google Cloud Platform services including BigQuery, Cloud Logging, Cloud Storage, and Compute Engine.",
    "version": "1.0.0",
    "project_urls": {
        "Documentation": "https://github.com/lockon-n/google-cloud-mcp#readme",
        "Homepage": "https://github.com/lockon-n/google-cloud-mcp",
        "Issues": "https://github.com/lockon-n/google-cloud-mcp/issues",
        "Repository": "https://github.com/lockon-n/google-cloud-mcp"
    },
    "split_keywords": [
        "ai",
        " bigquery",
        " cloud-logging",
        " cloud-storage",
        " compute-engine",
        " fastmcp",
        " gcp",
        " google-cloud",
        " mcp",
        " model-context-protocol"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7b0d39fe864af418285f3d838ad529d6bd51f33121246d7921ba7b5e4245e99f",
                "md5": "d3c101d1db60ab6914f71c17edaf4341",
                "sha256": "1c028702401a0c78b15a29bba65eb0c3d2eae9c9371392d8296df9b02ca23b50"
            },
            "downloads": -1,
            "filename": "google_cloud_mcp-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d3c101d1db60ab6914f71c17edaf4341",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 29947,
            "upload_time": "2025-07-31T10:21:45",
            "upload_time_iso_8601": "2025-07-31T10:21:45.298905Z",
            "url": "https://files.pythonhosted.org/packages/7b/0d/39fe864af418285f3d838ad529d6bd51f33121246d7921ba7b5e4245e99f/google_cloud_mcp-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "23f2a3d1a04c0aa694d6ced8b3e5c0c7f0b5055567bc6f80bdd45402cc0c4fe2",
                "md5": "a293848890f76bfa128162ab9daefbf1",
                "sha256": "fd3b0abc99dafbe616bfa0de47b230e4953ff9d29893fdb5c47e98f635eff4cd"
            },
            "downloads": -1,
            "filename": "google_cloud_mcp-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "a293848890f76bfa128162ab9daefbf1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 79096,
            "upload_time": "2025-07-31T10:21:46",
            "upload_time_iso_8601": "2025-07-31T10:21:46.997745Z",
            "url": "https://files.pythonhosted.org/packages/23/f2/a3d1a04c0aa694d6ced8b3e5c0c7f0b5055567bc6f80bdd45402cc0c4fe2/google_cloud_mcp-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-31 10:21:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lockon-n",
    "github_project": "google-cloud-mcp#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "google-cloud-mcp"
}
        
Elapsed time: 0.42304s