cloudshovel


Namecloudshovel JSON
Version 1.0.2 PyPI version JSON
download
home_pagehttps://github.com/saw-your-packet/CloudShovel
SummaryA tool for digging secrets in public AMIs
upload_time2024-11-13 19:37:42
maintainerNone
docs_urlNone
authorEduard Agavriloae
requires_python>=3.6
licenseNone
keywords aws ami secrets cloudshovel cloudquarry
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ## Introduction

CloudShovel is a tool designed to search for sensitive information within public or private Amazon Machine Images (AMIs). It automates the process of launching instances from target AMIs, mounting their volumes, and scanning for potential secrets or sensitive data.

The tool is a modified version of what was used for the research [AWS CloudQuarry: Digging for Secrets in Public AMIs](https://securitycafe.ro/2024/05/08/aws-cloudquarry-digging-for-secrets-in-public-amis/)

Authors:
- [Eduard Agavriloae](https://www.linkedin.com/in/eduard-k-agavriloae/) - [hacktodef.com](https://hacktodef.com)
- [Matei Josephs](https://www.linkedin.com/in/l31/) - [hivehack.tech](https://hivehack.tech)


**Disclaimer**

> This is not a tool that will fit all scenarios. There will be errors when starting an EC2 instance based on the target AMI or when trying to mount the volumes. We accept pull requests for covering more cases, but our recommendation is to manually try to handle those cases.

Table of Contents:

- [Introduction](#introduction)
- [Prerequisites](#prerequisites)
- [Installation](#installation)
  - [Using pip](#using-pip)
  - [Manually](#manually)
- [Usage](#usage)
- [How It Works](#how-it-works)
- [Customizing scanning](#customizing-scanning)
- [Resources Created](#resources-created)
- [Required Permissions](#required-permissions)
- [Cleaning Up](#cleaning-up)
- [Troubleshooting](#troubleshooting)


## Prerequisites

Before using CloudShovel, ensure you have the following:

- Python 3.6 or higher
- Access to your own AWS account and IAM identity
- Python libraries (can be installed automatically)
  - Boto3 library installed (`pip install boto3`)
  - Colorama library installed (`pip install colorama`)

## Installation

### Using pip
From terminal:

   ```bash
   python3 -m pip install cloudshovel
   ```

### Manually

1. Clone the CloudShovel repository:
   ```
   git clone https://github.com/your-repo/cloudshovel.git
   cd cloudshovel
   ```

2. Install the required Python libraries:
   ```
   pip install -r requirements.txt
   ```

## Usage

To use CloudShovel, run the `main.py` script with the following syntax:

```
cloudshovel <ami_id> --bucket <s3_bucket_name> [--profile <aws_profile> | --access-key <access_key> --secret-key <secret_key> (--session-token <session_token>)] [--region <aws_region>]
```

Arguments:
- `ami_id`: The ID of the AMI you want to scan (required)
- `--bucket`: The name of the S3 bucket to store results (required)
- Authentication with configured AWS CLI profile
  - `--profile`: AWS CLI profile name (default is 'default')
- Authentication with access keys
  - `--access-key`: AWS Access Key ID
  - `--secret-key`: AWS Secret Access Key
  - `--session-token`: AWS Session Token (optional)
- `--region`: AWS region (default is 'us-east-1')

If you don't specify an argument for authentication, the tool will try to automatically use the `default` profile.

Example:
```
cloudshovel ami-1234567890abcdef --bucket my-cloudshovel-results --profile my-aws-profile --region us-west-2
```

## How It Works

CloudShovel operates through the following steps:

1. **Initialization**: 
   - Parses command-line arguments and creates an AWS session.
   - Validates the target AMI's existence.

2. **Setup**:
   - Creates or verifies the existence of the specified S3 bucket.
   - Creates an IAM role and instance profile for the "secret searcher" EC2 instance.
   - Uploads necessary scripts to the S3 bucket.

3. **Secret Searcher Instance**:
   - Launches an EC2 instance (the "secret searcher") based on the latest Amazon Linux 202* AMI.
   - Installs required tools on the secret searcher instance.

4. **Target AMI Processing**:
   - Launches an EC2 instance from the target AMI.
   - Stops the instance and detaches its volumes.
   - Attaches these volumes to the secret searcher instance.

5. **Scanning**:
   - Mounts the attached volumes on the secret searcher instance.
   - Executes the `mount_and_dig.sh` script to search for potential secrets.
   - The script looks for specific file names and patterns that might indicate sensitive information.

6. **Results**:
   - Uploads the scanning results to the specified S3 bucket.

7. **Cleanup**:
   - Detaches and deletes the volumes from the target AMI.
   - Terminates instances and removes created IAM resources.

## Customizing scanning

By default, the tool will execute the next command to search for files and folders that might be of interest: 

```bash
for item in $(find . \( ! -path "./Windows/*" -a ! -path "./Program Files/*" -a ! -path "./Program Files (x86)/*" \) -size -25M \
            \( -name ".aws" -o -name ".ssh" -o -name "credentials.xml" \
            -o -name "secrets.yml" -o -name "config.php" -o -name "_history" \
            -o -name "autologin.conf" -o -name "web.config" -o -name ".env" \
            -o -name ".git" \) -not -empty)
   do
      echo "[+] Found $item. Copying to output..."
      save_name_item=${item:1}
      save_name_item=${save_name_item////\\}
      cp -r $item /home/ec2-user/OUTPUT/$counter/${save_name_item}
   done
```

The full code can be found in `src\cloudshovel\utils\bash_scripts\mount_and_dig.sh`. Feel free to modify the function and search for other files or folders, increase the file size limit or exclude additional folders from the search.

You can also modify the function and replace the usage of `find` with `trufflehog` or `linpeas`. The difference is that using find takes about 1 minute to execute whereas other scanning alternatives might take tens of minutes or hours depending on volume size.

## Resources Created

CloudShovel creates the following AWS resources during its operation:

1. **S3 Bucket**: Stores scanning scripts and results.
2. **IAM Role and Instance Profile**: Named "minimal-ssm", used by the secret searcher instance.
3. **EC2 Instances**:
   - A "secret searcher" instance based on Amazon Linux 2023.
   - A temporary instance launched from the target AMI (terminated after volume detachment).
4. **EBS Volumes**: Temporary attachments to the secret searcher instance (deleted after scanning).

## Required Permissions

To run CloudShovel, your AWS account or IAM identity needs the following permissions:

- EC2:
  - Describe, run, stop, and terminate instances
  - Describe, create, attach, detach, and delete volumes
  - Describe and create tags
- IAM:
  - Create, delete, and manage roles and instance profiles
  - Attach and detach role policies
- S3:
  - Create buckets
  - Put, get, and delete objects
- SSM:
  - Send commands to EC2 instances
  - Get command invocation results

It's recommended to use the principle of least privilege and create a specific IAM user or role for running CloudShovel with only the necessary permissions.

## Cleaning Up

CloudShovel attempts to clean up all created resources after completion or in case of errors. However, it's good practice to verify that all resources have been properly removed, especially:

- Check the EC2 console for any running instances tagged with "usage: CloudQuarry" or "usage: SecretSearcher".
- Verify that the IAM role and instance profile "minimal-ssm" have been deleted.
- The S3 bucket **is not** automatically deleted to preserve results. Delete it manually if no longer needed.

## Troubleshooting

- If the script fails to mount volumes, ensure that the necessary filesystem tools (e.g., ntfs-3g for NTFS volumes) are installed on the secret searcher instance.
- For permission-related errors, verify that your AWS credentials have all the required permissions listed above.
- If experiencing issues with specific AMIs, check their requirements (e.g., virtualization type, ENA support) and adjust the instance types in the script accordingly.

Use this tool responsibly and ensure you have the right to scan the AMIs you're targeting.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/saw-your-packet/CloudShovel",
    "name": "cloudshovel",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "aws ami secrets cloudshovel cloudquarry",
    "author": "Eduard Agavriloae",
    "author_email": "eduard.agavriloae@hacktodef.com",
    "download_url": "https://files.pythonhosted.org/packages/9d/19/cb2685bf79de2962f828dd6c6a3606d4b9a9e11b2be4993dd2d307450178/cloudshovel-1.0.2.tar.gz",
    "platform": null,
    "description": "## Introduction\r\n\r\nCloudShovel is a tool designed to search for sensitive information within public or private Amazon Machine Images (AMIs). It automates the process of launching instances from target AMIs, mounting their volumes, and scanning for potential secrets or sensitive data.\r\n\r\nThe tool is a modified version of what was used for the research [AWS CloudQuarry: Digging for Secrets in Public AMIs](https://securitycafe.ro/2024/05/08/aws-cloudquarry-digging-for-secrets-in-public-amis/)\r\n\r\nAuthors:\r\n- [Eduard Agavriloae](https://www.linkedin.com/in/eduard-k-agavriloae/) - [hacktodef.com](https://hacktodef.com)\r\n- [Matei Josephs](https://www.linkedin.com/in/l31/) - [hivehack.tech](https://hivehack.tech)\r\n\r\n\r\n**Disclaimer**\r\n\r\n> This is not a tool that will fit all scenarios. There will be errors when starting an EC2 instance based on the target AMI or when trying to mount the volumes. We accept pull requests for covering more cases, but our recommendation is to manually try to handle those cases.\r\n\r\nTable of Contents:\r\n\r\n- [Introduction](#introduction)\r\n- [Prerequisites](#prerequisites)\r\n- [Installation](#installation)\r\n  - [Using pip](#using-pip)\r\n  - [Manually](#manually)\r\n- [Usage](#usage)\r\n- [How It Works](#how-it-works)\r\n- [Customizing scanning](#customizing-scanning)\r\n- [Resources Created](#resources-created)\r\n- [Required Permissions](#required-permissions)\r\n- [Cleaning Up](#cleaning-up)\r\n- [Troubleshooting](#troubleshooting)\r\n\r\n\r\n## Prerequisites\r\n\r\nBefore using CloudShovel, ensure you have the following:\r\n\r\n- Python 3.6 or higher\r\n- Access to your own AWS account and IAM identity\r\n- Python libraries (can be installed automatically)\r\n  - Boto3 library installed (`pip install boto3`)\r\n  - Colorama library installed (`pip install colorama`)\r\n\r\n## Installation\r\n\r\n### Using pip\r\nFrom terminal:\r\n\r\n   ```bash\r\n   python3 -m pip install cloudshovel\r\n   ```\r\n\r\n### Manually\r\n\r\n1. Clone the CloudShovel repository:\r\n   ```\r\n   git clone https://github.com/your-repo/cloudshovel.git\r\n   cd cloudshovel\r\n   ```\r\n\r\n2. Install the required Python libraries:\r\n   ```\r\n   pip install -r requirements.txt\r\n   ```\r\n\r\n## Usage\r\n\r\nTo use CloudShovel, run the `main.py` script with the following syntax:\r\n\r\n```\r\ncloudshovel <ami_id> --bucket <s3_bucket_name> [--profile <aws_profile> | --access-key <access_key> --secret-key <secret_key> (--session-token <session_token>)] [--region <aws_region>]\r\n```\r\n\r\nArguments:\r\n- `ami_id`: The ID of the AMI you want to scan (required)\r\n- `--bucket`: The name of the S3 bucket to store results (required)\r\n- Authentication with configured AWS CLI profile\r\n  - `--profile`: AWS CLI profile name (default is 'default')\r\n- Authentication with access keys\r\n  - `--access-key`: AWS Access Key ID\r\n  - `--secret-key`: AWS Secret Access Key\r\n  - `--session-token`: AWS Session Token (optional)\r\n- `--region`: AWS region (default is 'us-east-1')\r\n\r\nIf you don't specify an argument for authentication, the tool will try to automatically use the `default` profile.\r\n\r\nExample:\r\n```\r\ncloudshovel ami-1234567890abcdef --bucket my-cloudshovel-results --profile my-aws-profile --region us-west-2\r\n```\r\n\r\n## How It Works\r\n\r\nCloudShovel operates through the following steps:\r\n\r\n1. **Initialization**: \r\n   - Parses command-line arguments and creates an AWS session.\r\n   - Validates the target AMI's existence.\r\n\r\n2. **Setup**:\r\n   - Creates or verifies the existence of the specified S3 bucket.\r\n   - Creates an IAM role and instance profile for the \"secret searcher\" EC2 instance.\r\n   - Uploads necessary scripts to the S3 bucket.\r\n\r\n3. **Secret Searcher Instance**:\r\n   - Launches an EC2 instance (the \"secret searcher\") based on the latest Amazon Linux 202* AMI.\r\n   - Installs required tools on the secret searcher instance.\r\n\r\n4. **Target AMI Processing**:\r\n   - Launches an EC2 instance from the target AMI.\r\n   - Stops the instance and detaches its volumes.\r\n   - Attaches these volumes to the secret searcher instance.\r\n\r\n5. **Scanning**:\r\n   - Mounts the attached volumes on the secret searcher instance.\r\n   - Executes the `mount_and_dig.sh` script to search for potential secrets.\r\n   - The script looks for specific file names and patterns that might indicate sensitive information.\r\n\r\n6. **Results**:\r\n   - Uploads the scanning results to the specified S3 bucket.\r\n\r\n7. **Cleanup**:\r\n   - Detaches and deletes the volumes from the target AMI.\r\n   - Terminates instances and removes created IAM resources.\r\n\r\n## Customizing scanning\r\n\r\nBy default, the tool will execute the next command to search for files and folders that might be of interest: \r\n\r\n```bash\r\nfor item in $(find . \\( ! -path \"./Windows/*\" -a ! -path \"./Program Files/*\" -a ! -path \"./Program Files (x86)/*\" \\) -size -25M \\\r\n            \\( -name \".aws\" -o -name \".ssh\" -o -name \"credentials.xml\" \\\r\n            -o -name \"secrets.yml\" -o -name \"config.php\" -o -name \"_history\" \\\r\n            -o -name \"autologin.conf\" -o -name \"web.config\" -o -name \".env\" \\\r\n            -o -name \".git\" \\) -not -empty)\r\n   do\r\n      echo \"[+] Found $item. Copying to output...\"\r\n      save_name_item=${item:1}\r\n      save_name_item=${save_name_item////\\\\}\r\n      cp -r $item /home/ec2-user/OUTPUT/$counter/${save_name_item}\r\n   done\r\n```\r\n\r\nThe full code can be found in `src\\cloudshovel\\utils\\bash_scripts\\mount_and_dig.sh`. Feel free to modify the function and search for other files or folders, increase the file size limit or exclude additional folders from the search.\r\n\r\nYou can also modify the function and replace the usage of `find` with `trufflehog` or `linpeas`. The difference is that using find takes about 1 minute to execute whereas other scanning alternatives might take tens of minutes or hours depending on volume size.\r\n\r\n## Resources Created\r\n\r\nCloudShovel creates the following AWS resources during its operation:\r\n\r\n1. **S3 Bucket**: Stores scanning scripts and results.\r\n2. **IAM Role and Instance Profile**: Named \"minimal-ssm\", used by the secret searcher instance.\r\n3. **EC2 Instances**:\r\n   - A \"secret searcher\" instance based on Amazon Linux 2023.\r\n   - A temporary instance launched from the target AMI (terminated after volume detachment).\r\n4. **EBS Volumes**: Temporary attachments to the secret searcher instance (deleted after scanning).\r\n\r\n## Required Permissions\r\n\r\nTo run CloudShovel, your AWS account or IAM identity needs the following permissions:\r\n\r\n- EC2:\r\n  - Describe, run, stop, and terminate instances\r\n  - Describe, create, attach, detach, and delete volumes\r\n  - Describe and create tags\r\n- IAM:\r\n  - Create, delete, and manage roles and instance profiles\r\n  - Attach and detach role policies\r\n- S3:\r\n  - Create buckets\r\n  - Put, get, and delete objects\r\n- SSM:\r\n  - Send commands to EC2 instances\r\n  - Get command invocation results\r\n\r\nIt's recommended to use the principle of least privilege and create a specific IAM user or role for running CloudShovel with only the necessary permissions.\r\n\r\n## Cleaning Up\r\n\r\nCloudShovel attempts to clean up all created resources after completion or in case of errors. However, it's good practice to verify that all resources have been properly removed, especially:\r\n\r\n- Check the EC2 console for any running instances tagged with \"usage: CloudQuarry\" or \"usage: SecretSearcher\".\r\n- Verify that the IAM role and instance profile \"minimal-ssm\" have been deleted.\r\n- The S3 bucket **is not** automatically deleted to preserve results. Delete it manually if no longer needed.\r\n\r\n## Troubleshooting\r\n\r\n- If the script fails to mount volumes, ensure that the necessary filesystem tools (e.g., ntfs-3g for NTFS volumes) are installed on the secret searcher instance.\r\n- For permission-related errors, verify that your AWS credentials have all the required permissions listed above.\r\n- If experiencing issues with specific AMIs, check their requirements (e.g., virtualization type, ENA support) and adjust the instance types in the script accordingly.\r\n\r\nUse this tool responsibly and ensure you have the right to scan the AMIs you're targeting.\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A tool for digging secrets in public AMIs",
    "version": "1.0.2",
    "project_urls": {
        "Homepage": "https://github.com/saw-your-packet/CloudShovel"
    },
    "split_keywords": [
        "aws",
        "ami",
        "secrets",
        "cloudshovel",
        "cloudquarry"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a86e5e732c823a766b4ef6f7b11cd2fbeeb3d6b78dbc409dd524f65dc6ecfc84",
                "md5": "590acfd04d5d967342290580987889e9",
                "sha256": "1319e684931136ce9a37627adf23c56dd0f8fd6ff279e1f54ba248e33669d61f"
            },
            "downloads": -1,
            "filename": "cloudshovel-1.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "590acfd04d5d967342290580987889e9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 20694,
            "upload_time": "2024-11-13T19:37:40",
            "upload_time_iso_8601": "2024-11-13T19:37:40.809149Z",
            "url": "https://files.pythonhosted.org/packages/a8/6e/5e732c823a766b4ef6f7b11cd2fbeeb3d6b78dbc409dd524f65dc6ecfc84/cloudshovel-1.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9d19cb2685bf79de2962f828dd6c6a3606d4b9a9e11b2be4993dd2d307450178",
                "md5": "1e47556334423195155bd6c63816cc6c",
                "sha256": "79e8979076aadafb47c4eaf39c614768b07496dc4d944a6b4032acd8d7d39718"
            },
            "downloads": -1,
            "filename": "cloudshovel-1.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1e47556334423195155bd6c63816cc6c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 22717,
            "upload_time": "2024-11-13T19:37:42",
            "upload_time_iso_8601": "2024-11-13T19:37:42.113941Z",
            "url": "https://files.pythonhosted.org/packages/9d/19/cb2685bf79de2962f828dd6c6a3606d4b9a9e11b2be4993dd2d307450178/cloudshovel-1.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-13 19:37:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "saw-your-packet",
    "github_project": "CloudShovel",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "cloudshovel"
}
        
Elapsed time: 0.66059s