aws_crawler


Nameaws_crawler JSON
Version 1.2.4 PyPI version JSON
download
home_pagehttps://gitlab.com/fer1035_python/modules/pypi-aws_crawler
SummaryCrawl through AWS accounts in an organization using master assumed role.
upload_time2024-02-10 17:05:21
maintainer
docs_urlNone
authorAhmad Ferdaus Abd Razak
requires_python>=3.9,<4.0
licenseGPL-2.0-only
keywords aws crawler accounts organization
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ===============
**aws_crawler**
===============

Overview
--------

Crawl through AWS accounts in an organization using master assumed role. You can specify a comma-separated string of account IDs for specific accounts, an Organizational Unit ID to crawl through all accounts therein, or a comma-separated string of account statuses to crawl through matching accounts in the organization.  

Crawling Precedence:

1. Specific accounts
2. Organizational Unit
3. All matching accounts in the organization

Usage
-----

Installation:

.. code-block:: BASH

    pip3 install aws_crawler
    python3 -m pip install aws_crawler

Example:

- Get STS caller identities
- Also featuring (installed with aws_crawler):
   - `Automated authentication <https://pypi.org/project/aws-authenticator/>`_
   - `Multithreading <https://pypi.org/project/multithreader/>`_

.. code-block:: PYTHON

   import aws_crawler
   import boto3
   from multithreader import threads
   from aws_authenticator import AWSAuthenticator as awsauth
   from pprint import pprint as pp


   def get_caller_identity(
      account_id: str,
      items: dict
   ) -> dict:
      """Get AWS STS caller identities from accounts."""
      print(f'Working on {account_id}...')

      try:
         # Get auth credential for each account.
         credentials = aws_crawler.get_credentials(
               items['session'],
               f'arn:aws:iam::{account_id}:role/{items["assumed_role_name"]}',
               items['external_id']
         )

         # Get STS caller identity.
         client = boto3.client(
               'sts',
               aws_access_key_id=credentials['aws_access_key_id'],
               aws_secret_access_key=credentials['aws_secret_access_key'],
               aws_session_token=credentials['aws_session_token'],
               region_name=items['region']
         )
         response = client.get_caller_identity()['UserId']

      except Exception as e:
         response = str(e)

      # Return result.
      return {
         'account_id': account_id,
         'details': response
      }


   if __name__ == '__main__':
      # Login to AWS through SSO.
      auth = awsauth(
         sso_url='https://myorg.awsapps.com/start/#',
         sso_role_name='AWSViewOnlyAccess',
         sso_account_id='123456789012'
      )
      session = auth.sso()

      # # Create account list from comma-separated string of IDs.
      # account_ids = aws_crawler.create_account_list(
      #    session,
      #    '123456789012, 234567890123, 345678901234'
      # )
      # Get account list for an Organizational Unit.
      account_ids = aws_crawler.list_ou_accounts(
         session,
         'ou-abc123-asgh39'
      )
      # # Get matching account list for the entire organization.
      # account_ids = aws_crawler.list_accounts(
      #    session,
      #    'ACTIVE,SUSPENDED'
      # )

      # Execute task with multithreading.
      items = {
         'session': session,
         'assumed_role_name': 'MyOrgCrossAccountAccess',
         'external_id': 'lkasf987923ljkf2;lkjf298fj2',
         'region': 'us-east-1'
      }
      results = threads(
         get_caller_identity,
         account_ids,
         items,
         thread_num=5
      )

      # Print results.
      pp(results)

            

Raw data

            {
    "_id": null,
    "home_page": "https://gitlab.com/fer1035_python/modules/pypi-aws_crawler",
    "name": "aws_crawler",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9,<4.0",
    "maintainer_email": "",
    "keywords": "AWS,crawler,accounts,organization",
    "author": "Ahmad Ferdaus Abd Razak",
    "author_email": "fer1035@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b6/50/15b811249ff5a44ffc16edb931fbd883ea9f93955962dae0fe991dbce010/aws_crawler-1.2.4.tar.gz",
    "platform": null,
    "description": "===============\n**aws_crawler**\n===============\n\nOverview\n--------\n\nCrawl through AWS accounts in an organization using master assumed role. You can specify a comma-separated string of account IDs for specific accounts, an Organizational Unit ID to crawl through all accounts therein, or a comma-separated string of account statuses to crawl through matching accounts in the organization.  \n\nCrawling Precedence:\n\n1. Specific accounts\n2. Organizational Unit\n3. All matching accounts in the organization\n\nUsage\n-----\n\nInstallation:\n\n.. code-block:: BASH\n\n    pip3 install aws_crawler\n    python3 -m pip install aws_crawler\n\nExample:\n\n- Get STS caller identities\n- Also featuring (installed with aws_crawler):\n   - `Automated authentication <https://pypi.org/project/aws-authenticator/>`_\n   - `Multithreading <https://pypi.org/project/multithreader/>`_\n\n.. code-block:: PYTHON\n\n   import aws_crawler\n   import boto3\n   from multithreader import threads\n   from aws_authenticator import AWSAuthenticator as awsauth\n   from pprint import pprint as pp\n\n\n   def get_caller_identity(\n      account_id: str,\n      items: dict\n   ) -> dict:\n      \"\"\"Get AWS STS caller identities from accounts.\"\"\"\n      print(f'Working on {account_id}...')\n\n      try:\n         # Get auth credential for each account.\n         credentials = aws_crawler.get_credentials(\n               items['session'],\n               f'arn:aws:iam::{account_id}:role/{items[\"assumed_role_name\"]}',\n               items['external_id']\n         )\n\n         # Get STS caller identity.\n         client = boto3.client(\n               'sts',\n               aws_access_key_id=credentials['aws_access_key_id'],\n               aws_secret_access_key=credentials['aws_secret_access_key'],\n               aws_session_token=credentials['aws_session_token'],\n               region_name=items['region']\n         )\n         response = client.get_caller_identity()['UserId']\n\n      except Exception as e:\n         response = str(e)\n\n      # Return result.\n      return {\n         'account_id': account_id,\n         'details': response\n      }\n\n\n   if __name__ == '__main__':\n      # Login to AWS through SSO.\n      auth = awsauth(\n         sso_url='https://myorg.awsapps.com/start/#',\n         sso_role_name='AWSViewOnlyAccess',\n         sso_account_id='123456789012'\n      )\n      session = auth.sso()\n\n      # # Create account list from comma-separated string of IDs.\n      # account_ids = aws_crawler.create_account_list(\n      #    session,\n      #    '123456789012, 234567890123, 345678901234'\n      # )\n      # Get account list for an Organizational Unit.\n      account_ids = aws_crawler.list_ou_accounts(\n         session,\n         'ou-abc123-asgh39'\n      )\n      # # Get matching account list for the entire organization.\n      # account_ids = aws_crawler.list_accounts(\n      #    session,\n      #    'ACTIVE,SUSPENDED'\n      # )\n\n      # Execute task with multithreading.\n      items = {\n         'session': session,\n         'assumed_role_name': 'MyOrgCrossAccountAccess',\n         'external_id': 'lkasf987923ljkf2;lkjf298fj2',\n         'region': 'us-east-1'\n      }\n      results = threads(\n         get_caller_identity,\n         account_ids,\n         items,\n         thread_num=5\n      )\n\n      # Print results.\n      pp(results)\n",
    "bugtrack_url": null,
    "license": "GPL-2.0-only",
    "summary": "Crawl through AWS accounts in an organization using master assumed role.",
    "version": "1.2.4",
    "project_urls": {
        "Homepage": "https://gitlab.com/fer1035_python/modules/pypi-aws_crawler",
        "Repository": "https://gitlab.com/fer1035_python/modules/pypi-aws_crawler"
    },
    "split_keywords": [
        "aws",
        "crawler",
        "accounts",
        "organization"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c99ef314521f7c67523d3f560ee51e57129a81591521fbf36cc53bcf5036ec99",
                "md5": "2e7419cabbd6696907552ff064facd8a",
                "sha256": "7d1ad5ba3e3e08f81f8388fafbe2aa766a385f7f5b5f1cd504273a6b2832ce60"
            },
            "downloads": -1,
            "filename": "aws_crawler-1.2.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2e7419cabbd6696907552ff064facd8a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9,<4.0",
            "size": 4852,
            "upload_time": "2024-02-10T17:05:20",
            "upload_time_iso_8601": "2024-02-10T17:05:20.739479Z",
            "url": "https://files.pythonhosted.org/packages/c9/9e/f314521f7c67523d3f560ee51e57129a81591521fbf36cc53bcf5036ec99/aws_crawler-1.2.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b65015b811249ff5a44ffc16edb931fbd883ea9f93955962dae0fe991dbce010",
                "md5": "96fad76d858c24201496ba6e736eb09f",
                "sha256": "860ea7ba894ace65b205d4efc0541003775597d0263b46cc87ffaaa9b5589db4"
            },
            "downloads": -1,
            "filename": "aws_crawler-1.2.4.tar.gz",
            "has_sig": false,
            "md5_digest": "96fad76d858c24201496ba6e736eb09f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9,<4.0",
            "size": 3864,
            "upload_time": "2024-02-10T17:05:21",
            "upload_time_iso_8601": "2024-02-10T17:05:21.895418Z",
            "url": "https://files.pythonhosted.org/packages/b6/50/15b811249ff5a44ffc16edb931fbd883ea9f93955962dae0fe991dbce010/aws_crawler-1.2.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-10 17:05:21",
    "github": false,
    "gitlab": true,
    "bitbucket": false,
    "codeberg": false,
    "gitlab_user": "fer1035_python",
    "gitlab_project": "modules",
    "lcname": "aws_crawler"
}
        
Elapsed time: 0.19578s