mwaa-dr


Namemwaa-dr JSON
Version 2.0.1 PyPI version JSON
download
home_pageNone
SummaryDR Solution for Amazon Managed Workflows for Apache Airflow (MWAA)
upload_time2024-08-22 13:48:54
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseCopyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords mwaa airflow disaster recovery dr
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MWAA Disaster Recovery

![MWAA](https://img.shields.io/badge/MWAA-2.8.1_|_2.7.2_|_2.6.3_|_2.5.1_|_2.4.3-blue)
![Python](https://img.shields.io/badge/Python-3.7+-blue)
[![Black](https://img.shields.io/badge/Code%20Style-Black-000000.svg)](https://github.com/psf/black)
[![CodeCoverage](https://raw.githubusercontent.com/aws-samples/mwaa-disaster-recovery/python-coverage-comment-action-data/badge.svg)](https://htmlpreview.github.io/?https://github.com/aws-samples/mwaa-disaster-recovery/blob/python-coverage-comment-action-data/htmlcov/index.html)
[![PyPI version](https://badge.fury.io/py/mwaa-dr.svg)](https://badge.fury.io/py/mwaa-dr)

The `mwaa-dr` PyPi package is a part of the larger [MWAA DR solution](https://github.com/aws-samples/mwaa-disaster-recovery) for the use case where exporting and importing metadata store needs to be performed independent to the broader DR solution. As of the time of writing, access to the MWAA metadata store is only available through DAGs. This solution simplifies the process of creating backup and restore DAGs, respectively, by providing a reusable python library.

## Installation

You can install the `mwaa-dr` package by including the latest version in your MWAA `requirements.txt` file.

## Simple Use Case

Let's look at creating a metadata backup and restore dags, respectively, as follows:

### Metadata Backup DAG

Let's assume your environment version is `2.8.1`. You can create a metadata backup dag by creating a python file in your MWAA `dags` folder as follows:

**backup_metadata.py**:
```python
# Importing DAG is necessary for DAG detection
from airflow import DAG
from mwaa_dr.v_2_8.dr_factory import DRFactory_2_8

factory = DRFactory_2_8(
    dag_id='backup',
    path_prefix='data',
    storage_type='S3'
)

# Assigning the returned dag to a global variable is necessary for DAG detection
dag: DAG = factory.create_backup_dag()
```

For running backup and restore on your Amazon MWAA environment on AWS, you need to do the following:
1. Ensure you have an S3 bucket created to store the backup.
2. Ensure that your MWAA execution role has read and write permissions on the bucket.
3. Create an Airflow variable with the key named `DR_BACKUP_BUCKET` and the value containing the **name** (not ARN) of the S3 bucket.
4. You are all set to manually trigger the backup and restore DAGs at any point. The metadata backup will be stored in `<backup S3 bucket>/<path_prefix>`.

If you want to use the solution with [aws-mwaa-local-runner](https://github.com/aws/aws-mwaa-local-runner), change the `storage_type` argument from `S3` to `LOCAL_FS`. The backup will be located in the `dags/data` folder or more generally at the `dags/<path_prefix>` folder of the local runner project.

Here is a sample run of the backup workflow:

![Backup Workflow](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/design/BackupRun.png?raw=true)


### Metadata Restore DAG

You can create a metadata restore dag by creating a python file in your MWAA `dags` folder as follows:

**restore_metadata.py**:
```python
from airflow import DAG
from mwaa_dr.v_2_8.dr_factory import DRFactory_2_8

factory = DRFactory_2_8(
    dag_id='restore',
    path_prefix='data',
    storage_type='S3'
)

dag:DAG = factory.create_restore_dag()
```

Here is a sample run of the restore workflow:

![Restore Workflow](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/design/RestoreRun.png?raw=true)

Please note that `variable` and `connection` tables are handled specially during the restore process. You can specify a restore strategy to be applied for these two tables by setting `DR_VARIABLE_RESTORE_STRATEGY` and `DR_CONNECTION_RESTORE_STRATEGY` Airflow variables. These variables can take on of the following values:

1. **DO_NOTHING**: As the name suggests, this strategy will not restore the variable and connection tables from the backup. This strategy is particularly useful if your MWAA environments have been configured to use [AWS Secrets Manager](https://docs.aws.amazon.com/mwaa/latest/userguide/connections-secrets-manager.html) for storing variables and connections.

2. **APPEND**: With this strategy, the restore workflow will not overwrite existing entries of the variable and connection tables and only add missing entries from the backup.

3. **REPLACE**: This strategy will overwrite existing variable and connections from backup.

Note that these two Airflow variables are treated specially and are unaffected by the restore process of the `variable` table. In the absence of these variables, the default value of `APPEND` is used for both `variable` and `connection` restores.

Note that you will need an empty database for restore to work. To cleanup the database before restore, please use the `cleanup_metadata` DAG discussed next.

### Metadata Cleanup DAG

You can create a metadata cleanup dag by creating a python file in your MWAA `dags` folder as follows:

**cleanup_metadata.py**:
```python
from airflow import DAG
from mwaa_dr.v_2_8.dr_factory import DRFactory_2_8

factory = DRFactory_2_8(
    dag_id='cleanup',
    path_prefix='data',
    storage_type='S3'
)

dag:DAG = factory.create_cleanup_dag()
```


## Advance Use Case

You may have some advance use cases, such as, a need to exclude or include additional tables from your backup, update SQL scripts for specific tables, and others. Note that, by default, the solution backs up only `variable`, `connection`, `slot_pool`, `log`, `job`, `dag_run`, `trigger`, `task_instance`, `task_fail`, and `xcom` tables. Majority of other tables are auto-generated by scheduler or by the webserver and thus, excluded from the list of tables to be backed up.

To add/remove tables from the backup or customize any aspects of the solution, you will derive from an appropriate factory class and override its methods. To see this more concretely, let's assume you want to exclude `variable` and `connection` tables from the backup and restore operations. Please follow the subsequent implementation guidelines.

First, find the appropriate factory class from the supported versions in the [mwaa_dr](https://github.com/aws-samples/mwaa-disaster-recovery/tree/main/assets/dags/mwaa_dr) package. Let's assume your environment version is `2.7.2`. So, you will pick [DRFactory_2_7](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/assets/dags/mwaa_dr/v_2_7/dr_factory.py) as your base class for inheritance and override its `setup_tables` method.

Note that majority of the functionality for the DR framework has either been implemented in [BaseDRFactory](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/assets/dags/mwaa_dr/v_2_7/dr_factory.py) or in [DRFactory_2_5](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/assets/dags/mwaa_dr/v_2_5/dr_factory.py). The other factories just implement a chain of inheritance from these classes and override specific methods to implement differences.

Here is a sample implementation of your derived class that you will need to create in your `dags` folder, let's name the file `custom_dr_factory_2_7.py`:

**custom_dr_factory_2_7.py**:
```python
from mwaa_dr.framework.model.base_table import BaseTable
from mwaa_dr.framework.model.dependency_model import DependencyModel
from mwaa_dr.v_2_7.dr_factory import DRFactory_2_7

class CustomDRFactory_2_7(DRFactory_2_7):
    def setup_tables(self, model: DependencyModel[BaseTable]) -> list[BaseTable]:
        # Create needed tables, all extend from the BaseTable class
        active_dag = self.active_dag(model)

        # Comment out variable and connection from the previous implementation
        # variable = self.variable(model)
        # connection = self.connection(model)
        slot_pool = self.slot_pool(model)

        log = self.log(model)
        job = self.job(model)
        dag_run = self.dag_run(model)
        trigger = self.trigger(model)

        task_instance = self.task_instance(model)
        task_fail = self.task_fail(model)

        # Specify dependencies
        task_instance << [job, trigger, dag_run]
        task_fail << [task_instance, dag_run]
        active_dag << [
            # variable,
            # connection,
            slot_pool,
            log,
            job,
            dag_run,
            trigger,
            task_instance,
            task_fail,
        ]

        # Return the list of tables to be included in backup and restore
        return [
            # variable,
            # connection,
            slot_pool,
            log,
            job,
            dag_run,
            trigger,
            task_instance,
            task_fail,
            active_dag,
        ]
```

Here is your metadata backup dag that will use your custom factory (also in the `dags` folder):

**backup.py**:
```python
from airflow import DAG
from custom_dr_factory_2_7 import CustomDRFactory_2_7

factory = CustomDRFactory_2_7(
    dag_id='backup',
    path_prefix='data',
    storage_type='S3'
)

dag:DAG = factory.create_backup_dag()
```

And finally, here is your metadata restore dag (also in the `dags` folder):

**restore.py**:
```python
from airflow import DAG
from custom_dr_factory_2_7 import CustomDRFactory_2_7

factory = CustomDRFactory_2_7(
    dag_id='restore',
    path_prefix='data',
    storage_type='S3'
)

dag: DAG = factory.create_restore_dag()
```

For additional details, please visit the project [homepage](https://github.com/aws-samples/mwaa-disaster-recovery).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mwaa-dr",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "MWAA, airflow, disaster, recovery, DR",
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/80/15/0acd4f8f11a13c1de749a7c92dd682faa07c31a440aaf23f6d444789d4d5/mwaa_dr-2.0.1.tar.gz",
    "platform": null,
    "description": "# MWAA Disaster Recovery\n\n![MWAA](https://img.shields.io/badge/MWAA-2.8.1_|_2.7.2_|_2.6.3_|_2.5.1_|_2.4.3-blue)\n![Python](https://img.shields.io/badge/Python-3.7+-blue)\n[![Black](https://img.shields.io/badge/Code%20Style-Black-000000.svg)](https://github.com/psf/black)\n[![CodeCoverage](https://raw.githubusercontent.com/aws-samples/mwaa-disaster-recovery/python-coverage-comment-action-data/badge.svg)](https://htmlpreview.github.io/?https://github.com/aws-samples/mwaa-disaster-recovery/blob/python-coverage-comment-action-data/htmlcov/index.html)\n[![PyPI version](https://badge.fury.io/py/mwaa-dr.svg)](https://badge.fury.io/py/mwaa-dr)\n\nThe `mwaa-dr` PyPi package is a part of the larger [MWAA DR solution](https://github.com/aws-samples/mwaa-disaster-recovery) for the use case where exporting and importing metadata store needs to be performed independent to the broader DR solution. As of the time of writing, access to the MWAA metadata store is only available through DAGs. This solution simplifies the process of creating backup and restore DAGs, respectively, by providing a reusable python library.\n\n## Installation\n\nYou can install the `mwaa-dr` package by including the latest version in your MWAA `requirements.txt` file.\n\n## Simple Use Case\n\nLet's look at creating a metadata backup and restore dags, respectively, as follows:\n\n### Metadata Backup DAG\n\nLet's assume your environment version is `2.8.1`. You can create a metadata backup dag by creating a python file in your MWAA `dags` folder as follows:\n\n**backup_metadata.py**:\n```python\n# Importing DAG is necessary for DAG detection\nfrom airflow import DAG\nfrom mwaa_dr.v_2_8.dr_factory import DRFactory_2_8\n\nfactory = DRFactory_2_8(\n    dag_id='backup',\n    path_prefix='data',\n    storage_type='S3'\n)\n\n# Assigning the returned dag to a global variable is necessary for DAG detection\ndag: DAG = factory.create_backup_dag()\n```\n\nFor running backup and restore on your Amazon MWAA environment on AWS, you need to do the following:\n1. Ensure you have an S3 bucket created to store the backup.\n2. Ensure that your MWAA execution role has read and write permissions on the bucket.\n3. Create an Airflow variable with the key named `DR_BACKUP_BUCKET` and the value containing the **name** (not ARN) of the S3 bucket.\n4. You are all set to manually trigger the backup and restore DAGs at any point. The metadata backup will be stored in `<backup S3 bucket>/<path_prefix>`.\n\nIf you want to use the solution with [aws-mwaa-local-runner](https://github.com/aws/aws-mwaa-local-runner), change the `storage_type` argument from `S3` to `LOCAL_FS`. The backup will be located in the `dags/data` folder or more generally at the `dags/<path_prefix>` folder of the local runner project.\n\nHere is a sample run of the backup workflow:\n\n![Backup Workflow](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/design/BackupRun.png?raw=true)\n\n\n### Metadata Restore DAG\n\nYou can create a metadata restore dag by creating a python file in your MWAA `dags` folder as follows:\n\n**restore_metadata.py**:\n```python\nfrom airflow import DAG\nfrom mwaa_dr.v_2_8.dr_factory import DRFactory_2_8\n\nfactory = DRFactory_2_8(\n    dag_id='restore',\n    path_prefix='data',\n    storage_type='S3'\n)\n\ndag:DAG = factory.create_restore_dag()\n```\n\nHere is a sample run of the restore workflow:\n\n![Restore Workflow](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/design/RestoreRun.png?raw=true)\n\nPlease note that `variable` and `connection` tables are handled specially during the restore process. You can specify a restore strategy to be applied for these two tables by setting `DR_VARIABLE_RESTORE_STRATEGY` and `DR_CONNECTION_RESTORE_STRATEGY` Airflow variables. These variables can take on of the following values:\n\n1. **DO_NOTHING**: As the name suggests, this strategy will not restore the variable and connection tables from the backup. This strategy is particularly useful if your MWAA environments have been configured to use [AWS Secrets Manager](https://docs.aws.amazon.com/mwaa/latest/userguide/connections-secrets-manager.html) for storing variables and connections.\n\n2. **APPEND**: With this strategy, the restore workflow will not overwrite existing entries of the variable and connection tables and only add missing entries from the backup.\n\n3. **REPLACE**: This strategy will overwrite existing variable and connections from backup.\n\nNote that these two Airflow variables are treated specially and are unaffected by the restore process of the `variable` table. In the absence of these variables, the default value of `APPEND` is used for both `variable` and `connection` restores.\n\nNote that you will need an empty database for restore to work. To cleanup the database before restore, please use the `cleanup_metadata` DAG discussed next.\n\n### Metadata Cleanup DAG\n\nYou can create a metadata cleanup dag by creating a python file in your MWAA `dags` folder as follows:\n\n**cleanup_metadata.py**:\n```python\nfrom airflow import DAG\nfrom mwaa_dr.v_2_8.dr_factory import DRFactory_2_8\n\nfactory = DRFactory_2_8(\n    dag_id='cleanup',\n    path_prefix='data',\n    storage_type='S3'\n)\n\ndag:DAG = factory.create_cleanup_dag()\n```\n\n\n## Advance Use Case\n\nYou may have some advance use cases, such as, a need to exclude or include additional tables from your backup, update SQL scripts for specific tables, and others. Note that, by default, the solution backs up only `variable`, `connection`, `slot_pool`, `log`, `job`, `dag_run`, `trigger`, `task_instance`, `task_fail`, and `xcom` tables. Majority of other tables are auto-generated by scheduler or by the webserver and thus, excluded from the list of tables to be backed up.\n\nTo add/remove tables from the backup or customize any aspects of the solution, you will derive from an appropriate factory class and override its methods. To see this more concretely, let's assume you want to exclude `variable` and `connection` tables from the backup and restore operations. Please follow the subsequent implementation guidelines.\n\nFirst, find the appropriate factory class from the supported versions in the [mwaa_dr](https://github.com/aws-samples/mwaa-disaster-recovery/tree/main/assets/dags/mwaa_dr) package. Let's assume your environment version is `2.7.2`. So, you will pick [DRFactory_2_7](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/assets/dags/mwaa_dr/v_2_7/dr_factory.py) as your base class for inheritance and override its `setup_tables` method.\n\nNote that majority of the functionality for the DR framework has either been implemented in [BaseDRFactory](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/assets/dags/mwaa_dr/v_2_7/dr_factory.py) or in [DRFactory_2_5](https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/assets/dags/mwaa_dr/v_2_5/dr_factory.py). The other factories just implement a chain of inheritance from these classes and override specific methods to implement differences.\n\nHere is a sample implementation of your derived class that you will need to create in your `dags` folder, let's name the file `custom_dr_factory_2_7.py`:\n\n**custom_dr_factory_2_7.py**:\n```python\nfrom mwaa_dr.framework.model.base_table import BaseTable\nfrom mwaa_dr.framework.model.dependency_model import DependencyModel\nfrom mwaa_dr.v_2_7.dr_factory import DRFactory_2_7\n\nclass CustomDRFactory_2_7(DRFactory_2_7):\n    def setup_tables(self, model: DependencyModel[BaseTable]) -> list[BaseTable]:\n        # Create needed tables, all extend from the BaseTable class\n        active_dag = self.active_dag(model)\n\n        # Comment out variable and connection from the previous implementation\n        # variable = self.variable(model)\n        # connection = self.connection(model)\n        slot_pool = self.slot_pool(model)\n\n        log = self.log(model)\n        job = self.job(model)\n        dag_run = self.dag_run(model)\n        trigger = self.trigger(model)\n\n        task_instance = self.task_instance(model)\n        task_fail = self.task_fail(model)\n\n        # Specify dependencies\n        task_instance << [job, trigger, dag_run]\n        task_fail << [task_instance, dag_run]\n        active_dag << [\n            # variable,\n            # connection,\n            slot_pool,\n            log,\n            job,\n            dag_run,\n            trigger,\n            task_instance,\n            task_fail,\n        ]\n\n        # Return the list of tables to be included in backup and restore\n        return [\n            # variable,\n            # connection,\n            slot_pool,\n            log,\n            job,\n            dag_run,\n            trigger,\n            task_instance,\n            task_fail,\n            active_dag,\n        ]\n```\n\nHere is your metadata backup dag that will use your custom factory (also in the `dags` folder):\n\n**backup.py**:\n```python\nfrom airflow import DAG\nfrom custom_dr_factory_2_7 import CustomDRFactory_2_7\n\nfactory = CustomDRFactory_2_7(\n    dag_id='backup',\n    path_prefix='data',\n    storage_type='S3'\n)\n\ndag:DAG = factory.create_backup_dag()\n```\n\nAnd finally, here is your metadata restore dag (also in the `dags` folder):\n\n**restore.py**:\n```python\nfrom airflow import DAG\nfrom custom_dr_factory_2_7 import CustomDRFactory_2_7\n\nfactory = CustomDRFactory_2_7(\n    dag_id='restore',\n    path_prefix='data',\n    storage_type='S3'\n)\n\ndag: DAG = factory.create_restore_dag()\n```\n\nFor additional details, please visit the project [homepage](https://github.com/aws-samples/mwaa-disaster-recovery).\n",
    "bugtrack_url": null,
    "license": "Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "DR Solution for Amazon Managed Workflows for Apache Airflow (MWAA)",
    "version": "2.0.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/aws-samples/mwaa-disaster-recovery/issues",
        "Changelog": "https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/CHANGELOG.md",
        "Documentation": "https://github.com/aws-samples/mwaa-disaster-recovery/blob/main/PYPIDOC.md",
        "Homepage": "https://github.com/aws-samples/mwaa-disaster-recovery",
        "Repository": "https://github.com/aws-samples/mwaa-disaster-recovery.git"
    },
    "split_keywords": [
        "mwaa",
        " airflow",
        " disaster",
        " recovery",
        " dr"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "781769c6fff1a4be2f85c2d7cf50a775f0cb60c9c9c0ac107895bf0eeafeee2c",
                "md5": "aa39f75cd8362d454a1470ee381f9d2d",
                "sha256": "6da3203a4222d521c31edf39ad3f7256807f58156ea0d1d12e5eaa7744ed81bf"
            },
            "downloads": -1,
            "filename": "mwaa_dr-2.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "aa39f75cd8362d454a1470ee381f9d2d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 37905,
            "upload_time": "2024-08-22T13:48:53",
            "upload_time_iso_8601": "2024-08-22T13:48:53.706117Z",
            "url": "https://files.pythonhosted.org/packages/78/17/69c6fff1a4be2f85c2d7cf50a775f0cb60c9c9c0ac107895bf0eeafeee2c/mwaa_dr-2.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "80150acd4f8f11a13c1de749a7c92dd682faa07c31a440aaf23f6d444789d4d5",
                "md5": "0ba3ed92aa4228d4f864c9bd6104c6db",
                "sha256": "30dab4b5179cbeb952fc512e97a893e7319add7bd7a168768c67e2c3d1a4938e"
            },
            "downloads": -1,
            "filename": "mwaa_dr-2.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "0ba3ed92aa4228d4f864c9bd6104c6db",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 38014,
            "upload_time": "2024-08-22T13:48:54",
            "upload_time_iso_8601": "2024-08-22T13:48:54.898827Z",
            "url": "https://files.pythonhosted.org/packages/80/15/0acd4f8f11a13c1de749a7c92dd682faa07c31a440aaf23f6d444789d4d5/mwaa_dr-2.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-22 13:48:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "aws-samples",
    "github_project": "mwaa-disaster-recovery",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "mwaa-dr"
}
        
Elapsed time: 3.30141s