# AWS Backup Construct Library
<!--BEGIN STABILITY BANNER-->---
![cfn-resources: Stable](https://img.shields.io/badge/cfn--resources-stable-success.svg?style=for-the-badge)
![cdk-constructs: Stable](https://img.shields.io/badge/cdk--constructs-stable-success.svg?style=for-the-badge)
---
<!--END STABILITY BANNER-->
AWS Backup is a fully managed backup service that makes it easy to centralize and automate the
backup of data across AWS services in the cloud and on premises. Using AWS Backup, you can
configure backup policies and monitor backup activity for your AWS resources in one place.
## Backup plan and selection
In AWS Backup, a *backup plan* is a policy expression that defines when and how you want to back up
your AWS resources, such as Amazon DynamoDB tables or Amazon Elastic File System (Amazon EFS) file
systems. You can assign resources to backup plans, and AWS Backup automatically backs up and retains
backups for those resources according to the backup plan. You can create multiple backup plans if you
have workloads with different backup requirements.
This module provides ready-made backup plans (similar to the console experience):
```python
# Daily, weekly and monthly with 5 year retention
plan = backup.BackupPlan.daily_weekly_monthly5_year_retention(self, "Plan")
```
Assigning resources to a plan can be done with `addSelection()`:
```python
# plan: backup.BackupPlan
my_table = dynamodb.Table.from_table_name(self, "Table", "myTableName")
my_cool_construct = Construct(self, "MyCoolConstruct")
plan.add_selection("Selection",
resources=[
backup.BackupResource.from_dynamo_db_table(my_table), # A DynamoDB table
backup.BackupResource.from_tag("stage", "prod"), # All resources that are tagged stage=prod in the region/account
backup.BackupResource.from_construct(my_cool_construct)
]
)
```
If not specified, a new IAM role with a managed policy for backup will be
created for the selection. The `BackupSelection` implements `IGrantable`.
To add rules to a plan, use `addRule()`:
```python
# plan: backup.BackupPlan
plan.add_rule(backup.BackupPlanRule(
completion_window=Duration.hours(2),
start_window=Duration.hours(1),
schedule_expression=events.Schedule.cron( # Only cron expressions are supported
day="15",
hour="3",
minute="30"),
move_to_cold_storage_after=Duration.days(30)
))
```
Continuous backup and point-in-time restores (PITR) can be configured.
Property `deleteAfter` defines the retention period for the backup. It is mandatory if PITR is enabled.
If no value is specified, the retention period is set to 35 days which is the maximum retention period supported by PITR.
Property `moveToColdStorageAfter` must not be specified because PITR does not support this option.
This example defines an AWS Backup rule with PITR and a retention period set to 14 days:
```python
# plan: backup.BackupPlan
plan.add_rule(backup.BackupPlanRule(
enable_continuous_backup=True,
delete_after=Duration.days(14)
))
```
Ready-made rules are also available:
```python
# plan: backup.BackupPlan
plan.add_rule(backup.BackupPlanRule.daily())
plan.add_rule(backup.BackupPlanRule.weekly())
```
By default a new [vault](#Backup-vault) is created when creating a plan.
It is also possible to specify a vault either at the plan level or at the
rule level.
```python
my_vault = backup.BackupVault.from_backup_vault_name(self, "Vault1", "myVault")
other_vault = backup.BackupVault.from_backup_vault_name(self, "Vault2", "otherVault")
plan = backup.BackupPlan.daily35_day_retention(self, "Plan", my_vault) # Use `myVault` for all plan rules
plan.add_rule(backup.BackupPlanRule.monthly1_year(other_vault))
```
You can [backup](https://docs.aws.amazon.com/aws-backup/latest/devguide/windows-backups.html)
VSS-enabled Windows applications running on Amazon EC2 instances by setting the `windowsVss`
parameter to `true`. If the application has VSS writer registered with Windows VSS,
then AWS Backup creates a snapshot that will be consistent for that application.
```python
plan = backup.BackupPlan(self, "Plan",
windows_vss=True
)
```
## Backup vault
In AWS Backup, a *backup vault* is a container that you organize your backups in. You can use backup
vaults to set the AWS Key Management Service (AWS KMS) encryption key that is used to encrypt backups
in the backup vault and to control access to the backups in the backup vault. If you require different
encryption keys or access policies for different groups of backups, you can optionally create multiple
backup vaults.
```python
my_key = kms.Key.from_key_arn(self, "MyKey", "aaa")
my_topic = sns.Topic.from_topic_arn(self, "MyTopic", "bbb")
vault = backup.BackupVault(self, "Vault",
encryption_key=my_key, # Custom encryption key
notification_topic=my_topic
)
```
A vault has a default `RemovalPolicy` set to `RETAIN`. Note that removing a vault
that contains recovery points will fail.
You can assign policies to backup vaults and the resources they contain. Assigning policies allows
you to do things like grant access to users to create backup plans and on-demand backups, but limit
their ability to delete recovery points after they're created.
Use the `accessPolicy` property to create a backup vault policy:
```python
vault = backup.BackupVault(self, "Vault",
access_policy=iam.PolicyDocument(
statements=[
iam.PolicyStatement(
effect=iam.Effect.DENY,
principals=[iam.AnyPrincipal()],
actions=["backup:DeleteRecoveryPoint"],
resources=["*"],
conditions={
"StringNotLike": {
"aws:userId": ["user1", "user2"
]
}
}
)
]
)
)
```
Alternativately statements can be added to the vault policy using `addToAccessPolicy()`.
Use the `blockRecoveryPointDeletion` property or the `blockRecoveryPointDeletion()` method to add
a statement to the vault access policy that prevents recovery point deletions in your vault:
```python
# backup_vault: backup.BackupVault
backup.BackupVault(self, "Vault",
block_recovery_point_deletion=True
)
backup_vault.block_recovery_point_deletion()
```
By default access is not restricted.
## Importing existing backup vault
To import an existing backup vault into your CDK application, use the `BackupVault.fromBackupVaultArn` or `BackupVault.fromBackupVaultName`
static method. Here is an example of giving an IAM Role permission to start a backup job:
```python
imported_vault = backup.BackupVault.from_backup_vault_name(self, "Vault", "myVaultName")
role = iam.Role(self, "Access Role", assumed_by=iam.ServicePrincipal("lambda.amazonaws.com"))
imported_vault.grant(role, "backup:StartBackupJob")
```
Raw data
{
"_id": null,
"home_page": "https://github.com/aws/aws-cdk",
"name": "aws-cdk.aws-backup",
"maintainer": "",
"docs_url": null,
"requires_python": "~=3.7",
"maintainer_email": "",
"keywords": "",
"author": "Amazon Web Services",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/ce/e8/8634358989ac674883c8f4a91e12748a0cb92c3da3afeb0b93cb64115e09/aws-cdk.aws-backup-1.203.0.tar.gz",
"platform": null,
"description": "# AWS Backup Construct Library\n\n<!--BEGIN STABILITY BANNER-->---\n\n\n![cfn-resources: Stable](https://img.shields.io/badge/cfn--resources-stable-success.svg?style=for-the-badge)\n\n![cdk-constructs: Stable](https://img.shields.io/badge/cdk--constructs-stable-success.svg?style=for-the-badge)\n\n---\n<!--END STABILITY BANNER-->\n\nAWS Backup is a fully managed backup service that makes it easy to centralize and automate the\nbackup of data across AWS services in the cloud and on premises. Using AWS Backup, you can\nconfigure backup policies and monitor backup activity for your AWS resources in one place.\n\n## Backup plan and selection\n\nIn AWS Backup, a *backup plan* is a policy expression that defines when and how you want to back up\nyour AWS resources, such as Amazon DynamoDB tables or Amazon Elastic File System (Amazon EFS) file\nsystems. You can assign resources to backup plans, and AWS Backup automatically backs up and retains\nbackups for those resources according to the backup plan. You can create multiple backup plans if you\nhave workloads with different backup requirements.\n\nThis module provides ready-made backup plans (similar to the console experience):\n\n```python\n# Daily, weekly and monthly with 5 year retention\nplan = backup.BackupPlan.daily_weekly_monthly5_year_retention(self, \"Plan\")\n```\n\nAssigning resources to a plan can be done with `addSelection()`:\n\n```python\n# plan: backup.BackupPlan\n\nmy_table = dynamodb.Table.from_table_name(self, \"Table\", \"myTableName\")\nmy_cool_construct = Construct(self, \"MyCoolConstruct\")\n\nplan.add_selection(\"Selection\",\n resources=[\n backup.BackupResource.from_dynamo_db_table(my_table), # A DynamoDB table\n backup.BackupResource.from_tag(\"stage\", \"prod\"), # All resources that are tagged stage=prod in the region/account\n backup.BackupResource.from_construct(my_cool_construct)\n ]\n)\n```\n\nIf not specified, a new IAM role with a managed policy for backup will be\ncreated for the selection. The `BackupSelection` implements `IGrantable`.\n\nTo add rules to a plan, use `addRule()`:\n\n```python\n# plan: backup.BackupPlan\n\nplan.add_rule(backup.BackupPlanRule(\n completion_window=Duration.hours(2),\n start_window=Duration.hours(1),\n schedule_expression=events.Schedule.cron( # Only cron expressions are supported\n day=\"15\",\n hour=\"3\",\n minute=\"30\"),\n move_to_cold_storage_after=Duration.days(30)\n))\n```\n\nContinuous backup and point-in-time restores (PITR) can be configured.\nProperty `deleteAfter` defines the retention period for the backup. It is mandatory if PITR is enabled.\nIf no value is specified, the retention period is set to 35 days which is the maximum retention period supported by PITR.\nProperty `moveToColdStorageAfter` must not be specified because PITR does not support this option.\nThis example defines an AWS Backup rule with PITR and a retention period set to 14 days:\n\n```python\n# plan: backup.BackupPlan\n\nplan.add_rule(backup.BackupPlanRule(\n enable_continuous_backup=True,\n delete_after=Duration.days(14)\n))\n```\n\nReady-made rules are also available:\n\n```python\n# plan: backup.BackupPlan\n\nplan.add_rule(backup.BackupPlanRule.daily())\nplan.add_rule(backup.BackupPlanRule.weekly())\n```\n\nBy default a new [vault](#Backup-vault) is created when creating a plan.\nIt is also possible to specify a vault either at the plan level or at the\nrule level.\n\n```python\nmy_vault = backup.BackupVault.from_backup_vault_name(self, \"Vault1\", \"myVault\")\nother_vault = backup.BackupVault.from_backup_vault_name(self, \"Vault2\", \"otherVault\")\n\nplan = backup.BackupPlan.daily35_day_retention(self, \"Plan\", my_vault) # Use `myVault` for all plan rules\nplan.add_rule(backup.BackupPlanRule.monthly1_year(other_vault))\n```\n\nYou can [backup](https://docs.aws.amazon.com/aws-backup/latest/devguide/windows-backups.html)\nVSS-enabled Windows applications running on Amazon EC2 instances by setting the `windowsVss`\nparameter to `true`. If the application has VSS writer registered with Windows VSS,\nthen AWS Backup creates a snapshot that will be consistent for that application.\n\n```python\nplan = backup.BackupPlan(self, \"Plan\",\n windows_vss=True\n)\n```\n\n## Backup vault\n\nIn AWS Backup, a *backup vault* is a container that you organize your backups in. You can use backup\nvaults to set the AWS Key Management Service (AWS KMS) encryption key that is used to encrypt backups\nin the backup vault and to control access to the backups in the backup vault. If you require different\nencryption keys or access policies for different groups of backups, you can optionally create multiple\nbackup vaults.\n\n```python\nmy_key = kms.Key.from_key_arn(self, \"MyKey\", \"aaa\")\nmy_topic = sns.Topic.from_topic_arn(self, \"MyTopic\", \"bbb\")\n\nvault = backup.BackupVault(self, \"Vault\",\n encryption_key=my_key, # Custom encryption key\n notification_topic=my_topic\n)\n```\n\nA vault has a default `RemovalPolicy` set to `RETAIN`. Note that removing a vault\nthat contains recovery points will fail.\n\nYou can assign policies to backup vaults and the resources they contain. Assigning policies allows\nyou to do things like grant access to users to create backup plans and on-demand backups, but limit\ntheir ability to delete recovery points after they're created.\n\nUse the `accessPolicy` property to create a backup vault policy:\n\n```python\nvault = backup.BackupVault(self, \"Vault\",\n access_policy=iam.PolicyDocument(\n statements=[\n iam.PolicyStatement(\n effect=iam.Effect.DENY,\n principals=[iam.AnyPrincipal()],\n actions=[\"backup:DeleteRecoveryPoint\"],\n resources=[\"*\"],\n conditions={\n \"StringNotLike\": {\n \"aws:userId\": [\"user1\", \"user2\"\n ]\n }\n }\n )\n ]\n )\n)\n```\n\nAlternativately statements can be added to the vault policy using `addToAccessPolicy()`.\n\nUse the `blockRecoveryPointDeletion` property or the `blockRecoveryPointDeletion()` method to add\na statement to the vault access policy that prevents recovery point deletions in your vault:\n\n```python\n# backup_vault: backup.BackupVault\nbackup.BackupVault(self, \"Vault\",\n block_recovery_point_deletion=True\n)\nbackup_vault.block_recovery_point_deletion()\n```\n\nBy default access is not restricted.\n\n## Importing existing backup vault\n\nTo import an existing backup vault into your CDK application, use the `BackupVault.fromBackupVaultArn` or `BackupVault.fromBackupVaultName`\nstatic method. Here is an example of giving an IAM Role permission to start a backup job:\n\n```python\nimported_vault = backup.BackupVault.from_backup_vault_name(self, \"Vault\", \"myVaultName\")\n\nrole = iam.Role(self, \"Access Role\", assumed_by=iam.ServicePrincipal(\"lambda.amazonaws.com\"))\n\nimported_vault.grant(role, \"backup:StartBackupJob\")\n```\n\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "The CDK Construct Library for AWS::Backup",
"version": "1.203.0",
"project_urls": {
"Homepage": "https://github.com/aws/aws-cdk",
"Source": "https://github.com/aws/aws-cdk.git"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ca443559045b9ef8fc85b081ce3e12c8309ed9bcda1579fa5e7de5f56dfb303b",
"md5": "5e70a728c4003448deaebe78971b0544",
"sha256": "52f1979fd0d28e10f3385244e6dc17ab5865c3172f1ed2dd856ceb7ae7188c8d"
},
"downloads": -1,
"filename": "aws_cdk.aws_backup-1.203.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5e70a728c4003448deaebe78971b0544",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "~=3.7",
"size": 193243,
"upload_time": "2023-05-31T22:53:02",
"upload_time_iso_8601": "2023-05-31T22:53:02.838815Z",
"url": "https://files.pythonhosted.org/packages/ca/44/3559045b9ef8fc85b081ce3e12c8309ed9bcda1579fa5e7de5f56dfb303b/aws_cdk.aws_backup-1.203.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "cee88634358989ac674883c8f4a91e12748a0cb92c3da3afeb0b93cb64115e09",
"md5": "b5c5b60aa0a534b0502bb71f91d3dadb",
"sha256": "f77fb77629e0230f58ee4ff738a4c8a7f59e4df4b7d5704ecfe7afc65244aa51"
},
"downloads": -1,
"filename": "aws-cdk.aws-backup-1.203.0.tar.gz",
"has_sig": false,
"md5_digest": "b5c5b60aa0a534b0502bb71f91d3dadb",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "~=3.7",
"size": 194053,
"upload_time": "2023-05-31T23:00:57",
"upload_time_iso_8601": "2023-05-31T23:00:57.061041Z",
"url": "https://files.pythonhosted.org/packages/ce/e8/8634358989ac674883c8f4a91e12748a0cb92c3da3afeb0b93cb64115e09/aws-cdk.aws-backup-1.203.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-05-31 23:00:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "aws",
"github_project": "aws-cdk",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "aws-cdk.aws-backup"
}