# Kinesis Analytics Flink
<!--BEGIN STABILITY BANNER-->---
![cdk-constructs: Experimental](https://img.shields.io/badge/cdk--constructs-experimental-important.svg?style=for-the-badge)
> The APIs of higher level constructs in this module are experimental and under active development.
> They are subject to non-backward compatible changes or removal in any future version. These are
> not subject to the [Semantic Versioning](https://semver.org/) model and breaking changes will be
> announced in the release notes. This means that while you may use them, you may need to update
> your source code when upgrading to a newer version of this package.
---
<!--END STABILITY BANNER-->
This package provides constructs for creating Kinesis Analytics Flink
applications. To learn more about using using managed Flink applications, see
the [AWS developer
guide](https://docs.aws.amazon.com/kinesisanalytics/latest/java/).
## Creating Flink Applications
To create a new Flink application, use the `Application` construct:
```python
import path as path
import aws_cdk.aws_cloudwatch as cloudwatch
import aws_cdk as core
import aws_cdk.integ_tests_alpha as integ
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppTest")
flink_runtimes = [flink.Runtime.FLINK_1_6, flink.Runtime.FLINK_1_8, flink.Runtime.FLINK_1_11, flink.Runtime.FLINK_1_13, flink.Runtime.FLINK_1_15, flink.Runtime.FLINK_1_18, flink.Runtime.FLINK_1_19, flink.Runtime.FLINK_1_20
]
flink_runtimes.for_each((runtime) => {
const flinkApp = new flink.Application(stack, `App-${runtime.value}`, {
code: flink.ApplicationCode.fromAsset(path.join(__dirname, 'code-asset')),
runtime: runtime,
});
new cloudwatch.Alarm(stack, `Alarm-${runtime.value}`, {
metric: flinkApp.metricFullRestarts(),
evaluationPeriods: 1,
threshold: 3,
});
})
integ.IntegTest(app, "ApplicationTest",
test_cases=[stack]
)
```
The `code` property can use `fromAsset` as shown above to reference a local jar
file in s3 or `fromBucket` to reference a file in s3.
```python
import path as path
import aws_cdk.aws_s3_assets as assets
import aws_cdk as core
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppCodeFromBucketTest")
asset = assets.Asset(stack, "CodeAsset",
path=path.join(__dirname, "code-asset")
)
bucket = asset.bucket
file_key = asset.s3_object_key
flink.Application(stack, "App",
code=flink.ApplicationCode.from_bucket(bucket, file_key),
runtime=flink.Runtime.FLINK_1_19
)
app.synth()
```
The `propertyGroups` property provides a way of passing arbitrary runtime
properties to your Flink application. You can use the
aws-kinesisanalytics-runtime library to [retrieve these
properties](https://docs.aws.amazon.com/kinesisanalytics/latest/java/how-properties.html#how-properties-access).
```python
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
property_groups={
"FlinkApplicationProperties": {
"input_stream_name": "my-input-kinesis-stream",
"output_stream_name": "my-output-kinesis-stream"
}
},
# ...
runtime=flink.Runtime.FLINK_1_20,
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar")
)
```
Flink applications also have specific configuration for passing parameters
when the Flink job starts. These include parameters for checkpointing,
snapshotting, monitoring, and parallelism.
```python
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
checkpointing_enabled=True, # default is true
checkpoint_interval=Duration.seconds(30), # default is 1 minute
min_pause_between_checkpoints=Duration.seconds(10), # default is 5 seconds
log_level=flink.LogLevel.ERROR, # default is INFO
metrics_level=flink.MetricsLevel.PARALLELISM, # default is APPLICATION
auto_scaling_enabled=False, # default is true
parallelism=32, # default is 1
parallelism_per_kpu=2, # default is 1
snapshots_enabled=False, # default is true
log_group=logs.LogGroup(self, "LogGroup")
)
```
Flink applications can optionally be deployed in a VPC:
```python
# bucket: s3.Bucket
# vpc: ec2.Vpc
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
vpc=vpc
)
```
Raw data
{
"_id": null,
"home_page": "https://github.com/aws/aws-cdk",
"name": "aws-cdk.aws-kinesisanalytics-flink-alpha",
"maintainer": null,
"docs_url": null,
"requires_python": "~=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Amazon Web Services",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/e9/40/dd5473ec51a49f3cafba7578c17dcf76cc6e7692e2fbf54ccafc0dd4eead/aws_cdk_aws_kinesisanalytics_flink_alpha-2.170.0a0.tar.gz",
"platform": null,
"description": "# Kinesis Analytics Flink\n\n<!--BEGIN STABILITY BANNER-->---\n\n\n![cdk-constructs: Experimental](https://img.shields.io/badge/cdk--constructs-experimental-important.svg?style=for-the-badge)\n\n> The APIs of higher level constructs in this module are experimental and under active development.\n> They are subject to non-backward compatible changes or removal in any future version. These are\n> not subject to the [Semantic Versioning](https://semver.org/) model and breaking changes will be\n> announced in the release notes. This means that while you may use them, you may need to update\n> your source code when upgrading to a newer version of this package.\n\n---\n<!--END STABILITY BANNER-->\n\nThis package provides constructs for creating Kinesis Analytics Flink\napplications. To learn more about using using managed Flink applications, see\nthe [AWS developer\nguide](https://docs.aws.amazon.com/kinesisanalytics/latest/java/).\n\n## Creating Flink Applications\n\nTo create a new Flink application, use the `Application` construct:\n\n```python\nimport path as path\nimport aws_cdk.aws_cloudwatch as cloudwatch\nimport aws_cdk as core\nimport aws_cdk.integ_tests_alpha as integ\nimport aws_cdk.aws_kinesisanalytics_flink_alpha as flink\n\napp = core.App()\nstack = core.Stack(app, \"FlinkAppTest\")\n\nflink_runtimes = [flink.Runtime.FLINK_1_6, flink.Runtime.FLINK_1_8, flink.Runtime.FLINK_1_11, flink.Runtime.FLINK_1_13, flink.Runtime.FLINK_1_15, flink.Runtime.FLINK_1_18, flink.Runtime.FLINK_1_19, flink.Runtime.FLINK_1_20\n]\n\nflink_runtimes.for_each((runtime) => {\n const flinkApp = new flink.Application(stack, `App-${runtime.value}`, {\n code: flink.ApplicationCode.fromAsset(path.join(__dirname, 'code-asset')),\n runtime: runtime,\n });\n\n new cloudwatch.Alarm(stack, `Alarm-${runtime.value}`, {\n metric: flinkApp.metricFullRestarts(),\n evaluationPeriods: 1,\n threshold: 3,\n });\n })\n\ninteg.IntegTest(app, \"ApplicationTest\",\n test_cases=[stack]\n)\n```\n\nThe `code` property can use `fromAsset` as shown above to reference a local jar\nfile in s3 or `fromBucket` to reference a file in s3.\n\n```python\nimport path as path\nimport aws_cdk.aws_s3_assets as assets\nimport aws_cdk as core\nimport aws_cdk.aws_kinesisanalytics_flink_alpha as flink\n\napp = core.App()\nstack = core.Stack(app, \"FlinkAppCodeFromBucketTest\")\n\nasset = assets.Asset(stack, \"CodeAsset\",\n path=path.join(__dirname, \"code-asset\")\n)\nbucket = asset.bucket\nfile_key = asset.s3_object_key\n\nflink.Application(stack, \"App\",\n code=flink.ApplicationCode.from_bucket(bucket, file_key),\n runtime=flink.Runtime.FLINK_1_19\n)\n\napp.synth()\n```\n\nThe `propertyGroups` property provides a way of passing arbitrary runtime\nproperties to your Flink application. You can use the\naws-kinesisanalytics-runtime library to [retrieve these\nproperties](https://docs.aws.amazon.com/kinesisanalytics/latest/java/how-properties.html#how-properties-access).\n\n```python\n# bucket: s3.Bucket\n\nflink_app = flink.Application(self, \"Application\",\n property_groups={\n \"FlinkApplicationProperties\": {\n \"input_stream_name\": \"my-input-kinesis-stream\",\n \"output_stream_name\": \"my-output-kinesis-stream\"\n }\n },\n # ...\n runtime=flink.Runtime.FLINK_1_20,\n code=flink.ApplicationCode.from_bucket(bucket, \"my-app.jar\")\n)\n```\n\nFlink applications also have specific configuration for passing parameters\nwhen the Flink job starts. These include parameters for checkpointing,\nsnapshotting, monitoring, and parallelism.\n\n```python\n# bucket: s3.Bucket\n\nflink_app = flink.Application(self, \"Application\",\n code=flink.ApplicationCode.from_bucket(bucket, \"my-app.jar\"),\n runtime=flink.Runtime.FLINK_1_20,\n checkpointing_enabled=True, # default is true\n checkpoint_interval=Duration.seconds(30), # default is 1 minute\n min_pause_between_checkpoints=Duration.seconds(10), # default is 5 seconds\n log_level=flink.LogLevel.ERROR, # default is INFO\n metrics_level=flink.MetricsLevel.PARALLELISM, # default is APPLICATION\n auto_scaling_enabled=False, # default is true\n parallelism=32, # default is 1\n parallelism_per_kpu=2, # default is 1\n snapshots_enabled=False, # default is true\n log_group=logs.LogGroup(self, \"LogGroup\")\n)\n```\n\nFlink applications can optionally be deployed in a VPC:\n\n```python\n# bucket: s3.Bucket\n# vpc: ec2.Vpc\n\nflink_app = flink.Application(self, \"Application\",\n code=flink.ApplicationCode.from_bucket(bucket, \"my-app.jar\"),\n runtime=flink.Runtime.FLINK_1_20,\n vpc=vpc\n)\n```\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "A CDK Construct Library for Kinesis Analytics Flink applications",
"version": "2.170.0a0",
"project_urls": {
"Homepage": "https://github.com/aws/aws-cdk",
"Source": "https://github.com/aws/aws-cdk.git"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "9ff040037a38b18ece2d0ede0bd241f3df1ccfa6791d1e75ae17cca811ed268b",
"md5": "b805f8069a5ea1de8232cc02bbc24872",
"sha256": "e7ed70bc11c3bd4e889cb2e206d594fc04577b8bcbcdb5c093390d6449a5c2b9"
},
"downloads": -1,
"filename": "aws_cdk.aws_kinesisanalytics_flink_alpha-2.170.0a0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b805f8069a5ea1de8232cc02bbc24872",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "~=3.8",
"size": 104896,
"upload_time": "2024-11-22T04:41:40",
"upload_time_iso_8601": "2024-11-22T04:41:40.868786Z",
"url": "https://files.pythonhosted.org/packages/9f/f0/40037a38b18ece2d0ede0bd241f3df1ccfa6791d1e75ae17cca811ed268b/aws_cdk.aws_kinesisanalytics_flink_alpha-2.170.0a0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e940dd5473ec51a49f3cafba7578c17dcf76cc6e7692e2fbf54ccafc0dd4eead",
"md5": "aef10c4fc2595b4454d66fe0193e4e90",
"sha256": "f9c1664440a52364dc78ce687ed7a1b6636c48c21d67109c7d220eebf3f5681d"
},
"downloads": -1,
"filename": "aws_cdk_aws_kinesisanalytics_flink_alpha-2.170.0a0.tar.gz",
"has_sig": false,
"md5_digest": "aef10c4fc2595b4454d66fe0193e4e90",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "~=3.8",
"size": 105423,
"upload_time": "2024-11-22T04:42:27",
"upload_time_iso_8601": "2024-11-22T04:42:27.031252Z",
"url": "https://files.pythonhosted.org/packages/e9/40/dd5473ec51a49f3cafba7578c17dcf76cc6e7692e2fbf54ccafc0dd4eead/aws_cdk_aws_kinesisanalytics_flink_alpha-2.170.0a0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-22 04:42:27",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "aws",
"github_project": "aws-cdk",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "aws-cdk.aws-kinesisanalytics-flink-alpha"
}