Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hotfix/remove references to kda #18

Merged
merged 5 commits into from
Aug 24, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions apps/java-datastream/kds-to-s3-datastream-java/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# KDS to S3 (Java Datastream API)

This blueprint deploys a KDA app that reads from Kinesis Data Streams (KDS) using IAM auth and writes to S3 using the Java DataStream API:
This blueprint deploys a MSF app that reads from Kinesis Data Streams (KDS) using IAM auth and writes to S3 using the Java DataStream API:

![Arch diagram](img/kds-kda-s3.png)

Expand All @@ -17,15 +17,15 @@ This blueprint deploys a KDA app that reads from Kinesis Data Streams (KDS) usin
## High-level deployment steps

1. Build app and copy resulting JAR to S3 location
2. Deploy associated infra (KDS and KDA) using CDK script
- If using existing resources, you can simply update app properties in KDA.
2. Deploy associated infra (KDS and MSF) using CDK script
- If using existing resources, you can simply update app properties in MSF.
3. Perform data generation

## Prerequisites

1. Maven
2. AWS SDK v2
2. AWS CDK v2 - for deploying associated infra (KDS Stream and KDA app)
2. AWS CDK v2 - for deploying associated infra (KDS Stream and MSF app)

## Step-by-step deployment walkthrough

Expand All @@ -52,11 +52,11 @@ mvn clean package
aws s3 cp target/<<your generated jar>> ${S3_BUCKET}/{S3_FILE_KEY}
```

4. Follow instructions in the [`cdk-infra`](cdk-infra/README.md) folder to deploy the infrastructure associated with this app - such as the source KDS stream and the Kinesis Data Analytics application.
4. Follow instructions in the [`cdk-infra`](cdk-infra/README.md) folder to deploy the infrastructure associated with this app - such as the source KDS stream and the Managed Service for Apache Flink application.

5. Follow instructions in [orders-datagen](../../../datagen/orders-datagen/README.md) to create topic and generate data into the source KDS stream.

6. Start your Kinesis Data Analytics application from the AWS console.
6. Start your Managed Service for Apache Flink application from the AWS console.

7. Do a Flink query or S3 Select Query against S3 to view data written to S3.

Expand All @@ -79,8 +79,8 @@ export BootstrapStackName=bootstrap-my-account-${timestampToLetters}-stack
export BlueprintStackName=kds-to-s3-blueprint-${timestampToLetters}-stack
export AppName=kds-to-s3-demo-${timestampToLetters}-app
export StreamName=kds-to-s3-demo-${timestampToLetters}-stream
export CloudWatchLogGroupName=blueprints/kinesis-analytics/${AppName}
export CloudWatchLogStreamName=kinesis-analytics-log-stream
export CloudWatchLogGroupName=blueprints/managed-flink/${AppName}
export CloudWatchLogStreamName=managed-flink-log-stream
export RoleName=kds-to-s3-demo-${timestampToLetters}-role

aws cloudformation create-stack --template-body file://./bootstrap-cdk/cdk.out/BootstrapCdkStack.template.json --stack-name ${BootstrapStackName} --parameters ParameterKey=AssetBucket,ParameterValue=$BucketName ParameterKey=AssetList,ParameterValue="https://data-streaming-labs.s3.amazonaws.com/blueprint-test/kds-to-s3-datastream-java-1.0.1.jar\,https://data-streaming-labs.s3.amazonaws.com/blueprint-test/kds-to-s3-datastream-java.json" --capabilities CAPABILITY_IAM
Expand Down
20 changes: 10 additions & 10 deletions apps/java-datastream/kds-to-s3-datastream-java/cdk-infra/README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# CDK Infrastructure associated with MSK Serverless to S3 KDA blueprint (Java)
# CDK Infrastructure associated with MSK Serverless to S3 MSF blueprint (Java)

This CDK script deploys the following the components:

1. VPC for MSK Serverless and Kinesis Data Analytics application.
1. VPC for MSK Serverless and Managed Service for Apache Flink application.
2. MSK Serverless.
3. Kinesis Data Analytics Java DataStream API application.
4. IAM permissions for the role associated with the Kinesis Data Analytics application.
3. Managed Service for Apache Flink Java DataStream API application.
4. IAM permissions for the role associated with the Managed Service for Apache Flink application.

This CDK script expects you to supply the following *existing* resources:

Expand All @@ -19,14 +19,14 @@ Open up `cdk.json` and fill in appropriate values for each of these CDK context

| Context value name | Purpose | Notes
| --- | --- | --- |
| `kdaAppName` | The name of the Kinesis Data Analytics application | KDA app *will be created* |
| `msfAppName` | The name of the Managed Service for Apache Flink application | MSF app *will be created* |
| `appBucket` | The S3 bucket where the application payload will be stored | *Must be pre-existing* |
| `appSinkBucket` | The bucket to which the MSK to S3 Flink app will write output files (in Parquet) | *Must be pre-existing* |
| `runtimeEnvironment` | The Kinesis Data Analytics runtime environment | For instance, `FLINK-1_15` |
| `deployDataGen` | `true` if you want Zeppelin-based interactive KDA for data generation to be deployed; `false` otherwise | N/A |
| `glueDatabaseName` | The AWS Glue database that will be used by KDA Studio datagen app | *Must be pre-existing* |
| `kdaLogGroup` | The name for the CloudWatch Log Group that will be linked to the KDA Flink app | Log group *will be created* |
| `kdaLogStream` | The name for the CloudWatch Log Stream that will be linked to the KDA Flink app | Log stream *will be created* |
| `runtimeEnvironment` | The Managed Service for Apache Flink runtime environment | For instance, `FLINK-1_15` |
| `deployDataGen` | `true` if you want Zeppelin-based interactive MSF for data generation to be deployed; `false` otherwise | N/A |
| `glueDatabaseName` | The AWS Glue database that will be used by MSF Studio datagen app | *Must be pre-existing* |
| `msfLogGroup` | The name for the CloudWatch Log Group that will be linked to the MSF Flink app | Log group *will be created* |
| `msfLogStream` | The name for the CloudWatch Log Stream that will be linked to the MSF Flink app | Log stream *will be created* |
| `sourceMskClusterName` | The name for the source MSK Serverless cluster | MSK Serverless cluster *will be created* |

For more information on CDK Runtime Context, please see [Runtime Context](https://docs.aws.amazon.com/cdk/v2/guide/context.html).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ const app = new cdk.App();
// expect there to be a pre-existing bucket. You can modify this stack
// to also create a bucket instead.
// Same goes for the bucket that this app will be writing to.
new CdkInfraKdsToS3Stack(app, 'CdkInfraKdaKdsToS3Stack', {
new CdkInfraKdsToS3Stack(app, 'CdkInfraMSFKdsToS3Stack', {
synthesizer: new BootstraplessStackSynthesizer({
templateBucketName: 'cfn-template-bucket',

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,17 +38,17 @@
"aws",
"aws-cn"
],
"kdaAppName": "",
"msfAppName": "",
"appBucket": "",
"appFileKeyOnS3": "",
"appSinkBucket": "",
"runtimeEnvironment": "",
"glueDatabaseName": "",
"flinkVersion": "1.15.2",
"zepFlinkVersion": "1.13.2",
"RuntimeEnvironment": "1.13.2",
"deployDataGen": "false",
"kdaLogGroup": "",
"kdaLogStream": "",
"msfLogGroup": "",
"msfLogStream": "",
"sourceKinesisStreamName": ""
}
}
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
{
"name": "cdk-infra-kda-kafka-to-s3",
"name": "cdk-infra-msf-kafka-to-s3",
"version": "0.1.0",
"bin": {
"cdk-infra-kda-kafka-to-s3": "bin/cdk-infra-kda-kafka-to-s3.js"
"cdk-infra-msf-kafka-to-s3": "bin/cdk-infra-msf-kafka-to-s3.js"
},
"scripts": {
"build": "tsc",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,13 +1,11 @@
// import * as cdk from 'aws-cdk-lib';
// import { Template } from 'aws-cdk-lib/assertions';
// import * as CdkInfraKdaKafkaToS3 from '../lib/cdk-infra-kda-kafka-to-s3-stack';

// example test. To run these tests, uncomment this file along with the
// example resource in lib/cdk-infra-kda-kafka-to-s3-stack.ts
// example resource in lib/cdk-infra-msf-kafka-to-s3-stack.ts
test('SQS Queue Created', () => {
// const app = new cdk.App();
// // WHEN
// const stack = new CdkInfraKdaKafkaToS3.CdkInfraKdaKafkaToS3Stack(app, 'MyTestStack');
// // THEN
// const template = Template.fromStack(stack);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
<mxCell id="5bTRFqQEF1BuItB7qnC4-6" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;" parent="1" source="5bTRFqQEF1BuItB7qnC4-2" target="5bTRFqQEF1BuItB7qnC4-4" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="5bTRFqQEF1BuItB7qnC4-2" value="Kinesis Data Analytics&lt;br&gt;Java / DataStream API" style="sketch=0;points=[[0,0,0],[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0,0],[0,1,0],[0.25,1,0],[0.5,1,0],[0.75,1,0],[1,1,0],[0,0.25,0],[0,0.5,0],[0,0.75,0],[1,0.25,0],[1,0.5,0],[1,0.75,0]];outlineConnect=0;fontColor=#232F3E;gradientColor=#945DF2;gradientDirection=north;fillColor=#5A30B5;strokeColor=#ffffff;dashed=0;verticalLabelPosition=bottom;verticalAlign=top;align=center;html=1;fontSize=12;fontStyle=0;aspect=fixed;shape=mxgraph.aws4.resourceIcon;resIcon=mxgraph.aws4.kinesis_data_analytics;" parent="1" vertex="1">
<mxCell id="5bTRFqQEF1BuItB7qnC4-2" value="Managed Service for Apache Flink&lt;br&gt;Java / DataStream API" style="sketch=0;points=[[0,0,0],[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0,0],[0,1,0],[0.25,1,0],[0.5,1,0],[0.75,1,0],[1,1,0],[0,0.25,0],[0,0.5,0],[0,0.75,0],[1,0.25,0],[1,0.5,0],[1,0.75,0]];outlineConnect=0;fontColor=#232F3E;gradientColor=#945DF2;gradientDirection=north;fillColor=#5A30B5;strokeColor=#ffffff;dashed=0;verticalLabelPosition=bottom;verticalAlign=top;align=center;html=1;fontSize=12;fontStyle=0;aspect=fixed;shape=mxgraph.aws4.resourceIcon;resIcon=mxgraph.aws4.kinesis_data_analytics;" parent="1" vertex="1">
<mxGeometry x="386" y="160" width="78" height="78" as="geometry" />
</mxCell>
<mxCell id="5bTRFqQEF1BuItB7qnC4-5" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" parent="1" target="5bTRFqQEF1BuItB7qnC4-2" edge="1">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
<mxCell id="5bTRFqQEF1BuItB7qnC4-6" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;" parent="1" source="5bTRFqQEF1BuItB7qnC4-2" target="5bTRFqQEF1BuItB7qnC4-4" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="5bTRFqQEF1BuItB7qnC4-2" value="Kinesis Data Analytics&lt;br&gt;Java / DataStream API" style="sketch=0;points=[[0,0,0],[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0,0],[0,1,0],[0.25,1,0],[0.5,1,0],[0.75,1,0],[1,1,0],[0,0.25,0],[0,0.5,0],[0,0.75,0],[1,0.25,0],[1,0.5,0],[1,0.75,0]];outlineConnect=0;fontColor=#232F3E;gradientColor=#945DF2;gradientDirection=north;fillColor=#5A30B5;strokeColor=#ffffff;dashed=0;verticalLabelPosition=bottom;verticalAlign=top;align=center;html=1;fontSize=12;fontStyle=0;aspect=fixed;shape=mxgraph.aws4.resourceIcon;resIcon=mxgraph.aws4.kinesis_data_analytics;" parent="1" vertex="1">
<mxCell id="5bTRFqQEF1BuItB7qnC4-2" value="Managed Service for Apache Flink&lt;br&gt;Java / DataStream API" style="sketch=0;points=[[0,0,0],[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0,0],[0,1,0],[0.25,1,0],[0.5,1,0],[0.75,1,0],[1,1,0],[0,0.25,0],[0,0.5,0],[0,0.75,0],[1,0.25,0],[1,0.5,0],[1,0.75,0]];outlineConnect=0;fontColor=#232F3E;gradientColor=#945DF2;gradientDirection=north;fillColor=#5A30B5;strokeColor=#ffffff;dashed=0;verticalLabelPosition=bottom;verticalAlign=top;align=center;html=1;fontSize=12;fontStyle=0;aspect=fixed;shape=mxgraph.aws4.resourceIcon;resIcon=mxgraph.aws4.kinesis_data_analytics;" parent="1" vertex="1">
<mxGeometry x="386" y="160" width="78" height="78" as="geometry" />
</mxCell>
<mxCell id="5bTRFqQEF1BuItB7qnC4-5" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" parent="1" target="5bTRFqQEF1BuItB7qnC4-2" edge="1">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
<mxCell id="5bTRFqQEF1BuItB7qnC4-6" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;" parent="1" source="5bTRFqQEF1BuItB7qnC4-2" target="5bTRFqQEF1BuItB7qnC4-4" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="5bTRFqQEF1BuItB7qnC4-2" value="Kinesis Data Analytics&lt;br&gt;Java / DataStream API" style="sketch=0;points=[[0,0,0],[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0,0],[0,1,0],[0.25,1,0],[0.5,1,0],[0.75,1,0],[1,1,0],[0,0.25,0],[0,0.5,0],[0,0.75,0],[1,0.25,0],[1,0.5,0],[1,0.75,0]];outlineConnect=0;fontColor=#232F3E;gradientColor=#945DF2;gradientDirection=north;fillColor=#5A30B5;strokeColor=#ffffff;dashed=0;verticalLabelPosition=bottom;verticalAlign=top;align=center;html=1;fontSize=12;fontStyle=0;aspect=fixed;shape=mxgraph.aws4.resourceIcon;resIcon=mxgraph.aws4.kinesis_data_analytics;" parent="1" vertex="1">
<mxCell id="5bTRFqQEF1BuItB7qnC4-2" value="Managed Service for Apache Flink&lt;br&gt;Java / DataStream API" style="sketch=0;points=[[0,0,0],[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0,0],[0,1,0],[0.25,1,0],[0.5,1,0],[0.75,1,0],[1,1,0],[0,0.25,0],[0,0.5,0],[0,0.75,0],[1,0.25,0],[1,0.5,0],[1,0.75,0]];outlineConnect=0;fontColor=#232F3E;gradientColor=#945DF2;gradientDirection=north;fillColor=#5A30B5;strokeColor=#ffffff;dashed=0;verticalLabelPosition=bottom;verticalAlign=top;align=center;html=1;fontSize=12;fontStyle=0;aspect=fixed;shape=mxgraph.aws4.resourceIcon;resIcon=mxgraph.aws4.kinesis_data_analytics;" parent="1" vertex="1">
<mxGeometry x="386" y="160" width="78" height="78" as="geometry" />
</mxCell>
<mxCell id="5bTRFqQEF1BuItB7qnC4-5" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" parent="1" target="5bTRFqQEF1BuItB7qnC4-2" edge="1">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ private static FileSink<Stock> getFileSink(StreamExecutionEnvironment env,

Path path = new Path(outputPath);

String prefix = String.format("%sjob_start=%s/", "app-kda-kafka-to-s3", System.currentTimeMillis());
String prefix = String.format("%sjob_start=%s/", "app-msf-kafka-to-s3", System.currentTimeMillis());

final FileSink<Stock> sink = FileSink
.forBulkFormat(path, ParquetAvroWriters.forReflectRecord(Stock.class))
Expand Down Expand Up @@ -186,7 +186,7 @@ public static void main(String[] args) throws Exception {
env.setRuntimeMode(RuntimeExecutionMode.STREAMING);

// Only for local
// Configure via KDA when running in cloud
// Configure via MSF when running in cloud
if(isLocal(env)) {
env.enableCheckpointing(2000);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ set -x

AWS_ACCOUNT_ID=$(aws sts get-caller-identity) | jq -r ".Account"
AWS_REGION=$(aws configure get region)
BUCKET_NAME="kda-blueprints-kds-to-s3-${AWS_ACCOUNT_ID}-${AWS_REGION}"
BUCKET_NAME="msf-blueprints-kds-to-s3-${AWS_ACCOUNT_ID}-${AWS_REGION}"
APP_NAME=kds-to-s3-datastream-java
JAR_FILE=$APP_NAME-1.0.1.jar

Expand Down
12 changes: 6 additions & 6 deletions apps/studio/msk-to-studio/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ This blueprint deploys a Studio app that reads from MSK Serverless using IAM aut

## High-level deployment steps

1. Deploy associated infra (MSK and KDA Studio) using CDK script
1. Deploy associated infra (MSK and MSF Studio) using CDK script
2. Run Studio query to read from MSK topic

## Prerequisites

1. Maven
2. AWS SDK v2
2. AWS CDK v2 - for deploying associated infra (MSK and KDA app)
2. AWS CDK v2 - for deploying associated infra (MSK and MSF app)

## Step-by-step deployment walkthrough

Expand All @@ -33,9 +33,9 @@ export AWS_PROFILE=<<profile-name>>
export APP_NAME=<<name-of-your-app>>
```

2. Follow instructions in the [`cdk-infra`](cdk-infra/README.md) folder to *deploy* the infrastructure associated with this app - such as MSK Serverless and the Kinesis Data Analytics Studio application.
2. Follow instructions in the [`cdk-infra`](cdk-infra/README.md) folder to *deploy* the infrastructure associated with this app - such as MSK Serverless and the Managed Service for Apache Flink Studio application.

3. Start your Kinesis Data Analytics Studio application from the AWS console.
3. Start your Managed Service for Apache Flink Studio application from the AWS console.

4. Run Flink SQL query in Studio notebook to read from MSK topic.

Expand All @@ -59,8 +59,8 @@ export BlueprintStackName=studio-demo-msk-studio-blueprint-${timestampToLetters}
export AppName=studio-demo-${timestampToLetters}-app
export ClusterName=studio-demo-${timestampToLetters}-cluster
export GlueDatabaseName=studio_demo_${timestampToLetters}_db
export CloudWatchLogGroupName=blueprints/kinesis-analytics/${AppName}
export CloudWatchLogStreamName=kinesis-analytics-log-stream
export CloudWatchLogGroupName=blueprints/managed-flink/${AppName}
export CloudWatchLogStreamName=managed-flink-log-stream
export RoleName=studio-demo-${timestampToLetters}-role
export RuntimeEnvironment=ZEPPELIN-FLINK-3_0

Expand Down
20 changes: 10 additions & 10 deletions apps/studio/msk-to-studio/cdk-infra/README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# CDK Infrastructure associated with MSK Serverless to S3 KDA blueprint (Java)
# CDK Infrastructure associated with MSK Serverless to S3 MSF blueprint (Java)

This CDK script deploys the following the components:

1. VPC for MSK Serverless and Kinesis Data Analytics application.
1. VPC for MSK Serverless and Managed Service for Apache Flinkapplication.
2. MSK Serverless.
3. Kinesis Data Analytics Java DataStream API application.
4. IAM permissions for the role associated with the Kinesis Data Analytics application.
3. Managed Service for Apache Flink Java DataStream API application.
4. IAM permissions for the role associated with the Managed Service for Apache Flink application.

This CDK script expects you to supply the following *existing* resources:

Expand All @@ -19,14 +19,14 @@ Open up `cdk.json` and fill in appropriate values for each of these CDK context

| Context value name | Purpose | Notes
| --- | --- | --- |
| `kdaAppName` | The name of the Kinesis Data Analytics application | KDA app *will be created* |
| `msfAppName` | The name of the Managed Service for Apache Flink application | MSF app *will be created* |
| `appBucket` | The S3 bucket where the application payload will be stored | *Must be pre-existing* |
| `appSinkBucket` | The bucket to which the MSK to S3 Flink app will write output files (in Parquet) | *Must be pre-existing* |
| `runtimeEnvironment` | The Kinesis Data Analytics runtime environment | For instance, `FLINK-1_15` |
| `deployDataGen` | `true` if you want Zeppelin-based interactive KDA for data generation to be deployed; `false` otherwise | N/A |
| `glueDatabaseName` | The AWS Glue database that will be used by KDA Studio datagen app | *Must be pre-existing* |
| `kdaLogGroup` | The name for the CloudWatch Log Group that will be linked to the KDA Flink app | Log group *will be created* |
| `kdaLogStream` | The name for the CloudWatch Log Stream that will be linked to the KDA Flink app | Log stream *will be created* |
| `runtimeEnvironment` | The Managed Service for Apache Flink runtime environment | For instance, `FLINK-1_15` |
| `deployDataGen` | `true` if you want Zeppelin-based interactive MSF for data generation to be deployed; `false` otherwise | N/A |
| `glueDatabaseName` | The AWS Glue database that will be used by MSF Studio datagen app | *Must be pre-existing* |
| `msfLogGroup` | The name for the CloudWatch Log Group that will be linked to the MSF Flink app | Log group *will be created* |
| `msfLogStream` | The name for the CloudWatch Log Stream that will be linked to the MSF Flink app | Log stream *will be created* |
| `sourceMskClusterName` | The name for the source MSK Serverless cluster | MSK Serverless cluster *will be created* |

For more information on CDK Runtime Context, please see [Runtime Context](https://docs.aws.amazon.com/cdk/v2/guide/context.html).
Expand Down
Loading
Loading