Skip to main content

CDK constructs for defining an interaction between an Amazon Kinesis Data Firehose delivery stream and (1) an Amazon S3 bucket, and (2) an Amazon Kinesis Data Analytics application.

Project description

aws-kinesisfirehose-s3-and-kinesisanalytics module

---

Stability: Experimental

All classes are under active development and subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.


Reference Documentation: https://docs.aws.amazon.com/solutions/latest/constructs/
Language Package
Python Logo Python aws_solutions_constructs.aws_kinesisfirehose_s3_and_kinesisanalytics
Typescript Logo Typescript @aws-solutions-constructs/aws-kinesisfirehose-s3-and-kinesisanalytics
Java Logo Java software.amazon.awsconstructs.services.kinesisfirehoses3kinesisanalytics

This AWS Solutions Construct implements an Amazon Kinesis Firehose delivery stream connected to an Amazon S3 bucket, and an Amazon Kinesis Analytics application.

Here is a minimal deployable pattern definition in Typescript:

const { KinesisFirehoseToAnalyticsAndS3 } from '@aws-solutions-constructs/aws-kinesisfirehose-s3-and-kinesisanalytics';

new KinesisFirehoseToAnalyticsAndS3(this, 'FirehoseToS3AndAnalyticsPattern', {
    kinesisAnalyticsProps: {
        inputs: [{
            inputSchema: {
                recordColumns: [{
                    name: 'ticker_symbol',
                    sqlType: 'VARCHAR(4)',
                    mapping: '$.ticker_symbol'
                }, {
                    name: 'sector',
                    sqlType: 'VARCHAR(16)',
                    mapping: '$.sector'
                }, {
                    name: 'change',
                    sqlType: 'REAL',
                    mapping: '$.change'
                }, {
                    name: 'price',
                    sqlType: 'REAL',
                    mapping: '$.price'
                }],
                recordFormat: {
                    recordFormatType: 'JSON'
                },
                recordEncoding: 'UTF-8'
            },
            namePrefix: 'SOURCE_SQL_STREAM'
        }]
    }
});

Initializer

new KinesisFirehoseToAnalyticsAndS3(scope: Construct, id: string, props: KinesisFirehoseToAnalyticsAndS3Props);

Parameters

Pattern Construct Props

Name Type Description
kinesisFirehoseProps? kinesisFirehose.CfnDeliveryStreamProps Optional user-provided props to override the default props for the Kinesis Firehose delivery stream.
kinesisAnalyticsProps? kinesisAnalytics.CfnApplicationProps Optional user-provided props to override the default props for the Kinesis Analytics application.
existingBucketObj? s3.IBucket Existing instance of S3 Bucket object, if this is set then the bucketProps is ignored.
bucketProps? s3.BucketProps User provided props to override the default props for the S3 Bucket.
logGroupProps? logs.LogGroupProps User provided props to override the default props for for the CloudWatchLogs LogGroup.

Pattern Properties

Name Type Description
kinesisAnalytics kinesisAnalytics.CfnApplication Returns an instance of the Kinesis Analytics application created by the pattern.
kinesisFirehose kinesisFirehose.CfnDeliveryStream Returns an instance of the Kinesis Firehose delivery stream created by the pattern.
kinesisFirehoseRole iam.Role Returns an instance of the iam.Role created by the construct for Kinesis Data Firehose delivery stream
kinesisFirehoseLogGroup logs.LogGroup Returns an instance of the LogGroup created by the construct for Kinesis Data Firehose delivery stream
s3Bucket? s3.Bucket Returns an instance of the S3 bucket created by the pattern.
s3LoggingBucket? s3.Bucket Returns an instance of s3.Bucket created by the construct as the logging bucket for the primary bucket.

Default settings

Out of the box implementation of the Construct without any override will set the following defaults:

Amazon Kinesis Firehose

  • Enable CloudWatch logging for Kinesis Firehose
  • Configure least privilege access IAM role for Amazon Kinesis Firehose

Amazon S3 Bucket

  • Configure Access logging for S3 Bucket
  • Enable server-side encryption for S3 Bucket using AWS managed KMS Key
  • Enforce encryption of data in transit
  • Turn on the versioning for S3 Bucket
  • Don't allow public access for S3 Bucket
  • Retain the S3 Bucket when deleting the CloudFormation stack
  • Applies Lifecycle rule to move noncurrent object versions to Glacier storage after 90 days

Amazon Kinesis Data Analytics

  • Configure least privilege access IAM role for Amazon Kinesis Analytics

Architecture

Architecture Diagram


© Copyright 2021 Amazon.com, Inc. or its affiliates. All Rights Reserved.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file aws-solutions-constructs.aws-kinesis-firehose-s3-kinesis-analytics-1.97.0.tar.gz.

File metadata

File hashes

Hashes for aws-solutions-constructs.aws-kinesis-firehose-s3-kinesis-analytics-1.97.0.tar.gz
Algorithm Hash digest
SHA256 5c2d8788e86844509e487dfe1025500829f7abc19e317e510b8f199fa9d2a138
MD5 e462d748cdc350ca24559ddde3838cf5
BLAKE2b-256 d7a3c2bcda755eaf7ea46fb45a900b1018d3ea8cf5cca9e747a58da30c169cb4

See more details on using hashes here.

File details

Details for the file aws_solutions_constructs.aws_kinesis_firehose_s3_kinesis_analytics-1.97.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aws_solutions_constructs.aws_kinesis_firehose_s3_kinesis_analytics-1.97.0-py3-none-any.whl
Algorithm Hash digest
SHA256 dc947be916c73549c5b2a945cc8dd28c26958164427edb5e15e4ae3e1f6aeb14
MD5 3bb5462bc9c8f5356b95efbeceeaf7e2
BLAKE2b-256 40716d67eacda87ec7ad0e846a97a5fed85a0ea55c6c8fb175538de294cfa85a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page