Create a Kinesis data stream with a Lambda producer that generates stock trades and a Lambda consumer that stores them in DynamoDB.
https://docs.aws.amazon.com/streams/latest/dev/tutorial-stock-data-kplkcl2.html
- ID: kinesis/getting-started
- Phase: create
- Complexity: intermediate
- Core actions: kinesis:CreateStream, kinesis:PutRecord, lambda:CreateEventSourceMapping
- Creates a Kinesis data stream (1 shard)
- Creates an IAM role with Kinesis, Lambda, and DynamoDB permissions
- Creates a Python producer Lambda that generates random stock trades
- Creates a Python consumer Lambda that writes trades to DynamoDB
- Creates a DynamoDB table (on-demand billing)
- Connects the stream to the consumer via event source mapping
- Produces 10 stock trades and verifies they land in DynamoDB
- Cleans up all resources
bash kinesis-data-streams.shTo auto-run with cleanup:
echo 'y' | bash kinesis-data-streams.sh- Kinesis data stream (1 shard)
- IAM role (with Lambda, Kinesis, and DynamoDB policies)
- 2 Lambda functions (Python 3.12): producer and consumer
- DynamoDB table (on-demand)
- Event source mapping
- 2 CloudWatch log groups (created automatically by Lambda)
- Run: ~2.5 minutes (stream creation takes ~30s, event source mapping activation ~60s)
- Cleanup: ~10 seconds
Kinesis: $0.015/shard-hour. DynamoDB: on-demand pricing. Both are negligible for a single tutorial run. Clean up promptly to avoid ongoing Kinesis charges.