|
| 1 | +# Process real-time data with Amazon Kinesis Data Streams |
| 2 | + |
| 3 | +This tutorial shows you how to process real-time stock trade data using Amazon Kinesis Data Streams. You create a data stream, set up a Lambda producer to generate trades, connect a Lambda consumer to process them, and store results in DynamoDB. |
| 4 | + |
| 5 | +## Prerequisites |
| 6 | + |
| 7 | +- AWS CLI configured with credentials and a default region |
| 8 | +- Permissions to create Kinesis streams, Lambda functions, IAM roles, and DynamoDB tables |
| 9 | + |
| 10 | +## Step 1: Create a Kinesis data stream |
| 11 | + |
| 12 | +```bash |
| 13 | +aws kinesis create-stream --stream-name stock-stream --shard-count 1 |
| 14 | +aws kinesis wait stream-exists --stream-name stock-stream |
| 15 | +``` |
| 16 | + |
| 17 | +## Step 2: Create an execution role |
| 18 | + |
| 19 | +Create a role with permissions for Lambda, Kinesis, and DynamoDB: |
| 20 | + |
| 21 | +```bash |
| 22 | +aws iam create-role --role-name kinesis-tutorial-role \ |
| 23 | + --assume-role-policy-document '{ |
| 24 | + "Version":"2012-10-17", |
| 25 | + "Statement":[{"Effect":"Allow","Principal":{"Service":"lambda.amazonaws.com"},"Action":"sts:AssumeRole"}] |
| 26 | + }' |
| 27 | + |
| 28 | +aws iam attach-role-policy --role-name kinesis-tutorial-role \ |
| 29 | + --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole |
| 30 | +aws iam attach-role-policy --role-name kinesis-tutorial-role \ |
| 31 | + --policy-arn arn:aws:iam::aws:policy/AmazonKinesisReadOnlyAccess |
| 32 | +``` |
| 33 | + |
| 34 | +Add an inline policy for Kinesis writes and DynamoDB access: |
| 35 | + |
| 36 | +```bash |
| 37 | +aws iam put-role-policy --role-name kinesis-tutorial-role --policy-name kinesis-dynamodb \ |
| 38 | + --policy-document '{ |
| 39 | + "Version":"2012-10-17", |
| 40 | + "Statement":[ |
| 41 | + {"Effect":"Allow","Action":["kinesis:PutRecord","kinesis:PutRecords"],"Resource":"*"}, |
| 42 | + {"Effect":"Allow","Action":["dynamodb:PutItem","dynamodb:CreateTable","dynamodb:DescribeTable"],"Resource":"*"} |
| 43 | + ] |
| 44 | + }' |
| 45 | +``` |
| 46 | + |
| 47 | +## Step 3: Create the producer function |
| 48 | + |
| 49 | +The producer generates random stock trades and writes them to the Kinesis stream. |
| 50 | + |
| 51 | +```python |
| 52 | +# producer.py |
| 53 | +import boto3, json, random, time, os |
| 54 | + |
| 55 | +def lambda_handler(event, context): |
| 56 | + kinesis = boto3.client('kinesis') |
| 57 | + stream = os.environ['STREAM_NAME'] |
| 58 | + tickers = ['AAPL', 'AMZN', 'MSFT', 'GOOGL', 'TSLA', 'NFLX', 'NVDA', 'META'] |
| 59 | + for _ in range(10): |
| 60 | + ticker = random.choice(tickers) |
| 61 | + trade = {'ticker': ticker, 'type': random.choice(['BUY', 'SELL']), |
| 62 | + 'price': round(random.uniform(50, 500), 2), |
| 63 | + 'quantity': random.randint(1, 100), |
| 64 | + 'timestamp': int(time.time() * 1000)} |
| 65 | + kinesis.put_record(StreamName=stream, Data=json.dumps(trade), PartitionKey=ticker) |
| 66 | + return {'statusCode': 200, 'body': '10 trades sent'} |
| 67 | +``` |
| 68 | + |
| 69 | +Deploy: |
| 70 | + |
| 71 | +```bash |
| 72 | +zip producer.zip producer.py |
| 73 | +aws lambda create-function --function-name stock-producer \ |
| 74 | + --zip-file fileb://producer.zip --handler producer.lambda_handler \ |
| 75 | + --runtime python3.12 --role <role-arn> \ |
| 76 | + --environment Variables={STREAM_NAME=stock-stream} |
| 77 | +``` |
| 78 | + |
| 79 | +## Step 4: Create the consumer function |
| 80 | + |
| 81 | +The consumer reads trades from the stream and stores them in DynamoDB. |
| 82 | + |
| 83 | +```python |
| 84 | +# consumer.py |
| 85 | +import boto3, json, base64, os |
| 86 | + |
| 87 | +def lambda_handler(event, context): |
| 88 | + dynamodb = boto3.resource('dynamodb') |
| 89 | + table = dynamodb.Table(os.environ['TABLE_NAME']) |
| 90 | + for record in event['Records']: |
| 91 | + payload = base64.b64decode(record['kinesis']['data']).decode() |
| 92 | + trade = json.loads(payload) |
| 93 | + table.put_item(Item={ |
| 94 | + 'TradeId': f"{trade['timestamp']}-{trade['ticker']}", |
| 95 | + 'Ticker': trade['ticker'], 'Type': trade['type'], |
| 96 | + 'Price': str(trade['price']), 'Quantity': trade['quantity']}) |
| 97 | + return {'statusCode': 200} |
| 98 | +``` |
| 99 | + |
| 100 | +## Step 5: Create a DynamoDB table |
| 101 | + |
| 102 | +```bash |
| 103 | +aws dynamodb create-table --table-name stock-trades \ |
| 104 | + --key-schema AttributeName=TradeId,KeyType=HASH \ |
| 105 | + --attribute-definitions AttributeName=TradeId,AttributeType=S \ |
| 106 | + --billing-mode PAY_PER_REQUEST |
| 107 | +aws dynamodb wait table-exists --table-name stock-trades |
| 108 | +``` |
| 109 | + |
| 110 | +## Step 6: Connect the stream to the consumer |
| 111 | + |
| 112 | +```bash |
| 113 | +aws lambda create-event-source-mapping \ |
| 114 | + --function-name stock-consumer \ |
| 115 | + --event-source-arn <stream-arn> \ |
| 116 | + --batch-size 100 --starting-position LATEST |
| 117 | +``` |
| 118 | + |
| 119 | +## Step 7: Produce trades and verify |
| 120 | + |
| 121 | +Invoke the producer, then check DynamoDB: |
| 122 | + |
| 123 | +```bash |
| 124 | +aws lambda invoke --function-name stock-producer response.json |
| 125 | +aws dynamodb scan --table-name stock-trades --limit 3 \ |
| 126 | + --query 'Items[].{Ticker:Ticker.S,Type:Type.S,Price:Price.S}' --output table |
| 127 | +``` |
| 128 | + |
| 129 | +## Cleanup |
| 130 | + |
| 131 | +```bash |
| 132 | +aws lambda delete-event-source-mapping --uuid <mapping-uuid> |
| 133 | +aws lambda delete-function --function-name stock-producer |
| 134 | +aws lambda delete-function --function-name stock-consumer |
| 135 | +aws dynamodb delete-table --table-name stock-trades |
| 136 | +aws kinesis delete-stream --stream-name stock-stream |
| 137 | +aws iam detach-role-policy --role-name kinesis-tutorial-role --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole |
| 138 | +aws iam detach-role-policy --role-name kinesis-tutorial-role --policy-arn arn:aws:iam::aws:policy/AmazonKinesisReadOnlyAccess |
| 139 | +aws iam delete-role-policy --role-name kinesis-tutorial-role --policy-name kinesis-dynamodb |
| 140 | +aws iam delete-role --role-name kinesis-tutorial-role |
| 141 | +``` |
| 142 | + |
| 143 | +The script automates all steps including cleanup. Run it with: |
| 144 | + |
| 145 | +```bash |
| 146 | +bash kinesis-data-streams.sh |
| 147 | +``` |
0 commit comments