Summary#

GitHub this note shows how to set up a very simple s3 trigger lambda, and build a cloudwatch dashboard to monitor a lambda function

  • Build a simple data pipeline analysis for each data uploaded to S3 bucket
  • The Lambda function analyse data and save outputs/images to S3 bucket
  • Send notification via email for each processed data
  • Monitor performance of the Lambda by adding a CloudWatch dashboard

Architecture#

aws_devops-CdkPipelineFhr drawio (1)

Setup Project#

In this case we want to deploy the lambda via ECR image, let create a project with struture as below

|--bin
|--s3-trigger-lambda.ts
|--lib
|--s3-trigger-lambda-stack.ts
|--lambda-ecr
|--index.py
|--Dockerfile
|--requirements.txt
|--.dockerignore

CDK Stack#

Let create role for the lambda function

  • Execution role and send log to CloudWatch
  • Read and write data from s3
  • Send notification via sns
const role = new aws_iam.Role(this, "LambdaRoleAccessS3", {
assumedBy: new aws_iam.ServicePrincipal("lambda.amazonaws.com"),
});
role.addManagedPolicy(
aws_iam.ManagedPolicy.fromAwsManagedPolicyName("AWSLambdaExecute")
);
// lambda role inline policies
role.attachInlinePolicy(
new aws_iam.Policy(this, "LambdaAccessS3Policy", {
statements: [
new aws_iam.PolicyStatement({
effect: aws_iam.Effect.ALLOW,
actions: ["sns:*"],
resources: [
`arn:aws:sns:ap-southeast-1:${this.account}:${topic.topicName}`,
],
}),
new aws_iam.PolicyStatement({
effect: aws_iam.Effect.ALLOW,
actions: ["s3:*", "s3-object-lambda:*"],
resources: [`arn:aws:s3:::${props.bucketName}/*`],
}),
],
})
);

Then create a lambda function with ecr image for bundling dependencies

const fn = new aws_lambda.Function(this, "ConvertAudioFunction", {
functionName: "ConvertAudioFunction",
code: aws_lambda.EcrImageCode.fromAssetImage(
path.join(__dirname, "./../lambda-ecr")
),
handler: aws_lambda.Handler.FROM_IMAGE,
runtime: aws_lambda.Runtime.FROM_IMAGE,
memorySize: 512,
timeout: Duration.seconds(15),
role: role,
environment: {
BUCKET_NAME: bucket.bucketName,
TOPIC_ARN: topic.topicArn,
},
});

Configure a bucket to trigger lambda

bucket.addEventNotification(
aws_s3.EventType.OBJECT_CREATED,
new aws_s3_notifications.LambdaDestination(fn),
{ prefix: "m4a/" }
);

Lambda Handler#

Let create a handler to read S3 event and write result to S3. Please take note

  • Read s3 event
  • Download and upload to s3
  • Temporary storage in lambda
import os
import boto3
# s3 boto3 client
client = boto3.client("s3")
# tmp folder
tmp_dir = "/tmp"
def handler(event, context):
"""
convert m4a to mp3
"""
#
print(event)
# parse file name from event
try:
full_name = event["Records"][0]["s3"]["object"]["key"]
except:
full_name = "m4a/test.m4a"
# parse only file name
file_name = full_name.split("/")[-1].split(".")[0]
# read file from s3
client.download_file(
os.environ["BUCKET_NAME"],
full_name,
f"{tmp_dir}/{file_name}.m4a"
)
# upload to s3
client.upload_file(
f"{tmp_dir}/{file_name}.m4a",
os.environ["BUCKET_NAME"],
f"mp3/{file_name}.mp3"
)
# return
return {"message": "hello lambda docker"}
if __name__=="__main__":
handler(event=None, context=None)

In case we need to send notification via sns, let update the handler code with sns client

sns_client = boto3.client("sns");
time_stamp = int(datetime.datetime.now().timestamp() * 1000);
sns_client.publish(
TopicArn=os.environ["TOPIC_ARN"],
Message=f'lambda send a message to sns {time_stamp}'
)

Reference#