Deploying Dockerized Applications on AWS Lambda: A Step-by-Step Guide

Implementing CI/CD pipelines for Docker applications, especially when deploying to AWS environments like Lambda, requires a well-thought-out approach to ensure smooth, automated processes for both development and production stages. The following outlines how to set up a CI/CD pipeline, using AWS services and considering a Docker application scheduled to execute on AWS Lambda every 12 hours.

Overview

The goal is to automate the process from code commit to deployment, ensuring that any updates to the application are automatically tested and deployed to the development environment and, following approval, to production. AWS services like CodeCommit, CodeBuild, CodeDeploy, and Lambda, along with CloudWatch for scheduling, will be instrumental in this setup.

Application Containerization With Docker

Application containerization with Docker is a pivotal step in modernizing applications, ensuring consistent environments from development to production, and facilitating continuous integration and continuous deployment (CI/CD) processes. This section expands on how to effectively containerize an application using Docker, a platform that packages an application and all its dependencies into a Docker container to ensure it runs uniformly in any environment.

Understanding Docker Containers

Docker containers encapsulate everything an application needs to run: the application's code, runtime, libraries, environment variables, and configuration files. Unlike virtual machines, containers share the host system's kernel but run in isolated user spaces. This makes them lightweight, allowing for rapid startup and scalable deployment practices.

Dockerizing an Application: The Process

  1. Creating a Dockerfile:
    • A Dockerfile is a text document containing all the commands a user could call on the command line to assemble an image. Creating a Dockerfile involves specifying a base image (e.g., Python, Node.js), adding your application code, and defining commands to run the application.
    • Example: For a Python-based application, your Dockerfile might start with something like FROM python:3.8-slim, followed by COPY . /app to copy your application into the container, and CMD ["python", "./app/my_app.py"] to run your application.
  2. Building the Docker Image:
    • Once the Dockerfile is set up, use the Docker build command to create an image. This image packages up your application and its environment.
    • Command Example: docker build -t my_app:1.0 .
    • This command tells Docker to build an image named my_app with a tag of 1.0 based on the Dockerfile in the current directory (.).
  3. Running Your Docker Container:
    • After building the image, run your application in a container using Docker's run command.
    • Command Example: docker run -d -p 5000:5000 my_app:1.0
    • This command starts a container based on the my_app:1.0 image, mapping port 5000 of the container to port 5000 on the host, allowing you to access the application via localhost:5000.

Best Practices for Dockerizing Applications

Source Control With GitLab

Integrating a CI/CD pipeline with GitLab for deploying a Dockerized application to AWS Lambda involves several key steps, from setting up your GitLab repository for source control to automating deployments through GitLab CI/CD pipelines. In the context of our example — an e-commerce platform's price update microservice scheduled to run every 12 hours — let's break down how to set up source control with GitLab and provide a code example for the Lambda function. 

Initialize a GitLab Repository: Start by creating a new project in GitLab for your application. This repository will host your application code, Dockerfile, buildspec.yml, and .gitlab-ci.yml files.

Push Your Application to GitLab:

PowerShell
 
git clone <your-gitlab-repository-url>

# Add your application files to the repository

git add .

git commit -m "Initial commit with application and Dockerfile"

git push -u origin master


Set up .gitlab-ci.yml: The .gitlab-ci.yml file defines your CI/CD pipeline in GitLab. For deploying a Dockerized Lambda function, this file needs to include steps for building the Docker image, pushing it to Amazon ECR, and updating the Lambda function to use the new image.

Code Example for AWS Lambda Function

Before setting up the CI/CD pipeline, let's define the Lambda function. Assuming the microservice is written in Python, the function might look like this: 

Python
 
import requests

import boto3

 

def update_pricing_data(event, context):

    # Your code to fetch new pricing data

    pricing_data = requests.get("https://api.example.com/pricing").json()

    

    # Logic to update the database with new pricing data

    # For simplicity, we'll assume it's a direct call to an RDS instance or DynamoDB

    # Note: Ensure your Lambda function has the necessary permissions for database access

    

    db_client = boto3.client('dynamodb')

    for product in pricing_data['products']:

        # Example of updating DynamoDB (simplified)

         db_client.update_item(

             TableName='ProductPrices',

             Key={'productId': {'S': product['id']}},

             UpdateExpression='SET price = :val',

             ExpressionAttributeValues={':val': {'N': str(product['price'])}}

        )

 

    return {

        'statusCode': 200,

        'body': 'Product pricing updated successfully.'

    }


This function fetches pricing data from an external API and updates a DynamoDB table with the new prices.

Integrating AWS Lambda With GitLab CI/CD

Dockerfile: Ensure your Dockerfile is set up to containerize your Lambda function correctly. AWS provides base images for Lambda which you can use as your starting point.

Dockerfile
 
# Example Dockerfile for a Python-based Lambda function

FROM public.ecr.aws/lambda/python:3.8

 

# Copy function code and requirements.txt into the container image

COPY update_pricing.py requirements.txt ./

 

# Install the function's dependencies

RUN python3.8 -m pip install -r requirements.txt

 

# Set the CMD to your handler

CMD ["update_pricing.update_pricing_data"]


.gitlab-ci.yml Example: Define the pipeline in .gitlab-ci.yml for automating the build and deployment process.

YAML
 
stages:

  - build

  - deploy

 

build_image:

  stage: build

  script:

    - $(aws ecr get-login --no-include-email --region us-east-1)

    - docker build -t my_ecr_repo/my_lambda_function:latest .

    - docker push my_ecr_repo/my_lambda_function:latest

 

deploy_lambda:

  stage: deploy

  script:

    - aws lambda update-function-code --function-name myLambdaFunction --image-uri my_ecr_repo/my_lambda_function:latest

  only:

    - master


This CI/CD pipeline automates the process of building your Docker image, pushing it to Amazon ECR, and updating the AWS Lambda function to use the new image. Make sure to replace placeholders like my_ecr_repo/my_lambda_function with your actual ECR repository URI and adjust AWS CLI commands based on your setup and region.

By following these steps and leveraging GitLab's CI/CD capabilities, you can automate the deployment process for your Dockerized AWS Lambda functions, ensuring that your e-commerce platform's price update microservice is always running with the latest codebase.

Deploying Docker Application on AWS Lambda

Deploying a Docker application on AWS Lambda involves several steps, starting from containerizing your application to configuring the Lambda function to use the Docker image. This process enables you to leverage the benefits of serverless architecture, such as scalability, cost-efficiency, and ease of deployment, for your containerized applications. Here’s how you can deploy a Docker application on AWS Lambda:

Containerize Your Application

Create a Dockerfile: Begin by defining a Dockerfile in your application's root directory. This file specifies the base image, dependencies, and other configurations needed to containerize your application.

Dockerfile
# Example Dockerfile for a Python-based application

FROM public.ecr.aws/lambda/python:3.8

 

# Copy the application source code and requirements.txt

COPY app.py requirements.txt ./

 

# Install any dependencies

RUN python3.8 -m pip install -r requirements.txt

 

# Define the handler function

CMD ["app.handler"]


Build the Docker Image: With the Dockerfile in place, build the Docker image using the Docker CLI. Ensure the image is compatible with AWS Lambda's container image requirements.

bash

PowerShell
docker build -t my-lambda-app .

 

Push the Docker Image to Amazon ECR: Create an ECR Repository: If you haven't already, create a new repository in Amazon Elastic Container Registry (ECR) to store your Docker image.

PowerShell
 
aws ecr create-repository --repository-name my-lambda-app


Authenticate Docker to Your ECR Registry: Authenticate your Docker CLI to the Amazon ECR registry to push images.

PowerShell
 
aws ecr get-login-password --region <your-region> | docker login --username AWS --password-stdin <your-aws-account-id>.dkr.ecr.<your-region>.amazonaws.com


Tag and Push the Docker Image: Tag your local Docker image with the ECR repository URI and push it to ECR.

PowerShell
 
docker tag my-lambda-app:latest <your-aws-account-id>.dkr.ecr.<your-region>.amazonaws.com/my-lambda-app:latest

docker push <your-aws-account-id>.dkr.ecr.<your-region>.amazonaws.com/my-lambda-app:latest


Create and Configure the AWS Lambda Function:

  1. Create a New Lambda Function: Go to the AWS Lambda console and create a new Lambda function. Choose the "Container image" option as your source and select the Docker image you pushed to ECR. 
  2. Configure Runtime Settings: Specify the handler information if required. For container images, the handler corresponds to the CMD or ENTRYPOINT specified in the Dockerfile.
  3. Adjust Permissions and Resources: Set the appropriate execution role with permissions that your Lambda function needs to access AWS resources. Also, configure memory, timeout, and other resources according to your application's requirements.

Testing and Deployment

  1. Deploy and Test: With the Lambda function configured, deploy it and perform tests to ensure it's working as expected. You can invoke the Lambda function manually from the AWS console or using the AWS CLI.
  2. Set Up Triggers (Optional): Depending on your use case, set up triggers to automatically invoke your Lambda function. For a Docker application that needs to execute periodically (e.g., every 12 hours), you can use Amazon CloudWatch Events to schedule the function.
PowerShell
 
aws events put-rule --name "MyScheduledRule" --schedule-expression "rate(12 hours)"

aws lambda add-permission --function-name "myLambdaFunction" --action "lambda:InvokeFunction" --principal events.amazonaws.com --source-arn <arn-of-the-scheduled-rule>

aws events put-targets --rule "MyScheduledRule" --targets "Id"="1","Arn"="<Lambda-function-ARN>"


Deploying Docker Application on AWS Lambda

Container Image Support: Ensure your application fits within the Lambda container image guidelines. You may need to adjust your Dockerfile to meet Lambda's requirements. 

Upload to ECR: Push your Docker image to Amazon ECR, which will serve as the source for Lambda to pull and execute the container.

Create Lambda Function: Configure a new Lambda function to use the container image from ECR as its source. Set the execution role with appropriate permissions for Lambda operations.

Scheduling Execution with AWS CloudWatch 

CloudWatch Event Rule: Set up a CloudWatch Event rule to trigger your Lambda function every 12 hours. Use a cron expression for scheduling (e.g., cron(0 */12 * * ? *)).

Monitoring and Rollback

CloudWatch Metrics and Logs: Utilize CloudWatch for monitoring application logs and performance metrics. Set alarms for any critical thresholds to be notified of issues.

Rollback Strategy: Ensure your CI/CD pipeline supports rolling back to previous versions in case of deployment failures or critical issues in production.

Conclusion 

Implementing CI/CD for Docker applications deploying to AWS environments, including Lambda for scheduled tasks, enhances operational efficiency, ensures code quality, and automates deployment processes. By leveraging AWS services and Docker, businesses can achieve a highly scalable and reliable deployment workflow for their applications.

 

 

 

 

Top