Python AWS Lambda Monorepo — Part 3: Test, Build, and Deploy

Hector Ayala
8 min readNov 30, 2019

If you found this helpful or have feedback, feel free to share your thoughts with me via Twitter or LinkedIn.

In Part 2 of this series, we learned how to create a custom Python package and use make to share it across the Lambda functions. We will work on testing our code locally, build it, deploy it to AWS and run it in on the cloud. In this part of the series we will use CircleCI along with the following tools:

  • AWS CLI 1.16
  • AWS SAM CLI 0.18

The AWS CLI is used to access and change AWS resources from our terminal. Follow the steps in the AWS CLI Docs to set up the developer user profile in your machine. AWS SAM CLI(Serverless Application Model) is a development framework that allows us to test our serverless application locally. With all installed, we can now start programming.

All the code can be downloaded on GitHub:

Jurassic Park III (2001) | Universal Studios/Amblin

Is this how you make dinosaurs?

Testing Locally

Serverless Application Model

With all our functions ready, we can test our functions locally to see if our program is working correctly. AWS SAM uses template files to define the AWS resources of the application. These templates can be used with CloudFormation to deploy the application. They can also be used to spin up a local Lambda environment to test our code. This will be our main use of SAM.

We start by adding a template.yml at the root of our project.

├── /packages
├── /services
├── Makefile
└── template.yml

These templates have a predefined structure. Let’s look at one of our function definitions:

AWSTemplateFormatVersion : '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Python Lambda Monorepo - Example
FunctionName: create_dinosaur
Description: Create Dinosaur Lambda
Type: AWS::Serverless::Function
Runtime: python3.7
Handler: main.handler
CodeUri: services/create_dinosaur
Timeout: 20
PYTHONPATH: ./packages
Type: Api
Path: /fight_dinosaurs
Method: any

This describes the Lambda function running the code for create_dinosaur. The resource is calledcreatedinosaur since SAM does not allow symbols in resource names. We also specify the Lambda function properties:

  • Runtime refers to what programming language environment the Lambda function will be set up for.
  • Handler specifies the code entry point for the Lambda function. The format is file.function where function is the entry function where our code starts executing and file is the script file containing our main function. In our case,main is the file and handler is the entry function.
  • CodeUri specifies the directory path of the function relative to the template file.
  • Timeout specifies the max time the Lambda can run for before timing out.
  • Environment.Variables allow us to define variables to our Python environment. In our case, we add the ./packages path to thePYTHONPATH variable since this will be the installation location of our packages.
  • Events.Api.Type defines the event source that is attached to the function to execute it. In our case we are attaching an API Gateway endpoint to be used for testing
  • Events.Api.Properties.Path defines the path endpoint for our function
  • Events.Api.Properties.Method states what HTTP method to use

Note: The PYTHONPATH variable is defined relative to the root directory of the Lambda function

With our templates.yml file ready, we can use the AWS SAM CLI to run the Lambdas locally. Make sure your AWS CLI is set up to access your AWS account with an Access ID and Access Key. In our terminal, we’ll run the invoke command from the project root directory:

sam local invoke createdinosaur --no-event

Note: We will run all our terminal commands from the project root directory unless specified otherwise

We see various things have happened. SAM invoked our Lambda locally. It did so by spinning up a local Docker container Python image in which our code ran. Then we see the actual code execution status: the START information, any console log messages printed by our code, the END information and REPORT information containing metadata about our code execution.

In this case, we got a No dinosaur provided message. This is because we are not passing any event data to our Lambda function (note we used --no-event in our command). Let’s add a new directory in the root directory named requests.

├── /packages
├── /services
├── /requests
| └─ tyrannosaurus.json
├── Makefile
└── template.yml

In here we will add .json files which we will pass as event data to our Lambda function. Let’s add a request .json file to create a Tyrannosaurus:

// tyrannosaurus.json
"name": "Tyrannosaurus rex",
"diet": "carnivore",
"period": "Cretaceous",
"weight": 28700,
"armor": false,
"hybrid": false

Let’s run the command again but adding the event data (-e parameter):

sam local invoke createdinosaur -e requests/tyrannosaurus.json

The function received the dinosaur data from the event parameter in the handler function. Now we see the function printed our dinosaur data and the successful response from the DynamoDB insert. If we go to DynamoDB we will see our new dinosaur.

With this knowledge, we can add the rest of the Lambda functions to our template file to test them locally.

Build and Deploy

We have our code, we have our packages. It’s time to start pushing all these files to our AWS environment. To do so we’ll use the CircleCi setup we did in Part 1. To define the CircleCi jobs we want to run we need to add our CircleCI config script to our project.

├── /.circleci
| └─ config.yml
├── /packages
├── /services
├── /requests
├── Makefile
└── template.yml

We will go over our config file part by part. For more in-depth information on the concepts that make up the CicleCi config file and process check out their docs. We’ll start with the environment:

version: 2.1
aws-cli: circleci/aws-cli@0.1.4
- image: circleci/python:3.7.0
working_directory: ~/tmp

Orbs in CircleCI are pre-made packages of resources and jobs that can be reused in your deploy process. Orbs are stored in a registry and CircleCI provides various orbs for common steps. In this case we are using the circleci/aws-cli orb. This orb installs the AWS CLI in our deployment environment for us.

In our build job, we are using a Docker executor which defines the environment in which we will run our script and process. Similar to Orbs, there are many pre-made images stored in a registry for common processes. in our case we use a Python3.7 image. The working_directory specifies where in this image environment does our script runs the steps. This takes us to the first steps:

- checkout
- aws-cli/install
- run:
name: Configure ENV files based on environment
command: |
echo "Setting AWS environment"
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
aws configure set region us-east-1

First, we check out our Github code into the CircleCI environment and install the AWS CLI using the Orb. Then we run the aws commands to use the AWS access key and secret from the CircleCI environment variables we set up in the beginning.

- run:
name: Zip and deploy Lambda Functions
command: |
# iterate through lambdas
for LAMBDA_PATH in services/*/
if [[ -f services/$FUNCTION_NAME/requirements.txt ]]; then
echo "Installing packages..."
pip install -r services/$FUNCTION_NAME/requirements.txt --target services/$FUNCTION_NAME/packages/ --find-links ./packages
cd services/$FUNCTION_NAME
echo "Building $FUNCTION_NAME..."
zip *.py */ -r
echo "Deploying $FUNCTION_NAME..."
aws lambda update-function-code --function-name $FUNCTION_NAME --region us-east-1 --zip-file fileb://
cd $ROOT

This is our whole build and deploy script. First, we store our current starting directory. We go through each Lambda function directory in the services directory, storing the directory name. We check if the function has a requirements.txt file present. If so, we run the install command we defined in our Makefile to install our package into the Lambda function directory. Then we go into the function directory and run the zip command. This command creates a .zip file that contains any file and directory containing files (e.i. our packages) that end with the Python extension.

Finally, we run the aws lambda update-function-code command to push this new zip file to the specified Lambda function. The process is repeated for each function. Now we define the workflow”

version: 2
- build:
- master

This is a simple workflow which runs the build job on our master branch only. Now we are ready to use this! Once we push this config file to the repo and start making changes to the master branch, we can see the build run in the CircleCI console:

Old CircleCI UI
New CircleCI UI

Inside our build, we see all the steps that executed as defined in our config.yml. Open each section and view the console outputs for the details on each step.

Old and New CircleCI UI

The process completed successfully and we can confirm everything worked by going into the AWS console and looking at the Lambda functions.

Run it!

Inside our Lambda function we see all our files are uploaded. They are now ready to run requests on AWS. Success!

We can run these Lambda functions manually by using the Test option in the console and pass a JSON object with request data to the function. We’ll test out the fight_dinosaurs function. After importing a couple of herbivore and carnivore dinosaurs into our app, start a test with an empty JSON to run the function.

Once the function runs, it returns metadata, response data, and the console log output. If all is set up correctly, we’ll see function runs correctly and the console output shows the hunt process. In this case, Diplodocus survived the attack (phew).


This marks the end of our 3 part series. This was the overall structure of our monorepo:

├── /.circleci
| └─ config.yml
├── /packages
| └── /package1
| ├── /package1
| | ├──
| | └──
| └──
| ...
├── /services
| ├─ /function1
| | ├─
| | └─ requirements.txt
| ...
├── /requests
├── Makefile
└── template.yml

From here on you can create serverless services that leverage your libraries in a manageable and scalable way. You can improve on the deployment process and version your releases by enhancing the deployment scripts.

This is my first foray into writing technical articles and I’ve learned a lot in the process. Hopefully, this was clear and helpful in how to set up a Lambda monorepo, create Python packages to reuse across your services and use CircleCI to deploy your code. Now, build on!



Hector Ayala

Co-founder and CTO of Hyperion. Tech entrepreneur from Puerto Rico 🇵🇷 Interested in combining tech, business, and product design.