
Python is an excellent choice for building solutions on AWS. Services like AWS Lambda are purpose-built for running microservices (small applications that are mostly independent of each other) that easily integrate with the AWS SDK, and have native support for Python among other languages. In this article, we'll look at uv and other useful tools to easily develop Python-powered microservices.
Introduction to uv
uv is a Python package management tool similar to Pip and Poetry. It acts as a complete solution for dependency installation and lockfiles, virtual environment creation, and may optionally act as a Python build backend. Another advantage is its ability to easily install a Python runtime with a specified version.
uv allows for declarative project management using a standardized pyproject.toml file, for example:
[project]
name = "my-python-project"
version = "0.1.0"
description = "A sample project"
requires-python = "3.13"
dependencies = [
"requests>=2.32.4",
]
Furthermore, it allows for project management through simple CLI commands, such as uv sync to sync dependencies to the virtual env, and uv run to run Python scripts in your project.
Along with various other useful Python tools, uv is developed by Astral. In comparison to other Python management tools, uv stands out thanks to its speed, partly because it is built with Rust. For this particular use case, uv is especially useful thanks to its workspaces feature.
Isolating services with uv workspaces
uv Workspaces allow creating multiple Python packages as part of the same uv project. Each package is allowed to have isolated source code and dependencies. A package may depend on other packages in the same workspace. The term monorepo also comes to mind. Workspaces are inspired by a feature with the same name available in Rust's Cargo package management tool.
The workspace structure is especially useful for a microservice-oriented application. Each service (such as a Lambda function handler) can belong to an isolated uv package. Additionally, shared utilities may be separated to non-runnable library-style packages that can be depended on by the service packages.
Here is an example file structure for a project using uv workspaces:
pyproject.toml
packages/
├── service1/
│ ├── pyproject.toml
│ └── src/
└── shared/
├── pyproject.toml
└── src/
Each package has its own subdirectory and a pyproject.toml with unique dependencies. The root pyproject.toml has the responsibility to define the workspace structure and packages:
# The root pyproject.toml file:
[project]
name = "microservices-app"
dependencies = []
[tool.uv.sources]
service1 = { workspace = true }
shared = { workspace = true }
[tool.uv.workspace]
members = ["packages/*"]
Then, service1 can define its own dependencies and other attributes in its own pyproject.toml:
# service1/pyproject.toml:
[project]
name = "service1"
version = "0.1.0"
dependencies = [
"shared>=0.1.0"
]
[build-system]
requires = ["uv_build"]
build-backend = "uv_build"
In this case, we specify the shared package as a dependency. Thanks to the [tool.uv.sources] section in the root pyproject.toml, uv knows that shared refers to another package in the workspace. We can also use the [build-system] section to use uv as a build backend; when running uv build --package=service1, we can create a Python wheel for service1 in particular. Note that many uv commands, such as sync, add, run, and build support the --package option to run only for a specified package in the workspace. Similarly, the --all-packages option runs a command for all packages.
Writing a Lambda function
Note: the following examples assume an ARM64 architecture.
Lambda functions are the basis of serverless computing in AWS (along with AWS Fargate). We can use uv to help develop and build Lambda function handlers in Python. Using workspaces, each Lambda can have a unique package containing its handler source code. An example handler function that also integrates with the AWS SDK could look like this:
The following is an example Lambda function handler that receives an event from SQS and integrates with the AWS SDK for DynamoDB in order to alter a table in response to the event:
# src/sample_lambda/main.py:
import boto3
dynamodb = boto3.resource("dynamodb")
users = dynamodb.table("Users")
def lambda_handler(event, context):
for record in event["Records"]:
users.put_item(Item={
"UserName": record["body"]
})
print("All records processed.")
Lambda Powertools for Python is also a great choice to ease development by adding ergonomics such as logging, AWS X-Ray integration, type hints, and event data classes, among others. You can use the add subcommand to add a dependency to a specific package:
uv add --package=sample_lambda aws_lambda_powertools
We can use Powertools to improve the handler:
# src/sample_lambda/main.py:
import boto3
from aws_lambda_powertools import Logger
from aws_lambda_powertools.utilities.data_classes import SQSEvent, event_source
dynamodb = boto3.resource("dynamodb")
users = dynamodb.table("Users")
logger = Logger()
@logger.inject_lambda_context
@event_source(data_class=SQSEvent)
def lambda_handler(event: SQSEvent, context):
for record in event.records:
users.put_item(Item={
"UserName": record.body
})
logger.info("All records processed.")
Then, when our handler source code is ready to deploy, we can use uv to build a Python wheel that Lambda can run. In this case, we'll be creating a zip archive with the handler's source code and dependencies all bundled together. Note that an alternative is to build a custom Lambda Layer or use a pre-built one with the required dependencies.
An advantage of bundling all dependencies, including redundant packages such as boto3 which are already available in most Lambda runtimes, is the ability to lock dependency versions. This is considered a good practice that prevents unexpected automatic dependency upgrades and works in conjunction with uv's lockfile feature.
The following shell script uses the build command to create a wheel (which by default is already a zip archive) for a package called sample_lambda, and then uses uv's Pip-compatible interface to export and add dependencies to that same wheel:
# temporary directory to install deps to.
dependency_dir="$PWD/dist/sample_lambda_deps"
output_archive="$PWD/dist/sample_lambda-0.1.0-py3-none-any.whl"
# cleaning cache is sometimes necessary when working
# with workspaces.
uv cache clean
uv build --package=sample_lambda
# uv can create a pip-compatible requirements.txt file
# with your package's dependencies.
uv export \
--frozen \
--no-dev \
--no-editable \
--package=sample_lambda \
-o requirements.txt > /dev/null
uv pip install \
--no-sources \
--no-installer-metadata \
--no-compile-bytecode \
--python-platform aarch64-manylinux2014 \
--target $dependency_dir \
-r requirements.txt
pushd $dependency_dir
zip -qr $output_archive .
popd
# cleanup temporary files
rm requirements.txt
rm -rf $dependency_dir
This will create a Lambda-ready zip archive at dist/sample_lambda-0.1.0-py3-none-any.whl that you can deploy using your favorite infrastructure as code tool, or the AWS SDK, CLI, or Console.
Deploying the Lambda function with Terraform
Terraform is a popular source-available infrastructure as code tool that can be used to easily deploy a Lambda function and various other AWS resources. Although it is not the only deployment option, it competes thanks to its simplicity, so we'll use it as an example:
locals {
sample_dist_archive = "dist/sample_lambda-0.1.0-py3-none-any.whl"
sample_source_hash = fileexists(local.sample_dist_archive) ? filebase64sha256(local.sample_dist_archive) : ""
}
resource "aws_lambda_function" "sample" {
function_name = "sample_lambda"
filename = local.sample_dist_archive
source_code_hash = sample_source_hash
handler = "sample_lambda.main.lambda_handler"
runtime = "python3.13"
architectures = ["arm64"]
timeout = 60
}
This configuration uses the zip archive generated by our shell script. Using the source_code_hash attribute ensures that Terraform detects a change of infrastructure every time we build a new zip archive, even if the Terraform files remain the same. The handler attribute must refer to the python module (sample_lambda.main) and function name (lambda_handler) that Lambda will look for and call at runtime.
See the full reference of the aws_lambda_function Terraform resource for more information. Since our sample function is meant to handle an SQS event, the aws_lambda_event_resource_mapping resource can be used to automatically link it to an SQS Queue:
resource "aws_lambda_event_source_mapping" "sample" {
event_source_arn = aws_sqs_queue.sample.arn
function_name = aws_lambda_function.sample.arn
}
Additional dev-only tools
You can specify development-only dependencies in the root pyproject.toml file as such:
[dependency-groups]
dev = [
"pytest>=8.4.1",
"ruff>=0.12.4",
"ty>=0.0.1a16",
]
The following is a list of related tools that you can add to enhance your uv development experience:
- Ruff - Also developed by Astral, Ruff lints and formats your project based on a given configuration.
- Ty - A Python type checker by Astral. At the time of writing, this is still pre-release software; alternatives include Pyrefly, MyPy, PyRight, among others.
- Pytest - A popular Python testing library that integrates well with uv projects. You can get started with it by creating a tests/ directory in the root folder and then running uv run pytest.
We recommend you on video