A Serverless v1.x plugin to automatically bundle dependencies from
requirements.txt
and make them available in your PYTHONPATH
.
Requires Serverless >= v1.12
npm install --save serverless-python-requirements
Add the plugin to your serverless.yml
:
plugins:
- serverless-python-requirements
Compiling non-pure-Python modules or fetching their manylinux wheels is
supported on non-linux OSs via the use of Docker and the
docker-lambda image.
To enable docker usage, add the following to your serverless.yml
:
custom:
pythonRequirements:
dockerizePip: true
To utilize your own Docker container instead of the default, add the following to your serverless.yml
:
custom:
pythonRequirements:
dockerImage: <image name>:tag
This must be the full image name and tag to use, including the runtime specific tag if applicable.
To help deal with potentially large dependencies (for example: numpy
, scipy
and scikit-learn
) there is support for compressing the libraries. This does
require a minor change to your code to decompress them. To enable this add the
following to your serverless.yml
:
custom:
pythonRequirements:
zip: true
and add this to your handler module before any code that imports your deps:
import unzip_requirements
If you want to be able to use sls invoke local
and don't have a check for
lambda or a try
/except ImportError
around that import, you can use the
following option to make this plugin not delete the unzip_requirements
helper:
custom:
pythonRequirements:
cleanupZipHelper: false
You can omit a package from deployment by adding #no-deploy
to the
requirement's line in requirements.txt
. For example, this will not install
the AWS SDKs that are already installed on Lambda, but will install numpy:
numpy
boto3 #no-deploy
botocore #no-deploy
docutils #no-deploy
jmespath #no-deploy
python-dateutil #no-deploy
s3transfer #no-deploy
six #no-deploy
You can specify extra arguments to be passed to pip like this:
custom:
pythonRequirements:
dockerizePip: true
pipCmdExtraArgs:
- --cache-dir
- .requirements-cache
The .requirements
and requirements.zip
(if using zip support) files are left
behind to speed things up on subsequent deploys. To clean them up, run
sls requirements clean
. You can also create them (and unzip_requirements
if
using zip support) manually with sls requirements install
.
If you are using your own Python library, you have to cleanup
.requirements
on any update. You can use the following option to cleanup
.requirements
everytime you package.
custom:
pythonRequirements:
invalidateCaches: true
This requires an update to your serverless.yml:
provider:
name: aws
runtime: python3.6
And be sure to clean up .requirements
or requirements.zip
if they exist as
python2.7 and python3.6 code can't coexist.
This plugin is influenced by serverless-wsgi from @logandk. I however wanted a simpler pip install process. It now also supports bundling packages without the wsgi handler.