I'm having quota issues with my Lambda functions b...
# aws
c
I'm having quota issues with my Lambda functions because they all pull in the whole
node_modules
folder. One function might use one package while another doesn't but function serialization doesn't discriminate what packages are being used. This means that they all grow in size as more packages are included in the stack/project. Has anyone ran across this issue? The compressed zip can't be over 50MB and when uncompressed, over 250MB. I know you can instead utilize a container to run the Lambda function, but how would you call aws functions inside of it (like
s3.putObject
) without just importing the aws-cli and executing it?
Another option is if the .zip file is larger than 50 MB, upload it to the function from an Amazon Simple Storage Service (Amazon S3) bucket.
It also looks like utilizing the s3 bucket for CallbackFunction's does not work either https://github.com/pulumi/pulumi-aws/issues/137
My solution was to create a Dockerfile and have my CallbackFunction invoke the Function used in the container. Basically using an extra Lambda function as a workaround. Seems to work well, and going forward for any larger deployments (or Lambda handlers) this is what I'm going to do
To "call" AWS functions inside of the Docker container, you can just use the aws-sdk and pass in the ID's of the resources you're trying to use in the
Payload
. Not really that elegant of a solution but it works 🤷
Copy code
const json = JSON.stringify({
    bucketId: bucket.get()
});
const lambdaResponse = await lambda.invoke({
    FunctionName: containerLambdaFunction.get(),
    Payload: json
}).promise();
g
A little late, but I use esbuild to package all my functions, it is fast and has some tree-shaking. I also use layers for aws-sdk and other large packages