https://pulumi.com logo
#aws
Title
# aws
s

steep-refrigerator-3541

02/27/2024, 12:26 PM
Hi, I'm relatively new to utilizing Pulumi for deploying infrastructure as code. My objective is to use Pulumi within an AWS Lambda to orchestrate the deployment of infrastructure. The Lambda function should use Pulumi to create additional Lambdas and SNS subscriptions. I've been searching online but haven't come across any relevant examples. If you could provide a sample illustration, it would be greatly appreciated. Thank you!
p

plain-eye-2667

02/27/2024, 2:21 PM
@steep-refrigerator-3541 I believe this is possible. Here is an example in TS/JS in a nodejs express api you can use pulumi that means you can create resource on the go:
Copy code
import * as express from "express";
import * as pulumi from "@pulumi/pulumi";
import * as automation from "@pulumi/pulumi/automation";

const app = express();

app.get("/deploy", async (req, res) => {

    // Initialize a new stack programmatically without the CLI
    const stack = await automation.LocalWorkspace.createOrSelectStack({
        stackName: "dev",
        projectName: "inlineNode",
        // Specify the program: this describes the desired state of the infrastructure
        program: async () => {
            // This is where you'd define your infrastructure
        }
    });

    // Config values can be provided via the config map
    const config = new pulumi.Config();

    // For example, setting a username and password config value could look like this:
    // config.set("username", "admin");
    // config.set("password", "example_password");

    // Perform an upsert operation: update the infrastructure to match the desired state defined in the program
    const upResult = await stack.up({ onOutput: console.info });
    res.send(`update summary: \n${JSON.stringify(upResult.summary.resourceChanges, null, 4)}`);
});

const server = app.listen(3000, () => console.log(`Server is listening on port 3000.`));

// Cleanup on process termination
process.on('SIGINT', () => {
    console.info("Interrupted");
    server.close();
});
The only thing you need to do in the lambda is to use layers which will have the lib:
@pulumi
and this is an example code:
Copy code
import * as pulumi from "@pulumi/pulumi";
import * as aws from "@pulumi/aws";
import * as automation from "@pulumi/pulumi/automation";

const app = async () => {
    // Initialize a new stack programmatically without the CLI
    const stack = await automation.LocalWorkspace.createOrSelectStack({
        stackName: "dev",
        projectName: "inlineNode",
        // Specify the program: this describes the desired state of the infrastructure
        program: async () => {
            // Define an AWS S3 bucket
            const bucket = new aws.s3.Bucket("myBucket", {
                bucket: "my-bucket-name",
            });

            // Define an AWS SNS topic
            const topic = new aws.sns.Topic("myTopic", {
                name: "my-topic-name",
            });

            // Return the bucket name and topic ARN as stack outputs
            return {
                bucketName: bucket.id,
                topicArn: topic.arn,
            };
        },
    });

    // Perform an upsert operation: update the infrastructure to match the desired state defined in the program
    const upResult = await stack.up({ onOutput: console.info });
    console.log(`update summary: \n${JSON.stringify(upResult.outputs, null, 4)}`);
};

app().catch(console.error);
Please keep in mind haven't tested it though
s

steep-refrigerator-3541

02/29/2024, 9:54 AM
much appreciated, i will test it
c

clean-machine-8609

03/09/2024, 11:00 PM
Got the same idea. But when I deploy the lambda with Pulumi automation code, I got errors about modules not found. CDK can bundle and deploy Lambda in Typescipt easily, I don't catch the point why Pulumi. can not do it too. And I got some concerns about the lambda timeout but the Pulumi deployment take time Tell me if you succeed ;)
p

plain-eye-2667

03/12/2024, 4:47 AM
Here is the example of layers that's being used
@clean-machine-8609 I think the libraries should work BTW my case is a bit different I am using layers of a lib that uses OS level files
s

steep-refrigerator-3541

03/12/2024, 10:55 AM
@clean-machine-8609 I actually avoided using Lambda layers and created a docker image, pushed it to ecr and my Lambda was able to create IaC with pulumi. So thats also a solution you can consider
c

clean-machine-8609

03/12/2024, 1:29 PM
Do you an example/link of Lambda with docker & push to ecr ?
s

steep-refrigerator-3541

03/13/2024, 8:57 AM
There is none I had to figure it out, here is my Dockerfile:
# Build Stage
FROM golang:1.21 as build
WORKDIR /app
RUN apt-get update && apt-get install -y tar
# Install Pulumi
RUN curl -fsSL <https://get.pulumi.com> | sh
RUN mkdir bin
RUN mkdir /app/bin/.pulumi
RUN mv /root/.pulumi/* /app/bin/.pulumi/
ENV PATH="$PATH:/app/bin/.pulumi/bin/"
COPY go.mod go.sum ./
COPY . .
RUN go build -o main ./cmd/main.go
FROM public.ecr.aws/lambda/provided:al2023
WORKDIR /app
COPY --from=build /app /app
ENV PATH="${PATH}:/app/bin/.pulumi/bin/"
ENV PULUMI_HOME="/tmp/pulumi"
ENV PULUMI_CONFIG_PASSPHRASE=""
RUN chmod 777 run.s
ENTRYPOINT ["/app/run.sh"]
the run.sh script that gets executed:
Copy code
#!/bin/bash
set -eo pipefail

# Look up the request ID for the current Lambda invocation
REQUEST_ID=$(curl -X GET -sI "http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/next" \
  | awk -v FS=": " -v RS="\r\n" '/Lambda-Runtime-Aws-Request-Id/{print $2}')
export REQUEST_ID

# Function to handle errors
function error {
  curl -s -d "ERROR" "http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/${REQUEST_ID}/error"
}

#Trap errors and call the error function
trap error ERR

pulumi login <s3://your-pulumi-state>


./main

curl -s -d "SUCCESS" "http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/${REQUEST_ID}/response"
The building and pushing to ecr is well documented and very very easy. I hope that helps