This message was deleted.
# general
s
This message was deleted.
b
So two questions here: 1. Do I need to the CLI to update the lambda if the source code hash has changed or will AWS do it for me? 2. Is there a Pulumi method to define the source code hash (in other words create the hash for me)? So answer to 2 is no, but I think pretty much all of the languages we support (aside from YAML) has a built in method to do the base64 sha hashing. The answer to 1 is “I don’t know but I’m finding out now).
Done a bit of investigating and yes if you’re uploading to S3, you’ll need a way of triggering the new version of the layer to be created with the new version of the layer code. There are two ways you can do this: 1. Use the
sourceCodeHash
input of the layer. If you use the Pulumi
aws.s3.BucketObject
resource, then this emits a source code hash that you can use. However, it seems that even without the code changing, the source code hash output keeps changing so you’ll get a new layer version every time you run an update. 2. The other way to do this is by versioning the S3 bucket and then passing the
versionId
output from the bucket object into the
s3ObjectVersion
input of the lambda layer. If you choose to use method 2, you’ll end up with something like this:
Copy code
import * as pulumi from "@pulumi/pulumi";
import * as aws from "@pulumi/aws";

const bucket = new aws.s3.Bucket("lambdaBucket", {
    versioning: {
        enabled: true
    }
});

const bucketObject = new aws.s3.BucketObject("lambdacode", {
    source: new pulumi.asset.AssetArchive({
        ".": new pulumi.asset.FileArchive("./lambda")
    }),
    bucket: bucket
});

const lambdaLayer = new aws.lambda.LayerVersion("layer", {
    layerName: "piers-test",
    s3Bucket: bucket.bucket,
    s3Key: bucketObject.key,
    s3ObjectVersion: bucketObject.versionId,
});
c
@brave-planet-10645 thanks for this, I think I've come up with a decent alternative solution. Using zx, I can create a pre pulumi script that gets a list of the files in the most recent commit, scans to see if the filepaths contain any layer files and if so, updates a json file containing hashes representing each layer. I can then pull the hashes from this file when creating my layers in pulumi and that way it should only update the code when necessary. Looks something like this using tsx in conjunction with zx:
Copy code
#!/usr/bin/env tsx
import { $ } from "zx";
import fs from "fs";
import path from "path";
import crypto from "crypto";
import {
    getChildDirectories,
    srcLayerPath,
    hashFilePath
} from "@lambda/script/helper";
import { rootDir } from "@root/root-dir";

async function updateHashes() {
    const output = await $`git diff --name-only HEAD~1`;
    const updated = output.stdout.split("\n")
    const layers = getChildDirectories(srcLayerPath);
    const hashFile = fs.readFileSync(hashFilePath, "utf8");
    const hashes = JSON.parse(hashFile);

    for (const layer of layers) {
        const relPath = path.relative(rootDir, `${srcLayerPath}/${layer}`);

        if (updated.find((file) => file.startsWith(relPath))) {
            hashes[layer] = createHash();
        }
    }

    fs.writeFileSync(hashFilePath, JSON.stringify(hashes));
}

function createHash() {
    return crypto.createHash("sha256").update(Date.now().toString()).digest("base64");
}

updateHashes();
actually I believe I may have to modify this approach as I think the hash needs to be generated from the zip file rather than just being random
325 Views