Does anyone else struggle with the `sourceCodeHash...
# aws
b
Does anyone else struggle with the
sourceCodeHash
of AWS Lambda ? Earlier I posted a Python function that replicates the
filebase64sha256
function of Terraform so that I can use the
sourceCodeHash
to know if my Lambda (or its Layer) actually needs to be updated by Pulumi or not Today, I realised that my weeks of struggling are due to zip files being undeterministic : if you zip the same file twice you'll have 2 checksums ๐Ÿ˜ž Making a zip deterministic is difficult due to the metadata included in it, so I found that the only way to make a zip archive checksum is to sum the CRC or each file, and I made a function that does exactly that (called it
zipbase64sha256
) Unfortunaly, every time I used my function to populate
sourceCodeHash
, I realize that my checksum gets written over by Pulumi (which I suspect uses the regular, non-deterministic,
filebase64sha256
of Terraform) How should I go about making a deterministic checksum so that if I build my archive again with the same content, Pulumi doesn't nag me to upload it to AWS Lambda ?
c
@bright-orange-69401 I donโ€™t necessarily have an answer for you. But I can point out a section of code that @gentle-diamond-70147 showed me awhile back, which as I understand is responsible for auto-population of the source code hash. https://github.com/pulumi/pulumi-aws/blob/master/provider/resources.go#L1501
b
Ah yes! Thanks @colossal-ram-89482 I can see in the code you linked:
We also automatically populate the asset hash property
There lies the great evil ๐Ÿ˜ˆ Auto-populating is fine as long as it doesn't overwrite user input (which seems to be the case here), although I think that summing the CRC of each file in a zip is a better method than hashing the zip altogether This is particularly true for a
Layer
: I tried to code Pulumi in a train once (before the lockdown) with intermittent 4G and having to upload a 30MB archive everytime was just impossible ๐Ÿ˜