This message was deleted.
# automation-api
s
This message was deleted.
l
Given that you'd need the Pulumi binaries, the plugin binaries, and all the libraries, I'd guess that the end result would be too big for a lambda. This might be better suited to containers.
k
At some point I managed to squeeze pytorch library into lambda, I doubt size would be an issue - I think it would additional hassle to package it, unless there is a Pulumi supported.
you are right limit still 250 MB for custom runtime. I am still hopeful pulumi is not on par with numpy/pandas/pytorch (1 GB library)
To rephrase the question: are any known obstacles which will prevent packaging Pulumi into container as any other Python program?
l
You would also need a user and a home directory. Do lambdas support that? Honestly, even if this were possible, it's unlikely to be worth it. When the automation-api is changed to not rely on the CLI it might be worth trying, but until then, stick with containers.
BTW Pulumi isn't a python program. Pulumi programs can be python programs, but Pulumi is a golang program.
k
Thank you, I will make is easy for myself by sticking to small dedicated server for automation or containers.
f
We created a custom AWS container to run our lambda in and install the Pulumi CLI in that container: https://docs.aws.amazon.com/lambda/latest/dg/images-create.html
k
@famous-magician-5742 so there are no challenges and there is at least one success story. Thank you.