Short question: for Python, is it possible to have...
# automation-api
a
Short question: for Python, is it possible to have multiple concurrent Pulumi sessions if running them in separate threads? (We currently run a larger set of independent deployment jobs in serial, but would like to run them in parallel if possible...)
l
You can run multiple instance of Pulumi in parallel from your shell. Does that achieve the same result?
a
Currently we use the automation API as this is a lot faster in our case. We have a fair amount of common state that can be handled before we even start the first Pulumi stack. I can make a test for this, but.. if somebody already knows that this is not possible due to some singletons in the code (or something else), then I will not try. If it is not possible, we will have to start separate shell jobs, but I would like to avoid this if possible
l
Ah, I see. I'm afraid I don't know! I haven't seen a multithreaded AutomationAPI program. I'm pretty confident that you can do it using a multi-process solution (.exec etc.), but you may be able to do it using threads, too. I'd be keen to hear your results!
l
@echoing-dinner-19531 might know the latest on concurrent python automation API - I believe we did some work here in the past to improve it.
I will also mention that if you're trying to scale out automation API depoyments, you may consider looking at Pulumi Deployments which offers REST APIs for running updates https://www.pulumi.com/docs/pulumi-cloud/deployments/
a
Thanks Evan. I will read up on that as well.
e
I don't believe Python has been made fully multithread safe for automation programs yet. But it should be fine to use python multiprocessing, the engine is parallel safe its just the python runtime isn't.
a
Thanks. I assume, you mean starting different Python processes for the jobs... The issue is the common state that will then need to be "done" in all the workers. But, if that is the only way out, then so be it.
e
If you use multiprocessing, you should be fine sharing the common state
a
Thanks. I'll have a look.
a
One way to do this in python is to have your API for example FastAPI that accepts your requests, sends Async tasks and then have Celery or such run your async requests and you define the concurrency in Celery.
a
I think, I will try out the multiprocessor approach after my vacation 🙂
a
any updates on this?
a
We tried out the multiprocessor support first, but ended up using plain processes - via subprocess - as we also wanted to "catch" the output from the process... It works well and there are no conflicts as the processes runs in different Pulumi stacks. For the common state - which is all immutable/read-only - we optimized the API a bit to avoid loading too much data.
a
thanks