https://pulumi.com logo
a

alert-zebra-27114

06/26/2023, 7:48 AM
Short question: for Python, is it possible to have multiple concurrent Pulumi sessions if running them in separate threads? (We currently run a larger set of independent deployment jobs in serial, but would like to run them in parallel if possible...)
l

little-cartoon-10569

06/26/2023, 8:30 AM
You can run multiple instance of Pulumi in parallel from your shell. Does that achieve the same result?
a

alert-zebra-27114

06/26/2023, 8:36 AM
Currently we use the automation API as this is a lot faster in our case. We have a fair amount of common state that can be handled before we even start the first Pulumi stack. I can make a test for this, but.. if somebody already knows that this is not possible due to some singletons in the code (or something else), then I will not try. If it is not possible, we will have to start separate shell jobs, but I would like to avoid this if possible
l

little-cartoon-10569

06/26/2023, 8:38 AM
Ah, I see. I'm afraid I don't know! I haven't seen a multithreaded AutomationAPI program. I'm pretty confident that you can do it using a multi-process solution (.exec etc.), but you may be able to do it using threads, too. I'd be keen to hear your results!
l

lemon-agent-27707

06/26/2023, 3:20 PM
@echoing-dinner-19531 might know the latest on concurrent python automation API - I believe we did some work here in the past to improve it.
I will also mention that if you're trying to scale out automation API depoyments, you may consider looking at Pulumi Deployments which offers REST APIs for running updates https://www.pulumi.com/docs/pulumi-cloud/deployments/
a

alert-zebra-27114

06/26/2023, 6:31 PM
Thanks Evan. I will read up on that as well.
e

echoing-dinner-19531

06/28/2023, 8:23 AM
I don't believe Python has been made fully multithread safe for automation programs yet. But it should be fine to use python multiprocessing, the engine is parallel safe its just the python runtime isn't.
a

alert-zebra-27114

06/28/2023, 8:26 AM
Thanks. I assume, you mean starting different Python processes for the jobs... The issue is the common state that will then need to be "done" in all the workers. But, if that is the only way out, then so be it.
e

echoing-dinner-19531

06/28/2023, 9:31 AM
If you use multiprocessing, you should be fine sharing the common state
a

alert-zebra-27114

06/28/2023, 9:32 AM
Thanks. I'll have a look.
a

able-thailand-87943

06/29/2023, 3:27 PM
One way to do this in python is to have your API for example FastAPI that accepts your requests, sends Async tasks and then have Celery or such run your async requests and you define the concurrency in Celery.
a

alert-zebra-27114

06/30/2023, 6:43 AM
I think, I will try out the multiprocessor approach after my vacation 🙂
2 Views