This message was deleted.
# python
s
This message was deleted.
b
it looks like you're defining two different programs inside the same stack - you can't do that. Pulumi drives towards a desired state. Define the S3 bucket and the dynamodb table inside the same program and Pulumi will handle doing them in parallel if there are no dependencies between them
h
oh got it, Thank you!
For my use case, I will be running the code using airflow dags based on few conditions. will it be possible to add new resources to existing stack if the resource is not already defined in initial program using the
auto.select_stack
? or do u have to create a new stack (state file) for each sub tasks ?
b
stacks are just new invocations of new programs, you'll need a new project and stack for each new pulumi program you want to run
👍 1
h
okay Thank you, will test it out..