Hello.. I am trying to create s3 and dynamo table ...
# python
h
Hello.. I am trying to create s3 and dynamo table in the same stack and trying to run it in parallel as they are not dependent on each other. I have used the
auto.create_or_select_stack
function, am running it as 2 separate tasks. I am noticing this, after the dynamo is created it deletes the s3. I am guessing its because its replacing the state file.
Copy code
aws:dynamodb:Table msd-test-dynamo creating
INFO - +  aws:dynamodb:Table msd-test-dynamo created
INFO - -  aws:s3:Bucket msd-test-s3 deleting
INFO - -  aws:s3:Bucket msd-test-s3 deleted
How do i add resources to existing stack without deleting existing resources ? Also it would be helpful if there a best practises doc for state, stack and projects management.
b
it looks like you're defining two different programs inside the same stack - you can't do that. Pulumi drives towards a desired state. Define the S3 bucket and the dynamodb table inside the same program and Pulumi will handle doing them in parallel if there are no dependencies between them
h
oh got it, Thank you!
For my use case, I will be running the code using airflow dags based on few conditions. will it be possible to add new resources to existing stack if the resource is not already defined in initial program using the
auto.select_stack
? or do u have to create a new stack (state file) for each sub tasks ?
b
stacks are just new invocations of new programs, you'll need a new project and stack for each new pulumi program you want to run
👍 1
h
okay Thank you, will test it out..