sparse-optician-70334
09/15/2023, 4:15 PMpython
import pulumi
from pulumi_aws import s3
# Define the S3 bucket
bucket = s3.Bucket('mybucket')
# List all objects in the S3 bucket
objs = {}
bucket_list = s3.Bucket("bucketList", bucket = bucket.id)
for obj in bucket_list.objects:
objs[obj.key] = s3.BucketObject(obj.key,
bucket = bucket.id,
source = pulumi.FileAsset(obj.key))
# Export the name of the bucket
pulumi.export('BucketName', bucket.id)
# Register all objects for deletion
pulumi.Resource.register_outputs(**objs)
it suggested this for loop. But this seems to be insensibly slow in case the bucket has many elements - and should call the empty command instead.
How can I tear down no longer needed resources including the side effects?stocky-restaurant-98004
09/15/2023, 5:31 PMforce_destroy
on the bucket will delete it even if it has objects in it.
The synced folder package will give you functionality like aws s3 sync
and should be a lot faster than a for loop for a large number of bucket objects. https://www.pulumi.com/registry/packages/synced-folder/sparse-optician-70334
09/15/2023, 5:50 PM