I'm attempting to use pulumi to manage a workflow ...
# aws
n
I'm attempting to use pulumi to manage a workflow where I have static assets getting pushed into a bucket from a CI build, and need to copy those into the production bucket (behind cloudfront) with specific cache-control headers for deployment. It's unclear to me what approach to take here though in copying the objects. Initially I was thinking of using
aws.s3.getObjects
to enumerate the relevant assets from the build bucket, followed by
aws.s3.getObject
to download all of them, and finally recreating new
BucketObjectv2
s with the cache-control headers specified. However, it doesn't look like that will work because
aws.s3.getObject
is hard coded to only allow downloading the content if the Content-Type is
text/*
or
application/json
and I have some images. I also looked into using
RemoteAsset
, but Assets in general seem to be a bit rough around the edges at the moment based on current issues like https://github.com/pulumi/pulumi/issues/6235 and it's unclear to me how to actually use
RemoteAsset
with authentication (if that's even possible). Another approach could be downloading the assets from S3 into a temporary directory and then using FileAsset. Is there a better way to accomplish this? I think if I do the initial upload into the build bucket with pulumi things would be easier, but I want to avoid going down the wrong path here.
If anybody is interested, I ended up constructing an S3 client using the AWS, downloading the objects, and reuploading with FileAsset