How are people uploading static sites [to s3 but p...
# general
m
How are people uploading static sites [to s3 but pretty general problem] via pulumi which don't end up costing.. thousands of credits because the documented examples all use a BucketObject resource per file? I'm hoping someone has shared some component resource which does an s3 sync under the hood.
q
have you considered using Docker?
b
I can't find it now, unfortunately, but a community user wrote a dynamic provider that would "throw files in s3" and was a single resource, I'll try track it down
m
Similarly to the script idea mentioned, we leverage the
aws
cli, because the
sync
command is great:
Copy code
aws s3 sync ./static-site s3://$(pulumi stack output contentFilesBucketId) --delete
m
You could also try leveraging the command provider, so you can use
aws s3 sync
as one resource within pulumi
g
Ah, @billowy-army-68599, I think I found the one you're talking about: https://www.npmjs.com/package/@wanews/pulumi-throw-files-in-s3
That being said, I also mentioned in the #aws thread that I've heard of (but haven't seen code for) a solution that uploads a file archive and uses Glue to unzip it.
b
that's the one! great find!
🎉 1
m
Great, thanks all 🙂
Is there a convenient way I can expose my current pulumi session credentials to a pulumi.Command somewhat idiomatically?
Ie. I use
assumeRole
in aws provider config and so the cli usage of aws s3 sync would need to assume that role to
would be nice to re-use the pulumi session than create another one
ah guess I can use
environment
arg
g
That would be the route I would go for now. The Command provider is still in preview and may be missing some functionality that would make sense for general use; if you think it's a good feature, it would be great if you'd add an issue on the repo! https://github.com/pulumi/pulumi-command/issues