How do people generally upload their static site f...
# aws
c
How do people generally upload their static site files to s3?
m
Two general approaches: 1. Using a helper function to crawl a directory, and a for loop on the results to declare every s3 object in Pulumi 2. Using the AWS SDK, or aws cli to
aws s3 sync
c
The second one do you use the local command stuff in Pulumi or just do it outside
m
Depends on the use case, but more often, I personally have done the content upload as a separate step from infrastructure. If you integrate it with your Pulumi code, it's more seamless to the user, but it could add more complexity.
aws s3 sync
is a superior command to even what the AWS SDK provides for file uploading.
l
Don't
FileAsset
and `BucketObject`/`BucketObjectv2` combine to do uploads in a more Pulumi-idiomatic way?
Though not so great for large directories and/or files that are unknown in advance...
m
If using the Pulumi service and you're getting charged by resource managed, also something to take into consideration.
💯 1
c
Also it deletes the previous deployments files. Which means for single page apps you can end up breaking users who have loaded the site but need to load specific chunks because they have navigated to a new page
m
FWIW,
aws s3 sync
with the
--delete
flag would have the same behavior. This might be something you could solve with CloudFront, which you should probably be using in front of a private S3 Bucket, instead of using a Public S3 Bucket.
c
Totally, but there is always the case when you hit an edge node which doesn’t have the file cached?
m
Yeah, I'm not positive on the best approach there.
c
This gives me some ideas anyways, I’ve not seen any good write-ups on how best to do static website files. Most work but have edge cases
m
I would think AWS would have some of the best advice in this case.
s
FWIW, I would also use
aws s3 sync
I often use Make to coordinate between IaC and stuff outside of IaC (e.g. when I used to mix TF with Serverless Framework), so it's a familiar pattern for me. You might also consider the
pulumi-command
provider: https://github.com/pulumi/pulumi-command
👍 1
c
For reference, I had a dynamic provider which used the aws sdk to copy files into s3, recently it broke, think due to upgrading typescript or upgrading dependencies. It just handled uploading html files last and a few other nice things around cache control This is what i’ve settled on for now
Copy code
const uploadAssetsCommand = pulumi.interpolate`aws s3 cp ${projectRoot}/dist/assets s3://${consoleWebBucket.bucket}/assets --recursive --cache-control public,max-age=31536000,immutable`
uploadAssetsCommand.apply(console.log)
const uploadAssetsCmd = new local.Command(`upload-assets`, {
    create: uploadAssetsCommand,
    // Always trigger
    triggers: [new Date().toISOString()],
})
uploadAssetsCmd.stdout.apply(console.log)
uploadAssetsCmd.stderr.apply(console.log)
const uploadToplevelFiles = pulumi.interpolate`aws s3 cp ${projectRoot}/dist s3://${consoleWebBucket.bucket} --cache-control no-store --recursive --exclude "assets/*"`
uploadToplevelFiles.apply(console.log)
const topLevelCmd = new local.Command(`upload-top-level-files`, {
    create: uploadToplevelFiles,
    // Always trigger
    triggers: [new Date().toISOString()],
})
topLevelCmd.stdout.apply(console.log)
topLevelCmd.stderr.apply(console.log)