07/16/2021, 2:43 PM
Is there any way to get a list of files from a 
? Alternatively, is there an existing pattern for expanding a zip/tarball archive into an S3 bucket? I have the assets of a website in a tarball and would like to decompress it into an S3 bucket that's serving our web content (that is, I do not want to simply upload the tarball as a single bucket object). Thanks in advance.


07/16/2021, 2:49 PM
Not exactly what you’re asking for … but maybe a partial solution:
Copy code
const scanToS3 = (dirPath: string, options?: { trimDirPath?: boolean; ignoreTypes?: RegExp | string }): void => {
  for (const item of fs.readdirSync(dirPath)) {
    const itemDirPath = path.join(dirPath, item);
    let itemKey = itemDirPath;
    if (options && options.trimDirPath) {
      const dirPathArr = dirPath.split("/");
      const rootFolder = dirPathArr[0] === "." ? dirPathArr[1] : dirPathArr[0];
      itemKey = itemDirPath.replace(new RegExp(`^${rootFolder}`), "");
    if (fs.statSync(itemDirPath).isFile()) {
      if (options && (!options.ignoreTypes || !item.match(options.ignoreTypes))) {
        new aws.s3.BucketObject(itemDirPath, {
          bucket: staticSiteBucket,
          source: new pulumi.asset.FileAsset(itemDirPath),
          key: itemKey,
          contentType: mime.getType(itemDirPath) || undefined,
    } else {
      scanToS3(itemDirPath, options);

scanToS3("./dist/", { trimDirPath: true });
You’d have to unzip the file first


07/16/2021, 2:51 PM
@millions-furniture-75402 Yeah, that's the path I think I'm going to have to go down