icy-fall-96489
03/26/2025, 2:01 PMboto3
client to execute a task? I may be going about this the wrong way (long time SWE, been doing DevOps for a bit under a year though) so I welcome any feedback.
I am building a CI/CD pipeline in AWS CodeCommit, CodeBuild, and CodePipeline. CodePipeline requires the use of an S3 bucket to hold the artifacts (i.e. the code associated with a PR that is being merged into Main). The problem is that there is no “clean up” of that bucket and a standard lifecycle policy on the bucket is rather coarse, lacking fine-grained controls. I need to keep the last N artifacts (it varies depending on the team) and clear out the rest.
My choices are to do this via a Lambda or via the CI/CD process itself - I’m hoping to do it as part of the pipeline itself and not use a lambda. Ordinarily, I would just write the required functions, using boto3
to verify the bucket exists, get the list of objects ordered by created date, and delete all but the last N. My concern is that this seems like a workaround when all other actions use Pulumi resources/objects to do the work. When i found out about Dynamic Providers
, it seemed like the more correct way to integrate boto3
.
So far I have been unsuccessful. I haven’t been able to even just return the results of a s3.head_bucket
command. I am not a strong python programmer (started using python when I moved to DevOps since it is the company standard), so I could be doing things wrong - but I can’t figure out how to see what the dynamic provider is doing.
I’ve tried logging (log files are blank), even adding an atexit.register(flush_logs)
to my code. I know the dynamic provider runs in a subprocess, but most of what I’ve googled all point to the same way of adding a logger to a subprocess (at the top of the file, add the standard logger = logging.getLogger… type boilerplate and a fileHandler if desired).
I’m happy to add some code to this if it helps, but I’m hoping someone can point me to an example or something other than the Pulumi doc examples. I’ve done several Pulumi projects - basics like create an S3 statebucket, deploy EKS, Helm charts, EC2, IAM, ECR - but the dynamic provider just isn’t clicking for me.icy-fall-96489
03/26/2025, 2:28 PMpulumi preview --logtostderr -v=9 > pulumi-debug.log 2>&1
does output the <http://pulumi.log.info|pulumi.log.info>
messages to the specified file.pulumi preview
alone, does not output to the file configured in the code or to the console.wonderful-umbrella-73154
03/26/2025, 2:36 PMCommand
provider? I use that all the time to run simple aws cli
commands during stack execution
https://www.pulumi.com/registry/packages/command/api-docs/local/command/wonderful-umbrella-73154
03/26/2025, 2:37 PMstdout
and stderr
as Outputs of running the Command
for simpl-ish use casesicy-fall-96489
03/26/2025, 3:03 PMwonderful-umbrella-73154
03/26/2025, 3:04 PMaws s3api head-bucket --bucket mybucketname
could be one to start playing withwonderful-umbrella-73154
03/26/2025, 3:05 PM.stdout
as one of the outputs (things that will be available after it executes)icy-fall-96489
03/26/2025, 3:05 PMwonderful-umbrella-73154
03/26/2025, 3:05 PMOutputs
and all the async stuff going on so that you can reference them correctlyicy-fall-96489
03/26/2025, 3:09 PM.apply(lambda...)
and pulumi.Output.all(something).apply(lambda…)
Async in python is still non-sensical to me (it was so much easier to implement in C#).icy-fall-96489
03/26/2025, 3:11 PMcommand
road and see what I find. Thanks for the info.wonderful-umbrella-73154
03/26/2025, 3:11 PMwonderful-umbrella-73154
03/26/2025, 3:12 PMCommand
. it's a great choice for quickly filling in tiny gaps or something custom that the main Provider libraries may not cover