:wave: How do you make `pulumi_command.local` to b...
# general
f
👋 How do you make
pulumi_command.local
to be dependent on a
pulumi.Output
? Imagine that I have to ansible playbooks
pl1
and
pl2
that I need to run in sequential order (can’t be part of the same command) and therefore I need
pl2
to not be started before
pl1
finishes. What would be the dependency tie up between the two? Currently I do some wanky job of creating a function that runs
pl1
and returns a bool pulumi.Output and then I use this output in the
Environment
as a dependency:
Copy code
def deploy(depends_on: List[pulumi.Output]):
    local.Command(
        "kubeone apply",
        local.CommandArgs(
            create="kubeone apply -y -m kubeone.yaml -c credentials.yaml && cp *-kubeconfig ~/.kube",
            environment={
                "PULUMI_DEPENDS_ON": f"{depends_on[0]} {depends_on[1]}",
            },
            triggers=depends_on,
        ),
    )
but this feels like a square peg in the round hole
e
Have you looked at ResourceOptions and tried just setting dependsOn: https://www.pulumi.com/docs/concepts/options/dependson/ ?
f
@echoing-dinner-19531, I sure did, and I apologize for a poorly explained issue. Whereas I can get a resource object out of local command
Copy code
c = local.Command(args)
I look for a way to return the same for a task that doesn’t use the resource. Imagine if I generate a YAML file using outputs and native language libs, so I don’t interact with pulumi-managed resource, then the question is how to make the next step (which is a local.command) to depend on that function completion?
maybe an example would explain it better than I do I am templating a file within the pulumi program
Copy code
def render_credentials(subnet_id: pulumi.Output):
    def render_creds(subnet_id):
        env = Environment(loader=FileSystemLoader("."))

        template = env.get_template("infra_modules/credentials_dev.yaml.j2")

        with open("credentials.yaml", "w") as f:
            f.write(
                template.render(
                    os.environ,
                    subnet_id=subnet_id,
                )
            )
        return True

    return subnet_id.apply(render_creds)
then, in order to make sure that my next steps will depend in this step I return an output
Copy code
creds_done = kubeone_cluster.render_credentials(subnet_id=sub.id)

cluster = kubeone_cluster.deploy(depends_on=[yaml_done, creds_done])
But may be I am rubbing it the wrong way
e
oh right, yeh doing something like you've done is probably the simplest way for now then. Just make the resource depend on that Output value somehow. You can "fake" a dependency with something like:
output.all([a, b]).apply(([a, b]) => a)
to make an output that depends on b but doesn't use it's value in any way.
f
nice, this is a very useful trick, I have to try it
Wanted to pick this up again 😃 Not sure I understand what happens in that
output.all
function. Can someone clarify? Thanks!
e
The dependencies of an Output from the
all
method are the union of all the dependencies of the outputs passed into the
all
method. So this tracks the dependencies for a and b. Then the apply statement doesn't change dependencies but takes only the value of a.
f
@echoing-dinner-19531 will the apply func execute during a preview though? e.g. if I have two steps that I need to execute sequentially: 1. step1 - creates a file 2. step2 - reads a file I want step2 to not try reading a file until step1 is finished. I was trying to use outputs.all.apply for that, but to no avail at this point
e
apply will only run in a preview if all the inputs are known. For example the ID of a bucket object that needs to be created can't be known at preview time, so any applies off it won't run. But in terms of sequentialising things it should be fine to use output apply to do that.
f
then I might have not mastered my artificial dependencies still. Will poke around more