QQ: I'm still playing with this, but I think it wo...
# python
i
QQ: I'm still playing with this, but I think it would be powerful if
Resource
be made Awaitable, roughly:
Copy code
class Resource():
    def __await__(self):
        return self.id.future()__await__
This allows me to manage my resources in async more conveniently, including I/O packaging my series of lambda src code. Wdyt? Obviously we need an async version of
register_resource()
and externalizing loop may feel icky.
w
I’d be really interested to see a more complete code example of how you would expect to use this. Note that although we use async internally, we expose a data-flow based programming model on top of that using
Output
and
apply
. Our goal is that you rarely would want/need to be exposed to the asynchrony directly, though it’s always interesting to understand use cases where it may be helpful. There’s also an additional subtlety that we track dependency information through this same data flow, so offering ways to peak behind the abstraction can lead to loss of accurate dependency tracking, which we’re trying to make sure is something you just get for free from writing Pulumi programs naturally.
i
I'm sending you an email with my sample code. Frankly I'm having some issues with
apply
and the code should show you why. RE lossy dependency metadata: I've been thinking about this as well and understand the need to support the synchronous programming model. I see
__await__
as an additional hook to augment the pre-existing dependency metadata captured as
Output
dependencies. I believe it should enable both sync and async programming model. I'm sure the devil's in the details. The code I'm sending you would clarify what I meant.
i
What do you mean by this?
including I/O packaging my series of lambda src code
You should be able to pass anything awaitable as an input to a
Resource
- is that what you’d like to do here? Do some sort of I/O asynchronously and then use the data from I/O for a resource?
i
@incalculable-sundown-82514 thanks. I see it now the I/O use case is possible. The other use case I have is in building multi-step stepfunction (SFN) with various lambdas. I find
apply
to be limiting when building the SFN definition that depends on multiple instances of
lambda.arn
i
Yeah - in Node, we have a combinator
pulumi.all
which accepts a list of outputs and produces an output of a list. It makes it way easier to compose multiple outputs like that. We should probably include a combinator like that in Python, too, since it’s super useful.
That way you could do
pulumi.all(arn1, arn2, arn3).apply(lambda arns: do_something_with_arns(arns))
i
Got it. Thanks. But I find it more expressive to be able to define my own coro that would
await
for pulumi `Resource`s. Perhaps instead of ended with
asyncio.ensure_futures(coro)
we swap it with pulumi's coro registration api. This way I'm not limited to
apply
and
Output
API.
i
It is more expressive, but one of the things that Pulumi does is that it aggressively tracks dependencies between resources, and it uses
apply
and the asynchronous resolution of output properties to do this. If you were able to
await
a Resource and get its properties directly, you’d be able to defeat our dependency tracking.
Am I understanding your proposal correctly?
i
Yes you got it. I've been thinking about the lossy dependency concerns as well. I wonder if
__await__
on
Resource
can be used as a hook to augment the pre-existing dependency graph. Possibly with the help of custom executor, pulumi decorators or perhaps stacktrace (hopefully not). I'm brainstorming, but am just curious if you have plans to bring full async programming to the hands of the clients.
@incalculable-sundown-82514 off-hand, do you know when
pulumi.all
will make it into the SDK?
i
Can you open up an issue for it? I can tackle it in the next week or so.
👍 1
i
@incalculable-sundown-82514, just to close the thread, I was curious if you guys are planning to bring async programming model in the future. It sounds like there's no plan for it, for the reason that either it's not possible or too much work -- forgive me for not being able to arrive at which conclusion is more likely.
i
If you’re interested in it, feel free to open an issue for it and we can start discussing designs - I’m afraid I’m having a hard time wrapping my head around the implications of what you’re suggesting, so I think it would be helpful if we could have a centralized discussion point for this
👍 1