This message was deleted.
# python
s
This message was deleted.
b
can you explain a little more? Where do you need to grab the resources from?
m
Thanks for the response, @billowy-army-68599. Sure! Here's a simple one: There's an S3 bucket and Glue jobs provisioned/managed by Pulumi TS. In order for the DAG task (which is all in Python) to call on and use the TS resources, it must know said resources by
name
, a-priori. There's just not a way to do this during deployment. It seems that I must first name those TS resources and then literally copy/paste that string to the DAG task configurations. Like I said, this kind of thing can be easily hard-coded, but that'll be unruly very shortly😅
r
As airflow DAGs and pulumi programs never execute in the same environments a possible way to workaround this is storing the names in an external store so then the DAG can fetch the names in runtime.
✅ 1
w