modern-napkin-96707
03/17/2021, 3:05 PMpython my-dataflow-script.py --template_location '<gs://my-template-bucket/my-dataflow-template>'
or I can hardcode the template_location
in the script itself and just run python my-dataflow-script.py
which then packages the beam application as a template to be run in dataflow.
I’ve tried calling the dataflow script from another python script using exec(open('my-dataflow-script.py').read())
which works, but trying that in pulumi’s __main__.py
fails with:
TypeError: cannot pickle 'TaskStepMethWrapper' object
I guess apache_beam
tries to pickle the whole pulumi program or do something else which probably doesn’t make sense.
Any experience on dataflow + pulumi and getting this working?gentle-diamond-70147
03/17/2021, 3:47 PMmy-dataflow-script.py
from your Pulumi application or are you trying to (B) call your Pulumi application from my-dataflow-script.py
?
(A) is supported. The best way would be to create a Dynamic Provider to call your script at the right "event" (create, update, etc) - https://www.pulumi.com/docs/intro/concepts/resources/#dynamicproviders.
(B) is not supported as the pulumi
cli (engine) must be the invocation point of your pulumi application.modern-napkin-96707
03/17/2021, 4:17 PMgentle-diamond-70147
03/17/2021, 4:18 PMmodern-napkin-96707
03/17/2021, 4:26 PM