<@U01038HJ93R> - would really value your input on ...
# python
e
@red-match-15116 - would really value your input on @full-artist-27215 & @purple-appointment-84502's feedback above. Looks like this has been raised before as far back as 2018 even by Joe himself (see https://github.com/pulumi/pulumi/issues/3635, https://github.com/pulumi/pulumi/issues/1641), but has yet to be resolved. Is there any mechanism allowing pulumi to be invoked from (or for automation's
work_dir
param to be set to) a higher level directory than the project dir, e.g. a directory further up the hierarchy from the one containing the yaml and main.py files? That way common code can be shared between multiple pulumi projects as sibling or parent modules/ packages and imported in _main.py and python would resolve them (as they'd naturally be included in sys.path). Or is there a different more obvious solution or workaround that I'm missing?
r
e
Sure that works, but that imported module is within the project dir itself
The problem is trying to import a module or package that's shared across multiple pulumi projects in a monorepo. In this case you want them defined somewhere else in the tree and to run python from a common ancestor.
Our structure is what @purple-appointment-84502 specified:
r
Ah, got it. Thanks for clarifying.
So your resources are defined in components?
👍 1
e
where deploy/__main__.py deploys au and us infrastructure projects via pulumi automation. We
auto.select_stack
passing in work_dir as infra/us
Yeah, infra/us/__main__.py has
from infra.components import Vpc
r
What doesn’t work?
e
ModuleNotFoundError: No module named 'infra'
r
This happens when you run
python main.py
from deploy?
e
Same same
it's because only infra/us is on sys.path
r
same to what? What two things result in that error? (Sorry just trying to wrap my head around this)
e
yeah sorry
same for both the CLI and auto
CLI is obvious because you have cd to infra/us to run pulumi up
r
Ah okay got it
e
but running the automation script in deploy results in the same (presumably because setting the work_dir to infra/us in effect does the same as using the CLI in that directory)
r
presumably because setting the work_dir to infra/us in effect does the same as using the CLI in that directory
Yeah that’s right…
e
Other python tools I've worked with usually decouple configuration from the execution context
allowing you to pass in a path via a param or a cli arg to the config file(s)
and then you can invoke that tool from wherever you want... in our case that's really the infra directory
r
What happens if you do
from ..components import Vpc
(I hear you, I’m just not sure we have the capacity to take this on right now so I’m trying to figure out if there’s a workaround)
e
Yup, I'll settle for a workaround too 🙂
Sadly relative imports don't work either. Error is the same for both CLI and automation:
ImportError: attempted relative import with no known parent package
(note that same error occurs whether I make the pulumi project a package or not)
r
Ugh, bummer
Other python tools I’ve worked with usually decouple configuration from the execution context
I think there’s a way to do this. Gimme a second.
🙌 2
Hmm okay no I’m coming up short unfortunately. I guess the only hack that seems viable is to manipulate
sys.path
as @full-artist-27215 mentioned?
e
Yeah, I've not had luck with that so far either 😞
r
Yeah, I’ve not had luck with that so far either
Hmm, how do you mean? I would think
Copy code
sys.path.append(os.path.abspath("../components.py"))
would then let you do
from components import Vpc
e
Yup, my bad. I was appending an existing pathlib handle which I thought contained the parent, but didn't.
Strategically what would be the best solution to this?
Allowing the CLI to be invoked from outside the project dir (maybe passing in config via an arg, which itself references where main.py exists), and then modifying automation to allow the same?
(Happy to write whatever the best mechanism would be as a feature request)
r
I’m honestly not sure what the solution is yet… but just capturing the details of what you described in this thread would be super helpful so we can bring it up for prioritization.
1
e
Will do. Thanks for your help as always!
r
So there actually is a mechanism to indicate where your
__main__.py
file is by passing in a
main
key to the
Pulumi.yaml
file but I’m not sure that actually solves your problem since you have multiple pulumi projects in one repo.
e
Indeed. This is more about python being able to resolve module imports outside the project directory, and it doesn't seem like pulumi (be it cli or automation) enables that right now.
💯 1
r
Okay another idea, you can install local packages into your
venv
with
pip install -e
. I’m not sure if that would work with your directory structure since
infra
contains both the components and the pulumi programs so dunno if that would cause some weird recursive isses, but if there was a sibling directory to
infra
called
resources
or something which contained
components.py
and
__init__.py
you could conceivably
pip install -e /path/to/resources
and then be able to do
from resources.components import Vpc
. Again, not the ideal solution but a possible workaround.
Also noticed this Note in the docs
e
Yeah, I'll admit I'm not sure what that note means - no detail on how to package a project that isn't defined in the root dir.
Using pip for a local package is a nice idea tho - I'll give that a try
f
What I ended up settling on at the moment is having my projects in sibling directories to my shared code. In each project directory I have my
__main__.py
file, along with
Pulumi.yaml
and my stack config files. All my shared code imports are absolute, and I add
Copy code
import sys
sys.path.insert(0, "..")
to the top of each
__main__.py
file. So far, this works, with a minimal amount of fuss (though being able to control things more concretely with some CLI arguments would ultimately be fantastic). I just
cd pulumi/<PROJECT>
and run
pulumi up
and things work. (I'm maybe doing some unconventional virtualenv management, though, since all the projects share a virtualenv that I manage outside of Pulumi's built-in support via project config options. That may affect the overall suitability of this approach somewhat)
(happy to share more details if that's useful)
e
Cheers @full-artist-27215 - that's exactly where I ended up with @red-match-15116 too.
🎉 1
f
I'm by no means a Python expert, but maybe tweaking Pulumi to be able to run a module via
runpy.run_module
(https://docs.python.org/3/library/runpy.html#runpy.run_module), as opposed to just
runpy.run_path
would be useful for dealing with this kind of usecase? I think that might alleviate the need to manually patch
sys.path
🤔 1
👍 1
e
FWIW Komal, your local package suggestion does work but seems to require a setup.py. Until pulumi offers a better solution, I think I prefer the sys.path "hack" + a comment, as it's more explicit and lives right alongside the problematic imports.
👍🏽 1
g
The CLI accepts CWD string - I wonder if it would help in your case
Copy code
-C, --cwd string                   Run pulumi as if it had been started in another directory
I'll soon be dividing my code into reusable modules as well
e
Nice find @great-sunset-355 - I'd missed that option. Quick check sadly it seems to change the execution context of python to that directory.
In my example above I cd to infra and run
pulumi --cwd=us preview
I get
ModuleNotFoundError: No module named 'infra'
.
If I cd to us and run
pulumi --cwd=./.. preview
I get
error: no Pulumi.yaml project file found
g
ouch, well expect to deal with the same/similar problem around next week time, sorry I cannot spend more time on this now. But it would be great if you could share a repo with minimal example of your setup
b
As a datapoint, we are hitting the same issue. We have a pulumi stack embedded into a project (as a subdirectory) and no amount of fiddling with sys.path seems to make all use cases happy (unit tests, running pulumi from the subdirectory manually, and calling the stack from the automation API). If stacks are limited to a main file, everything works, but as soon as we break things out into separate files, things blow up. If there is any way we can influence the roadmap, that's our #1 blocker for wider pulumi adoption 😕
r
@brave-knife-93369 I see that you saw @billowy-army-68599’s reference to the mit project that has a fairly complex project structure in the linked thread: https://pulumi-community.slack.com/archives/CDE799L1M/p1624482814134200 If this continues to be a #1 blocker I would suggest starting with opening an issue and we will prioritize accordingly, but it does appear there is existing prior art that you can base your project off of.
b
Yes I think this will allow us not to have to resort to path hacks. I'll report back on how well that works for us! Thanks!
e
FYI @red-match-15116 I've opened https://github.com/pulumi/pulumi/issues/7360 and replicated/ documented the issue at https://github.com/followben/pulumi-example/
r
Thanks @enough-leather-70274. Curious if you’ve looked at the mit example and tried to replicate their workflow (it seems to work for them)
e
Yeah, as @brave-knife-93369 pointed out it requires packaging the repo and installing in a venv. Not sure that's a great fit for our use case (we're not re-using or publishing the projects themselves), but guess it's an alternate workaround to hacking the syspath. Once again, the latter feels clearer as to what we're working around and why it's there (i.e. maybe just better for us from a maintenance PoV?)
👍🏽 1