I'm using Python with the Automation API I haven't...
# automation-api
c
I'm using Python with the Automation API I haven't been able to reference modules that are referring to stack metadata because these are initialized outside of the stack initialization Is there a way to package code with an inline program so that it's used consistently across all stacks?
For example:
Copy code
config = pulumi.Config()
environment = config.require('environment')
project_name = pulumi.get_project()
AutoTag(environment)
I need these lines in every single stack I'm creating
l
I think your normal language constructs would work here. Put that code in a function, call it in the normal way from each project...
c
Right, but the lines after this would be different 🤔
l
So only those lines would be in the function...?
âž• 1
c
Each stack would provision it's own resources The config and tagging would be used everywhere but the resources would be different depending on the stack
If I could chain functions that my work but classes haven't worked for me
l
Are you used either aws:defaultTags or StackTransformation? Both of those can be set up before you create the resources, and then be forgotten about. Should work with this style of code.
c
I'm using
StackTransformation
Is it possible to load a
StackTransformation
during stack creation?
l
It is configured in runtime, before any resources are created.
So it should always apply.
c
@billowy-army-68599 do you know if it's possible to load
StackTransformation
into
LocalWorkspaceOptions
or anytime before
stack.up
?
r
do you know if it’s possible to load 
StackTransformation
 into 
LocalWorkspaceOptions
 or anytime before 
stack.up
?
I’m not sure what exactly a StackTransformation is - do you have a doc reference? But in general, no. Concepts available from within a pulumi program are not available outside of the program itself.
I’m kinda struggling to wrap my head around what the exact issue is… what are you hoping to achieve?
l
r
Thanks for the link @little-cartoon-10569. Today I learned! Okay but yeah this is still something you would call from within your
InlineProgram
using automation api.
Going back to tenwit’s original suggestion though… why can’t these lines be a function that is called from within each stack?
Copy code
def tag_env():
    config = pulumi.Config()
    environment = config.require('environment')
    project_name = pulumi.get_project()
    AutoTag(environment)

def my_inline_pulumi_program():
    tag_env()
    ...

def my_other_inline_program():
    tag_env()
    ...
I feel like I’m missing something
l
Yea, it's not even each stack, it's just each program.. it looks good to me...
c
Is it possible to run multiple inline programs in the same stack?
l
Programs contain stacks, not vice versa...
r
stacks are usually used to denote environments like staging, dev, prod of the same program.
that doesn’t necessarily have to be the case. for instance, in the little webapp i built, i use stacks to just pump out different resources from the same program: https://github.com/komalali/self-service-platyform/blob/main/app/sites.py#L77-L92
c
When you initialize a stack you pass the inline program. That's what I'm referring to
r
If you pass in a different function to the stack it will override your program and deploy the new program. So no, you cannot run multiple inline programs in a stack.
but your inline program function can call other functions