https://pulumi.com logo
Join the conversationJoin Slack
Channels
announcements
automation-api
aws
azure
blog-posts
built-with-pulumi
cloudengineering
cloudengineering-support
content-share
contribex
contribute
docs
dotnet
finops
general
getting-started
gitlab
golang
google-cloud
hackathon-03-19-2020
hacktoberfest
install
java
jobs
kubernetes
learn-pulumi-events
linen
localstack
multi-language-hackathon
office-hours
oracle-cloud-infrastructure
plugin-framework
pulumi-cdk
pulumi-crosscode
pulumi-deployments
pulumi-kubernetes-operator
pulumi-service
pulumiverse
python
registry
status
testingtesting123
testingtesting321
typescript
welcome
workshops
yaml
Powered by Linen
python
  • s

    sparse-tomato-5980

    06/08/2021, 8:51 PM
    Hey folks - is there a way to quiet things that are logged with
    pulumi.log.warn()
    ?
    f
    • 2
    • 3
  • g

    gorgeous-minister-41131

    06/11/2021, 10:36 PM
    Not sure if this is a Python-specific thing, but is there not a way to modify the values of an Input of an object after-the-fact it has been instantiated? for example:
    d = self.create_deployment(
                *args,
                **kwargs,
                extra_annotations={"<http://app.kubernetes.io/group|app.kubernetes.io/group>": "worker"},
                extra_labels={"<http://app.kubernetes.io/group|app.kubernetes.io/group>": "worker"},
            )
            d.spec.replicas = 2
    I find myself having to write custom classes that use Mixins to modify object properties before they’re passed to the constructor, to allow the object to be modified (just-before) it’s actually created…
  • g

    gorgeous-minister-41131

    06/11/2021, 10:37 PM
    TL;DR: d is a Deployment object in K8s.. create_deployment creates the object… but after the object’s created, I assume there’s no way to ever change replicas.. without creating a whole new deployment object of the same URN to replace the old…?
  • g

    gorgeous-minister-41131

    06/11/2021, 11:17 PM
    I’m going to gather it’s not possible since the object has already been created and the URN exists, and the method of using helper classes to allow Input changes prior to the object creation is the only way…
  • g

    gorgeous-lifeguard-69736

    06/18/2021, 11:34 AM
    Hi, how can I assign a role to a gcp service account? I tried the below but it fails with
    gcp:serviceAccount:IAMMember (log-writer-iam):
      error: 1 error occurred:
            * Error applying IAM policy for service account 'projects/secret-stash-stadium/serviceAccounts/web-sa@secret-stash-stadium.iam.gserviceaccount.com': Error setting IAM policy for service account 'projects/secret-stash-stadium/serviceAccounts/web-sa@secret-stash-stadium.iam.gserviceaccount.com': googleapi: Error 400: Invalid service account (<pulumi.output.output object at 0x10c8f3310>)., badRequest
    What am I doing wrong?
    sa = serviceaccount.Account(
            resource_name="sa",
            account_id="web-sa",
        )
    
        log_writer_iam = serviceaccount.IAMMember(
            resource_name="log-writer-iam",
            member=f"serviceAccount:{sa.email}",
            role="roles/logging.logWriter",
            service_account_id=sa.name,
        )
    s
    n
    • 3
    • 7
  • f

    full-island-88669

    06/18/2021, 2:37 PM
    Hi! I'm having an issue using pulumi Python package:
    Could not find a version that satisfies the requirement grpcio==1.38.0
    More info: https://stackoverflow.com/questions/68036977/cannot-use-my-app-on-bitbucket-pipelines-could-not-find-a-version-that-satisfie
  • g

    great-sunset-355

    06/19/2021, 8:37 AM
    Hi was anyone able to run pulumi with python debugger? when I run
    python __main__.py
    I get an error
    Program run without the Pulumi engine available
    . I found this issue https://github.com/pulumi/pulumi/issues/1372#issuecomment-583086422 I tried what @white-balloon-205 suggested but I could not find the process. Here is the code I tried from: https://www.pulumi.com/docs/intro/concepts/config/#structured-configuration
    import pulumi
    
    config = pulumi.Config()
    data = config.require_object("data")
    print("Active:", data.get("active"))
    For some reason I do not even get the output of
    print
    when I run
    pulumi up
    Any clues?
  • p

    purple-appointment-84502

    06/22/2021, 8:52 AM
    Hi all, struggling with the pulumi automation API in python, and some import issues, wondering if you can help. I'm trying to import a common component resource from a sibling directory as a module. In my main__.py files, i'm trying to import:
    from ..common.commonComponentResource
    And I'm getting relative import errors. From my understanding, the
    work_dir
    specified as part of the automation api, needs to be where the stack .yaml files are, and this is then used as the python working directory too. Is there any way to have a different work directory to get around the relative import errors? So the project structure I have is: -- Common ---- CommonComponentResource.py ---- __init.py -- Stack1 ---- _main.py ---- yaml -- Stack2 Thanks for your help!
    b
    • 2
    • 2
  • f

    full-artist-27215

    06/22/2021, 2:45 PM
    I may be having similar issues as @purple-appointment-84502 above, if I'm understanding correctly (I'm having a bit of trouble following the directory layout structure as well). I'm trying to manage multiple Pulumi projects in a single repository that share common resource code (e.g., I have defined some component resources that each project uses). I happen to also be using Pants (https://pantsbuild.org) to manage our other Python code, and I can use the Automation API (based on the "Local Program" pattern in the documentation) to run things under Pants, and therefore take advantage of its Python dependency management to resolve the common code. However, I don't want to forego the use of the
    pulumi
    CLI to interact with these projects and their stacks (e.g., if something goes wrong, I'd like to be easily do something like
    pulumi export
    -> edit file ->
    pulumi import
    , without having to also encode that via the Automation API). The best I've managed so far (though I'm still investigating) is to manipulate the
    sys.path
    in my various
    __main__.py
    in order to get the shared code importing correctly, but that seems like a hack I'd rather avoid. Is there any guidance for how to structure things (with directory structures, configuration parameters, etc.) to be able to juggle multiple related Pulumi Python projects that share code? Thanks in advance.
    n
    • 2
    • 1
  • e

    enough-leather-70274

    06/22/2021, 11:18 PM
    @red-match-15116 - would really value your input on @full-artist-27215 & @purple-appointment-84502's feedback above. Looks like this has been raised before as far back as 2018 even by Joe himself (see https://github.com/pulumi/pulumi/issues/3635, https://github.com/pulumi/pulumi/issues/1641), but has yet to be resolved. Is there any mechanism allowing pulumi to be invoked from (or for automation's
    work_dir
    param to be set to) a higher level directory than the project dir, e.g. a directory further up the hierarchy from the one containing the yaml and main.py files? That way common code can be shared between multiple pulumi projects as sibling or parent modules/ packages and imported in _main.py and python would resolve them (as they'd naturally be included in sys.path). Or is there a different more obvious solution or workaround that I'm missing?
    r
    f
    +2
    • 5
    • 61
  • h

    hallowed-ice-8403

    06/23/2021, 8:48 AM
    Hello.. I am trying to create s3 and dynamo table in the same stack and trying to run it in parallel as they are not dependent on each other. I have used the
    auto.create_or_select_stack
    function, am running it as 2 separate tasks. I am noticing this, after the dynamo is created it deletes the s3. I am guessing its because its replacing the state file.
    aws:dynamodb:Table msd-test-dynamo creating
    INFO - +  aws:dynamodb:Table msd-test-dynamo created
    INFO - -  aws:s3:Bucket msd-test-s3 deleting
    INFO - -  aws:s3:Bucket msd-test-s3 deleted
    How do i add resources to existing stack without deleting existing resources ? Also it would be helpful if there a best practises doc for state, stack and projects management.
    b
    • 2
    • 5
  • h

    happy-alarm-59675

    06/23/2021, 10:07 AM
    Hi, I try to output a list for my custom provider, like in this example here: https://github.com/pulumi/examples/blob/master/aws-py-ec2-provisioners/provisioners.py#L159-L160 But when I do this,
    pulumi up
    either hangs, or I get this error:
    File ".venv/lib/python3.9/site-packages/pulumi/runtime/rpc.py", line 79, in _get_list_element_type
            raise AssertionError(f"Unexpected type. Expected 'list' got '{typ}'")
        AssertionError: Unexpected type. Expected 'list' got '<class 'list'>'
        error: an unhandled error occurred: Program exited with non-zero exit code: 1
    The code, which is just some playing around with dynamic providers: https://gist.github.com/ederst/406438f594dd82b3c614df43658b3bf8 My guess is i am running into https://github.com/pulumi/pulumi/pull/7049 and I have to wait until this is resolved, and use some different method to expose a list as output (Output[str] and ','.join()) in the meantime?
  • g

    great-sunset-355

    06/23/2021, 9:13 PM
    Hi does anyone have an example of moderately complex project with packages and functions?
    b
    e
    b
    • 4
    • 13
  • g

    great-sunset-355

    06/25/2021, 11:55 AM
    Hi was anyone able to get the code completion working in PyCharm? There seem to be something with _utilities.lazy_import so that PyCharm does not provide any completion at all. Then in VScode, you get completion working but it's not great either when I use Show Bover on the resource it does not show almost anything. For this example:
    """A Python Pulumi program"""
    
    import pulumi
    
    from pulumi_aws import iam
    
    iam.Role()
    Show Hover action only shows
    (class) Role
    - while I expect the whole docstring to show up. But it looks like the cause of this is that how docstrings are written and possibly usage of overload decorator as well
    class MyClass:
        @overload
        def __init__(self, a:int)->None:
            """init INT"""
            ...
        def __init__(self, *args, **kwargs) -> None:
            pass # real implementation
    If Classes had a class docstring like this it would at least shown that:
    class MyClass:
        """A class doc."""
        @overload
        def __init__(self, a:int)->None:
            """init INT"""
            ...
        def __init__(self, *args, **kwargs) -> None:
            pass # real implementation
    I wonder if pulumi adheres to any of the styling guides or "just generates docstrings"
    r
    • 2
    • 10
  • a

    alert-mechanic-59024

    06/25/2021, 4:03 PM
    Hey guys. Currently using the latest versions of pulumi (python sdk) and pulumi-kubernetes, and seeing this: File “/pulumi/projects/venv/lib/python3.9/site-packages/pulumi_kubernetes/apiextensions/v1/CustomResourceDefinition.py”, line 121, in init self._internal_init(resource_name, *args, **kwargs) TypeError: _internal_init() got an unexpected keyword argument ‘status’ I’ve considered modifying the upstream yaml files to remove the offending code, but I’d prefer to either wait for the bug to be resolved, or roll back to an earlier version of pulumi - I believe v2 did not have this issue. Could anyone tell me if they have seen this issue and, if so, how they went about resolving? Alternatively, could someone offer up their requirements.txt file with a list of compatible python library versions? Thanks!
    g
    • 2
    • 1
  • m

    mammoth-refrigerator-77806

    06/28/2021, 7:34 PM
    in a
    aws.ecs.TaskDefinition()
    resource I’d like to use other resource attributes in the
    container_definitions
    parameter which requires a valid JSON document (currently declaring within
    json.dumps
    ). However, when I try this I get the error
    TypeError: Object of type Output is not JSON serializable
    . Is there a way to do this?
    e
    • 2
    • 1
  • g

    great-sunset-355

    06/29/2021, 6:02 PM
    Generated objects in python have unreal names - this is not pythonic it looks like Java or C# at best 😞
    ServiceSourceConfigurationAuthenticationConfigurationArgs
    Is there any chance to remedy this? without dropping into dictionaries? PEP8 python line length suggest 79 characters the name above is 57!
    r
    • 2
    • 12
  • i

    incalculable-action-69391

    06/30/2021, 3:42 AM
    Is there anyone please who know how to set environment vars in pulumi with python, this information is NOT covered even one time in the documentation. Is it a dictionary {key:value} .
    g
    • 2
    • 1
  • i

    incalculable-action-69391

    06/30/2021, 3:42 AM
    b
    • 2
    • 1
  • a

    ambitious-father-68746

    06/30/2021, 2:38 PM
    Hi, I currently need to create a set of resources conditionally based on some Outputs. Using
    if
    doesn't work because an Output is not a boolean. I've found a solution where I can shove the creation of the resource inside an
    apply()
    , but then I can't refer to that resource from outside the
    apply().
    Any ideas?
    b
    • 2
    • 8
  • g

    great-sunset-355

    06/30/2021, 4:37 PM
    How can I do this
    apply
    correctly, it keeps biting me?
    cfg = pulumi.Config()
        ssm.Parameter(
            f"{prefix}-app-dn-credentials",
            name=f"/{prefix}",
            value=json.dumps(
                {
                    "user": "user",
                    "password": cfg.require_secret('secret').apply(lambda x: x)
                }
            )
        )
    r
    b
    +3
    • 6
    • 34
  • f

    few-pillow-1133

    07/01/2021, 12:59 PM
    Any idea why cli deployment could be failing We're trying to deploy from gitlab to azure, and it seems not to work
    raise invoke_error
    Exception: invoke of azure-native:resources:getResourceGroup failed: invocation of azure-native:resources:getResourceGroup returned an error: building auth config: obtain subscription(f15d5330-8e98-43e0-ac1e-3a08e5702508) from Azure CLI: Error parsing json result from the Azure CLI: Error waiting for the Azure CLI: exit status 1
    error: an unhandled error occurred: Program exited with non-zero exit code: 1
    Config looks like below
    config:
      azure-native:clientId: xxx
      azure-native:clientSecret:
        secure: xxx
      azure-native:environment: public
      azure-native:location: xxx
      azure-native:subscriptionId: xxx
      azure-native:tenantId: xxx
      azure:clientId: xxx
      azure:clientSecret:
        secure: xxx
      azure:environment: public
      azure:location: xxx
      azure:subscriptionId: xxx
      azure:tenantId: xxx
      service-bus:data:
         ....
  • n

    numerous-pencil-44890

    07/02/2021, 4:15 PM
    Hi, does anyone know how to set a provider’s config programmatically? I have a stack that creates a GKE cluster which uses the
    gcp:project
    and other configs, but in my application (k8s namespace) stack I just want to inherit the GCP project, region, zone from a stack reference so that it can’t be screwed up when deploying between multiple clusters. The docs mention that you can create the provider with those values: “passed to the constructor of
    new gcp.Provider
    to construct a specific instance of the GCP provider”. But I don’t see any examples doing that. [https://www.pulumi.com/docs/intro/cloud-providers/gcp/#configuration]
    b
    • 2
    • 2
  • h

    hallowed-ice-8403

    07/05/2021, 6:39 AM
    Hello.. I am trying to provision base infrastructure components in aws using pulumi. Is there a sample git project using python automation api, that provision micro stacks for each module (group of aws components) within same project ?
  • a

    ambitious-article-39970

    07/05/2021, 11:58 AM
    hey guys qq trying to deploy ECS fargate and pass it a secret in container definitions
    ```task_definition = aws.ecs.TaskDefinition('pulumi-app-task',
        family='fargate-task-definition',
        cpu='256',
        memory='512',
        network_mode='awsvpc',
       tags = global_tags,
        requires_compatibilities=['FARGATE'],
        execution_role_arn=role.arn,
        container_definitions=json.dumps([{
          'secrets' : json.dumps([{'db_password': f"{db_password.id}"}]),
          'name': 'pulumi-test-app',
          'image': 'nginx',
          'portMappings': [{
             'containerPort': 80,
             'hostPort': 80,
             'protocol': 'tcp'
          }]
       }])
    )
    having issues with this line  'secrets' : json.dumps([{'db_password': f"{db_password.id}"}]),
    I assume this is because its returning an aws ARN im getting an unmarshelled string error converting to json. is this the case?
    rror: aws:ecs/taskDefinition:TaskDefinition resource 'pulumi-app-task' has a problem: ECS Task Definition container_definitions is invalid: Error decoding JSON: json: cannot unmarshal string into Go struct field ContainerDefinition.Secrets of type []*ecs.Secret. Examine values at 'TaskDefinition.ContainerDefinitions'.
    im new to pulumi and havent quite figured out how to use a debugger with it yet so havning to guess alot at what values which come back look like. (I know the second json dumps isnt needed but there for debugging)
    b
    g
    • 3
    • 6
  • a

    ambitious-article-39970

    07/05/2021, 11:58 AM
    db_password = aws.ssm.Parameter("pulumi-db-secret",
       type =  "SecureString",
       value = db_password_ssm)
  • a

    ambitious-article-39970

    07/05/2021, 1:37 PM
    so I figure out abit further and the issue seems to resolve around the db_password object as the values it spits out work when hardcoded
  • m

    many-yak-61188

    07/06/2021, 3:32 PM
    hello all 👋 complete noob here. Running into an issue utilizing
    pulumi
    in a github
    workflow
    where all dependencies are managed via
    poetry
    . I'm think I do not understand the combination of
    poetry
    in github
    workflows
    which is causing the error in pulumi and not directly an issue with pulumi itself. Describing the issue in the thread with more details
    • 1
    • 9
  • m

    many-yak-61188

    07/07/2021, 3:27 AM
    Following up here from the thread above. After a lot of painful experimentation I settled on this work around - bottom line
    pulumi/actions@v3
    and
    poetry
    don't play nice together in a github workflow environment. So instead of simply being able to do
    # ----------------------------------------------
                #          install & configure poetry
                # ----------------------------------------------
                - name: Install Poetry
                  uses: snok/install-poetry@v1.1.6
                  with:
                      virtualenvs-create: true
    
                - name: Install dependencies
                  run: |
                      poetry install --no-interaction
    
                # ----------------------------------------------
                #          Run pulumi in preview mode
                # ----------------------------------------------
    
                - name: Run pulumi preview
                  uses: pulumi/actions@v3
                  with:
                      command: preview
                      stack-name: dev
                      cloud-url: <s3://accrue-pulumi>
    (poetry installs
    pulumi
    packages) I had to install
    pulumi
    packages via
    pip
    ... again
    # ----------------------------------------------
                #          install & configure poetry
                # ----------------------------------------------
                - name: Install Poetry
                  uses: snok/install-poetry@v1.1.6
                  with:
                      virtualenvs-create: true
    
                - name: Install dependencies
                  run: |
                      poetry install --no-interaction
    
                # ----------------------------------------------
                #          Run pulumi in preview mode
                # ----------------------------------------------
    
                - name: Install pulumi via pip
                  run: |
                      pip install pulumi
                      pip install pulumi-aws
    
                - name: Run pulumi preview
                  uses: pulumi/actions@v3
                  with:
                      command: preview
                      stack-name: dev
                      cloud-url: <s3://accrue-pulumi>
    If I come up with a better way, I'll post it in the thread
    • 1
    • 1
  • s

    some-twilight-56575

    07/07/2021, 4:44 PM
    I could use some help with this use case I need to create a k8s secret (tls) from a SelfSigned Cert generated by pulumi
    key = pulumi_tls.PrivateKey(
            "cluster-issuer-key", algorithm="RSA", rsa_bits=4096
        )
        # private_key_pem public_key_pem
        ca = pulumi_tls.SelfSignedCert(
            "cluster-issuer-cert",
            is_ca_certificate=True,
            private_key_pem=key.private_key_pem,
            validity_period_hours=87600,
            key_algorithm="RSA",
            subjects=[
                pulumi_tls.SelfSignedCertSubjectArgs(
                    common_name=f"{stack_name} Communication CA"
                )
            ],
            allowed_uses=[
                "cert_signing",
                "key_encipherment",
                "digital_signature",
                "server_auth",
            ],
        )
    
        issuer_secret = pulumi_kubernetes.core.v1.Secret(
            "cluster-issuer-secret",
            metadata={"name": "ca-issuer", "namespace": "cert-manager"},
            type="<http://kubernetes.io/tls|kubernetes.io/tls>",
            data={"tls.key": key.private_key_pem, "tls.crt": key.cert_pem},
        )
    it appears data must be pre base64 encoded
    b
    • 2
    • 3
Powered by Linen
Title
s

some-twilight-56575

07/07/2021, 4:44 PM
I could use some help with this use case I need to create a k8s secret (tls) from a SelfSigned Cert generated by pulumi
key = pulumi_tls.PrivateKey(
        "cluster-issuer-key", algorithm="RSA", rsa_bits=4096
    )
    # private_key_pem public_key_pem
    ca = pulumi_tls.SelfSignedCert(
        "cluster-issuer-cert",
        is_ca_certificate=True,
        private_key_pem=key.private_key_pem,
        validity_period_hours=87600,
        key_algorithm="RSA",
        subjects=[
            pulumi_tls.SelfSignedCertSubjectArgs(
                common_name=f"{stack_name} Communication CA"
            )
        ],
        allowed_uses=[
            "cert_signing",
            "key_encipherment",
            "digital_signature",
            "server_auth",
        ],
    )

    issuer_secret = pulumi_kubernetes.core.v1.Secret(
        "cluster-issuer-secret",
        metadata={"name": "ca-issuer", "namespace": "cert-manager"},
        type="<http://kubernetes.io/tls|kubernetes.io/tls>",
        data={"tls.key": key.private_key_pem, "tls.crt": key.cert_pem},
    )
it appears data must be pre base64 encoded
b

billowy-army-68599

07/07/2021, 4:49 PM
try this:
issuer_secret = pulumi_kubernetes.core.v1.Secret(
        "cluster-issuer-secret",
        metadata={"name": "ca-issuer", "namespace": "cert-manager"},
        type="<http://kubernetes.io/tls|kubernetes.io/tls>",
        string_data={"tls.key": key.private_key_pem, "tls.crt": key.cert_pem},
    )
s

some-twilight-56575

07/07/2021, 5:17 PM
testing
worked thanks
View count: 1