https://pulumi.com logo
#pulumi-kubernetes-operator
Title
# pulumi-kubernetes-operator
p

prehistoric-kite-30979

09/27/2021, 7:56 PM
Do I have to configure the secrets provider in the CRD if it already exists? I’m getting complaints about setting passphrase env var.
Copy code
{"level":"debug","ts":1632772421.0293512,"logger":"controller_stack","msg":"stackConfig loaded","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","stack":{},"stackConfig":{"secretsprovider":"<awskms://alias/REDACTED>","encryptedk
ey":"REDACTED","config":{"tetrat
e:metadata":{"owner":"maintainability","source":"cloud/projects/operations/storage/aws"}}}}
{"level":"debug","ts":1632772421.770783,"logger":"controller_stack","msg":"Updated stack config","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Stack.Name":"operations.storage.aws","config":{}}
{"level":"debug","ts":1632772421.771033,"logger":"controller_stack","msg":"InstallProjectDependencies","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","workspace":"/tmp/pulumi_auto636600156/cloud/projects/operations/storage/aws"}
{"level":"debug","ts":1632772422.9292898,"logger":"controller_stack","msg":"NPM/Yarn","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Dir":"/tmp/pulumi_auto636600156/cloud/projects/operations/storage/aws","Path":"/usr/bin/yarn","Args":["/usr
/bin/yarn","install"],"Stdout":"➤ YN0000: ┌ Resolution step"}
...
{"level":"debug","ts":1632772426.5784225,"logger":"controller_stack","msg":"NPM/Yarn","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Dir":"/tmp/pulumi_auto636600156/cloud/projects/operations/storage/aws","Path":"/usr/bin/yarn","Args":["/usr
/bin/yarn","install"],"Stdout":"➤ YN0000: Done with warnings in 3s 651ms"}
{"level":"info","ts":1632772426.6382632,"logger":"controller_stack","msg":"Checking current HEAD commit hash","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Current commit":"REDACTED"}
{"level":"info","ts":1632772426.6382985,"logger":"controller_stack","msg":"New commit hash found","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Current commit":"REDACTED","Last commit":""}
{"level":"error","ts":1632772428.426037,"logger":"controller_stack","msg":"Failed to refresh stack","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Stack.Name":"operations.storage.aws","error":"refreshing stack \"operations.storage.aws\": fa
iled to refresh stack: exit status 255\ncode: 255\nstdout: \nstderr: warning: A new version of Pulumi is available. To upgrade from version '3.11.0' to '3.13.0', visit <https://pulumi.com/docs/reference/install/> for manual instructions and release notes.\nerror: getting se
crets manager: passphrase must be set with PULUMI_CONFIG_PASSPHRASE or PULUMI_CONFIG_PASSPHRASE_FILE environment variables\n\n","errorVerbose":"failed to refresh stack: exit status 255\ncode: 255\nstdout: \nstderr: warning: A new version of Pulumi is available. To upgrade
 from version '3.11.0' to '3.13.0', visit <https://pulumi.com/docs/reference/install/> for manual instructions and release notes.\nerror: getting secrets manager: passphrase must be set with PULUMI_CONFIG_PASSPHRASE or PULUMI_CONFIG_PASSPHRASE_FILE environment variables\n\n
\nrefreshing stack \"operations.storage.aws\"\<http://ngithub.com/pulumi/pulumi-kubernetes-operator/pkg/controller/stack.(*reconcileStackSession).RefreshStack\n\t/home/runner/work/pulumi-kubernetes-operator/pulumi-kubernetes-operator/pkg/controller/stack/stack_controller.go:814\n|ngithub.com/pulumi/pulumi-kubernetes-operator/pkg/controller/stack.(*reconcileStackSession).RefreshStack\n\t/home/runner/work/pulumi-kubernetes-operator/pulumi-kubernetes-operator/pkg/controller/stack/stack_controller.go:814\n>
<http://github.com/pulumi/pulumi-kubernetes-operator/pkg/controller/stack.(*ReconcileStack).Reconcile\n\t/home/runner/work/pulumi-kubernetes-operator/pulumi-kubernetes-operator/pkg/controller/stack/stack_controller.go:237\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*|github.com/pulumi/pulumi-kubernetes-operator/pkg/controller/stack.(*ReconcileStack).Reconcile\n\t/home/runner/work/pulumi-kubernetes-operator/pulumi-kubernetes-operator/pkg/controller/stack/stack_controller.go:237\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*>
Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:298\<http://nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/control|nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/control>
ler-runtime@v0.9.0/pkg/internal/controller/controller.go:253\<http://nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:214\nruntime.goexit\n\t/|nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:214\nruntime.goexit\n\t/>
opt/hostedtoolcache/go/1.16.2/x64/src/runtime/asm_amd64.s:1371","stacktrace":"<http://sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:298\n|sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:298\n>
<http://sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start|sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start>.
func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:214"}
Copy code
spec:
  backend: <s3://REDACTED>
  branch: refs/heads/master
  gitAuth:
    sshAuth:
      sshPrivateKey:
        type: Secret
        ...
  projectRepo: git@github.com:tetrateio/tetrate.git
  refresh: true
  repoDir: cloud/projects/operations/storage/aws
  stack: operations.storage.aws
  useLocalStackOnly: true
the stack for context ^^
@sparse-park-68967 any ideas? I’m getting to the point where I need to do some println debugging 😅
s

sparse-park-68967

09/27/2021, 10:51 PM
On mobile. Will look shortly!
p

prehistoric-kite-30979

09/27/2021, 10:52 PM
thanks 🙂
s

sparse-park-68967

09/28/2021, 12:31 AM
so the stack already exists right? And its setup with a (non-default) secret provider? Can you check the contents of the Pulumi.yaml and Pulumi stack yaml? We can chat over DM if there is anything sensitive.
Looks like the secret provider is indeed being loaded:
Copy code
{"level":"debug","ts":1632772421.0293512,"logger":"controller_stack","msg":"stackConfig loaded","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","stack":{},"stackConfig":{"secretsprovider":"<awskms://alias/REDACTED>","encryptedk
ey":"REDACTED","config":{"tetrat
e:metadata":{"owner":"maintainability","source":"cloud/projects/operations/storage/aws"}}}}
{"level":"debug","ts":1632772421.770783,"logger":"controller_stack","msg":"Updated stack config","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Stack.Name":"operations.storage.aws","config":{}}
err yeah so there might be a bug. Can you try with just setting the secretProvider in the CR? You don't need to set the key material etc.
I expect that to work.
Please open an issue but hopefully just setting the secretsProvider should be sufficient to get you around this
p

prehistoric-kite-30979

09/28/2021, 1:09 AM
I’ll give this a go tomorrow morning and report back (will raise the relevant issues)
It is doing two yarn installs by the look of it
(may be related)
it appears to do one at the root and then another at the repoDir
s

sparse-park-68967

09/28/2021, 2:42 AM
I will have to look. I do think that the install dependencies intrinsically in automation api was added later and perhaps there is a redundant call to install dependencies. I would expect it to go at the repodir in both cases though.
p

prehistoric-kite-30979

09/28/2021, 6:49 PM
adding it to the stack did solve the issue, I’ll open up an issue later today
Ok so next failure I get is on stack refresh…
Copy code
│ {"level":"error","ts":1632858513.1335511,"logger":"controller_stack","msg":"Failed to refresh stack","Request.Namespace":"pulumi-operator","Request.Name":"operations.storage.aws","Stack.Name":"operations.storage.aws","error":"failed to get permalink","errorVerbose":"f │
│ ailed to get permalink\<http://ngithub.com/pulumi/pulumi/sdk/v3/go/auto.init\n\t/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.13.0/go/auto/stack.go:682\nruntime.doInit\n\t/opt/hostedtoolcache/go/1.16.2/x64/src/runtime/proc.go:6265\nruntime.doInit\n\t/opt/hostedto|ngithub.com/pulumi/pulumi/sdk/v3/go/auto.init\n\t/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.13.0/go/auto/stack.go:682\nruntime.doInit\n\t/opt/hostedtoolcache/go/1.16.2/x64/src/runtime/proc.go:6265\nruntime.doInit\n\t/opt/hostedto> │
│ olcache/go/1.16.2/x64/src/runtime/proc.go:6242\nruntime.doInit\n\t/opt/hostedtoolcache/go/1.16.2/x64/src/runtime/proc.go:6242\nruntime.doInit\n\t/opt/hostedtoolcache/go/1.16.2/x64/src/runtime/proc.go:6242\nruntime.main\n\t/opt/hostedtoolcache/go/1.16.2/x64/src/runtime │
│ /proc.go:208\nruntime.goexit\n\t/opt/hostedtoolcache/go/1.16.2/x64/src/runtime/asm_amd64.s:1371","stacktrace":"<http://sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/in|sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/in> │
│ ternal/controller/controller.go:298\<http://nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg|nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg> │
│ /internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.9.0/pkg/internal/controller/controller.go:214"}
I’ve never seen the failed to get permalink error before 🤔
Looking through the code refresh basically always errors when not using the pulumi service backend
I will open up a PR to match update stack behaviour when unable to find a permalink
s

sparse-park-68967

09/28/2021, 8:11 PM
Thanks!
p

prehistoric-kite-30979

09/28/2021, 9:38 PM
also created an issue for the secret prov https://github.com/pulumi/pulumi-kubernetes-operator/issues/201
let me know if its missing anything
s

sparse-park-68967

09/29/2021, 12:56 AM
Looks good to me. Thanks a bunch.