08/04/2022, 9:35 AM
Hello, I'm trying to use
to deploy CRDs into my EKS cluster (typescript) and I run into issues. Btw I'm new to typescript and as well to pulumi. So please kindly bare with me šŸ™ Code:
import * as k8s from "@pulumi/kubernetes";
import * as eks from "@pulumi/eks";

export default {
    install_crds(cluster: eks.Cluster){
        new k8s.yaml.ConfigFile("argocd_namespace", {
            file: "kubernetes_cluster_components/namespaces/argocd-namespace.yaml",
        }, {providers: { "kubernetes": cluster.provider }});
pulumi:pulumi:Stack  k8s-moralis-aws-dev-argo-test  running.    error: an unhandled error occurred: Program exited with non-zero exit code: -1
I0804 09:23:53.878138   22054 deployment_executor.go:162] deploymentExecutor.Execute(...): exiting provider canceller
     Type                 Name                           Plan     Info
     pulumi:pulumi:Stack  k8s-moralis-aws-dev-argo-test           1 error; 39 messages

  pulumi:pulumi:Stack (k8s-moralis-aws-dev-argo-test):
    Cloud Provider: aws Stack: aws-dev-argo-test

    error: an unhandled error occurred: Program exited with non-zero exit code: -1
The error message is not very descriptive, hence difficult to troubleshoot. Can someone please help me here šŸ™


08/04/2022, 6:11 PM
can you try with this instead
new k8s.yaml.ConfigFile("argocd_namespace", {
    file: "kubernetes_cluster_components/namespaces/argocd-namespace.yaml",
}, {provider: cluster.provider })
if this doesn't work, ā€¢ what's in the
file? ā€¢ can you double check that the filepath is correct? i think it should be relative to where this file is ā€¢ is this literally everything that's in your index.ts file?


08/05/2022, 7:34 AM
Hey Mike, thanks for looking into it. I found the issue. You were of course right with the provider stuff, but there was something more to it as well. I have installed pip version of awscli version and for some reason it wasn't compatible. I kept getting
"error: exec plugin: invalid apiVersion "<|>"
error. So had to install awscli through brew and update the kubeconfig file again. This fixed my issues.
šŸ‘ 1