And just in case anyone knows the answer to the fo...
# aws
And just in case anyone knows the answer to the following issue: I’m trying to make a simple EKS cluster via pulumi.  My code and package.json: It always error with the following error (cluster itself is deployed)
Copy code
pulumi-nodejs:dynamic:Resource (newcluster-vpc-cni):
   error: Command failed: kubectl apply -f /var/folders/93/trfs1ns93nx39y22gbwx6hmr0000gn/T/tmp-13508VziZRSVp56CV.tmp
  error: You must be logged in to the server (the server has asked for the client to provide credentials)
Full error log and the yaml it tries to deploy and my computers kubeconfig: As far as I can tell, it means my kubeconfig cannot connect to the cluster correctly. I have no aws config file on my computer and have exported AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY (working credentials, we deploy and interact with clusters with these credentials constantly in the CLI and code I’m trying to replace by pulumi. Which I assume is because it did not authenticate correctly using aws-iam-authenticator?
provider will construct its own
but allows for overrides via environment variable. It looks like you have a
file already in your home directory. I suspect this is conflicting with the
that is being set for the EKS cluster itself. Can you try removing that file and
environment variable if you have it set and try again?
(Copied my response in the support ticket here to help others if they have questions on this too.)