How do I fix this?
# kubernetes
r
How do I fix this?
b
Can you provide info about how the
kubeconfig
stack output is being created, and what's in the JSON file itself? There are a lot of reasons this could be failing. Do you know how authentication worked before (IAM authenticator, static token, etc)?
r
The stack output is created like this:
Here I'm using pulumi-eks rather than aws/eks to create the EKS cluster.
👆The json file with some information removed.
b
That does look sufficient to have authentication work. What happens when you run
AWS_PROFILE="<your profile name>" aws eks get-token --cluster <your cluster name>
, replacing the two values from the values in your JSON file?
If that prints out a token of some kind, I'm not sure what's wrong with your setup!
r
it does print out a valid token with a valid expiration date timestamp: "expirationTimestamp": "2021-09-24T212354Z"...
b
The only other thing I could think of is an outdated
kubectl
that doesn't understand the format of that JSON file.
r
I have a 1.21.3 version which is pretty new. not only kubectl doesn't work. Pulumi provider is broken as well (used to work). Not sure what the issue. Will keep digging into it. Thanks Jesse, for your help!