We're having some problems trying to perform an `u...
# kubernetes
g
We're having some problems trying to perform an
up
on an existing cluster in EKS. I determined that we needed to associate an AWS IAM Role with a Kubernetes group in order for us to connect to each other's clusters. Now however after changing the CI from the original IAM user to a service account that assumes the role we get the following error on
up
.
Copy code
Configured Kubernetes cluster is unreachable: unable to load schema information from the API server: the server has asked for the client to provide credentials
We have our Pulumi code outputting the kubeconfig file and its the same one I am currently connected with so it can't be that the cert is expired or the kubeconfig is invalid. Any help is appreciated.
b
so if you look at the kubeconfig, it actually uses
aws eks get-token
https://github.com/pulumi/pulumi-eks/blob/c5fcceb8746b0ae2c1ef859fe1a7e4f70ec12398/nodejs/eks/cluster.ts#L187 So it may be your AWS IAM role might not have permission to do that
g
I bound it to the administrator policy for testing
b
makes sense, ultimately that message is coming from kubectl not having access
g
ok thats good information
so its a kubectl error not a aws error
is this discussion relevant?
i want to use the links in these comments but they seem dead
would i need to use that?
b
yes i think that's what's missing, you'd need to add it to the
aws-auth
configmap
g
i did that already with the
rolemappings
in the cluster
the role is in the configmap
b
ah interesting. in your ci pipeline, can you maybe try a kubectl command with the generated kubeconfig?
g
that's an idea!
Turns out pulumi needs to also know what the role is via @billowy-army-68599
Copy code
providerCredentialOpts
b
did you get it fixed?
g
yeah we had a support call with pulumi
the pulumi code also has to be aware of the role