I created an eks cluster a while ago. Right now I'm using assumerole to auth with kubernetes (i.e.,
aws-iam-authenticator with role). When anyone else tries to do certain things in aws using
the same role, they get error message:
'error: configured Kubernetes cluster is unreachable: unable to load schema
information from the API server: the server has asked for the client to provide
credentials'
I'm able to reproduce this with a test aws account and setting up
~/.aws/credentials to point to these new credentials. The new user has the same
access level I do, so should be able to assume the role.
I do see what appears to be some discussion of this in
https://github.com/pulumi/pulumi-eks/pull/205
but unclear what I should do on my end to fix this. Any ideas?