Hello all! I am trying to create a cluster with mu...
# aws
a
Hello all! I am trying to create a cluster with multiple node groups following this tutorial (my code is attached): https://www.pulumi.com/docs/guides/crosswalk/aws/eks/#configuring-your-eks-clusters-worker-nodes-and-node-groups I run pulumi up successfully, but when I access the cluster no nodes appear in it. I can see the instances running on the EC2 panel, but no system pod, like coredns, gets deployed. Also when I run
kubectl get nodes
I dont see any nodes. I used
aws eks update-kubeconfig --name CLUSTER_NAME --region us-east-2
to point kubectl to the cluster, the user configured has admin access to everything. Any idea why that is happening? Appreciate your attention
In these pictures you can see what I said, the instances are running (EC2 panel) but no node group shows up in eks
r
I think that you need to use ManagedNodeGroups to see them in the EKS console