Hi, I have Pulumi managing some EC2 instances, and...
# general
c
Hi, I have Pulumi managing some EC2 instances, and I got an alert from AWS that one of them is unstable and is scheduled for replacement. What's the best way to deal with that? Would this work? 1. Manually terminate the instance in EC2 console 2. Run
pulumi refresh
to alert Pulumi that the instance is gone 3. Run
pulumi up
to get a new one
p
marked for answer🙂
m
Yes that should work!
I do this sometimes 😄 not for EC2 but for K8S
c
Thanks 🙂 I have realised that maybe the even better answer to this is to setup an EKS Node Group so that (at least for EKS nodes) I don't need to
pulumi up
to get the right number of nodes https://www.pulumi.com/registry/packages/eks/api-docs/nodegroup/