hello, I'm trying to create a new EKS cluster with...
# aws
m
hello, I'm trying to create a new EKS cluster with pulumi, this is my code
Copy code
eks = Cluster(
    'eks-locust',
    vpc_id=vpc.id,
    public_subnet_ids=[subnet.id for subnet in dmz_subnets],
    private_subnet_ids=[subnet.id for subnet in private_subnets],
    cluster_security_group=sg,
    kubernetes_service_ip_address_range='192.168.0.0/16',
    vpc_cni_options=VpcCniOptionsArgs(cni_custom_network_cfg=True),
    node_group_options=ClusterNodeGroupOptionsArgs(
        min_size=2,
        max_size=150,
        node_associate_public_ip_address=False,
        node_security_group=sg),
    provider_credential_opts=KubeconfigOptionsArgs(
        profile_name='671822246166_Admin'),
    tags=common_tags
)
but I keep getting the error
Setting nodeGroupOptions, and any set of singular node group option(s) on the cluster, is mutually exclusive. Choose a single approach.
I read through the source and can't figure out what's conflicting with
node_group_options
, anyone got any ideas what I'm doing wrong? https://github.com/pulumi/pulumi-eks/blob/master/nodejs/eks/cluster.ts#L361-L375
w
I think the conflict is with node_associate_public_ip_address
n
@magnificent-battery-62880 - did you ever figure this out? I’m seeing the exact same behavior in code that has been working for months, and now sporadically fails with the same errors you are seeing.
m
nope I did not, but I ended up creating my own managed node group and turned the default one off so it's no longer an issue for me.
w
I ended up using the same approach as @magnificent-battery-62880. But it does not solve my problem where I need to have multiple Node Groups with distinct user data for each Group