icy-lion-8963
03/06/2024, 12:10 PMPreviewing update (staging)
View in Browser (Ctrl+O): <https://app.pulumi.com/pressone/pressone-infra/staging/previews/a35a5f5a-debb-4c7a-b325-7e573e5b34e6>
Downloading plugin: 36.40 MiB / 36.40 MiB [========================] 100.00% 16s
[resource plugin kubernetes-4.8.1] installing
Downloading plugin: 36.45 MiB / 36.45 MiB [========================] 100.00% 16s
[resource plugin kubernetes-4.9.0] installing
Type Name Plan Info
pulumi:pulumi:Stack pressone-infra-staging 1 error
~ ├─ pulumi:providers:kubernetes pressone-do-k8s update [diff: ~version]
+- └─ kubernetes:cert-manager.io/v1:ClusterIssuer pressone-letsencrypt-staging-cert-issuer replace [diff: ~metadata]
Diagnostics:
pulumi:pulumi:Stack (pressone-infra-staging):
error: Program failed with an unhandled exception:
Traceback (most recent call last):
File "/Users/theoluwanifemi/pulumi/pulumi-language-python-exec", line 197, in <module>
loop.run_until_complete(coro)
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 141, in run_in_stack
await run_pulumi_func(run)
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 51, in run_pulumi_func
await wait_for_rpcs()
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 83, in wait_for_rpcs
raise exn from cause
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc_manager.py", line 71, in rpc_wrapper
result = await rpc
^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/resource.py", line 1067, in do_register_resource_outputs
serialized_props = await rpc.serialize_properties(outputs or {}, {})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc.py", line 215, in serialize_properties
result = await serialize_property(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc.py", line 468, in serialize_property
value = await serialize_property(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc.py", line 451, in serialize_property
future_return = await asyncio.ensure_future(awaitable)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 129, in get_value
val = await self._future
^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 212, in run
return await transformed.future(with_unknowns=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 129, in get_value
val = await self._future
^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 175, in run
value = await self._future
^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 200, in run
transformed: Input[U] = func(value)
^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi_kubernetes/helm/v3/helm.py", line 618, in invoke_helm_template
inv = pulumi.runtime.invoke('kubernetes:helm:template', {'jsonOpts': opts}, invoke_opts)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/invoke.py", line 192, in invoke
raise invoke_error
Exception: invoke of kubernetes:helm:template failed: invocation of kubernetes:helm:template returned an error: failed to generate YAML for specified Helm chart: could not get server version from Kubernetes: the server has asked for the client to provide credentials
I’ve tried to remove the Kubernetes plugin and re-install, no luck. Please help?bumpy-glass-30283
03/06/2024, 12:19 PMHelm chart: could not get server version from Kubernetes: the server has asked for the client to provide credentials
which means authentication is turned on the k8s cluster, or your credentials expired?icy-lion-8963
03/06/2024, 12:30 PMbumpy-glass-30283
03/06/2024, 12:31 PMlimited-rainbow-51650
03/06/2024, 12:31 PMpulumi
cli on.bumpy-glass-30283
03/06/2024, 12:32 PMkubectl get pods
? if it doesn't succeed, then pulumi won't eitherlimited-rainbow-51650
03/06/2024, 12:32 PMicy-lion-8963
03/06/2024, 12:33 PMkubectl
and helm
commands work fineicy-lion-8963
03/06/2024, 12:33 PMlimited-rainbow-51650
03/06/2024, 12:34 PMpressone-do-k8s
icy-lion-8963
03/06/2024, 12:36 PMimport pulumi
import pulumi_kubernetes as k8s
from pulumi_kubernetes.helm.v3 import Chart, ChartOpts, FetchOpts
from digitalocean.config import do_settings
from digitalocean.k8s.provider import get_k8s_opts
from digitalocean.k8s.utils.transformations import metadata_annotations
nginx_ingress_controller_name = f"{do_settings.CURRENT_ENV}-ingress-controller"
namespace_identifier = "pressone-nginx-ingress-ns"
nginx_ingress_namespace = k8s.core.v1.Namespace(
namespace_identifier, metadata={
"name": namespace_identifier
},
opts=get_k8s_opts()
)
pod_annotations = {
# "<http://service.beta.kubernetes.io/do-loadbalancer-enable-proxy-protocol|service.beta.kubernetes.io/do-loadbalancer-enable-proxy-protocol>": "true",
# "<http://service.beta.kubernetes.io/do-loadbalancer-hostname|service.beta.kubernetes.io/do-loadbalancer-hostname>": get_pressone_cloud_domain()
}
# commented out the loadbalancer annotations because they removed the ip from output
nginx_ingress = Chart(
nginx_ingress_controller_name,
ChartOpts(
chart="ingress-nginx",
version="4.10.0",
namespace=namespace_identifier,
fetch_opts=FetchOpts(
repo="<https://kubernetes.github.io/ingress-nginx>",
),
transformations=[metadata_annotations(pod_annotations)],
values={
"controller": {
"metrics": {
"enabled": True,
},
"publishService": {
"enabled": True,
}
},
},
),
opts=get_k8s_opts(
depends_on=nginx_ingress_namespace,
)
)
ingress_service_ip = nginx_ingress.get_resource(
"v1/Service", f"{namespace_identifier}/{nginx_ingress_controller_name}-ingress-nginx-controller"
).status.apply(lambda status: status.load_balancer.ingress[0].ip)
pulumi.export("load_balancer_ip", ingress_service_ip)
icy-lion-8963
03/06/2024, 12:36 PMprovider.py
from typing import Any
import pulumi_kubernetes as k8s
from pulumi import ResourceOptions
from digitalocean.k8s.cluster import cluster
k8s_provider = k8s.Provider(
"pressone-do-k8s", kubeconfig=cluster.kube_configs[0].raw_config
)
def get_k8s_opts(**kwargs: Any) -> ResourceOptions:
return ResourceOptions(
provider=k8s_provider,
**kwargs
)
limited-rainbow-51650
03/06/2024, 12:42 PMChart
resource temporarily and can you add these lines in your __main__.yaml
?
from digitalocean.k8s.cluster import cluster
pulumi.export("kubeconfig", cluster.kube_configs[0].raw_config)
If your run pulumi up
like this, should get the value of the kubeconfig
of your DigitalOcean cluster as a stack output for debugging purposes.
If the value seems legit, can you drop that value in a file and then try this:
KUBECONFIG=<file containing export kubeconfig value> kubectl get pods
icy-lion-8963
03/06/2024, 12:52 PMicy-lion-8963
03/06/2024, 1:31 PMPreviewing update (staging)
View in Browser (Ctrl+O): <https://app.pulumi.com/pressone/pressone-infra/staging/previews/5f6ddfcf-5078-46bc-9a49-4e1ee3364afc>
Type Name Plan Info
pulumi:pulumi:Stack pressone-infra-staging 1 error
~ ├─ pulumi:providers:kubernetes pressone-do-k8s update [diff: ~version]
+- ├─ kubernetes:<http://cert-manager.io/v1:ClusterIssuer|cert-manager.io/v1:ClusterIssuer> pressone-letsencrypt-staging-cert-issuer replace [diff: ~metadata]
+- └─ kubernetes:core/v1:Secret pressone-k8s-gitlab-registry-secret replace [diff: ~data,provider]
Diagnostics:
pulumi:pulumi:Stack (pressone-infra-staging):
error: Program failed with an unhandled exception:
Traceback (most recent call last):
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc_manager.py", line 71, in rpc_wrapper
result = await rpc
^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/resource.py", line 1067, in do_register_resource_outputs
serialized_props = await rpc.serialize_properties(outputs or {}, {})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc.py", line 215, in serialize_properties
result = await serialize_property(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc.py", line 468, in serialize_property
value = await serialize_property(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/rpc.py", line 451, in serialize_property
future_return = await asyncio.ensure_future(awaitable)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 129, in get_value
val = await self._future
^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 212, in run
return await transformed.future(with_unknowns=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 129, in get_value
val = await self._future
^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 175, in run
value = await self._future
^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/output.py", line 200, in run
transformed: Input[U] = func(value)
^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi_kubernetes/helm/v3/helm.py", line 618, in invoke_helm_template
inv = pulumi.runtime.invoke('kubernetes:helm:template', {'jsonOpts': opts}, invoke_opts)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/invoke.py", line 192, in invoke
raise invoke_error
Exception: invoke of kubernetes:helm:template failed: invocation of kubernetes:helm:template returned an error: failed to generate YAML for specified Helm chart: could not get server version from Kubernetes: the server has asked for the client to provide credentials
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/theoluwanifemi/pulumi/pulumi-language-python-exec", line 197, in <module>
loop.run_until_complete(coro)
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 141, in run_in_stack
await run_pulumi_func(run)
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 51, in run_pulumi_func
await wait_for_rpcs()
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 83, in wait_for_rpcs
raise exn from cause
File "/Users/theoluwanifemi/Library/Caches/pypoetry/virtualenvs/pressone-infra-_cUdUW92-py3.11/lib/python3.11/site-packages/pulumi/runtime/stack.py", line 75, in wait_for_rpcs
await rpc_manager.rpcs.pop()
Exception: invoke of kubernetes:helm:template failed: invocation of kubernetes:helm:template returned an error: failed to generate YAML for specified Helm chart: could not get server version from Kubernetes: the server has asked for the client to provide credentials
Similar erroricy-lion-8963
03/06/2024, 1:32 PMlimited-rainbow-51650
03/06/2024, 1:33 PMkubeconfig
shown, uncommenting all the code that is now failing due to this problem.icy-lion-8963
03/06/2024, 2:54 PMOutputs:
- cluster_issuer_name : "pressone-staging-letsencrypt"
- load_balancer_ip : "<REPLACED BY ME>"
- tls_private_key_secret_ref: "pressone-staging-letsencrypt-private-key"
Resources:
- 110 to delete
16 unchanged
Do you want to perform this update? [Use arrows to move, type to filter]
yes
> no
details
limited-rainbow-51650
03/06/2024, 3:00 PMicy-lion-8963
03/06/2024, 3:01 PMimport pulumi
import pulumi_digitalocean as do
from pulumi_digitalocean import KubernetesClusterNodePoolArgs
from digitalocean.config import do_settings
from digitalocean.providers import get_do_opts
from digitalocean.registry import pressone_container_registry
from digitalocean.vpc import pressone_vpc
cluster_name = f"pressone-{do_settings.CURRENT_ENV}-k8s-cluster"
cluster = do.KubernetesCluster(
cluster_name,
version="1.29.1-do.0",
region=do.Region.LON1,
node_pool=KubernetesClusterNodePoolArgs(
name=f"pressone-{do_settings.CURRENT_ENV}-pool",
size=do.DropletSlug.DROPLET_S4_VCPU8_G_B_INTEL,
# <https://www.pulumi.com/registry/packages/digitalocean/api-docs/droplet/#supporting-types>
auto_scale=True,
min_nodes=1,
max_nodes=2,
),
tags=[do_settings.CURRENT_ENV],
vpc_uuid=pressone_vpc.id,
registry_integration=True,
opts=get_do_opts(depends_on=pressone_container_registry),
name=cluster_name
)
pulumi.export("kubeconfig", cluster.kube_configs[0].raw_config)
limited-rainbow-51650
03/06/2024, 3:03 PM__main__.yaml
?icy-lion-8963
03/06/2024, 3:06 PMlimited-rainbow-51650
03/06/2024, 3:11 PMpulumi.export
lines for the existing outputs:
- cluster_issuer_name : "pressone-staging-letsencrypt"
- load_balancer_ip : "<REPLACED BY ME>"
- tls_private_key_secret_ref: "pressone-staging-letsencrypt-private-key"
That file should also contain the additional pulumi.export
line I suggested before.icy-lion-8963
03/06/2024, 3:13 PMkubeconfig1
, then I see it like this:icy-lion-8963
03/06/2024, 3:14 PMkubeconfig
, I don’t see the output anymore. Are there like any auto exports? Maybe I’m overriding somethingicy-lion-8963
03/06/2024, 3:14 PMlimited-rainbow-51650
03/06/2024, 3:14 PMkubeconfig
export comes from, but I suspect there is already an export named kubeconfig
somewhere in your code.icy-lion-8963
03/06/2024, 3:16 PMlimited-rainbow-51650
03/06/2024, 3:18 PM"kubeconfig"
?icy-lion-8963
03/06/2024, 3:20 PMicy-lion-8963
03/06/2024, 3:21 PMkubeconfig
export statment completely, and the kubeconfig
output still came. So, it’s being exported from somewhere, but I definitely do not know wherelimited-rainbow-51650
03/06/2024, 3:27 PMkubeconfig
, see if the command pulumi stack output kubeconfig --show-secrets
shows you a valid kubeconfig configuration. Don't paste it here as it is a secret.icy-lion-8963
03/06/2024, 3:46 PMicy-lion-8963
03/06/2024, 3:48 PMlimited-rainbow-51650
03/06/2024, 3:50 PMpulumi refresh
to bring it back in sync.icy-lion-8963
03/06/2024, 3:50 PMicy-lion-8963
03/06/2024, 3:52 PMkubectl get pods
returns Unauthorized
limited-rainbow-51650
03/06/2024, 3:52 PMkubeconfig
stack output comes from. It is possible that value is not connected to the actual cluster output (cluster.kube_configs[0].raw_config
).icy-lion-8963
03/06/2024, 3:54 PMPulumi.yaml
configs with different name. Is it possible this has anything to do?limited-rainbow-51650
03/06/2024, 3:56 PMicy-lion-8963
03/06/2024, 4:01 PMicy-lion-8963
03/06/2024, 4:01 PMdry-keyboard-94795
03/06/2024, 4:14 PMpulumi refresh
?dry-keyboard-94795
03/06/2024, 4:16 PMdry-keyboard-94795
03/06/2024, 4:17 PMcluster.kube_configs[0].expires_at
as an output to help debugicy-lion-8963
03/06/2024, 4:23 PMpulumi refresh
yeticy-lion-8963
03/06/2024, 4:23 PMicy-lion-8963
03/06/2024, 5:17 PMkubernetes:<http://rbac.authorization.k8s.io/v1:ClusterRoleBinding|rbac.authorization.k8s.io/v1:ClusterRoleBinding> (staging-cert-manager-cainjector):
warning: configured Kubernetes cluster is unreachable: unable to load schema information from the API server: the server has asked for the client to provide credentials
error: Preview failed: failed to read resource state due to unreachable cluster. If the cluster was deleted, you can remove this resource from Pulumi state by rerunning the operation with the PULUMI_K8S_DELETE_UNREACHABLE environment variable set to "true"
kubernetes:<http://opentelemetry.io/v1alpha1:Instrumentation|opentelemetry.io/v1alpha1:Instrumentation> (openobserve-collector/openobserve-java):
warning: configured Kubernetes cluster is unreachable: unable to load schema information from the API server: the server has asked for the client to provide credentials
error: Preview failed: failed to read resource state due to unreachable cluster. If the cluster was deleted, you can remove this resource from Pulumi state by rerunning the operation with the PULUMI_K8S_DELETE_UNREACHABLE environment variable set to "true"
kubernetes:<http://rbac.authorization.k8s.io/v1:ClusterRoleBinding|rbac.authorization.k8s.io/v1:ClusterRoleBinding> (staging-cert-manager-controller-challenges):
warning: configured Kubernetes cluster is unreachable: unable to load schema information from the API server: the server has asked for the client to provide credentials
error: Preview failed: failed to read resource state due to unreachable cluster. If the cluster was deleted, you can remove this resource from Pulumi state by rerunning the operation with the PULUMI_K8S_DELETE_UNREACHABLE environment variable set to "true"
kubernetes:<http://rbac.authorization.k8s.io/v1:ClusterRoleBinding|rbac.authorization.k8s.io/v1:ClusterRoleBinding> (staging-cert-manager-controller-certificatesigningrequests):
warning: configured Kubernetes cluster is unreachable: unable to load schema information from the API server: the server has asked for the client to provide credentials
error: Preview failed: failed to read resource state due to unreachable cluster. If the cluster was deleted, you can remove this resource from Pulumi state by rerunning the operation with the PULUMI_K8S_DELETE_UNREACHABLE environment variable set to "true"
icy-lion-8963
03/06/2024, 5:17 PMdry-keyboard-94795
03/06/2024, 5:20 PM--target
to only refresh the clusterdry-keyboard-94795
03/06/2024, 5:21 PM--target '*-k8s-cluster'
icy-lion-8963
03/06/2024, 5:25 PMicy-lion-8963
03/06/2024, 5:26 PMdry-keyboard-94795
03/06/2024, 5:26 PMpulumi stack --show-urns
icy-lion-8963
03/06/2024, 5:37 PMdry-keyboard-94795
03/06/2024, 5:38 PMdry-keyboard-94795
03/06/2024, 5:38 PMup
again, does it work?icy-lion-8963
03/06/2024, 5:43 PMicy-lion-8963
03/06/2024, 5:43 PMicy-lion-8963
03/06/2024, 5:43 PMdry-keyboard-94795
03/06/2024, 5:45 PMdry-keyboard-94795
03/06/2024, 5:45 PMicy-lion-8963
03/06/2024, 5:45 PMkubeconfig
output is being removed.dry-keyboard-94795
03/06/2024, 5:46 PMicy-lion-8963
03/06/2024, 5:49 PMYou can make it part of your process to run the refresh command with URN targeted, or have the token be updated on every apply using this function:This makes sense then
dry-keyboard-94795
03/06/2024, 5:50 PMkubeconfig = get_kubernetes_cluster_output(cluster.name).kube_configs[0].raw_config
icy-lion-8963
03/06/2024, 6:20 PM