Anyone else getting this issue? Not sure if its re...
# kubernetes
b
Anyone else getting this issue? Not sure if its related to the latest provider release We don’t use implicit provider
kubeconfig
anywhere, only grabbing from
eks.Cluster.provider
Copy code
Error: invocation of kubernetes:yaml:decode returned an error: unable to load Kubernetes client configuration from kubeconfig file: invalid configuration: no configuration has been provided
        at monitor.invoke (/root/project/node_modules/@pulumi/pulumi/runtime/invoke.js:172:33)
        at Object.onReceiveStatus (/root/project/node_modules/grpc/src/client_interceptors.js:1210:9)
        at InterceptingListener._callNext (/root/project/node_modules/grpc/src/client_interceptors.js:568:42)
        at InterceptingListener.onReceiveStatus (/root/project/node_modules/grpc/src/client_interceptors.js:618:8)
        at callback (/root/project/node_modules/grpc/src/client_interceptors.js:847:24)
g
Yes, that appears to be related to the changes in the latest release. Can you file an issue with a repro if possible?
a
That’s the error we started getting on multiple projects today.
We’ve been using
k8s.Provider
which gets populated from a top level project and consumed as stack reference in underlying projects. Should we target and lock the previous version to get around of the issue or do you recommend another way to handle it until it’s been fixed? @gorgeous-egg-16927
b
Version locking the NPM package on 1.4.1 resolved it for me
1
g
Thanks, looking into it now
@busy-umbrella-36067 I haven’t been able to reproduce the problem locally. Could you update the issue with package versions you’re using, and more details on the workflow where you’re seeing the problem (i.e. using existing stacks, only on new stacks, only on preview, etc.)? Thanks!
b
@gorgeous-egg-16927 updated the issue,
--skip-preview
does seem to get around it
I lied, getting the issue with
--skip-preview
as well
Not sure if you're able to view user stacks on the console: https://app.pulumi.com/mazamats/test/dev/updates/5
g
Tracked down the problem and am reverting the relevant changes until we can release a proper fix. 1.4.3 release should be out this afternoon with the problematic changes reverted
🙏 2