https://pulumi.com logo
#aws
Title
# aws
l

lemon-salesclerk-6224

10/25/2022, 10:32 PM
Hey Folks - we've been using pulumi eks and seemingly suddenly today we're now getting a panic when doing a pulumi up. It also causes memory usage on a couple folks macs to steadily climb before freezing up their computers. I'm pasting the panic in the thread + some extra info
Copy code
goroutine 776 [running]:
<http://github.com/pulumi/pulumi/sdk/v3/go/common/util/contract.failfast(...)|github.com/pulumi/pulumi/sdk/v3/go/common/util/contract.failfast(...)>
	/Users/runner/work/pulumi/pulumi/sdk/go/common/util/contract/failfast.go:23
<http://github.com/pulumi/pulumi/sdk/v3/go/common/util/contract.Assert(...)|github.com/pulumi/pulumi/sdk/v3/go/common/util/contract.Assert(...)>
	/Users/runner/work/pulumi/pulumi/sdk/go/common/util/contract/assert.go:26
<http://github.com/pulumi/pulumi/sdk/v3/go/common/resource/plugin.(*provider).Construct|github.com/pulumi/pulumi/sdk/v3/go/common/resource/plugin.(*provider).Construct>(0x14001a3ed20, {{0x140007cb180, 0x7}, {0x140004f43ad, 0x7}, 0x14000b48210, {0x0, 0x0, 0x0}, 0x1, ...}, ...)
	/Users/runner/work/pulumi/pulumi/sdk/go/common/resource/plugin/provider_plugin.go:1096 +0x1110
<http://github.com/pulumi/pulumi/pkg/v3/resource/deploy.(*resmon).RegisterResource(0x14000b4e000|github.com/pulumi/pulumi/pkg/v3/resource/deploy.(*resmon).RegisterResource(0x14000b4e000>, {0x102567e18, 0x14001f13380}, 0x14000857cc0)
	/Users/runner/work/pulumi/pulumi/pkg/resource/deploy/source_eval.go:1002 +0x163c
<http://github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceMonitor_RegisterResource_Handler.func1({0x102567e18|github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceMonitor_RegisterResource_Handler.func1({0x102567e18>, 0x14001f13380}, {0x1024cece0, 0x14000857cc0})
	/Users/runner/work/pulumi/pulumi/sdk/proto/go/resource.pb.go:1124 +0x7c
<http://github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc.OpenTracingServerInterceptor.func1({0x102567e18|github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc.OpenTracingServerInterceptor.func1({0x102567e18>, 0x14001f12f90}, {0x1024cece0, 0x14000857cc0}, 0x14000af9900, 0x14000fd59e0)
	/Users/runner/go/pkg/mod/github.com/grpc-ecosystem/grpc-opentracing@v0.0.0-20180507213350-8e809c8a8645/go/otgrpc/server.go:57 +0x3bc
<http://github.com/pulumi/pulumi/sdk/v3/go/common/util/rpcutil.OpenTracingServerInterceptor.func1({0x102567e18|github.com/pulumi/pulumi/sdk/v3/go/common/util/rpcutil.OpenTracingServerInterceptor.func1({0x102567e18>, 0x14001f12f90}, {0x1024cece0, 0x14000857cc0}, 0x14000af9900, 0x14000fd59e0)
	/Users/runner/work/pulumi/pulumi/sdk/go/common/util/rpcutil/interceptor.go:79 +0x1e0
<http://github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceMonitor_RegisterResource_Handler({0x10240b320|github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceMonitor_RegisterResource_Handler({0x10240b320>, 0x14000b4e000}, {0x102567e18, 0x14001f12f90}, 0x14000be6ea0, 0x14000af8260)
	/Users/runner/work/pulumi/pulumi/sdk/proto/go/resource.pb.go:1126 +0x150
<http://google.golang.org/grpc.(*Server).processUnaryRPC(0x14000b401c0|google.golang.org/grpc.(*Server).processUnaryRPC(0x14000b401c0>, {0x102589220, 0x140003bbe00}, 0x14001b937a0, 0x14000b483f0, 0x10328e7c0, 0x0)
	/Users/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:1217 +0xc38
<http://google.golang.org/grpc.(*Server).handleStream(0x14000b401c0|google.golang.org/grpc.(*Server).handleStream(0x14000b401c0>, {0x102589220, 0x140003bbe00}, 0x14001b937a0, 0x0)
	/Users/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:1540 +0xa34
<http://google.golang.org/grpc.(*Server).serveStreams.func1.2(0x14000446470|google.golang.org/grpc.(*Server).serveStreams.func1.2(0x14000446470>, 0x14000b401c0, {0x102589220, 0x140003bbe00}, 0x14001b937a0)
	/Users/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:878 +0x94
created by <http://google.golang.org/grpc.(*Server).serveStreams.func1|google.golang.org/grpc.(*Server).serveStreams.func1>
	/Users/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:876 +0x1f0
@boundless-engineer-23836 vis and extra info if i missed anything
I saw the thread above regarding pulumi eks using version 3.22.0 but we don't think this is necessarily related. fwiw I did try doing the workaround steps in case it was related
It fails on the preview portion of the up
Looking to try and get some help with determining how to triage or understanding what happened.
Essentially we had a branch w/ eks pulumi work functioning and suddenly today running the exact same processes, we're getting a panic. So it feels like maybe some underlying dependencies were changed but we're not sure which
The other concerning thing is that there are symptoms which look like a memory leak
b

boundless-engineer-23836

10/25/2022, 10:54 PM
seems to be due to "@pulumi/kubernetes-cert-manager" and the new eks version from pulumi
l

limited-rainbow-51650

10/26/2022, 8:19 AM
@lemon-salesclerk-6224 did you upgrade any of the versions of components involved? • Pulumi CLI • Pulumi runtime library in your program • Pulumi EKS package • Pulumi Kubernetes Provider Can you post the output of
pulumi about
?
b

boundless-engineer-23836

10/26/2022, 2:18 PM
There was no upgrade explicitly specified, however the package.json does use the
^
notion, so it looks like the latest version of the EKS package and kubernetes provider got pulled in
Copy code
"@pulumi/eks": "^0.42.0",
        "@pulumi/kubernetes": "^3.21.4",
        "@pulumi/kubernetes-cert-manager": "^0.0.3"
l

limited-rainbow-51650

10/26/2022, 2:43 PM
@boundless-engineer-23836 can you pin the versions to what you show here by dropping the caret, removing your
package-lock.json
or
yarn.lock
file including the
node_modules
and installing the dependencies again?
b

boundless-engineer-23836

10/26/2022, 2:44 PM
I can, but im hesitant to run pulumi up. When we did, it generated a memory leak/fork bomb, put memory pressure and crashed my computer. Can install them though
k

kind-hamburger-15227

10/27/2022, 1:05 PM
If you are on mac disable ipv6 - set network to link local.
Go lang DNS have a very annoying bug going around - affects terraform and other tools (seaweedfs in my case, home distributed filesystem running on 3 RPI4): if you are on ipv6 enabled network all your DNS resolution will be to ::1. Macs can't recover (both Intel/Arm) - network crashes then hard restart. The solution is to disable IPV6 and pin IP addresses to IPV4.
l

limited-rainbow-51650

11/03/2022, 1:20 PM