https://pulumi.com logo
#general
Title
# general
m

magnificent-lifeguard-15082

03/29/2022, 2:46 PM
Has anyone else experienced issues with pulumi.Command running within github actions?
Copy code
+  command:local:Command pro-sto-site-sync creating error: transport is closing
 +  command:local:Command pro-sto-site-sync **creating failed** error: transport is closing
    pulumi:pulumi:Stack platform-production running panic: runtime error: invalid memory address or nil pointer dereference
    pulumi:pulumi:Stack platform-production running [signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0xc07034]
    pulumi:pulumi:Stack platform-production running goroutine 9 [running]:
    pulumi:pulumi:Stack platform-production running <http://github.com/pulumi/pulumi-command/provider/pkg/provider.(*command).run(0xc00007aae0|github.com/pulumi/pulumi-command/provider/pkg/provider.(*command).run(0xc00007aae0>, 0xf219e0, 0xc000382e40, 0xc000038600, 0x57, 0xc0000ad3b0, 0xc00003c2a0, 0x6a, 0x0, 0x0, ...)
    pulumi:pulumi:Stack platform-production running 	/home/runner/work/pulumi-command/pulumi-command/provider/pkg/provider/command.go:115 +0x794
    pulumi:pulumi:Stack platform-production running <http://github.com/pulumi/pulumi-command/provider/pkg/provider.(*command).RunCreate(0xc00007aae0|github.com/pulumi/pulumi-command/provider/pkg/provider.(*command).RunCreate(0xc00007aae0>, 0xf219e0, 0xc000382e40, 0xc0000ad3b0, 0xc00003c2a0, 0x6a, 0x0, 0x0, 0x0, 0x0)
    pulumi:pulumi:Stack platform-production running 	/home/runner/work/pulumi-command/pulumi-command/provider/pkg/provider/command.go:50 +0x93
    pulumi:pulumi:Stack platform-production running <http://github.com/pulumi/pulumi-command/provider/pkg/provider.(*commandProvider).Create(0xc000109b00|github.com/pulumi/pulumi-command/provider/pkg/provider.(*commandProvider).Create(0xc000109b00>, 0xf21a88, 0xc00007f560, 0xc00007cb90, 0x0, 0x0, 0x0)
    pulumi:pulumi:Stack platform-production running 	/home/runner/work/pulumi-command/pulumi-command/provider/pkg/provider/provider.go:197 +0xe85
    pulumi:pulumi:Stack platform-production running <http://github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceProvider_Create_Handler.func1(0xf21a88|github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceProvider_Create_Handler.func1(0xf21a88>, 0xc00007f560, 0xdad2e0, 0xc00007cb90, 0xdbbac0, 0x1471400, 0xf21a88, 0xc00007f560)
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.19.0/proto/go/provider.pb.go:2602 +0x89
    pulumi:pulumi:Stack platform-production running <http://github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc.OpenTracingServerInterceptor.func1(0xf21a88|github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc.OpenTracingServerInterceptor.func1(0xf21a88>, 0xc00007f230, 0xdad2e0, 0xc00007cb90, 0xc0000305a0, 0xc00000c8d0, 0x0, 0x0, 0xf0f3a0, 0xc00009d6e0)
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/github.com/grpc-ecosystem/grpc-opentracing@v0.0.0-20180507213350-8e809c8a8645/go/otgrpc/server.go:57 +0x30a
    pulumi:pulumi:Stack platform-production running <http://github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceProvider_Create_Handler(0xdce940|github.com/pulumi/pulumi/sdk/v3/proto/go._ResourceProvider_Create_Handler(0xdce940>, 0xc000109b00, 0xf21a88, 0xc00007f230, 0xc00007aa80, 0xc0000c38c0, 0xf21a88, 0xc00007f230, 0xc0003aca80, 0x34a)
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/github.com/pulumi/pulumi/sdk/v3@v3.19.0/proto/go/provider.pb.go:2604 +0x150
    pulumi:pulumi:Stack platform-production running <http://google.golang.org/grpc.(*Server).processUnaryRPC(0xc000249880|google.golang.org/grpc.(*Server).processUnaryRPC(0xc000249880>, 0xf29e38, 0xc0000bf500, 0xc0003965a0, 0xc000109b90, 0x14344a0, 0x0, 0x0, 0x0)
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:1217 +0x52b
    pulumi:pulumi:Stack platform-production running <http://google.golang.org/grpc.(*Server).handleStream(0xc000249880|google.golang.org/grpc.(*Server).handleStream(0xc000249880>, 0xf29e38, 0xc0000bf500, 0xc0003965a0, 0x0)
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:1540 +0xd0c
    pulumi:pulumi:Stack platform-production running <http://google.golang.org/grpc.(*Server).serveStreams.func1.2(0xc0002fc260|google.golang.org/grpc.(*Server).serveStreams.func1.2(0xc0002fc260>, 0xc000249880, 0xf29e38, 0xc0000bf500, 0xc0003965a0)
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:878 +0xab
    pulumi:pulumi:Stack platform-production running created by <http://google.golang.org/grpc.(*Server).serveStreams.func1|google.golang.org/grpc.(*Server).serveStreams.func1>
    pulumi:pulumi:Stack platform-production running 	/home/runner/go/pkg/mod/google.golang.org/grpc@v1.37.0/server.go:876 +0x1fd
 +  command:local:Command pro-dev-docs-sync creating error: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:38829: connect: connection refused"
 +  command:local:Command pro-dev-docs-sync **creating failed** error: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:38829: connect: connection refused"
b

billowy-army-68599

03/29/2022, 2:50 PM
@magnificent-lifeguard-15082 you're trying to connect to localhost? what are you using it for?
m

magnificent-lifeguard-15082

03/29/2022, 2:52 PM
Err, not that I'm aware of
This is all I'm doing with local.Command `pulumi.interpolate`aws s3 sync ./ s3://${bucket.bucket} --cache-control "public, max-age=3600;"`
b

billowy-army-68599

03/29/2022, 2:53 PM
Copy code
transport: Error while dialing dial tcp 127.0.0.1:38829
This is connecting to localhost
m

magnificent-lifeguard-15082

03/29/2022, 2:54 PM
I was imagining that is some pulumi [engine] internals
I'm not knowingly trying to connect to anything locally
Also interestingly this is only failing in our production deployment, it deployed the same pulumi program in a staging workflow fine
I've managed to workaround for now by running the deployment from my own machine but the above definitely unexpected and guessing likely to crop up again
10 Views