brave-analyst-91177
05/19/2024, 10:47 PMpulumi up
I get the following error, not sure how to troubleshoot/fix. Any ideas?
View in Browser (Ctrl+O): <https://app.pulumi.com/smantei/aws-py-langserve/dev/updates/5>
Type Name Status Info
pulumi:pulumi:Stack aws-py-langserve-dev **failed** 1 error
+ ├─ aws:ec2:Subnet langserve-subnet2 created (12s)
+ ├─ aws:ec2:Subnet langserve-subnet1 created (12s)
+ ├─ docker:index:Image langserve-ecr-image **creating failed** 1 error
+ ├─ aws:ssm:Parameter langserve-ssm-parameter created (0.82s)
+ ├─ aws:iam:Role langserve-execution-role created (1s)
+ ├─ aws:ec2:RouteTableAssociation langserve-subnet1-rt-assoc created (0.64s)
+ ├─ aws:ec2:RouteTableAssociation langserve-subnet2-rt-assoc created (0.65s)
+ └─ aws:lb:LoadBalancer langserve-load-balancer created (152s)
Diagnostics:
pulumi:pulumi:Stack (aws-py-langserve-dev):
error: update failed
docker:index:Image (langserve-ecr-image):
error: error reading build output: failed to solve with frontend dockerfile.v0: failed to build LLB: executor failed running [/bin/sh -c pip install poetry==1.6.1]: runc did not terminate sucessfully
little-cartoon-10569
05/19/2024, 11:02 PMdelightful-salesclerk-16161
many-telephone-49025
05/20/2024, 7:58 AMexport DOCKER_BUILDKIT=0
export COMPOSE_DOCKER_CLI_BUILD=0
Helped. Or setting:
{
"builder": {
"gc": {
"defaultKeepStorage": "20GB",
"enabled": true
}
},
"experimental": false,
"features": {
"buildkit": false
}
}
But this assumes you are using Docker Desktop.
(source: https://stackoverflow.com/questions/64221861/an-error-failed-to-solve-with-frontend-dockerfile-v0)many-telephone-49025
05/20/2024, 8:02 AMeu-central-1
as default. If you going to use a different region, please remove the string eu-central-1
with the region of your choice!
I will mark this in the README.mdmany-telephone-49025
05/20/2024, 9:13 AMbrave-analyst-91177
05/21/2024, 2:03 AMexport DOCKER_BUILDKIT=0
export COMPOSE_DOCKER_CLI_BUILD=0
And pulumi up
worked perfectly.
Also, I had to change my region, so your PR will be helpful for others.
Thanks again all!