Guessing my configuration is wrong. Action looks l...
# pulumi-deployments
e
Guessing my configuration is wrong. Action looks like this
Copy code
jobs:
  deploy:
    name: Deploy Development
    runs-on: ubuntu-latest
    environment: dev
  
    steps:
      - name: Checkout
        uses: actions/checkout@v3
      - uses: actions/setup-node@v3
        with:
          node-version: 16
         
      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}
      
      - name: Install Dependencies
        run: |
          cd sync-driver-auth-service
          yarn install 
          cd ../sync-service-cr-creator
          yarn install
          cd ../sync-service-cr-resolver
          yarn install
          cd ../sync-service-streamos-sync
          yarn install
          cd ../sync-service-webhook-service
          yarn install
      - run: npm install
        working-directory: infrastructure
      - uses: pulumi/actions@v3
        with:
          command: up
          stack-name: dev 
          work-dir: infrastructure
        env:
          PULUMI_ACCESS_TOKEN: ${{ secrets.PULUMI_ACCESS_TOKEN }}
l
Only one
pulumi up
can run on a stack at a time. There could be two things going on: 1. Your actions aren't queueing and are trying to run in parallel, and you're trying to run concurrent updates which will fail. 2. Sometimes if a job crashes mid update a stack can get stuck in this state for a few minutes before the update times out. You can run
pulumi cancel
if that is the case to remove the lock.
e
Thanks @lemon-agent-27707