Hello, I don't know if this is the best place to a...
# pulumi-deployments
h
Hello, I don't know if this is the best place to ask my question, but I have an issue with Github "Preview" Action with our own backend (AWS S3 bucket).
Looks like I'm hitting the bug described in https://github.com/pulumi/actions/issues/698
Here's my action.yml:
Copy code
name: Pulumi
on:
  # Allow to manually run workflow
  workflow_dispatch:

  pull_request:
      branches:
        - 'develop'
jobs:
  preview:
    strategy:
      matrix:
        stack: [eks-cluster-go]
    name: Pulumi Preview
    runs-on: ubuntu-latest
    env:
      STACK_DIR: pulumi/aws/${{ matrix.stack }}
    steps:
      - uses: actions/checkout@v4
        with:
          fetch-depth: 1
      - uses: actions/setup-go@v4
        with:
          go-version: 'stable'
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-region: ${{ secrets.AWS_REGION }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      - run: go mod download
        working-directory: ${{ env.STACK_DIR }}
      - uses: pulumi/actions@v4
        with:
          command: preview
          cloud-url: s3://${{ vars.PULUMI_STATE_BUCKET }}/${{ matrix.stack }}
          stack-name: dev
          comment-on-pr: true
          #github-token: ${{ secrets.GITHUB_TOKEN }}
          work-dir: ${{ env.STACK_DIR }}
And this is where it fails:
Copy code
Run pulumi/actions@v4
  with:
    command: preview
    cloud-url: s3://<valid-bucket-name-here>/eks-cluster-go
    stack-name: dev
    comment-on-pr: true
    work-dir: pulumi/aws/eks-cluster-go
    pulumi-version: ^3
    comment-on-summary: false
    github-token: ***
    expect-no-changes: false
    diff: false
    target-dependents: false
    refresh: false
    upsert: false
    remove: false
    edit-pr-comment: true
    color: auto
    exclude-protected: false
  env:
    STACK_DIR: pulumi/aws/eks-cluster-go
    AWS_DEFAULT_REGION: ***
    AWS_REGION: ***
    AWS_ACCESS_KEY_ID: ***
    AWS_SECRET_ACCESS_KEY: ***
  
##[debug]Configuration is loaded
##[debug]Platform: linux-x64
Configured range: ^3
/usr/local/bin/pulumi version
v3.86.0
warning: A new version of Pulumi is available. To upgrade from version '3.86.0' to '3.87.0', visit <https://pulumi.com/docs/install/> for manual instructions and release notes.
Pulumi version 3.86.0 is already installed on this machine. Skipping download
Logging into s3://<valid-bucket-name-here>/eks-cluster-go
##[debug]Working directory resolved at /home/runner/work/<repo-name-redacted>/<repo-name-redacted>/pulumi/aws/eks-cluster-go

/home/runner/work/_actions/pulumi/actions/v4/webpack:/pulumi-github-action/node_modules/@pulumi/pulumi/automation/errors.js:77
                    : new CommandError(result);
^
CommandError: code: -2
 stdout: 
 stderr: Command failed with exit code 255: pulumi stack select --stack dev --non-interactive
error: PULUMI_ACCESS_TOKEN must be set for login during non-interactive CLI sessions
 err?: Error: Command failed with exit code 255: pulumi stack select --stack dev --non-interactive
error: PULUMI_ACCESS_TOKEN must be set for login during non-interactive CLI sessions

    at Object.createCommandError (/home/runner/work/_actions/pulumi/actions/v4/webpack:/pulumi-github-action/node_modules/@pulumi/pulumi/automation/errors.js:77:1)
    at Object.<anonymous> (/home/runner/work/_actions/pulumi/actions/v4/webpack:/pulumi-github-action/node_modules/@pulumi/pulumi/automation/cmd.js:76:1)
    at Generator.throw (<anonymous>)
    at rejected (/home/runner/work/_actions/pulumi/actions/v4/webpack:/pulumi-github-action/node_modules/@pulumi/pulumi/automation/cmd.js:19:1)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
##[debug]Node Action run completed with exit code 1
##[debug]Finishing: Run pulumi/actions@v4
https://github.com/pulumi/actions/issues/1014 I realized there's a newer issue opened for the same bug
And https://github.com/pulumi/actions/issues/1010, where the workaround is provided.