From 9a785e9c327fa734984e4419fe2cd7a55438c3ca Mon Sep 17 00:00:00 2001 From: Hongming Wang Date: Fri, 24 Apr 2026 22:37:13 -0700 Subject: [PATCH] ci(canary): inject E2E_OPENAI_API_KEY so A2A turn doesn't 500 The canary workflow has been failing for ~30 consecutive runs (issue #1500, opened 2026-04-21) on the same line: [hermes-agent error 500] No LLM provider configured. Run `hermes model` to select a provider, or run `hermes setup` for first-time configuration. Root cause: the canary's env block was missing E2E_OPENAI_API_KEY. Without it, tests/e2e/test_staging_full_saas.sh provisions the workspace with empty secrets; template-hermes start.sh seeds ~/.hermes/.env with no provider keys; derive-provider.sh resolves the model slug `openai/gpt-4o` to PROVIDER=openrouter (hermes has no native openai provider in its registry); A2A request at step 8/11 fails with the "No LLM provider configured" error from hermes-agent. The full-lifecycle workflow (e2e-staging-saas.yml line 84) carries the same secret correctly. Mirror its pattern + add a fail-fast preflight so future regressions surface in <5s instead of after 8 min of provision-then-die. Co-Authored-By: Claude Opus 4.7 (1M context) --- .github/workflows/canary-staging.yml | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) diff --git a/.github/workflows/canary-staging.yml b/.github/workflows/canary-staging.yml index 32cba939..0c4bae19 100644 --- a/.github/workflows/canary-staging.yml +++ b/.github/workflows/canary-staging.yml @@ -43,6 +43,17 @@ jobs: env: MOLECULE_CP_URL: https://staging-api.moleculesai.app MOLECULE_ADMIN_TOKEN: ${{ secrets.MOLECULE_STAGING_ADMIN_TOKEN }} + # Without an LLM key the test_staging_full_saas.sh script provisions + # the workspace with empty secrets, hermes derive-provider.sh resolves + # `openai/gpt-4o` to PROVIDER=openrouter, no OPENROUTER_API_KEY is + # found in env, and A2A returns "No LLM provider configured" at + # request time (canary step 8/11). The full-lifecycle workflow + # (e2e-staging-saas.yml) has carried this secret since launch — the + # canary regressed when it was first split out and lost the env + # block. Issue #1500 had ~30 consecutive failures before this was + # spotted; do NOT remove without re-reading the script's secrets- + # injection block. + E2E_OPENAI_API_KEY: ${{ secrets.MOLECULE_STAGING_OPENAI_KEY }} E2E_MODE: canary E2E_RUNTIME: hermes E2E_RUN_ID: "canary-${{ github.run_id }}" @@ -57,6 +68,14 @@ jobs: exit 2 fi + - name: Verify OpenAI key present + run: | + if [ -z "$E2E_OPENAI_API_KEY" ]; then + echo "::error::MOLECULE_STAGING_OPENAI_KEY secret not set — A2A will fail at request time with 'No LLM provider configured'" + exit 2 + fi + echo "OpenAI key present ✓ (len=${#E2E_OPENAI_API_KEY})" + - name: Canary run id: canary run: bash tests/e2e/test_staging_full_saas.sh