Compare commits

..

1 Commits

Author SHA1 Message Date
Hongming Wang
227787bbbd test: bash coverage for entrypoint.sh log_boot_context()
Some checks failed
CI / validate (push) Failing after 0s
CI / Adapter unit tests (push) Failing after 6s
The Python adapter audit (test_adapter_logging.py) pins the
adapter.py side, but the entrypoint shell function fires earlier and
twice (pre-gosu + post-gosu). When the SDK import wedge keeps the
adapter from running at all, the shell emission is the operator's
only visibility into the boot env.

Eight new tests cover:
- env NAME=set / env NAME=unset shape for every audited var
- value-leak guard: secret strings never appear in output
- WORKSPACE_ID + PLATFORM_URL passthrough by value (not secret)
- <unset> fallback for missing platform identifiers
- uid/gid line shape (used to verify the privilege drop)
- dated boot banner shape (used to count restarts in a crash loop)
- cross-file gate: shell for-loop names == fixture tuple, mirroring
  test_audit_env_list_matches_entrypoint_sh's adapter.py↔shell gate

Strategy: regex-extract the function body from entrypoint.sh and run
it in a fresh /bin/sh with controlled env. We never source the whole
entrypoint because it would chown /workspace and exec molecule-runtime.

Closes the gap from task #251 (follow-up to PR #32 boot-debug logging).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-02 22:26:35 -07:00
12 changed files with 269 additions and 1110 deletions

View File

@ -2,7 +2,7 @@ name: CI
on: [push, pull_request]
jobs:
validate:
uses: molecule-ai/molecule-ci/.github/workflows/validate-workspace-template.yml@main
uses: Molecule-AI/molecule-ci/.github/workflows/validate-workspace-template.yml@main
tests:
name: Adapter unit tests

View File

@ -32,47 +32,14 @@ permissions:
packages: write
jobs:
# The `.runtime-version` file is the push-mode cascade signal post-
# 2026-05-06: when molecule-core/publish-runtime.yml ships a new
# version to PyPI, it does NOT call repository_dispatch (Gitea 1.22.6
# has no such endpoint — empirically verified molecule-core#20).
# Instead it git-pushes an updated `.runtime-version` to each template,
# which trips this workflow's `on: push: branches: [main]` trigger.
# This job reads that file and forwards the version to the reusable
# build workflow so the Dockerfile pip-installs the exact published
# version, not whatever requirements.txt currently bounds.
resolve-version:
runs-on: ubuntu-latest
timeout-minutes: 2
outputs:
version: ${{ steps.read.outputs.version }}
steps:
- uses: actions/checkout@v4
- id: read
run: |
if [ -f .runtime-version ]; then
v="$(head -n1 .runtime-version | tr -d '[:space:]')"
echo "version=$v" >> "$GITHUB_OUTPUT"
echo "resolved runtime version: $v"
else
echo "no .runtime-version file present — falling through to Dockerfile default"
fi
publish:
needs: resolve-version
uses: molecule-ai/molecule-ci/.github/workflows/publish-template-image.yml@main
uses: Molecule-AI/molecule-ci/.github/workflows/publish-template-image.yml@main
secrets: inherit
with:
# Resolution chain (highest priority first):
# 1. client_payload.runtime_version — legacy GitHub
# repository_dispatch path (will return if Gitea ever adds
# the dispatch API; left in place for forward-compat).
# 2. inputs.runtime_version — manual workflow_dispatch run from
# the Actions UI for ad-hoc rebuilds against a specific
# version.
# 3. needs.resolve-version.outputs.version — the
# `.runtime-version` file in this repo, written by
# molecule-core/publish-runtime.yml's push-mode cascade.
# 4. '' — fall through to the Dockerfile default
# (requirements.txt pin).
runtime_version: ${{ github.event.client_payload.runtime_version || inputs.runtime_version || needs.resolve-version.outputs.version || '' }}
# When the cascade fires, client_payload.runtime_version is the
# exact version PyPI just published. Forwarded to the reusable
# workflow as a docker --build-arg so the cache key changes
# per-version and pip install resolves freshly.
# On other events (push to main / manual without input), this is
# empty and the Dockerfile's default (requirements.txt pin) applies.
runtime_version: ${{ github.event.client_payload.runtime_version || inputs.runtime_version || '' }}

View File

@ -1,201 +1,22 @@
name: Secret scan
# Hard CI gate. Refuses any PR / push whose diff additions contain a
# recognisable credential. Defense-in-depth for the #2090-class incident
# (2026-04-24): GitHub's hosted Copilot Coding Agent leaked a ghs_*
# installation token into tenant-proxy/package.json via `npm init`
# slurping the URL from a token-embedded origin remote. We can't fix
# upstream's clone hygiene, so we gate here.
# Calls the canonical reusable workflow in molecule-core. Defense
# against the #2090-class leak (a hosted-agent commit slipping a
# credential-shaped string into a PR). Pattern set lives in
# molecule-core so we do not maintain a parallel copy here.
#
# Inlined copy from molecule-ai/molecule-core/.github/workflows/secret-scan.yml.
# Cross-repo workflow_call to a private repo doesn't fully work on Gitea 1.22.6
# (workflow file fails parse-time at 0s with no logs); inline keeps the gate
# functional until Gitea is upgraded or the canonical scanner moves to a public
# repo. When that lands, this file reverts to the 3-line wrapper:
#
# jobs:
# secret-scan:
# uses: Molecule-AI/molecule-core/.github/workflows/secret-scan.yml@staging
#
# Pin to @staging not @main — staging is the active default branch,
# main lags via the staging-promotion workflow. Updates ride along
# automatically on the next consumer workflow run.
#
# Same regex set as the runtime's bundled pre-commit hook
# (molecule-ai-workspace-runtime: molecule_runtime/scripts/pre-commit-checks.sh).
# Keep the two sides aligned when adding patterns.
# Pinned to @staging because that is the active default branch on the
# upstream repo (main lags behind via the staging-promotion workflow).
# Updates ride along automatically as the upstream regex set evolves.
on:
pull_request:
types: [opened, synchronize, reopened]
push:
branches: [main, staging]
branches: [main, staging, master]
merge_group:
types: [checks_requested]
jobs:
scan:
name: Scan diff for credential-shaped strings
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 2 # need previous commit to diff against on push events
# For pull_request events the diff base may be many commits behind
# HEAD and absent from the shallow clone. Fetch it explicitly.
- name: Fetch PR base SHA (pull_request events only)
if: github.event_name == 'pull_request'
run: git fetch --depth=1 origin ${{ github.event.pull_request.base.sha }}
# For merge_group events the queue's pre-merge ref is a commit on
# `gh-readonly-queue/...` whose parent is the queue's base_sha.
# That parent isn't part of the queue branch's shallow clone, so
# we fetch it explicitly. Without this the diff falls through to
# "no BASE → scan entire tree" mode and false-positives on legit
# test fixtures (e.g. canvas/src/lib/validation/__tests__/secret-formats.test.ts).
- name: Refuse if credential-shaped strings appear in diff additions
env:
# Plumb event-specific SHAs through env so the script doesn't
# need conditional `${{ ... }}` interpolation per event type.
# github.event.before/after only exist on push events;
# merge_group has its own base_sha/head_sha; pull_request has
# pull_request.base.sha / pull_request.head.sha.
PR_BASE_SHA: ${{ github.event.pull_request.base.sha }}
PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }}
PUSH_BEFORE: ${{ github.event.before }}
PUSH_AFTER: ${{ github.event.after }}
run: |
# Pattern set covers GitHub family (the actual #2090 vector),
# Anthropic / OpenAI / Slack / AWS. Anchored on prefixes with low
# false-positive rates against agent-generated content. Mirror of
# molecule-ai-workspace-runtime/molecule_runtime/scripts/pre-commit-checks.sh
# — keep aligned.
SECRET_PATTERNS=(
'ghp_[A-Za-z0-9]{36,}' # GitHub PAT (classic)
'ghs_[A-Za-z0-9]{36,}' # GitHub App installation token
'gho_[A-Za-z0-9]{36,}' # GitHub OAuth user-to-server
'ghu_[A-Za-z0-9]{36,}' # GitHub OAuth user
'ghr_[A-Za-z0-9]{36,}' # GitHub OAuth refresh
'github_pat_[A-Za-z0-9_]{82,}' # GitHub fine-grained PAT
'sk-ant-[A-Za-z0-9_-]{40,}' # Anthropic API key
'sk-proj-[A-Za-z0-9_-]{40,}' # OpenAI project key
'sk-svcacct-[A-Za-z0-9_-]{40,}' # OpenAI service-account key
'sk-cp-[A-Za-z0-9_-]{60,}' # MiniMax API key (F1088 vector — caught only after the fact)
'xox[baprs]-[A-Za-z0-9-]{20,}' # Slack tokens
'AKIA[0-9A-Z]{16}' # AWS access key ID
'ASIA[0-9A-Z]{16}' # AWS STS temp access key ID
)
# Determine the diff base. Each event type stores its SHAs in
# a different place — see the env block above.
case "${{ github.event_name }}" in
pull_request)
BASE="$PR_BASE_SHA"
HEAD="$PR_HEAD_SHA"
;;
*)
BASE="$PUSH_BEFORE"
HEAD="$PUSH_AFTER"
;;
esac
# On push events with shallow clones, BASE may be present in
# the event payload but absent from the local object DB
# (fetch-depth=2 doesn't always reach the previous commit
# across true merges). Try fetching it on demand. If the
# fetch fails — e.g. the SHA was force-overwritten — we fall
# through to the empty-BASE branch below, which scans the
# entire tree as if every file were new. Correct, just slow.
if [ -n "$BASE" ] && ! echo "$BASE" | grep -qE '^0+$'; then
if ! git cat-file -e "$BASE" 2>/dev/null; then
git fetch --depth=1 origin "$BASE" 2>/dev/null || true
fi
fi
# Files added or modified in this change.
if [ -z "$BASE" ] || echo "$BASE" | grep -qE '^0+$' || ! git cat-file -e "$BASE" 2>/dev/null; then
# New branch / no previous SHA / BASE unreachable — check the
# entire tree as added content. Slower, but correct on first
# push.
CHANGED=$(git ls-tree -r --name-only HEAD)
DIFF_RANGE=""
else
CHANGED=$(git diff --name-only --diff-filter=AM "$BASE" "$HEAD")
DIFF_RANGE="$BASE $HEAD"
fi
if [ -z "$CHANGED" ]; then
echo "No changed files to inspect."
exit 0
fi
# Self-exclude: this workflow file legitimately contains the
# pattern strings as regex literals. Without an exclude it would
# block its own merge.
SELF=".github/workflows/secret-scan.yml"
OFFENDING=""
# `while IFS= read -r` (not `for f in $CHANGED`) so filenames
# containing whitespace don't word-split silently — a path
# with a space would otherwise produce two iterations on
# tokens that aren't real filenames, breaking the
# self-exclude + diff lookup.
while IFS= read -r f; do
[ -z "$f" ] && continue
[ "$f" = "$SELF" ] && continue
if [ -n "$DIFF_RANGE" ]; then
ADDED=$(git diff --no-color --unified=0 "$BASE" "$HEAD" -- "$f" 2>/dev/null | grep -E '^\+[^+]' || true)
else
# No diff range (new branch first push) — scan the full file
# contents as if every line were new.
ADDED=$(cat "$f" 2>/dev/null || true)
fi
[ -z "$ADDED" ] && continue
for pattern in "${SECRET_PATTERNS[@]}"; do
if echo "$ADDED" | grep -qE "$pattern"; then
OFFENDING="${OFFENDING}${f} (matched: ${pattern})\n"
break
fi
done
done <<< "$CHANGED"
if [ -n "$OFFENDING" ]; then
echo "::error::Credential-shaped strings detected in diff additions:"
# `printf '%b' "$OFFENDING"` interprets backslash escapes
# (the literal `\n` we appended above becomes a newline)
# WITHOUT treating OFFENDING as a format string. Plain
# `printf "$OFFENDING"` is a format-string sink: a filename
# containing `%` would be interpreted as a conversion
# specifier, corrupting the error message (or printing
# `%(missing)` artifacts).
printf '%b' "$OFFENDING"
echo ""
echo "The actual matched values are NOT echoed here, deliberately —"
echo "round-tripping a leaked credential into CI logs widens the blast"
echo "radius (logs are searchable + retained)."
echo ""
echo "Recovery:"
echo " 1. Remove the secret from the file. Replace with an env var"
echo " reference (e.g. \${{ secrets.GITHUB_TOKEN }} in workflows,"
echo " process.env.X in code)."
echo " 2. If the credential was already pushed (this PR's commit"
echo " history reaches a public ref), treat it as compromised —"
echo " ROTATE it immediately, do not just remove it. The token"
echo " remains valid in git history forever and may be in any"
echo " log/cache that consumed this branch."
echo " 3. Force-push the cleaned commit (or stack a revert) and"
echo " re-run CI."
echo ""
echo "If the match is a false positive (test fixture, docs example,"
echo "or this workflow's own regex literals): use a clearly-fake"
echo "placeholder like ghs_EXAMPLE_DO_NOT_USE that doesn't satisfy"
echo "the length suffix, OR add the file path to the SELF exclude"
echo "list in this workflow with a short reason."
echo ""
echo "Mirror of the regex set lives in the runtime's bundled"
echo "pre-commit hook (molecule-ai-workspace-runtime:"
echo "molecule_runtime/scripts/pre-commit-checks.sh) — keep aligned."
exit 1
fi
echo "✓ No credential-shaped strings in this change."
secret-scan:
uses: Molecule-AI/molecule-core/.github/workflows/secret-scan.yml@staging

View File

@ -1 +0,0 @@
0.1.129

View File

@ -43,19 +43,6 @@ RUN pip install --no-cache-dir -r requirements.txt && \
# Copy adapter code
COPY adapter.py .
COPY __init__.py .
# Provider registry. The adapter's _load_providers walks 4 paths:
# 1. /opt/adapter/config.yaml — provisioner-managed canonical
# 2. os.path.dirname(__file__)/config.yaml — alongside adapter.py (this image)
# 3. ${WORKSPACE_CONFIG_PATH}/config.yaml — workspace per-instance overrides
# 4. _BUILTIN_PROVIDERS — oauth + anthropic-api only
# On this image /opt/adapter/ is never populated by the platform
# provisioner, so path 2 (/app/config.yaml) is the load-bearing one.
# Without this COPY the file isn't in the image, all 3 file paths fail,
# and _load_providers falls through to _BUILTIN_PROVIDERS — every
# MiniMax/GLM/Kimi/DeepSeek model silently routes to anthropic-oauth →
# "Not logged in. Please run /login" at first LLM call. Caused the
# canary's 38h chronic red on 2026-05-07/08 (molecule-core#129).
COPY config.yaml .
# Adapter-specific executor — owned by THIS template (universal-runtime
# refactor, molecule-core task #87). Lives alongside adapter.py so
# Python's import system picks the local /app/claude_sdk_executor.py

View File

@ -147,118 +147,36 @@ def _normalize_provider(entry: dict):
}
# Canonical install path the platform provisioner is contracted to clone
# the template repo into. Hardcoded so the adapter's config.yaml lookup
# is invariant across Docker (mounted /app→/opt/adapter) and EC2-host
# (cloned by molecule-controlplane's ec2.go) install paths — robust
# against the site-packages copy that bit us 2026-05-04 11:08Z.
_CANONICAL_ADAPTER_DIR = "/opt/adapter"
# Adjacent-to-adapter.py path. Module-level so tests can monkeypatch it
# to redirect the path-2 lookup at a controlled tmp dir. Production code
# resolves this once at import time and never touches it again — same
# semantics as before.
_TEMPLATE_DIR = os.path.dirname(os.path.abspath(__file__))
def _load_providers(config_path: str) -> tuple:
"""Load the provider registry from the template's bundled config.yaml.
"""Load the provider registry from /configs/config.yaml.
The providers list is a TEMPLATE concern it describes which
models/auth-modes this runtime image supports and ships in the
template's own config.yaml alongside adapter.py. The per-workspace
``${WORKSPACE_CONFIG_PATH}/config.yaml`` (default ``/configs/``)
only contains workspace-specific overrides (model, runtime, skills,
prompt files) and does NOT carry a providers section.
The YAML's top-level ``providers:`` list is the canonical source —
canvas Config tab reads the same list to populate its Provider
dropdown so the UI and the adapter never disagree on what's
available. Falls back to ``_BUILTIN_PROVIDERS`` (oauth + anthropic-api)
if the file is missing, malformed, or has no providers section, so a
bare-bones workspace still boots with the historical defaults.
Two-step incident history:
Pre-2026-05-04 09:00Z: only checked ``config_path``, fell back
to ``_BUILTIN_PROVIDERS`` (oauth + anthropic-api). Every
MiniMax / GLM / Kimi / DeepSeek model resolved to
``anthropic-oauth`` and crashed at first LLM call with
"Not logged in. Please run /login". Fixed by adding a
template-bundled lookup using
``os.path.dirname(os.path.abspath(__file__))``.
2026-05-04 11:08Z: that ``__file__`` lookup misses on EC2-host
installs because the provisioner copies adapter.py to
``/opt/molecule-venv/lib/python3.12/site-packages/``
site-packages wins over PYTHONPATH=/opt/adapter (which the
host install doesn't set), so __file__ resolves to the venv
path WITHOUT an adjacent config.yaml. Same silent fallback
to anthropic-oauth + same "Not logged in" symptom.
2026-05-08 (#129): the multi-path lookup that fixed both of
the above was lost in a post-suspension migration cycle (the
Gitea main branch never carried the fix even though the
:latest image had it baked in from a prior build). Canary
chronic red for 38h before this commit restored the lookup.
Resolution order:
1. ``/opt/adapter/config.yaml`` canonical provisioner-managed
install dir. Hardcoded because the platform contract is
"provisioner clones template repo into /opt/adapter"; this
is invariant across Docker (mounted /app/opt/adapter) and
EC2-host (cloned by ec2.go) install paths. Robust against
site-packages copy.
2. Adjacent to ``adapter.__file__`` works in dev/test where
the canonical path doesn't exist. Also covers the Docker
image's /app/config.yaml (bundled by Dockerfile #6).
3. Per-workspace ``${config_path}/config.yaml`` fallback for
operator-shipped overrides on a private deployment that
wants a custom providers list.
4. ``_BUILTIN_PROVIDERS`` oauth + anthropic-api defaults so a
bare-bones workspace still boots even with no config.yaml
anywhere.
Per-entry isolation: a single bad provider entry is dropped with
a warning; the rest of the registry survives.
Per-entry isolation: a single bad provider entry is dropped with a
warning; the rest of the registry survives. Used to be a generator
inside tuple(...) that propagated any AttributeError out and reverted
the whole registry to builtins exactly the silent-fallback failure
mode this file's existence was meant to fix.
"""
canonical_yaml = os.path.join(_CANONICAL_ADAPTER_DIR, "config.yaml")
template_yaml = os.path.join(_TEMPLATE_DIR, "config.yaml")
workspace_yaml = os.path.join(config_path, "config.yaml")
# Deduplicate while preserving order — _CANONICAL_ADAPTER_DIR and
# the __file__ dir collide in dev/test (when imported from
# /opt/adapter directly), and workspace_yaml may also collide if
# config_path == /opt/adapter in tests.
seen = set()
candidates = []
for path in (canonical_yaml, template_yaml, workspace_yaml):
if path not in seen:
seen.add(path)
candidates.append(path)
raw = None
chosen_path = None
yaml_path = os.path.join(config_path, "config.yaml")
try:
import yaml # transitive dep via molecule-ai-workspace-runtime
except ImportError:
logger.warning("providers: yaml import failed; using builtins")
with open(yaml_path, "r") as f:
data = yaml.safe_load(f) or {}
except FileNotFoundError:
logger.info("providers: %s not found, using builtin defaults", yaml_path)
return _BUILTIN_PROVIDERS
except Exception as exc: # noqa: BLE001 — defensive: never block boot on YAML
logger.warning("providers: failed to load from %s (%s); using builtins", yaml_path, exc)
return _BUILTIN_PROVIDERS
for yaml_path in candidates:
try:
with open(yaml_path, "r") as f:
data = yaml.safe_load(f) or {}
except FileNotFoundError:
logger.info("providers: %s not found, trying next candidate", yaml_path)
continue
except Exception as exc: # noqa: BLE001 — defensive: never block boot on YAML
logger.warning(
"providers: failed to load from %s (%s); trying next candidate",
yaml_path, exc,
)
continue
candidate_raw = data.get("providers") if isinstance(data, dict) else None
if isinstance(candidate_raw, list) and candidate_raw:
raw = candidate_raw
chosen_path = yaml_path
break
if raw is None:
logger.info(
"providers: no providers section found in %s; using builtin defaults",
" or ".join(candidates),
)
raw = data.get("providers") if isinstance(data, dict) else None
if not isinstance(raw, list) or not raw:
return _BUILTIN_PROVIDERS
parsed = []
@ -272,121 +190,11 @@ def _load_providers(config_path: str) -> tuple:
parsed.append(normalized)
if not parsed:
logger.warning("providers: no valid entries in %s; using builtins", chosen_path)
logger.warning("providers: no valid entries in %s; using builtins", yaml_path)
return _BUILTIN_PROVIDERS
logger.info("providers: loaded %d entries from %s", len(parsed), chosen_path)
return tuple(parsed)
# Aliases for `MODEL_PROVIDER` env values that should map to a registry
# provider name. The persona env files use shorter / friendlier slugs
# than the registry's canonical names — without this alias map a value
# like ``MODEL_PROVIDER=claude-code`` would fall through to YAML-based
# resolution and (when the YAML doesn't pin a provider) hit the
# model-prefix matcher with the operator-picked MODEL, mis-routing a
# lead workspace through MiniMax even though its CLAUDE_CODE_OAUTH_TOKEN
# was clearly meant to be used.
#
# Maintain this list in sync with the persona env file convention:
# - ``claude-code`` → ``anthropic-oauth`` (Claude Code subscription path)
# - ``anthropic`` → ``anthropic-api`` (direct Anthropic API key)
# Provider names already in the registry alias to themselves implicitly
# (the ``in registry`` check catches them before this map is consulted).
_PROVIDER_SLUG_ALIASES = {
"claude-code": "anthropic-oauth",
"anthropic": "anthropic-api",
}
def _resolve_model_and_provider_from_env(
yaml_model: str,
yaml_provider: str,
providers: tuple,
) -> tuple:
"""Reconcile model + provider from env vars vs YAML, with the persona-env
convention winning over the legacy ``MODEL_PROVIDER``-as-model-id usage.
The persona env files (``~/.molecule-ai/personas/<name>/env`` on the host,
sourced into each workspace container at provision time) declare TWO env
vars with distinct semantics:
* ``MODEL`` the model id (e.g. ``MiniMax-M2.7-highspeed``, ``opus``).
* ``MODEL_PROVIDER`` the provider slug (e.g. ``minimax``,
``claude-code``, ``anthropic``).
The legacy ``workspace/config.py`` (in molecule-ai-workspace-runtime)
historically interpreted ``MODEL_PROVIDER`` as the *model id* a name
chosen before there was a separate ``MODEL`` env var. When both env vars
are set with the persona convention, the legacy code reads
``MODEL_PROVIDER=minimax`` into ``runtime_config.model``, which then
fails to match any registry prefix (``minimax-`` requires a hyphen
suffix) and silently falls through to providers[0] (``anthropic-oauth``).
OAuth-token-less workspaces then wedge at ``query.initialize()`` because
the claude CLI can't authenticate. This is the 2026-05-08 dev-tree
incident 22/27 non-lead workspaces stuck in ``degraded``.
Resolution order (this function):
1. ``MODEL`` env var picked_model. Authoritative when set; the
persona env always sets it alongside ``MODEL_PROVIDER`` so the
model id never has to be inferred.
2. ``MODEL_PROVIDER`` env var explicit_provider, BUT only when the
value matches a known provider name in the registry. This guards
against the legacy case where some callers still set
``MODEL_PROVIDER`` to a model id (e.g. canvas Save+Restart prior to
this fix). If the value isn't a registered provider name and YAML
didn't supply a model, treat it as a model id for back-compat.
3. YAML ``runtime_config.model`` / ``provider`` used for any field
the env didn't supply. Carries the operator's canvas selection
on workspaces that haven't yet adopted the persona env shape.
Returns ``(picked_model, explicit_provider_name)``. Either may be
empty/None the caller (``setup``) handles the empty cases via
``_resolve_provider``'s registry fallback.
"""
env_model = (os.environ.get("MODEL") or "").strip()
env_provider = (os.environ.get("MODEL_PROVIDER") or "").strip()
provider_names_lower = {p.get("name", "").lower() for p in providers}
# Detect whether MODEL_PROVIDER carries the persona-convention slug
# (provider name) vs. the legacy convention (model id). Persona-
# convention wins when the value matches a registered provider; we
# fall back to legacy interpretation only when it doesn't.
#
# First, apply the alias map so persona-friendly slugs like
# ``claude-code`` resolve to the canonical registry name
# ``anthropic-oauth``. Without this, a lead workspace's
# ``MODEL_PROVIDER=claude-code`` env would fall through to the model-
# prefix matcher, see ``MODEL=MiniMax-M2.7`` and mis-route to MiniMax
# even though the operator's intent (and the OAuth token they set)
# was the OAuth subscription path.
env_provider_resolved = _PROVIDER_SLUG_ALIASES.get(
env_provider.lower(), env_provider,
) if env_provider else ""
env_provider_is_slug = (
bool(env_provider_resolved)
and env_provider_resolved.lower() in provider_names_lower
)
# Picked model resolution
if env_model:
picked_model = env_model
elif env_provider and not env_provider_is_slug:
# Legacy: MODEL_PROVIDER env carried the model id. Honor it so
# canvas Save+Restart workflows that predate this fix keep working.
picked_model = env_provider
else:
picked_model = yaml_model or ""
# Explicit provider resolution — env wins when it's a registered slug
# (after alias mapping), otherwise fall back to YAML.
if env_provider_is_slug:
explicit_provider = env_provider_resolved
else:
explicit_provider = yaml_provider or None
return picked_model, explicit_provider
def _strip_provider_prefix(model: str) -> str:
"""Strip LangChain-style "<provider>:<model>" prefix from a model id.
@ -472,28 +280,13 @@ def _project_vendor_auth(provider: dict) -> None:
return
def _resolve_provider(
model: str,
providers: tuple,
explicit_provider: str = None,
) -> dict:
def _resolve_provider(model: str, providers: tuple) -> dict:
"""Return the provider entry matching this model id.
If ``explicit_provider`` is given (set via the ``provider:`` field in
workspace config.yaml or runtime_config), look up by name first. If the
named provider is not in the registry, RAISE ``ValueError`` with an
actionable message silent fallback to ``providers[0]`` is the bug
that motivated #180 (workspace operator picks ``provider: minimax``
in the canvas Config tab, the adapter ignores it, the Claude SDK
silently keeps using ``CLAUDE_CODE_OAUTH_TOKEN`` and the operator has
no way to tell from the canvas that their provider switch did
nothing).
Without an explicit name: match is case-insensitive, prefix wins over
alias when both could apply, and unknown ids fall back to the first
provider in the registry (by convention, the OAuth/safest default
``anthropic-oauth`` in both _BUILTIN_PROVIDERS and the shipped
config.yaml).
Match is case-insensitive: prefix wins over alias when both could
apply. Unknown ids fall back to the first provider in the registry
(by convention, the OAuth/safest default anthropic-oauth in both
_BUILTIN_PROVIDERS and the shipped config.yaml).
Pre-condition: ``providers`` is non-empty. _load_providers always
returns at least one entry (built-ins when YAML is missing or every
@ -505,44 +298,6 @@ def _resolve_provider(
"_load_providers must always return at least one entry "
"(falling back to _BUILTIN_PROVIDERS when needed)"
)
# Explicit provider name takes precedence — fail fast if it's not in
# the registry. Anything else would silently route the operator's
# picked provider through the wrong auth/base_url path. The error
# message tells them exactly which two paths fix it.
if explicit_provider:
ep_lower = explicit_provider.lower()
for provider in providers:
if provider["name"].lower() == ep_lower:
return provider
names = ", ".join(p["name"] for p in providers)
raise ValueError(
f"claude-code adapter: workspace config picks "
f"provider='{explicit_provider}' but it is not in the "
f"providers registry.\n"
f"\n"
f"Known providers: {names}\n"
f"\n"
f"Two ways to fix:\n"
f" (a) Add '{explicit_provider}' to /configs/config.yaml as a "
f"providers: entry. Required keys:\n"
f" providers:\n"
f" - name: {explicit_provider}\n"
f" auth_mode: third_party_anthropic_compat\n"
f" base_url: https://... # provider's Anthropic-compat endpoint\n"
f" auth_env: [{explicit_provider.upper()}_API_KEY]\n"
f" model_prefixes: [...]\n"
f" (b) Switch the workspace runtime template to one that "
f"natively supports {explicit_provider} (CrewAI, LangGraph, or "
f"DeepAgents read provider/model from runtime_config and route "
f"directly without needing an Anthropic-compat shim).\n"
f"\n"
f"Note: claude-code SDK speaks the Anthropic API protocol. "
f"Providers that only expose OpenAI-compatible endpoints "
f"(MiniMax, GLM, Kimi, DeepSeek native APIs) need either an "
f"Anthropic-compat proxy in front, or option (b)."
)
if not model:
return providers[0]
m = model.lower()
@ -645,52 +400,9 @@ class ClaudeCodeAdapter(BaseAdapter):
# validation + ANTHROPIC_BASE_URL routing from that single decision.
rc = config.runtime_config
if isinstance(rc, dict):
yaml_model = rc.get("model") or ""
yaml_provider_name = rc.get("provider") or ""
picked_model = rc.get("model") or "sonnet"
else:
yaml_model = getattr(rc, "model", None) or ""
yaml_provider_name = getattr(rc, "provider", None) or ""
# Also honor the top-level `provider:` field in /configs/config.yaml.
# The canvas Config-tab Provider dropdown writes there (not into
# runtime_config) on some legacy paths. Either source is canonical;
# whichever is set wins. Root cause of #180: the adapter used to
# ignore both, silently routing every non-Anthropic provider pick
# through anthropic-oauth.
if not yaml_provider_name:
yaml_path = os.path.join(config.config_path, "config.yaml")
try:
import yaml # transitive dep via molecule-ai-workspace-runtime
with open(yaml_path, "r") as f:
data = yaml.safe_load(f) or {}
if isinstance(data, dict):
val = data.get("provider")
if isinstance(val, str) and val.strip():
yaml_provider_name = val.strip()
except FileNotFoundError:
pass
except Exception as exc: # noqa: BLE001 — defensive: never block boot
logger.warning(
"providers: failed to read top-level provider: from %s (%s); "
"falling back to model-based resolution",
yaml_path, exc,
)
# Reconcile env vars (persona convention: MODEL=<id>,
# MODEL_PROVIDER=<slug>) against YAML. Env wins over YAML — the
# persona env files are the canonical per-agent provider mapping
# (Phase 2 mapping 2026-05-08), and the workspace-runtime wheel's
# legacy ``MODEL_PROVIDER``-as-model-id reading would otherwise
# silently route non-leads to providers[0] = anthropic-oauth.
# Documented in detail at _resolve_model_and_provider_from_env.
picked_model, explicit_provider_name = _resolve_model_and_provider_from_env(
yaml_model=yaml_model,
yaml_provider=yaml_provider_name,
providers=providers,
)
if not picked_model:
picked_model = "sonnet"
picked_model = getattr(rc, "model", None) or "sonnet"
# NOTE: do NOT strip the provider prefix here. The pre-fix routing
# behavior — `anthropic:claude-opus-4-7` falls through to
# providers[0] (anthropic-oauth) when no model_prefixes match — is
@ -699,15 +411,7 @@ class ClaudeCodeAdapter(BaseAdapter):
# `anthropic-api` provider and the CLI then hangs at `initialize`
# because ANTHROPIC_API_KEY isn't set. The strip belongs only at
# the CLI invocation site (create_executor below).
#
# Pass the explicit provider name through so _resolve_provider
# raises ValueError with an actionable message (instead of silently
# routing to providers[0]) when an operator picks a provider that
# isn't in the registry. See #180.
provider = _resolve_provider(
picked_model, providers,
explicit_provider=explicit_provider_name,
)
provider = _resolve_provider(picked_model, providers)
auth_env_options = provider["auth_env"]
# Project the per-vendor API key (MINIMAX_API_KEY, GLM_API_KEY,
@ -818,26 +522,9 @@ class ClaudeCodeAdapter(BaseAdapter):
# RuntimeConfig dataclass. Read `model` defensively from either shape.
rc = config.runtime_config
if isinstance(rc, dict):
yaml_model = rc.get("model") or ""
yaml_provider = rc.get("provider") or ""
explicit_model = rc.get("model") or ""
else:
yaml_model = getattr(rc, "model", None) or ""
yaml_provider = getattr(rc, "provider", None) or ""
# Reconcile against env vars (persona convention: MODEL=<id>,
# MODEL_PROVIDER=<slug>) using the same helper that ``setup`` uses,
# so the executor and the boot banner agree on the picked model.
# Without this, a workspace whose env says ``MODEL=MiniMax-M2.7``
# but whose runtime wheel pre-dates the persona-env fix would set
# runtime_config.model="minimax" (the slug, mistakenly read by the
# legacy ``MODEL_PROVIDER``-as-model-id path); this helper restores
# the correct model id before it reaches the SDK.
providers = _load_providers(config.config_path)
explicit_model, _ = _resolve_model_and_provider_from_env(
yaml_model=yaml_model,
yaml_provider=yaml_provider,
providers=providers,
)
explicit_model = getattr(rc, "model", None) or ""
explicit_model = _strip_provider_prefix(explicit_model)
# Pre-validation: detect the misconfiguration combo that drove the
@ -868,7 +555,7 @@ class ClaudeCodeAdapter(BaseAdapter):
"The default fallback ('sonnet') is an Anthropic-native "
"alias; non-Anthropic shims (MiniMax, OpenAI gateways, "
"etc.) won't recognize it and the SDK --print probe will "
"hang for 30s before timing out. Fix: set MODEL "
"hang for 30s before timing out. Fix: set MODEL_PROVIDER "
"as a workspace secret (canvas: Save+Restart with model "
"picked) or set runtime_config.model in /configs/config.yaml."
)

View File

@ -24,7 +24,7 @@ common problems.
## Step 1 — Clone the Repository
```bash
git clone https://git.moleculesai.app/molecule-ai/molecule-ai-workspace-template-claude-code.git
git clone https://github.com/Molecule-AI/molecule-ai-workspace-template-claude-code.git
cd molecule-ai-workspace-template-claude-code
```

View File

@ -1,89 +0,0 @@
"""Shared pytest fixtures + import shims for the adapter test suite.
`adapter.py` imports at module load:
- molecule_runtime.adapters.base (BaseAdapter, AdapterConfig, RuntimeCapabilities)
- molecule_runtime.plugins (lazy in setup(), but stubbed proactively)
- a2a.server.agent_execution (AgentExecutor)
- claude_sdk_executor (lazy in create_executor(), stubbed proactively)
In production those arrive transitively via molecule-ai-workspace-runtime.
The CI runner only installs `pytest pytest-asyncio pyyaml`, so the import
chain would fail with ModuleNotFoundError before any test collects
exactly the failure that broke CI on the #180 fix branch (PR #4) and
caused the merge wall to block on a green local but red Gitea CI.
Putting the stub installer here (collected before any test module is
imported, per pytest semantics) means every test file can do
`from adapter import ...` at module top without a per-file boilerplate
copy. It also forces a single shape for the stubs so two files can't
silently disagree on whether `BaseAdapter` has
`install_plugins_via_registry` (see test_adapter_prevalidate's
async-setup tests, which need the method to exist on the parent class).
"""
import os
import sys
import types
from dataclasses import dataclass
from unittest.mock import MagicMock
@dataclass
class _StubRuntimeCapabilities:
provides_native_session: bool = False
@dataclass
class _StubAdapterConfig:
runtime_config: object = None
config_path: str = "/tmp/configs"
system_prompt: str = ""
heartbeat: object = None
class _StubBaseAdapter:
async def install_plugins_via_registry(self, *_args, **_kwargs):
pass
def _install_stubs() -> None:
"""Install the smallest set of import shims that adapter.py needs."""
if "molecule_runtime" not in sys.modules:
mr = types.ModuleType("molecule_runtime")
mr.adapters = types.ModuleType("molecule_runtime.adapters")
mr.adapters.base = types.ModuleType("molecule_runtime.adapters.base")
mr.adapters.base.BaseAdapter = _StubBaseAdapter
mr.adapters.base.AdapterConfig = _StubAdapterConfig
mr.adapters.base.RuntimeCapabilities = _StubRuntimeCapabilities
mr.plugins = types.ModuleType("molecule_runtime.plugins")
mr.plugins.load_plugins = lambda **_kwargs: []
sys.modules["molecule_runtime"] = mr
sys.modules["molecule_runtime.adapters"] = mr.adapters
sys.modules["molecule_runtime.adapters.base"] = mr.adapters.base
sys.modules["molecule_runtime.plugins"] = mr.plugins
if "a2a" not in sys.modules:
a2a = types.ModuleType("a2a")
a2a.server = types.ModuleType("a2a.server")
a2a.server.agent_execution = types.ModuleType("a2a.server.agent_execution")
a2a.server.agent_execution.AgentExecutor = type("AgentExecutor", (), {})
sys.modules["a2a"] = a2a
sys.modules["a2a.server"] = a2a.server
sys.modules["a2a.server.agent_execution"] = a2a.server.agent_execution
if "claude_sdk_executor" not in sys.modules:
mod = types.ModuleType("claude_sdk_executor")
mod.ClaudeSDKExecutor = MagicMock(name="ClaudeSDKExecutor")
sys.modules["claude_sdk_executor"] = mod
# Run at conftest import time — pytest collects conftest.py before any
# test module, so the stubs are in sys.modules before `from adapter
# import ...` ever executes.
_install_stubs()
# adapter.py lives in the parent dir of tests/ (template root). pytest's
# `--import-mode=importlib` + tests/pytest.ini anchoring rootdir at
# tests/ means the parent isn't on sys.path automatically. Add it here
# once so every test file can do `from adapter import ...` cleanly.
_PARENT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if _PARENT_DIR not in sys.path:
sys.path.insert(0, _PARENT_DIR)

View File

@ -514,15 +514,8 @@ async def test_setup_auth_token_alone_satisfies_third_party_check(
# ---- _load_providers / _resolve_provider unit tests ----
def test_load_providers_returns_builtin_when_yaml_missing(tmp_path, monkeypatch):
"""FileNotFoundError path returns the in-code defaults verbatim.
Monkeypatches the canonical + template paths to a non-existent dir
so only the workspace config_path is in scope. Without this, the
multi-path lookup picks up the repo-root config.yaml that ships
with the template (path 2 finds the bundled providers list and
returns it instead of falling through to builtins).
"""
def test_load_providers_returns_builtin_when_yaml_missing(tmp_path):
"""FileNotFoundError path returns the in-code defaults verbatim."""
_install_stubs()
parent_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if parent_dir not in sys.path:
@ -530,10 +523,6 @@ def test_load_providers_returns_builtin_when_yaml_missing(tmp_path, monkeypatch)
sys.modules.pop("adapter", None)
import adapter as adapter_module
nonexistent = str(tmp_path / "_isolate_canonical")
monkeypatch.setattr(adapter_module, "_CANONICAL_ADAPTER_DIR", nonexistent)
monkeypatch.setattr(adapter_module, "_TEMPLATE_DIR", nonexistent)
result = adapter_module._load_providers(str(tmp_path))
assert result == adapter_module._BUILTIN_PROVIDERS
@ -587,12 +576,8 @@ async def test_setup_routes_extra_providers(
assert os.environ.get("ANTHROPIC_BASE_URL") == expected_url
def test_load_providers_falls_back_on_malformed_yaml(tmp_path, caplog, monkeypatch):
"""Malformed YAML → log warning + fallback (don't kill boot).
Isolated from the multi-path lookup by pinning canonical + template
dirs at a non-existent path; only the workspace config_path is read.
"""
def test_load_providers_falls_back_on_malformed_yaml(tmp_path, caplog):
"""Malformed YAML → log warning + fallback (don't kill boot)."""
_install_stubs()
parent_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if parent_dir not in sys.path:
@ -600,10 +585,6 @@ def test_load_providers_falls_back_on_malformed_yaml(tmp_path, caplog, monkeypat
sys.modules.pop("adapter", None)
import adapter as adapter_module
nonexistent = str(tmp_path / "_isolate_canonical")
monkeypatch.setattr(adapter_module, "_CANONICAL_ADAPTER_DIR", nonexistent)
monkeypatch.setattr(adapter_module, "_TEMPLATE_DIR", nonexistent)
(tmp_path / "config.yaml").write_text("providers: [not valid yaml: {{{")
import logging
@ -641,7 +622,7 @@ def test_resolve_provider_minimax_prefix_matches_minimax_provider():
assert result2["name"] == "minimax"
def test_load_providers_drops_bad_entry_keeps_rest(tmp_path, caplog, monkeypatch):
def test_load_providers_drops_bad_entry_keeps_rest(tmp_path, caplog):
"""Per-entry isolation: one malformed entry shouldn't nuke the registry.
Pre-fix: ``_load_providers`` built the registry via a generator inside
@ -653,9 +634,6 @@ def test_load_providers_drops_bad_entry_keeps_rest(tmp_path, caplog, monkeypatch
Post-fix: per-entry try/except drops the bad entry with a warning,
rest of the registry survives.
Isolated from the multi-path lookup so only the test's tmp config.yaml
is read.
"""
_install_stubs()
parent_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
@ -664,10 +642,6 @@ def test_load_providers_drops_bad_entry_keeps_rest(tmp_path, caplog, monkeypatch
sys.modules.pop("adapter", None)
import adapter as adapter_module
nonexistent = str(tmp_path / "_isolate_canonical")
monkeypatch.setattr(adapter_module, "_CANONICAL_ADAPTER_DIR", nonexistent)
monkeypatch.setattr(adapter_module, "_TEMPLATE_DIR", nonexistent)
yaml_with_typo = textwrap.dedent("""
providers:
- name: good-zai
@ -716,7 +690,7 @@ def test_load_providers_drops_bad_entry_keeps_rest(tmp_path, caplog, monkeypatch
)
def test_load_providers_string_as_prefix_does_not_split_into_chars(tmp_path, caplog, monkeypatch):
def test_load_providers_string_as_prefix_does_not_split_into_chars(tmp_path, caplog):
"""A YAML field declared as list-of-strings but written as a bare
string (operator forgot brackets) used to silently iterate over
characters ``('m','i','m','o','-')``. Post-fix: non-list value
@ -731,10 +705,6 @@ def test_load_providers_string_as_prefix_does_not_split_into_chars(tmp_path, cap
sys.modules.pop("adapter", None)
import adapter as adapter_module
nonexistent = str(tmp_path / "_isolate_canonical")
monkeypatch.setattr(adapter_module, "_CANONICAL_ADAPTER_DIR", nonexistent)
monkeypatch.setattr(adapter_module, "_TEMPLATE_DIR", nonexistent)
yaml_str_prefix = textwrap.dedent("""
providers:
- name: typo-prefix
@ -753,7 +723,7 @@ def test_load_providers_string_as_prefix_does_not_split_into_chars(tmp_path, cap
)
def test_load_providers_drops_entry_without_name(tmp_path, caplog, monkeypatch):
def test_load_providers_drops_entry_without_name(tmp_path, caplog):
"""An entry without ``name`` is operator error — no silent fallback
to ``<unnamed>``. Drop the entry with a warning so the boot log
surfaces the typo.
@ -765,10 +735,6 @@ def test_load_providers_drops_entry_without_name(tmp_path, caplog, monkeypatch):
sys.modules.pop("adapter", None)
import adapter as adapter_module
nonexistent = str(tmp_path / "_isolate_canonical")
monkeypatch.setattr(adapter_module, "_CANONICAL_ADAPTER_DIR", nonexistent)
monkeypatch.setattr(adapter_module, "_TEMPLATE_DIR", nonexistent)
yaml_no_name = textwrap.dedent("""
providers:
- name: good

View File

@ -0,0 +1,206 @@
r"""Tests for entrypoint.sh's log_boot_context() shell function.
The Python-side audit (test_adapter_logging.py) pins what `_audit_auth_env_presence`
in adapter.py emits. But the shell function fires FIRST twice, even (once
pre-gosu as root, once post-gosu as agent). When the adapter never runs at
all because the SDK import fails, the entrypoint emission is the operator's
ONLY visibility into the boot env. So this contract needs its own test.
The cross-file gate `test_audit_env_list_matches_entrypoint_sh` proves the
NAME LIST matches; this file proves the SHELL CODE actually emits the
right lines for those names. Without this, a typo in the for-loop body
(e.g. `eval "val=\$$var"` `val=$var`, which would print the literal
name not its value) silently breaks the audit.
Strategy: extract the `log_boot_context()` function body from entrypoint.sh
and run it in a fresh subprocess with controlled env. Asserts on stdout.
We never source entrypoint.sh wholesale because it would chown /workspace
and exec molecule-runtime neither is appropriate in a test sandbox.
"""
from __future__ import annotations
import os
import re
import subprocess
from pathlib import Path
import pytest
TEMPLATE_DIR = Path(__file__).resolve().parent.parent
ENTRYPOINT = TEMPLATE_DIR / "entrypoint.sh"
def _extract_function() -> str:
"""Pull just the log_boot_context() function definition out of entrypoint.sh.
Returns the literal function definition (`log_boot_context() { ... }`) as
a string, suitable for `sh -c "<func>; log_boot_context"`. Bails with a
clear message if the function can't be located — that itself is a
regression worth a loud test failure.
"""
text = ENTRYPOINT.read_text()
# `log_boot_context() {` on its own line, then everything up to the
# matching closing `}` at column 0. The function is small and shape-stable;
# we don't try to be a full shell parser.
match = re.search(r"^log_boot_context\(\)\s*\{.*?^\}\s*$", text, re.DOTALL | re.MULTILINE)
if not match:
pytest.fail("Could not locate log_boot_context() in entrypoint.sh")
return match.group(0)
def _run_function(env: dict[str, str]) -> str:
"""Run log_boot_context() in a fresh /bin/sh with the given env. Returns stdout."""
func = _extract_function()
script = f"{func}\nlog_boot_context\n"
# Empty base env so PATH lookups (`id`, `hostname`, `date`, `ls`) still work
# but no inherited auth vars leak into the test. We restore PATH explicitly.
safe_env = {"PATH": os.environ.get("PATH", "/usr/bin:/bin")}
safe_env.update(env)
result = subprocess.run(
["/bin/sh", "-c", script],
env=safe_env,
capture_output=True,
text=True,
timeout=10,
check=False,
)
assert result.returncode == 0, (
f"log_boot_context exited rc={result.returncode}\n"
f"stdout:\n{result.stdout}\nstderr:\n{result.stderr}"
)
return result.stdout
# Audit names — kept in lockstep with adapter.py's _AUTH_ENV_AUDIT and the
# entrypoint.sh for-loop. test_audit_env_list_matches_entrypoint_sh and
# test_loop_var_list_matches_audit (below) gate any drift across the three
# locations.
_AUDIT_NAMES = (
"CLAUDE_CODE_OAUTH_TOKEN",
"ANTHROPIC_API_KEY",
"ANTHROPIC_AUTH_TOKEN",
"ANTHROPIC_BASE_URL",
"MINIMAX_API_KEY",
"GLM_API_KEY",
"KIMI_API_KEY",
"DEEPSEEK_API_KEY",
)
def test_emits_set_for_present_env():
"""A set var must produce `env NAME=set` — proves the eval-deref works."""
out = _run_function({"MINIMAX_API_KEY": "secret-MUST-NOT-LEAK"})
assert "env MINIMAX_API_KEY=set" in out
def test_emits_unset_for_absent_env():
"""An unset var must produce `env NAME=unset` — proves the empty-string branch."""
out = _run_function({})
for name in _AUDIT_NAMES:
assert f"env {name}=unset" in out, (
f"missing `env {name}=unset` line — for-loop body may be miscoded"
)
def test_never_leaks_value():
"""The audit prints NAMES, not VALUES. Regression here = secret leak.
Same threat model as the Python-side test: an operator-visible boot log
that contains the actual key would defeat the whole point of the audit
(the audit exists so we can answer 'is the key present' WITHOUT exposing
the key). A `eval "val=\\$$var"` typo collapsing to `echo $var` would
trip this test.
"""
secret = "sk-FAKE-MUST-NEVER-APPEAR-IN-BOOT-LOG"
out = _run_function({
"MINIMAX_API_KEY": secret,
"CLAUDE_CODE_OAUTH_TOKEN": secret,
"ANTHROPIC_BASE_URL": "https://api.example.com",
})
assert secret not in out, f"boot-context log leaked the env VALUE:\n{out}"
# ANTHROPIC_BASE_URL is the most-likely-to-be-logged-by-mistake field
# because operators sometimes WANT to see it; pin that it's still
# name-only.
assert "https://api.example.com" not in out
def test_emits_workspace_id_and_platform_url():
"""WORKSPACE_ID and PLATFORM_URL appear by VALUE — these are not secrets.
They're the operator-visible identifiers a support engineer needs to
correlate logs with platform records. Pinning the field shape so a
later refactor doesn't accidentally redact them.
"""
out = _run_function({
"WORKSPACE_ID": "ws-test-1234",
"PLATFORM_URL": "https://test.example.com",
})
assert "workspace_id=ws-test-1234" in out
assert "platform_url=https://test.example.com" in out
def test_emits_unset_marker_when_workspace_id_missing():
"""Missing WORKSPACE_ID falls back to the literal `<unset>` placeholder.
A support engineer reading the boot log must be able to distinguish
'WORKSPACE_ID was empty string' from 'WORKSPACE_ID was never injected
by the platform'. The shell `${VAR:-<unset>}` default handles that.
"""
out = _run_function({})
assert "workspace_id=<unset>" in out
assert "platform_url=<unset>" in out
def test_emits_uid_and_gid():
"""uid/gid line is critical — answers 'did the privilege drop happen?'
The two-emission pattern (pre-gosu as root, post-gosu as agent) only
works as a diagnostic if uid/gid is in every emission. Pin the field
shape; we don't pin the literal value because CI runs vary.
"""
out = _run_function({})
assert re.search(r"uid=\d+\s+gid=\d+", out), (
f"missing or malformed uid/gid line:\n{out}"
)
def test_emits_boot_marker():
"""Each emission starts with the dated `entrypoint boot` banner.
Operators grep for this to count restarts in a crash loop.
"""
out = _run_function({})
# Format: "----- entrypoint boot 2026-05-02T12:34:56Z -----"
assert re.search(
r"-----\s+entrypoint boot \d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}Z\s+-----",
out,
), f"missing boot banner:\n{out}"
def test_loop_var_list_matches_audit():
"""The for-loop's literal NAME list must match _AUDIT_NAMES (this file).
Companion to test_audit_env_list_matches_entrypoint_sh in
test_adapter_logging.py: that test cross-checks adapter.py vs
entrypoint.sh; this one cross-checks entrypoint.sh vs the test
fixture above. If a maintainer adds a vendor to entrypoint.sh
without updating the audit name tuple in this file, the existing
`test_emits_unset_for_absent_env` would still pass (because all
audited names also appear in the loop), but the maintainer would
have a false sense of coverage. This test catches that.
"""
text = ENTRYPOINT.read_text()
loop_line = next(
(line for line in text.splitlines()
if "for var in" in line and "CLAUDE_CODE_OAUTH_TOKEN" in line),
None,
)
assert loop_line, "entrypoint.sh missing the auth-env for-loop"
names_in_shell = tuple(
loop_line.split("for var in", 1)[1].split(";", 1)[0].split()
)
assert set(names_in_shell) == set(_AUDIT_NAMES), (
f"_AUDIT_NAMES in this file ({set(_AUDIT_NAMES)}) and the for-loop "
f"in entrypoint.sh ({set(names_in_shell)}) disagree — update the "
"test fixture or the shell loop to bring them back in sync."
)

View File

@ -1,239 +0,0 @@
"""Tests for ``_resolve_model_and_provider_from_env`` — the env-vs-YAML
reconciliation that fixes the 2026-05-08 dev-tree wedge incident.
Symptom: 22/27 non-lead workspaces (minimax tier) wedged on
``Control request timeout: initialize`` because the runtime wheel's
``workspace/config.py`` interpreted ``MODEL_PROVIDER=minimax`` as the
*model id* instead of the provider slug. ``model="minimax"`` failed to
match the ``minimax-`` registry prefix, fell through to providers[0]
(anthropic-oauth), demanded ``CLAUDE_CODE_OAUTH_TOKEN`` (unset on
non-leads), and the claude CLI hung at SDK init.
The persona env files (``~/.molecule-ai/personas/<name>/env``) declare
the new convention:
* ``MODEL`` model id (e.g. ``MiniMax-M2.7-highspeed``)
* ``MODEL_PROVIDER`` provider slug (e.g. ``minimax``)
These tests cover the matrix of (env shape) × (YAML shape) so a future
contributor can't silently regress the wedge fix.
"""
import pytest
from adapter import (
_BUILTIN_PROVIDERS,
_resolve_model_and_provider_from_env,
)
# A registry that contains both anthropic-oauth (providers[0]) and
# minimax/zai (third-party slugs) — matches the shipped config.yaml.
_REGISTRY = _BUILTIN_PROVIDERS + (
{
"name": "minimax",
"auth_mode": "third_party_anthropic_compat",
"model_prefixes": ("minimax-",),
"model_aliases": (),
"base_url": "https://api.minimax.io/anthropic",
"auth_env": ("MINIMAX_API_KEY",),
},
{
"name": "zai",
"auth_mode": "third_party_anthropic_compat",
"model_prefixes": ("glm-",),
"model_aliases": (),
"base_url": "https://api.z.ai/api/anthropic",
"auth_env": ("GLM_API_KEY",),
},
)
def _clear_env(monkeypatch):
monkeypatch.delenv("MODEL", raising=False)
monkeypatch.delenv("MODEL_PROVIDER", raising=False)
# ------------------------------------------------------------------
# Persona env convention: MODEL=<id>, MODEL_PROVIDER=<slug>
# ------------------------------------------------------------------
def test_persona_env_minimax_resolves_correctly(monkeypatch):
"""The 2026-05-08 wedge regression test: persona env shape must
yield model=MiniMax-M2.7-highspeed (not "minimax") and explicit
provider=minimax."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "MiniMax-M2.7-highspeed")
monkeypatch.setenv("MODEL_PROVIDER", "minimax")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert model == "MiniMax-M2.7-highspeed"
assert provider == "minimax"
def test_persona_env_lead_claude_code_resolves_correctly(monkeypatch):
"""Lead persona env (MODEL=opus, MODEL_PROVIDER=claude-code) —
``claude-code`` is the persona-friendly alias for the canonical
``anthropic-oauth`` registry name. Must resolve via the alias map
so the lead boots through the OAuth subscription path even when
MODEL is a non-Anthropic model id (e.g. an operator who picked
MiniMax in canvas but whose persona env still pins claude-code)."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "opus")
monkeypatch.setenv("MODEL_PROVIDER", "claude-code")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert model == "opus"
# claude-code → anthropic-oauth via the alias map
assert provider == "anthropic-oauth"
def test_persona_env_lead_with_minimax_model_routes_via_oauth(monkeypatch):
"""Lead workspace whose persona pins MODEL_PROVIDER=claude-code but
whose YAML/canvas selection happens to be a MiniMax model still
routes via OAuth the persona's provider pin wins over the
model-prefix matcher. Without the alias map, the fall-through
mis-routed leads to MiniMax even when their CLAUDE_CODE_OAUTH_TOKEN
was set."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "MiniMax-M2.7")
monkeypatch.setenv("MODEL_PROVIDER", "claude-code")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert model == "MiniMax-M2.7"
assert provider == "anthropic-oauth"
def test_anthropic_alias_resolves_to_anthropic_api(monkeypatch):
"""``MODEL_PROVIDER=anthropic`` alias → ``anthropic-api`` (direct
Anthropic API key path)."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "claude-opus-4-7")
monkeypatch.setenv("MODEL_PROVIDER", "anthropic")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert model == "claude-opus-4-7"
assert provider == "anthropic-api"
def test_persona_env_glm_resolves_correctly(monkeypatch):
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "GLM-4.6")
monkeypatch.setenv("MODEL_PROVIDER", "zai")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert model == "GLM-4.6"
assert provider == "zai"
def test_env_provider_slug_case_insensitive(monkeypatch):
"""Operator typos like ``MiniMax`` (mixed case) still resolve."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "MiniMax-M2.7-highspeed")
monkeypatch.setenv("MODEL_PROVIDER", "MiniMax") # mixed case
_, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert provider == "MiniMax" # caller compares case-insensitively
# ------------------------------------------------------------------
# Legacy convention: MODEL_PROVIDER=<model-id>, MODEL unset
# ------------------------------------------------------------------
def test_legacy_model_provider_as_model_id_still_works(monkeypatch):
"""Pre-2026-05-08 canvas Save+Restart shape: MODEL_PROVIDER carried
the model id directly (e.g. ``MODEL_PROVIDER=MiniMax-M2.7``) and
no MODEL env. Must keep working so existing canvas users don't
break overnight."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL_PROVIDER", "MiniMax-M2.7-highspeed")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
# MiniMax-M2.7-highspeed is not a registered provider name, so
# it's treated as a legacy model-id-in-MODEL_PROVIDER value.
assert model == "MiniMax-M2.7-highspeed"
assert provider is None
# ------------------------------------------------------------------
# Env wins over YAML
# ------------------------------------------------------------------
def test_env_model_wins_over_yaml_model(monkeypatch):
"""When both env MODEL and YAML model are set, env wins."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "GLM-4.6")
model, _ = _resolve_model_and_provider_from_env(
yaml_model="MiniMax-M2.7", yaml_provider="", providers=_REGISTRY,
)
assert model == "GLM-4.6"
def test_env_provider_wins_over_yaml_provider(monkeypatch):
"""Env MODEL_PROVIDER (when a registered slug) wins over YAML provider."""
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "GLM-4.6")
monkeypatch.setenv("MODEL_PROVIDER", "zai")
_, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="minimax", providers=_REGISTRY,
)
assert provider == "zai"
# ------------------------------------------------------------------
# YAML fallback (no env)
# ------------------------------------------------------------------
def test_no_env_falls_back_to_yaml(monkeypatch):
"""Workspace whose env doesn't set MODEL/MODEL_PROVIDER falls back
to the YAML config preserves existing operator workflows."""
_clear_env(monkeypatch)
model, provider = _resolve_model_and_provider_from_env(
yaml_model="claude-sonnet-4-6",
yaml_provider="anthropic-api",
providers=_REGISTRY,
)
assert model == "claude-sonnet-4-6"
assert provider == "anthropic-api"
def test_no_env_no_yaml_returns_empty(monkeypatch):
"""Pure default path — caller (setup) substitutes ``sonnet``."""
_clear_env(monkeypatch)
model, provider = _resolve_model_and_provider_from_env(
yaml_model="", yaml_provider="", providers=_REGISTRY,
)
assert model == ""
assert provider is None
# ------------------------------------------------------------------
# Whitespace / empty-value defensive cases
# ------------------------------------------------------------------
def test_whitespace_only_env_treated_as_unset(monkeypatch):
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", " ")
monkeypatch.setenv("MODEL_PROVIDER", " ")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="opus", yaml_provider="", providers=_REGISTRY,
)
assert model == "opus"
assert provider is None
def test_empty_env_value_treated_as_unset(monkeypatch):
_clear_env(monkeypatch)
monkeypatch.setenv("MODEL", "")
monkeypatch.setenv("MODEL_PROVIDER", "")
model, provider = _resolve_model_and_provider_from_env(
yaml_model="sonnet", yaml_provider="", providers=_REGISTRY,
)
assert model == "sonnet"
assert provider is None

View File

@ -1,146 +0,0 @@
"""Tests for the provider-resolution path that was silent-failing on #180.
Regression coverage: when an operator picks a provider in the canvas Config
tab that isn't in the registry, the adapter must raise ValueError with an
actionable message NOT silently fall through to providers[0]
(anthropic-oauth) and then have the Claude SDK hit the user's OAuth quota
under a different name.
These tests mirror the production failure mode reported by Hongming
2026-05-07 17:35: workspace config.yaml had `provider: minimax` set, the
adapter ignored it entirely, the SDK kept calling the Anthropic API with
CLAUDE_CODE_OAUTH_TOKEN, hit the OAuth quota, and the canvas surfaced
"Agent error (Exception)" with no clue why.
Import-shim setup (sys.path + molecule_runtime / a2a / claude_sdk_executor
stubs) lives in tests/conftest.py shared with test_adapter_prevalidate
so the two stub installers can't disagree on shape (e.g. BaseAdapter
having install_plugins_via_registry).
"""
import pytest
from adapter import (
_BUILTIN_PROVIDERS,
_resolve_provider,
)
def test_resolve_with_no_explicit_provider_falls_back_to_model_match():
"""No explicit provider → model-based prefix/alias matching, default to providers[0]."""
p = _resolve_provider("claude-opus-4-7", _BUILTIN_PROVIDERS)
assert p["name"] == "anthropic-api" # matches model_prefixes=("claude-",)
def test_resolve_with_no_explicit_provider_falls_back_to_default():
"""Unknown model + no explicit provider → providers[0] (anthropic-oauth)."""
p = _resolve_provider("unknown-model", _BUILTIN_PROVIDERS)
assert p["name"] == "anthropic-oauth"
def test_resolve_with_explicit_provider_in_registry_returns_match():
"""Explicit name lookup wins over model-based resolution."""
# Even though "claude-opus-4-7" would normally resolve to anthropic-api
# via prefix matching, the explicit provider name wins.
p = _resolve_provider(
"claude-opus-4-7", _BUILTIN_PROVIDERS,
explicit_provider="anthropic-oauth",
)
assert p["name"] == "anthropic-oauth"
def test_resolve_with_explicit_provider_case_insensitive():
"""Provider name match is case-insensitive (operators write 'Anthropic-OAuth' etc)."""
p = _resolve_provider(
"sonnet", _BUILTIN_PROVIDERS,
explicit_provider="ANTHROPIC-OAUTH",
)
assert p["name"] == "anthropic-oauth"
def test_resolve_with_explicit_provider_not_in_registry_raises():
"""The #180 regression test: explicit non-registry provider must raise, not fall through."""
with pytest.raises(ValueError) as exc_info:
_resolve_provider(
"MiniMax-M2.7-highspeed", _BUILTIN_PROVIDERS,
explicit_provider="minimax",
)
msg = str(exc_info.value)
# Must name the bad provider so operator knows what they typed
assert "minimax" in msg
# Must list known providers so operator knows what's available
assert "anthropic-oauth" in msg
assert "anthropic-api" in msg
# Must give actionable next steps — NOT just "not found"
assert "providers:" in msg or "Add" in msg
assert "Switch" in msg or "runtime" in msg
def test_resolve_with_explicit_provider_does_not_silent_fallback():
"""Specifically: must not return providers[0] when explicit_provider is bogus.
This is the exact silent-fallback path that caused the user-visible
bug: operator picks 'minimax' adapter returns anthropic-oauth
SDK uses CLAUDE_CODE_OAUTH_TOKEN hits quota.
"""
with pytest.raises(ValueError):
result = _resolve_provider(
"anything", _BUILTIN_PROVIDERS,
explicit_provider="minimax",
)
# If the implementation regresses to silent fallback, this would
# have returned providers[0] (anthropic-oauth) instead of raising.
# Defense-in-depth: guard against accidental "return" inside the
# error path.
assert result["name"] not in {"anthropic-oauth", "anthropic-api"}, (
"REGRESSION: silent fallback to default provider when explicit "
"provider name is not in registry — this is the #180 bug."
)
def test_resolve_with_explicit_provider_in_custom_registry():
"""When operator adds a third-party provider to the registry, explicit lookup finds it."""
custom_registry = _BUILTIN_PROVIDERS + (
{
"name": "minimax",
"auth_mode": "third_party_anthropic_compat",
"model_prefixes": ("minimax-",),
"model_aliases": (),
"base_url": "https://api.minimaxi.com/anthropic-compat",
"auth_env": ("MINIMAX_API_KEY",),
},
)
p = _resolve_provider(
"MiniMax-M2.7-highspeed", custom_registry,
explicit_provider="minimax",
)
assert p["name"] == "minimax"
assert p["base_url"] == "https://api.minimaxi.com/anthropic-compat"
assert "MINIMAX_API_KEY" in p["auth_env"]
def test_resolve_empty_providers_raises():
"""Pre-condition: providers must be non-empty (existing behavior preserved)."""
with pytest.raises(ValueError, match="empty providers tuple"):
_resolve_provider("anything", ())
def test_resolve_explicit_empty_string_treated_as_no_explicit():
"""`provider: ''` (empty string) → fall back to model-based resolution, not raise."""
# This shape can happen when the canvas writes an empty provider field.
# Treating it as "no explicit pick" is more forgiving than raising,
# since the user clearly didn't intend to break their workspace.
p = _resolve_provider(
"claude-opus-4-7", _BUILTIN_PROVIDERS,
explicit_provider="",
)
assert p["name"] == "anthropic-api" # fell through to model-based
def test_resolve_explicit_none_treated_as_no_explicit():
"""`explicit_provider=None` (default) → fall back to model-based resolution."""
p = _resolve_provider(
"claude-opus-4-7", _BUILTIN_PROVIDERS,
explicit_provider=None,
)
assert p["name"] == "anthropic-api"