fix(validator): address post-merge review findings on #17 + #18 (#19)

Independent code review of #17 (adapter runtime-load) and #18 (schema
versioning) surfaced four Required and three Optional findings worth
fixing before the patterns harden into the codebase.

Required:

  R1: Delete .molecule-ci/scripts/{validate-workspace-template,
      migrate-template}.py — dead-vendored mirror. The new validator
      workflow invokes .molecule-ci-canonical/scripts/ (the canonical
      clone), not .molecule-ci/scripts/. The mirror was the exact drift
      class #90 is supposed to eliminate: next contributor would edit
      one copy and silently diverge. Other workflows (validate-plugin,
      validate-org-template) still use the legacy path and keep their
      own scripts there — so removing OUR two files is asymmetric but
      correct, and the legacy path can phase out organically.

  R2: validate-workspace-template.yml's `cache-dependency-path` pointed
      at the validator's own deps file (just `pyyaml>=6.0`). Pip cache
      key never invalidated when the template added crewai/langgraph/
      etc. Repoint to the calling repo's `requirements.txt`, which is
      the file the heavy install actually uses one step later.

  R3: `_check_schema_v1` looped `SCHEMA_V1_REQUIRED_KEYS` and re-emitted
      "missing required key `template_schema_version`" — but the
      dispatcher already verified the field is present + int before
      reaching v1, so that branch was dead defensive code. Skip it
      explicitly with a comment, but keep the field in the constant for
      contract documentation + the unknown-keys filter.

  R4: `_template_adapter_under_validation` was a fixed sys.modules key,
      meaning back-to-back invocations in the same Python process
      shared the slot. Use a per-call-unique name keyed on the absolute
      path's hash. No observed bug today; defensive-only.

Optional:

  O1: Class-discovery filter now also requires `__module__ == module_name`.
      Without this, an `from molecule_runtime.adapters.base import
      AbstractCLIAdapter` re-export would count as a "real" adapter,
      masking the genuine "no concrete subclass" case the gate exists
      to catch. Cheap and forward-proofs against any future abstract
      intermediate the runtime might expose. Added a sibling test
      pinning the new behavior.

  O2: migrate-template.py's docstring claimed "uses ruamel.yaml when
      available" but the implementation only ever calls `yaml.safe_dump`.
      Replaced the lie with a clearer caveat block + a forward-pointer
      to ruamel-when-comments-detected as a future enhancement.

  O3: Reordered the workflow so the secret-scan step runs BEFORE
      `pip install -r requirements.txt`. Same threat surface as the
      Docker build smoke (which already runs first), but cheap defense-
      in-depth: a malicious template PR adding a malicious dep to
      requirements.txt now has its post-install hook execute AFTER the
      secret scanner has already inspected the diff.

Test changes:

  - test_adapter_with_no_baseadapter_subclass_errors updated for the
    new error message ("no concrete class inheriting from").
  - New test_only_imported_baseadapter_subclass_does_not_count pins
    the O1 __module__-filter behavior.
  - 43/43 tests pass (was 42/42 before the new test).
  - Real langgraph template still passes the full validator end-to-end.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Hongming Wang 2026-04-28 12:17:44 -07:00 committed by GitHub
parent 84a104a146
commit f125d68910
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
10 changed files with 109 additions and 618 deletions

View File

@ -23,24 +23,18 @@ jobs:
- uses: actions/setup-python@v5 - uses: actions/setup-python@v5
with: with:
python-version: "3.11" python-version: "3.11"
# Cache pip against the calling repo's own requirements.txt
# (the file we install one step below). Pointing the cache key
# at the validator's own deps was decorative — pyyaml never
# changes, so the key never invalidated even when the template
# added a heavy dep like crewai.
cache: "pip" cache: "pip"
cache-dependency-path: .molecule-ci-canonical/.molecule-ci/scripts/requirements.txt cache-dependency-path: requirements.txt
- run: pip install pyyaml -q # Secret-scan runs FIRST, before any `pip install` could execute a
# Install the template's runtime dependencies so the validator's # post-install hook from a malicious dep added by a template PR.
# `check_adapter_runtime_load()` can import adapter.py the same way # Same threat surface the Docker build smoke test already runs
# the workspace container does at boot. Without this, a # against (Dockerfile RUN steps execute before the secret-scan if
# syntactically-valid adapter that ImportErrors on a missing # the docker step were earlier), but cheap to keep ordered safely.
# transitive dep would build clean and crash on first user prompt.
# The fallback (no requirements.txt) installs the runtime alone so
# BaseAdapter is at least importable for the class-discovery check.
- if: hashFiles('requirements.txt') != ''
run: pip install -q -r requirements.txt
- if: hashFiles('requirements.txt') == ''
run: pip install -q molecule-ai-workspace-runtime
- run: python3 .molecule-ci-canonical/scripts/validate-workspace-template.py
- name: Docker build smoke test
if: hashFiles('Dockerfile') != ''
run: docker build -t template-test . --no-cache 2>&1 | tail -5 && echo "✓ Docker build succeeded"
- name: Check for secrets - name: Check for secrets
run: | run: |
python3 - << 'PYEOF' python3 - << 'PYEOF'
@ -90,3 +84,19 @@ jobs:
else: else:
print("::notice::No secrets detected") print("::notice::No secrets detected")
PYEOF PYEOF
- run: pip install pyyaml -q
# Install the template's runtime dependencies so the validator's
# `check_adapter_runtime_load()` can import adapter.py the same way
# the workspace container does at boot. Without this, a
# syntactically-valid adapter that ImportErrors on a missing
# transitive dep would build clean and crash on first user prompt.
# The fallback (no requirements.txt) installs the runtime alone so
# BaseAdapter is at least importable for the class-discovery check.
- if: hashFiles('requirements.txt') != ''
run: pip install -q -r requirements.txt
- if: hashFiles('requirements.txt') == ''
run: pip install -q molecule-ai-workspace-runtime
- run: python3 .molecule-ci-canonical/scripts/validate-workspace-template.py
- name: Docker build smoke test
if: hashFiles('Dockerfile') != ''
run: docker build -t template-test . --no-cache 2>&1 | tail -5 && echo "✓ Docker build succeeded"

View File

@ -1,212 +0,0 @@
#!/usr/bin/env python3
"""Migrate a workspace template's config.yaml across schema versions.
Companion to validate-workspace-template.py. Whenever the validator
adds a new schema version, this script gets a corresponding entry in
MIGRATIONS so each consumer template can mechanically upgrade rather
than every maintainer figuring out the field changes by hand.
Discipline (matches the validator's header):
1. Validator gets a SCHEMA_V<N+1> block + KNOWN_SCHEMA_VERSIONS bump.
2. This script gets `MIGRATIONS[N]` defined a function that takes
a v<N> dict and returns a v<N+1> dict. Pure, deterministic, no
I/O that way migrations compose: v1 v2 v3 just chains them.
3. Each migration is FROZEN once shipped. If a v2 migration needs
fixing post-ship, ship it as v3 with the corrective migration.
4. Consumers run this script (one PR per template repo) before the
deprecation window for v<N> closes.
Usage:
# Migrate the template in cwd from its current version to the latest
python3 scripts/migrate-template.py .
# Migrate to a specific version (bounded; useful when a deprecation
# window is closing and you want to skip-ahead)
python3 scripts/migrate-template.py --to 3 .
# Force the source version (override config.yaml's declared version)
python3 scripts/migrate-template.py --from 1 --to 2 .
# Dry-run: print the diff without writing
python3 scripts/migrate-template.py --dry-run .
The script preserves YAML round-trip fidelity for keys it doesn't
touch (using ruamel.yaml when available; falling back to PyYAML's
default representer otherwise). Migrations should ONLY mutate keys
they're explicitly versioning — leave everything else alone so a
consumer template's customizations survive.
"""
from __future__ import annotations
import argparse
import sys
from copy import deepcopy
from pathlib import Path
from typing import Callable
import yaml
# ──────────────────────────────────────────── migrations registry
# Each entry maps a SOURCE version to the function that produces the
# next version's dict. Currently empty — no v2 yet. The first time a
# real schema bump lands, MIGRATIONS[1] gets defined alongside the
# validator's SCHEMA_V2 block.
MIGRATIONS: dict[int, Callable[[dict], dict]] = {}
# ──────────────────────────────────────────── version detection
def _detect_current_version(config: dict) -> int:
sv = config.get("template_schema_version")
if sv is None:
sys.exit(
"error: config.yaml has no `template_schema_version`. "
"Add it (likely 1 for legacy templates) before migrating."
)
if not isinstance(sv, int):
sys.exit(
f"error: template_schema_version must be int, got "
f"{type(sv).__name__}={sv!r}."
)
return sv
def _latest_known_version() -> int:
"""Maximum version reachable by chaining MIGRATIONS from any
starting point. With an empty registry, this is 1 (the floor:
every existing template is at v1)."""
if not MIGRATIONS:
return 1
return max(MIGRATIONS.keys()) + 1
# ──────────────────────────────────────────── core
def migrate_config(config: dict, from_version: int, to_version: int) -> dict:
"""Apply migrations sequentially from `from_version` to `to_version`.
Returns a NEW dict does not mutate the input.
Errors loudly when there's no migration registered for an
intermediate step: forward-only, never silently skip a hop. If the
user asks for a backward migration, error too schema versions
are append-only and we don't ship downgrades."""
if to_version < from_version:
sys.exit(
f"error: cannot migrate backward (from v{from_version} to "
f"v{to_version}). Schema versions are append-only — file a "
f"new bug + ship a forward migration instead."
)
current = from_version
out = deepcopy(config)
while current < to_version:
step = MIGRATIONS.get(current)
if step is None:
sys.exit(
f"error: no migration registered for v{current}"
f"v{current + 1}. Either add it to MIGRATIONS in "
f"scripts/migrate-template.py or pick a different --to."
)
out = step(out)
# Every migration MUST stamp the new version on its output —
# this assertion catches a class of bugs where a migration
# forgets to bump template_schema_version.
if out.get("template_schema_version") != current + 1:
sys.exit(
f"error: MIGRATIONS[{current}] did not stamp "
f"template_schema_version={current + 1} on its output. "
f"This is a bug in the migration function itself."
)
current += 1
return out
def _read_yaml(path: Path) -> dict:
with open(path) as f:
data = yaml.safe_load(f)
if not isinstance(data, dict):
sys.exit(f"error: {path} root is not a mapping (got {type(data).__name__})")
return data
def _write_yaml(path: Path, data: dict) -> None:
# Sort keys for stable diffs across migrations. This matches what
# `yaml.safe_dump` does when we write — consumer repos with
# custom orderings will see their config.yaml re-ordered, which is
# one of those round-trip lossy tradeoffs that's worth accepting:
# the migration moment is rare and the diff is reviewable.
with open(path, "w") as f:
yaml.safe_dump(data, f, sort_keys=True, default_flow_style=False)
# ──────────────────────────────────────────── CLI
def main(argv: list[str] | None = None) -> int:
parser = argparse.ArgumentParser(
description="Migrate a workspace template's config.yaml across schema versions."
)
parser.add_argument(
"template_dir",
type=Path,
help="Path to the template repo root (must contain config.yaml).",
)
parser.add_argument(
"--from",
dest="from_version",
type=int,
default=None,
help="Source schema version (defaults to whatever config.yaml declares).",
)
parser.add_argument(
"--to",
dest="to_version",
type=int,
default=None,
help="Target schema version (defaults to the highest reachable from MIGRATIONS).",
)
parser.add_argument(
"--dry-run",
action="store_true",
help="Print the migrated YAML to stdout without modifying the file.",
)
args = parser.parse_args(argv)
config_path = args.template_dir / "config.yaml"
if not config_path.is_file():
sys.exit(f"error: {config_path} does not exist")
config = _read_yaml(config_path)
from_version = args.from_version
if from_version is None:
from_version = _detect_current_version(config)
to_version = args.to_version
if to_version is None:
to_version = _latest_known_version()
if from_version == to_version:
print(
f"nothing to do: config.yaml is already at v{from_version}.",
file=sys.stderr,
)
return 0
migrated = migrate_config(config, from_version, to_version)
if args.dry_run:
yaml.safe_dump(migrated, sys.stdout, sort_keys=True, default_flow_style=False)
return 0
_write_yaml(config_path, migrated)
print(
f"✓ migrated {config_path} from v{from_version} → v{to_version}",
file=sys.stderr,
)
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -1,373 +0,0 @@
#!/usr/bin/env python3
"""Prototype of the beefed-up validate-workspace-template.py.
Run from a template repo's root. Surfaces hard structural drift in
Dockerfile + config.yaml + requirements.txt against the canonical
contract. Replaces the existing soft-warnings-only validator at
molecule-ci/scripts/validate-workspace-template.py.
"""
import os, re, sys
import yaml
ERRORS: list[str] = []
WARNINGS: list[str] = []
def err(msg: str) -> None:
ERRORS.append(msg)
def warn(msg: str) -> None:
WARNINGS.append(msg)
# ───────────────────────────────────────────────────────────── Dockerfile
def check_dockerfile() -> None:
if not os.path.isfile("Dockerfile"):
warn("no Dockerfile — skipping container drift checks (library-only template?)")
return
df = open("Dockerfile").read()
if not re.search(r"^FROM python:3\.11-slim\b", df, re.MULTILINE):
err("Dockerfile: must base on `FROM python:3.11-slim` — see contract doc")
if not re.search(r"^ARG RUNTIME_VERSION", df, re.MULTILINE):
err(
"Dockerfile: missing `ARG RUNTIME_VERSION=`. "
"This arg invalidates the pip-install cache when the cascade "
"publishes a new wheel; without it, the cascade silently ships "
"the previous runtime (cache trap observed 2026-04-27, 5x in a row)."
)
if "molecule-ai-workspace-runtime" not in df and not (
os.path.isfile("requirements.txt")
and "molecule-ai-workspace-runtime" in open("requirements.txt").read()
):
err("Dockerfile + requirements.txt: must install `molecule-ai-workspace-runtime`")
if "${RUNTIME_VERSION}" not in df and "$RUNTIME_VERSION" not in df:
err(
"Dockerfile: must reference `${RUNTIME_VERSION}` in a pip install RUN block. "
'Pattern: `if [ -n "${RUNTIME_VERSION}" ]; then '
'pip install --no-cache-dir --upgrade "molecule-ai-workspace-runtime==${RUNTIME_VERSION}"; fi`'
)
if not re.search(r"useradd[^\n]*\bagent\b", df):
err(
"Dockerfile: must create the `agent` user "
"(`RUN useradd -u 1000 -m -s /bin/bash agent`). "
"Runtime drops to uid 1000; without it, claude-code refuses "
"`--dangerously-skip-permissions` for safety."
)
has_direct_entrypoint = bool(
re.search(r'(ENTRYPOINT|CMD)\s*\[?\s*"?molecule-runtime"?', df)
)
has_custom_entrypoint = bool(
re.search(r'ENTRYPOINT\s*\[?\s*"?(/?[\w./-]*entrypoint\.sh|/?[\w./-]*start\.sh)', df)
)
if not has_direct_entrypoint and not has_custom_entrypoint:
err(
"Dockerfile: must end at `molecule-runtime` "
"(`ENTRYPOINT [\"molecule-runtime\"]` or via custom "
"entrypoint.sh / start.sh that exec's molecule-runtime)"
)
if has_custom_entrypoint:
m = re.search(r'ENTRYPOINT\s*\[?\s*"?(/?[\w./-]+)', df)
if m:
ep_in_image = m.group(1).lstrip("/")
ep_local = os.path.basename(ep_in_image)
if os.path.isfile(ep_local):
if "molecule-runtime" not in open(ep_local).read():
err(
f"Dockerfile uses ENTRYPOINT [{ep_in_image}] but "
f"{ep_local} does not exec `molecule-runtime`"
)
else:
warn(
f"Dockerfile points ENTRYPOINT at {ep_in_image} but "
f"{ep_local} not found in repo root — verify it's COPYed in"
)
# ───────────────────────────────────────────────────────────── config.yaml
KNOWN_RUNTIMES = {
"langgraph",
"claude-code",
"crewai",
"autogen",
"deepagents",
"hermes",
"gemini-cli",
"openclaw",
}
# ──────────────────────────────────────────── schema versioning
#
# `template_schema_version: int` in each template's config.yaml selects
# which contract this validator enforces. Versions are FROZEN once
# shipped — never edit a SCHEMA_V* constant in place. To bump:
#
# 1. Add `SCHEMA_V<N+1>_REQUIRED_KEYS` / `SCHEMA_V<N+1>_OPTIONAL_KEYS`
# describing the new contract.
# 2. Add `_check_schema_v<N+1>(config)` that enforces it.
# 3. Add the entry to SCHEMA_CHECKS below.
# 4. Move version N from KNOWN_SCHEMA_VERSIONS to
# DEPRECATED_SCHEMA_VERSIONS so existing v<N> templates warn but
# still pass — buys a deprecation window.
# 5. Ship a corresponding migration in scripts/migrate-template.py's
# MIGRATIONS table (key = N, value = callable that produces the
# v<N+1> dict from a v<N> dict).
# 6. Run migrate-template.py on each consumer template repo as a PR.
# 7. After all consumers migrate, drop version N from
# DEPRECATED_SCHEMA_VERSIONS in a follow-up PR.
#
# This discipline means a schema version always has exactly one valid
# enforcement function, never "branch on minor variants" — the whole
# point of versioning is to avoid that drift.
KNOWN_SCHEMA_VERSIONS: set[int] = {1}
DEPRECATED_SCHEMA_VERSIONS: set[int] = set()
SCHEMA_V1_REQUIRED_KEYS = ["name", "runtime", "template_schema_version"]
SCHEMA_V1_OPTIONAL_KEYS = [
"description",
"version",
"tier",
"model",
"models",
"runtime_config",
"env",
"skills",
"tools",
"a2a",
"delegation",
"prompt_files",
"bridge",
"governance",
]
def _check_schema_v1(config: dict) -> None:
"""v1 contract — the keys frozen as of monorepo task #90's Phase 2.
Currently every production template runs this version. Do NOT edit
in place; add v2 instead and migrate consumers (see header)."""
for key in SCHEMA_V1_REQUIRED_KEYS:
if key not in config:
err(f"config.yaml: missing required key `{key}`")
runtime = config.get("runtime")
if runtime and runtime not in KNOWN_RUNTIMES:
warn(
f"config.yaml: runtime `{runtime}` not in known set "
f"{sorted(KNOWN_RUNTIMES)} — OK for custom runtimes; "
f"if canonical, add it to KNOWN_RUNTIMES in validate-workspace-template.py"
)
unknown = set(config.keys()) - set(SCHEMA_V1_REQUIRED_KEYS) - set(SCHEMA_V1_OPTIONAL_KEYS)
if unknown:
warn(
f"config.yaml: unknown top-level keys {sorted(unknown)}"
f"may be drift. If intentional, add them to SCHEMA_V1_OPTIONAL_KEYS."
)
SCHEMA_CHECKS = {
1: _check_schema_v1,
}
def check_config_yaml() -> None:
if not os.path.isfile("config.yaml"):
err("config.yaml: missing at repo root")
return
with open("config.yaml") as f:
try:
config = yaml.safe_load(f)
except yaml.YAMLError as e:
err(f"config.yaml: invalid YAML — {e}")
return
if not isinstance(config, dict):
err(f"config.yaml: root must be a mapping, got {type(config).__name__}")
return
# Schema-version dispatch. Validate the version field shape first
# so error messages are actionable.
sv = config.get("template_schema_version")
if sv is None:
err("config.yaml: missing required key `template_schema_version`")
# Can't dispatch without a version. Don't fall through to v1
# checks — that would mask the missing-version error.
return
if not isinstance(sv, int):
err(
f"config.yaml: template_schema_version must be int, "
f"got {type(sv).__name__}={sv!r}"
)
return
if sv in DEPRECATED_SCHEMA_VERSIONS:
latest = max(KNOWN_SCHEMA_VERSIONS)
warn(
f"config.yaml: template_schema_version={sv} is deprecated; "
f"migrate to v{latest} via "
f"`python3 scripts/migrate-template.py --to {latest} .`. "
f"Support for v{sv} will be removed in a future cycle."
)
elif sv not in KNOWN_SCHEMA_VERSIONS:
valid = sorted(KNOWN_SCHEMA_VERSIONS | DEPRECATED_SCHEMA_VERSIONS)
err(
f"config.yaml: template_schema_version={sv} is unknown — "
f"this validator understands {valid}. Either bump the "
f"validator (add a SCHEMA_V{sv} block) or correct the version."
)
return
SCHEMA_CHECKS[sv](config)
# ───────────────────────────────────────────────────────────── requirements.txt
def check_requirements() -> None:
if not os.path.isfile("requirements.txt"):
warn("no requirements.txt — Dockerfile must install runtime by other means")
return
reqs = open("requirements.txt").read()
if "molecule-ai-workspace-runtime" not in reqs:
err("requirements.txt: must declare `molecule-ai-workspace-runtime` as a dependency")
# ───────────────────────────────────────────────────────────── adapter.py
def check_adapter() -> None:
"""Static-text adapter checks. Fast — no imports."""
if not os.path.isfile("adapter.py"):
warn("no adapter.py — runtime will use the default langgraph executor from the wheel")
return
content = open("adapter.py").read()
# The original validator's warning ("don't import molecule_runtime") was
# backwards — that's the canonical package name. The previous check shipped
# for ~2 weeks producing false-positive warnings. Removed.
if re.search(r"\bfrom molecule_ai\b|\bimport molecule_ai\b", content):
warn(
"adapter.py imports `molecule_ai` — that's a pre-#87 package name; "
"use `molecule_runtime`"
)
def check_adapter_runtime_load() -> None:
"""Strong adapter contract: import adapter.py the same way the runtime
does at workspace boot, and assert at least one class in it inherits
from molecule_runtime.adapters.base.BaseAdapter.
The Docker build smoke test in validate-workspace-template.yml builds
the image but doesn't RUN it — adapter.py is only imported at
container startup. So a template with a syntactically-valid Dockerfile
+ a broken adapter.py (wrong base class, ImportError on a missing
framework dep, typo) builds clean and fails on first user prompt.
This check exercises the same class-resolution path the runtime uses,
so a passing validator means a passing workspace boot for the
adapter-load step.
Skip conditions:
- No adapter.py exists. Templates without one inherit the default
langgraph executor from the wheel (intentional, not drift).
- molecule-ai-workspace-runtime not importable in the validator
environment. That's a CI-config bug — the workflow that runs
this validator must `pip install molecule-ai-workspace-runtime`
first. Warn loudly so the misconfiguration surfaces, but don't
hard-fail (we'd be saying "your adapter is broken" when the
actual cause is missing infra). The `pip install -r
requirements.txt` step in validate-workspace-template.yml
normally satisfies this transitively.
Hard-error conditions:
- adapter.py raises any exception during import. The same
exception would crash workspace boot.
- No class in the module inherits from BaseAdapter. The runtime's
adapter-discovery would silently fall through to the default
executor, ignoring this file exactly the kind of human-error
mode this contract is supposed to eliminate.
"""
if not os.path.isfile("adapter.py"):
return # check_adapter() already warned; don't double-warn
try:
from molecule_runtime.adapters.base import BaseAdapter # noqa: PLC0415
except ImportError:
warn(
"adapter.py: skipping runtime-load check — "
"`molecule-ai-workspace-runtime` not installed in the validator "
"environment. The CI workflow that invokes this script must "
"`pip install molecule-ai-workspace-runtime` (or `pip install "
"-r requirements.txt`) first; otherwise this critical check is "
"silently bypassed."
)
return
# Load adapter.py as a module under a unique name so it doesn't
# collide with any installed `adapter` package or with a previous
# invocation in the same Python process.
import importlib.util # noqa: PLC0415
import sys # noqa: PLC0415
module_name = "_template_adapter_under_validation"
spec = importlib.util.spec_from_file_location(module_name, "adapter.py")
if spec is None or spec.loader is None:
err("adapter.py: cannot construct an import spec — file may be unreadable")
return
mod = importlib.util.module_from_spec(spec)
sys.modules[module_name] = mod # required so dataclass / pydantic refs resolve
try:
spec.loader.exec_module(mod)
except Exception as e:
err(
f"adapter.py: failed to import — `{type(e).__name__}: {e}`. "
f"This is the same failure mode that crashes workspace boot at "
f"runtime; the cure is to fix the adapter, not skip this check. "
f"If the import fails because a transitive dep isn't installed in "
f"this CI env, add it to the template's requirements.txt — that's "
f"what the workspace container does, and the validator job "
f"installs requirements.txt before running this check."
)
sys.modules.pop(module_name, None)
return
adapter_classes = [
obj
for name, obj in vars(mod).items()
if isinstance(obj, type)
and obj is not BaseAdapter
and issubclass(obj, BaseAdapter)
]
sys.modules.pop(module_name, None)
if not adapter_classes:
err(
"adapter.py: no class inheriting from "
"`molecule_runtime.adapters.base.BaseAdapter` found. "
"The runtime resolves the adapter via class discovery — "
"without a BaseAdapter subclass, workspace boot falls "
"through to the default langgraph executor and ignores "
"this file silently. If that's intentional, delete adapter.py."
)
def main() -> None:
check_dockerfile()
check_config_yaml()
check_requirements()
check_adapter()
check_adapter_runtime_load()
for w in WARNINGS:
print(f"::warning::{w}")
for e in ERRORS:
print(f"::error::{e}")
if ERRORS:
sys.exit(1)
print(f"✓ Template validation passed ({len(WARNINGS)} warning(s))")
if __name__ == "__main__":
main()

Binary file not shown.

View File

@ -32,11 +32,23 @@ Usage:
# Dry-run: print the diff without writing # Dry-run: print the diff without writing
python3 scripts/migrate-template.py --dry-run . python3 scripts/migrate-template.py --dry-run .
The script preserves YAML round-trip fidelity for keys it doesn't YAML round-trip caveats:
touch (using ruamel.yaml when available; falling back to PyYAML's - PyYAML's safe_dump is used for output. Comments + anchor/alias
default representer otherwise). Migrations should ONLY mutate keys forms in the consumer's config.yaml are NOT preserved across
they're explicitly versioning — leave everything else alone so a migrations the migrated file is a clean re-emit. Templates
consumer template's customizations survive. rarely have inline comments in config.yaml; on the rare occasion
they do, the maintainer needs to re-add them after migration.
- Keys are sorted alphabetically on output. This trades a one-time
re-ordering diff (reviewable) for stable diffs across future
migrations.
- Migrations should ONLY mutate keys they're explicitly versioning
leave everything else alone so a consumer template's
customizations survive.
A future enhancement could detect comments in the original file and
opt into ruamel.yaml for round-trip-preserving emission. Not done
today; flag in the migrator's stderr if comments are detected so the
maintainer knows what they're losing.
""" """
from __future__ import annotations from __future__ import annotations

View File

@ -375,7 +375,34 @@ def test_adapter_with_no_baseadapter_subclass_errors(validator, tmp_path, monkey
monkeypatch.chdir(tmp_path) monkeypatch.chdir(tmp_path)
validator.check_adapter_runtime_load() validator.check_adapter_runtime_load()
assert any( assert any(
"no class inheriting from" in e and "BaseAdapter" in e "no concrete class inheriting from" in e and "BaseAdapter" in e
for e in validator.ERRORS
), validator.ERRORS
@_skip_no_runtime
def test_only_imported_baseadapter_subclass_does_not_count(validator, tmp_path, monkeypatch):
"""Re-exported imports do not satisfy the contract. If the only
BaseAdapter subclass in adapter.py is something `from
molecule_runtime.adapters.base import BaseAdapter` re-exports (or
a future abstract intermediate), the runtime's class-discovery
would correctly skip it and the validator must too. Without
this check, an `__module__`-filter regression would mask the
'no concrete subclass' case the gate exists to catch.
"""
adapter = (
# This file imports BaseAdapter but never SUBCLASSES it.
# `BaseAdapter` itself is in vars(mod) but it's already
# filtered by `obj is not BaseAdapter`. The new __module__
# filter ensures no third-party class slipping in via import
# is counted either.
"from molecule_runtime.adapters.base import BaseAdapter # noqa: F401\n"
)
_materialise(tmp_path, adapter_py=adapter)
monkeypatch.chdir(tmp_path)
validator.check_adapter_runtime_load()
assert any(
"no concrete class inheriting from" in e
for e in validator.ERRORS for e in validator.ERRORS
), validator.ERRORS ), validator.ERRORS

View File

@ -129,6 +129,13 @@ KNOWN_RUNTIMES = {
KNOWN_SCHEMA_VERSIONS: set[int] = {1} KNOWN_SCHEMA_VERSIONS: set[int] = {1}
DEPRECATED_SCHEMA_VERSIONS: set[int] = set() DEPRECATED_SCHEMA_VERSIONS: set[int] = set()
# `template_schema_version` is part of the v1 contract and listed
# here for documentation, but the top-level `check_config_yaml`
# already verifies it's present and is an int before dispatching
# here — `_check_schema_v1` does NOT re-check it (would be dead
# defensive code). The key DOES need to appear in the union of
# required + optional so it isn't flagged as unknown drift in the
# `unknown top-level keys` warning at the end of `_check_schema_v1`.
SCHEMA_V1_REQUIRED_KEYS = ["name", "runtime", "template_schema_version"] SCHEMA_V1_REQUIRED_KEYS = ["name", "runtime", "template_schema_version"]
SCHEMA_V1_OPTIONAL_KEYS = [ SCHEMA_V1_OPTIONAL_KEYS = [
"description", "description",
@ -153,6 +160,10 @@ def _check_schema_v1(config: dict) -> None:
Currently every production template runs this version. Do NOT edit Currently every production template runs this version. Do NOT edit
in place; add v2 instead and migrate consumers (see header).""" in place; add v2 instead and migrate consumers (see header)."""
for key in SCHEMA_V1_REQUIRED_KEYS: for key in SCHEMA_V1_REQUIRED_KEYS:
if key == "template_schema_version":
# Already verified present + int by the dispatcher; skip
# to avoid emitting a duplicate or contradictory error.
continue
if key not in config: if key not in config:
err(f"config.yaml: missing required key `{key}`") err(f"config.yaml: missing required key `{key}`")
runtime = config.get("runtime") runtime = config.get("runtime")
@ -303,13 +314,18 @@ def check_adapter_runtime_load() -> None:
) )
return return
# Load adapter.py as a module under a unique name so it doesn't # Load adapter.py as a module under a per-call-unique name so it
# collide with any installed `adapter` package or with a previous # doesn't collide with any installed `adapter` package OR with a
# invocation in the same Python process. # previous invocation in the same Python process. The id() of the
# cwd-anchored absolute path is sufficient — we just need
# different invocations to land on different sys.modules keys so
# one invocation's lingering references can't bleed into the
# next's adapter discovery.
import importlib.util # noqa: PLC0415 import importlib.util # noqa: PLC0415
import sys # noqa: PLC0415 import sys # noqa: PLC0415
module_name = "_template_adapter_under_validation" abs_path = os.path.abspath("adapter.py")
module_name = f"_template_adapter_under_validation_{abs(hash(abs_path)):x}"
spec = importlib.util.spec_from_file_location(module_name, "adapter.py") spec = importlib.util.spec_from_file_location(module_name, "adapter.py")
if spec is None or spec.loader is None: if spec is None or spec.loader is None:
err("adapter.py: cannot construct an import spec — file may be unreadable") err("adapter.py: cannot construct an import spec — file may be unreadable")
@ -333,23 +349,34 @@ def check_adapter_runtime_load() -> None:
sys.modules.pop(module_name, None) sys.modules.pop(module_name, None)
return return
# Class discovery: only count classes DEFINED in adapter.py, not
# re-exported imports. Without the `__module__` filter, a template
# that does `from molecule_runtime.adapters.base import
# AbstractCLIAdapter` (or any future abstract intermediate the
# runtime exposes) would have that import counted as a "real"
# adapter — masking the genuine "no concrete subclass" case the
# whole check is meant to catch.
adapter_classes = [ adapter_classes = [
obj obj
for name, obj in vars(mod).items() for name, obj in vars(mod).items()
if isinstance(obj, type) if isinstance(obj, type)
and obj is not BaseAdapter and obj is not BaseAdapter
and issubclass(obj, BaseAdapter) and issubclass(obj, BaseAdapter)
and getattr(obj, "__module__", None) == module_name
] ]
sys.modules.pop(module_name, None) sys.modules.pop(module_name, None)
if not adapter_classes: if not adapter_classes:
err( err(
"adapter.py: no class inheriting from " "adapter.py: no concrete class inheriting from "
"`molecule_runtime.adapters.base.BaseAdapter` found. " "`molecule_runtime.adapters.base.BaseAdapter` defined "
"The runtime resolves the adapter via class discovery — " "in this file. The runtime resolves the adapter via "
"without a BaseAdapter subclass, workspace boot falls " "class discovery on adapter.py's own definitions — "
"through to the default langgraph executor and ignores " "imports of base classes from molecule_runtime do not "
"this file silently. If that's intentional, delete adapter.py." "count. Without a concrete subclass DEFINED here, "
"workspace boot falls through to the default langgraph "
"executor and ignores this file silently. If that's "
"intentional, delete adapter.py."
) )