forked from molecule-ai/molecule-core
Compare commits
208 Commits
feat/plugi
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| fedfb49c0a | |||
|
|
ef40701a78 | ||
| 26946367a0 | |||
| 36dcf076d2 | |||
|
|
ad9e11d8c4 | ||
|
|
e8eeb5ff8e | ||
| 78890703f5 | |||
|
|
6ab1184c15 | ||
|
|
6029ccb964 | ||
|
|
306262a315 | ||
| 4baf60f01d | |||
| 1492b40b38 | |||
| c0ee500e47 | |||
|
|
7b60008d33 | ||
|
|
be2de6351f | ||
|
|
96ae24a83c | ||
| 0ba16cded6 | |||
|
|
aff8831817 | ||
|
|
fb3ab76456 | ||
|
|
e541889150 | ||
| bc1d602883 | |||
|
|
6b73c7abc7 | ||
|
|
0722bf3df8 | ||
| 2da036204c | |||
| e53cbeae2f | |||
| cc2dbb1f3d | |||
|
|
0de7771a72 | ||
| e29b166f60 | |||
| 4a73a72e44 | |||
| b837d3b065 | |||
| e80d2ccb72 | |||
| f5682fbb5f | |||
| 7bc249ff7a | |||
|
|
bf0e47814e | ||
| 2c3b36f5cd | |||
|
|
f263f89ca9 | ||
|
|
9c44bdf4fe | ||
|
|
02a8303bb5 | ||
|
|
41283b1919 | ||
| 534cdb5aa4 | |||
| 9368b20d49 | |||
| 13375ed902 | |||
|
|
a07e2df1c0 | ||
|
|
64b970657f | ||
|
|
2a04233d5a | ||
| a9bb2c47da | |||
|
|
5a5a7bce27 | ||
|
|
4e69b88d82 | ||
|
|
a8df558909 | ||
| d144827ea4 | |||
|
|
0a571a1f1e | ||
| 19bb3430e5 | |||
|
|
b42cc0e0a0 | ||
| a0e815672f | |||
|
|
bd0a52a9a1 | ||
| ebc56a2ce6 | |||
|
|
1d644f451d | ||
|
|
b33f372085 | ||
| eaf7dbb7c4 | |||
| 278952c13d | |||
| 9e2cbd337c | |||
|
|
ede4551c73 | ||
|
|
281b1493f8 | ||
|
|
51ddc50592 | ||
| 2077cf4054 | |||
| afdb546026 | |||
|
|
050db66b36 | ||
|
|
70347e916e | ||
| e65633bf15 | |||
| 4d9850df53 | |||
|
|
b9fdaf6b61 | ||
|
|
2f13fd24a1 | ||
|
|
56b4f6d7e1 | ||
| e3ea8ff74a | |||
| 449a49f31a | |||
|
|
0183fe66cb | ||
| 3e2ff63f7f | |||
|
|
1cbdf69c8d | ||
| 76ac5a88dc | |||
| ab7bb20545 | |||
|
|
b54101947f | ||
|
|
97768272a3 | ||
| 21a5c31b85 | |||
| bceed5323d | |||
|
|
6f862e36db | ||
| 518a4d3520 | |||
|
|
e90419b9fe | ||
| d5b2ae8e13 | |||
|
|
2fa40bf989 | ||
|
|
5581a18981 | ||
|
|
215056bfdd | ||
|
|
4dcabf1cb9 | ||
|
|
a34ebfc57f | ||
|
|
e716d699e9 | ||
| d0d9af2591 | |||
| c9cf240751 | |||
|
|
3525ee61a4 | ||
| b971b5872d | |||
| 57aedec1a3 | |||
|
|
dff7d8fbab | ||
|
|
35945d26da | ||
| 7079d4ba01 | |||
| 7db9fc7211 | |||
|
|
d72bef93bc | ||
|
|
2cc68d57d6 | ||
| 33fc860918 | |||
|
|
862de8cd93 | ||
| eac153de90 | |||
| 86f720ee14 | |||
|
|
736805e575 | ||
|
|
ca74a9c064 | ||
|
|
cf2501bd18 | ||
| b33f1feb79 | |||
| d353ab5286 | |||
| 1224f19cfc | |||
| d15040d233 | |||
| 020d63cbc7 | |||
|
|
ea8ac4f023 | ||
|
|
f4598c8c2a | ||
|
|
ad89173f0f | ||
| 032e37e703 | |||
|
|
49d53204cc | ||
|
|
7bcfc8821e | ||
| 84b38914bd | |||
|
|
f9d58b2186 | ||
|
|
b9db10432d | ||
|
|
5b50dafe34 | ||
|
|
7090eab0d5 | ||
| 1320901b1c | |||
| 2654a4da01 | |||
|
|
0a29c0a9e5 | ||
|
|
205ee9645c | ||
| fa7e4101d7 | |||
| c16c5c6183 | |||
| 252f8d0c47 | |||
| e8f521011f | |||
| 8cd52fc642 | |||
| 6193f67bc0 | |||
| 2ef4f64b31 | |||
| d27b1e13de | |||
| efbe4035f3 | |||
|
|
a4fc04189c | ||
| c0abbe33ef | |||
| 323bbb4ec2 | |||
| 0529bc246a | |||
| 6818f01447 | |||
| d25e5c0f43 | |||
|
|
04157f6896 | ||
|
|
a6477d2b0c | ||
| 9456d1c5fd | |||
| b671019364 | |||
|
|
dee733cf97 | ||
| a2970db8ed | |||
|
|
5fe335ffae | ||
| a50cda1a85 | |||
|
|
a526dabf04 | ||
|
|
4534e922c8 | ||
|
|
427d5b04ed | ||
| a93c4ce177 | |||
|
|
b3041c13d3 | ||
| e1214ca0b4 | |||
|
|
bfefcb315b | ||
| c94ead1953 | |||
|
|
3de51faa19 | ||
| 6f861926bd | |||
| 15c5f32491 | |||
| 9b5e89bb42 | |||
|
|
b91da1ab77 | ||
| aea6109602 | |||
|
|
c3596d6271 | ||
| 2fa79ea462 | |||
|
|
15935143c8 | ||
|
|
558e4fee48 | ||
| 8e4169cfac | |||
| bce60f1b22 | |||
| c6f41198f7 | |||
|
|
5c0c15eb4f | ||
|
|
7eda8f510f | ||
| 44bb35f2a8 | |||
|
|
42ff6be15c | ||
| 32773fd566 | |||
|
|
d72f21da09 | ||
| cc28cc6607 | |||
|
|
120b3a25aa | ||
| b7f3b270a3 | |||
|
|
72b0d4b1ab | ||
| f78d844960 | |||
|
|
249e760fbd | ||
| 3a4b62a52a | |||
| b4eab9cef2 | |||
| 3e96184d6f | |||
| 48a24e6b3e | |||
| d543138bde | |||
| bfcb0fc445 | |||
| 2752a217c8 | |||
| c3686a4bb3 | |||
|
|
9e18ab4620 | ||
| 08e8d325e2 | |||
| ff8cc48340 | |||
| c5669aa304 | |||
| bbfcaedece | |||
| 2597511d7b | |||
| b398667fce | |||
| 5c62f172f0 | |||
| 7f86a245bf | |||
| 9c82b2a61c | |||
| e4b1248f47 | |||
|
|
501d07b0f2 |
118
.gitea/scripts/audit-force-merge.sh
Executable file
118
.gitea/scripts/audit-force-merge.sh
Executable file
@ -0,0 +1,118 @@
|
||||
#!/usr/bin/env bash
|
||||
# audit-force-merge — detect a §SOP-6 force-merge after PR close, emit
|
||||
# `incident.force_merge` to stdout as structured JSON.
|
||||
#
|
||||
# Vector's docker_logs source picks up runner stdout; the JSON gets
|
||||
# shipped to Loki on molecule-canonical-obs, indexable by event_type.
|
||||
# Query example:
|
||||
#
|
||||
# {host="operator"} |= "event_type" |= "incident.force_merge" | json
|
||||
#
|
||||
# A force-merge is detected when a PR closed-with-merged=true had at
|
||||
# least one of the repo's required-status-check contexts in a state
|
||||
# other than "success" at the merge commit's SHA. That's exactly what
|
||||
# the Gitea force_merge:true API call lets through, so it's a faithful
|
||||
# detector of the override path.
|
||||
#
|
||||
# Triggers on `pull_request_target: closed` (loaded from base branch
|
||||
# per §SOP-6 security model). No-op when merged=false.
|
||||
#
|
||||
# Required env (set by the workflow):
|
||||
# GITEA_TOKEN, GITEA_HOST, REPO, PR_NUMBER, REQUIRED_CHECKS
|
||||
#
|
||||
# REQUIRED_CHECKS is a newline-separated list of status-check context
|
||||
# names that branch protection requires. Declared in the workflow YAML
|
||||
# rather than fetched from /branch_protections (which needs admin
|
||||
# scope — sop-tier-bot has read-only). Trade dynamism for simplicity:
|
||||
# when the required-check set changes, update both branch protection
|
||||
# AND this env. Keeping them in sync is less complexity than granting
|
||||
# the audit bot admin perms on every repo.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
: "${GITEA_TOKEN:?required}"
|
||||
: "${GITEA_HOST:?required}"
|
||||
: "${REPO:?required}"
|
||||
: "${PR_NUMBER:?required}"
|
||||
: "${REQUIRED_CHECKS:?required (newline-separated context names)}"
|
||||
|
||||
OWNER="${REPO%%/*}"
|
||||
NAME="${REPO##*/}"
|
||||
API="https://${GITEA_HOST}/api/v1"
|
||||
AUTH="Authorization: token ${GITEA_TOKEN}"
|
||||
|
||||
# 1. Fetch the PR. If not merged, no-op.
|
||||
PR=$(curl -sS -H "$AUTH" "${API}/repos/${OWNER}/${NAME}/pulls/${PR_NUMBER}")
|
||||
MERGED=$(echo "$PR" | jq -r '.merged // false')
|
||||
if [ "$MERGED" != "true" ]; then
|
||||
echo "::notice::PR #${PR_NUMBER} closed without merge — no audit emission."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
MERGE_SHA=$(echo "$PR" | jq -r '.merge_commit_sha // empty')
|
||||
MERGED_BY=$(echo "$PR" | jq -r '.merged_by.login // "unknown"')
|
||||
TITLE=$(echo "$PR" | jq -r '.title // ""')
|
||||
BASE_BRANCH=$(echo "$PR" | jq -r '.base.ref // "main"')
|
||||
HEAD_SHA=$(echo "$PR" | jq -r '.head.sha // empty')
|
||||
|
||||
if [ -z "$MERGE_SHA" ]; then
|
||||
echo "::warning::PR #${PR_NUMBER} merged=true but no merge_commit_sha — cannot evaluate force-merge."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# 2. Required status checks declared in the workflow env.
|
||||
REQUIRED="$REQUIRED_CHECKS"
|
||||
if [ -z "${REQUIRED//[[:space:]]/}" ]; then
|
||||
echo "::notice::REQUIRED_CHECKS empty — force-merge not applicable."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# 3. Status-check state at the PR HEAD (where checks ran). The merge
|
||||
# commit doesn't get its own checks; we evaluate the PR's last
|
||||
# commit, which is what branch protection compared against.
|
||||
STATUS=$(curl -sS -H "$AUTH" \
|
||||
"${API}/repos/${OWNER}/${NAME}/commits/${HEAD_SHA}/status")
|
||||
declare -A CHECK_STATE
|
||||
while IFS=$'\t' read -r ctx state; do
|
||||
[ -n "$ctx" ] && CHECK_STATE[$ctx]="$state"
|
||||
done < <(echo "$STATUS" | jq -r '.statuses // [] | .[] | "\(.context)\t\(.status)"')
|
||||
|
||||
# 4. For each required check, was it green at merge? YAML block scalars
|
||||
# (`|`) leave a trailing newline; skip blank/whitespace-only lines.
|
||||
FAILED_CHECKS=()
|
||||
while IFS= read -r req; do
|
||||
trimmed="${req#"${req%%[![:space:]]*}"}" # ltrim
|
||||
trimmed="${trimmed%"${trimmed##*[![:space:]]}"}" # rtrim
|
||||
[ -z "$trimmed" ] && continue
|
||||
state="${CHECK_STATE[$trimmed]:-missing}"
|
||||
if [ "$state" != "success" ]; then
|
||||
FAILED_CHECKS+=("${trimmed}=${state}")
|
||||
fi
|
||||
done <<< "$REQUIRED"
|
||||
|
||||
if [ "${#FAILED_CHECKS[@]}" -eq 0 ]; then
|
||||
echo "::notice::PR #${PR_NUMBER} merged with all required checks green — not a force-merge."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# 5. Emit structured audit event.
|
||||
NOW=$(date -u +%Y-%m-%dT%H:%M:%SZ)
|
||||
FAILED_JSON=$(printf '%s\n' "${FAILED_CHECKS[@]}" | jq -R . | jq -s .)
|
||||
|
||||
# Print as a single-line JSON so Vector's parse_json transform can pick
|
||||
# it up cleanly from docker_logs.
|
||||
jq -nc \
|
||||
--arg event_type "incident.force_merge" \
|
||||
--arg ts "$NOW" \
|
||||
--arg repo "$REPO" \
|
||||
--argjson pr "$PR_NUMBER" \
|
||||
--arg title "$TITLE" \
|
||||
--arg base "$BASE_BRANCH" \
|
||||
--arg merged_by "$MERGED_BY" \
|
||||
--arg merge_sha "$MERGE_SHA" \
|
||||
--argjson failed_checks "$FAILED_JSON" \
|
||||
'{event_type: $event_type, ts: $ts, repo: $repo, pr: $pr, title: $title,
|
||||
base_branch: $base, merged_by: $merged_by, merge_sha: $merge_sha,
|
||||
failed_checks: $failed_checks}'
|
||||
|
||||
echo "::warning::FORCE-MERGE detected on PR #${PR_NUMBER} by ${MERGED_BY}: ${#FAILED_CHECKS[@]} required check(s) not green at merge time."
|
||||
149
.gitea/scripts/sop-tier-check.sh
Executable file
149
.gitea/scripts/sop-tier-check.sh
Executable file
@ -0,0 +1,149 @@
|
||||
#!/usr/bin/env bash
|
||||
# sop-tier-check — verify a Gitea PR satisfies the §SOP-6 approval gate.
|
||||
#
|
||||
# Reads the PR's tier label, walks approving reviewers, and checks each
|
||||
# approver's Gitea team membership against the tier's eligible-team set.
|
||||
# Marks pass only when at least one non-author approver is in an eligible
|
||||
# team.
|
||||
#
|
||||
# Invoked from `.gitea/workflows/sop-tier-check.yml`. The workflow sets
|
||||
# the env vars below; this script does no IO outside of stdout/stderr +
|
||||
# the Gitea API.
|
||||
#
|
||||
# Required env:
|
||||
# GITEA_TOKEN — bot PAT with read:organization,read:user,
|
||||
# read:issue,read:repository scopes
|
||||
# GITEA_HOST — e.g. git.moleculesai.app
|
||||
# REPO — owner/name (from github.repository)
|
||||
# PR_NUMBER — int (from github.event.pull_request.number)
|
||||
# PR_AUTHOR — login (from github.event.pull_request.user.login)
|
||||
#
|
||||
# Optional:
|
||||
# SOP_DEBUG=1 — print per-API-call diagnostic lines (HTTP codes,
|
||||
# raw response bodies). Default: off.
|
||||
#
|
||||
# Stale-status caveat: Gitea Actions does not always re-fire workflows
|
||||
# on `labeled` / `pull_request_review:submitted` events. If the
|
||||
# sop-tier-check status is stale (e.g. red after labels/approvals were
|
||||
# added), push an empty commit to the PR branch to force a synchronize
|
||||
# event, OR re-request reviews. Tracked: internal#46.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
debug() {
|
||||
if [ "${SOP_DEBUG:-}" = "1" ]; then
|
||||
echo " [debug] $*" >&2
|
||||
fi
|
||||
}
|
||||
|
||||
# Validate env
|
||||
: "${GITEA_TOKEN:?GITEA_TOKEN required}"
|
||||
: "${GITEA_HOST:?GITEA_HOST required}"
|
||||
: "${REPO:?REPO required (owner/name)}"
|
||||
: "${PR_NUMBER:?PR_NUMBER required}"
|
||||
: "${PR_AUTHOR:?PR_AUTHOR required}"
|
||||
|
||||
OWNER="${REPO%%/*}"
|
||||
NAME="${REPO##*/}"
|
||||
API="https://${GITEA_HOST}/api/v1"
|
||||
AUTH="Authorization: token ${GITEA_TOKEN}"
|
||||
echo "::notice::tier-check start: repo=$OWNER/$NAME pr=$PR_NUMBER author=$PR_AUTHOR"
|
||||
|
||||
# Sanity: token resolves to a user
|
||||
WHOAMI=$(curl -sS -H "$AUTH" "${API}/user" | jq -r '.login // ""')
|
||||
if [ -z "$WHOAMI" ]; then
|
||||
echo "::error::GITEA_TOKEN cannot resolve a user via /api/v1/user — check the token scope and that the secret is wired correctly."
|
||||
exit 1
|
||||
fi
|
||||
echo "::notice::token resolves to user: $WHOAMI"
|
||||
|
||||
# 1. Read tier label
|
||||
LABELS=$(curl -sS -H "$AUTH" "${API}/repos/${OWNER}/${NAME}/issues/${PR_NUMBER}/labels" | jq -r '.[].name')
|
||||
TIER=""
|
||||
for L in $LABELS; do
|
||||
case "$L" in
|
||||
tier:low|tier:medium|tier:high)
|
||||
if [ -n "$TIER" ]; then
|
||||
echo "::error::Multiple tier labels: $TIER + $L. Apply exactly one."
|
||||
exit 1
|
||||
fi
|
||||
TIER="$L"
|
||||
;;
|
||||
esac
|
||||
done
|
||||
if [ -z "$TIER" ]; then
|
||||
echo "::error::PR has no tier:low|tier:medium|tier:high label. Apply one before merge."
|
||||
exit 1
|
||||
fi
|
||||
debug "tier=$TIER"
|
||||
|
||||
# 2. Tier → eligible teams
|
||||
case "$TIER" in
|
||||
tier:low) ELIGIBLE="engineers managers ceo" ;;
|
||||
tier:medium) ELIGIBLE="managers ceo" ;;
|
||||
tier:high) ELIGIBLE="ceo" ;;
|
||||
esac
|
||||
debug "eligible_teams=$ELIGIBLE"
|
||||
|
||||
# Resolve team-name → team-id once. /orgs/{org}/teams/{slug}/... endpoints
|
||||
# don't exist on Gitea 1.22; we have to use /teams/{id}.
|
||||
ORG_TEAMS_FILE=$(mktemp)
|
||||
trap 'rm -f "$ORG_TEAMS_FILE"' EXIT
|
||||
HTTP_CODE=$(curl -sS -o "$ORG_TEAMS_FILE" -w '%{http_code}' -H "$AUTH" \
|
||||
"${API}/orgs/${OWNER}/teams")
|
||||
debug "teams-list HTTP=$HTTP_CODE size=$(wc -c <"$ORG_TEAMS_FILE")"
|
||||
if [ "${SOP_DEBUG:-}" = "1" ]; then
|
||||
echo " [debug] teams-list body (first 300 chars):" >&2
|
||||
head -c 300 "$ORG_TEAMS_FILE" >&2; echo >&2
|
||||
fi
|
||||
if [ "$HTTP_CODE" != "200" ]; then
|
||||
echo "::error::GET /orgs/${OWNER}/teams returned HTTP $HTTP_CODE — token likely lacks read:org scope. Add a SOP_TIER_CHECK_TOKEN secret with read:organization scope at the org level."
|
||||
exit 1
|
||||
fi
|
||||
declare -A TEAM_ID
|
||||
for T in $ELIGIBLE; do
|
||||
ID=$(jq -r --arg t "$T" '.[] | select(.name==$t) | .id' <"$ORG_TEAMS_FILE" | head -1)
|
||||
if [ -z "$ID" ] || [ "$ID" = "null" ]; then
|
||||
VISIBLE=$(jq -r '.[]?.name? // empty' <"$ORG_TEAMS_FILE" 2>/dev/null | tr '\n' ' ')
|
||||
echo "::error::Team \"$T\" not found in org $OWNER. Teams visible: $VISIBLE"
|
||||
exit 1
|
||||
fi
|
||||
TEAM_ID[$T]="$ID"
|
||||
debug "team-id: $T → $ID"
|
||||
done
|
||||
|
||||
# 3. Read approving reviewers
|
||||
REVIEWS=$(curl -sS -H "$AUTH" "${API}/repos/${OWNER}/${NAME}/pulls/${PR_NUMBER}/reviews")
|
||||
APPROVERS=$(echo "$REVIEWS" | jq -r '[.[] | select(.state=="APPROVED") | .user.login] | unique | .[]')
|
||||
if [ -z "$APPROVERS" ]; then
|
||||
echo "::error::No approving reviews. Tier $TIER requires approval from {$ELIGIBLE} (non-author)."
|
||||
exit 1
|
||||
fi
|
||||
debug "approvers: $(echo "$APPROVERS" | tr '\n' ' ')"
|
||||
|
||||
# 4. For each approver: check non-author + team membership (by id)
|
||||
OK=""
|
||||
for U in $APPROVERS; do
|
||||
if [ "$U" = "$PR_AUTHOR" ]; then
|
||||
debug "skip self-review by $U"
|
||||
continue
|
||||
fi
|
||||
for T in $ELIGIBLE; do
|
||||
ID="${TEAM_ID[$T]}"
|
||||
CODE=$(curl -sS -o /dev/null -w '%{http_code}' -H "$AUTH" \
|
||||
"${API}/teams/${ID}/members/${U}")
|
||||
debug "probe: $U in team $T (id=$ID) → HTTP $CODE"
|
||||
if [ "$CODE" = "200" ] || [ "$CODE" = "204" ]; then
|
||||
echo "::notice::approver $U is in team $T (eligible for $TIER)"
|
||||
OK="yes"
|
||||
break
|
||||
fi
|
||||
done
|
||||
[ -n "$OK" ] && break
|
||||
done
|
||||
|
||||
if [ -z "$OK" ]; then
|
||||
echo "::error::Tier $TIER requires approval from a non-author member of {$ELIGIBLE}. Got approvers: $APPROVERS — none of them satisfied team membership. Set SOP_DEBUG=1 to see per-probe HTTP codes."
|
||||
exit 1
|
||||
fi
|
||||
echo "::notice::sop-tier-check passed: $TIER, approver in {$ELIGIBLE}"
|
||||
58
.gitea/workflows/audit-force-merge.yml
Normal file
58
.gitea/workflows/audit-force-merge.yml
Normal file
@ -0,0 +1,58 @@
|
||||
# audit-force-merge — emit `incident.force_merge` to runner stdout when
|
||||
# a PR is merged with required-status-checks not green. Vector picks
|
||||
# the JSON line off docker_logs and ships to Loki on
|
||||
# molecule-canonical-obs (per `reference_obs_stack_phase1`); query as:
|
||||
#
|
||||
# {host="operator"} |= "event_type" |= "incident.force_merge" | json
|
||||
#
|
||||
# Closes the §SOP-6 audit gap (the doc says force-merges write to
|
||||
# `structure_events`, but that table lives in the platform DB, not
|
||||
# Gitea-side; Loki is the practical equivalent for Gitea Actions
|
||||
# events). When the credential / observability stack converges later,
|
||||
# this can sync into structure_events from Loki via a backfill job —
|
||||
# the structured JSON shape is forward-compatible.
|
||||
#
|
||||
# Logic in `.gitea/scripts/audit-force-merge.sh` per the same script-
|
||||
# extract pattern as sop-tier-check.
|
||||
|
||||
name: audit-force-merge
|
||||
|
||||
# pull_request_target loads from the base branch — same security model
|
||||
# as sop-tier-check. Without this, an attacker could rewrite the
|
||||
# workflow on a PR and skip the audit emission for their own
|
||||
# force-merge. See `.gitea/workflows/sop-tier-check.yml` for the full
|
||||
# rationale.
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [closed]
|
||||
|
||||
jobs:
|
||||
audit:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
# Skip when PR is closed without merge — saves a runner.
|
||||
if: github.event.pull_request.merged == true
|
||||
steps:
|
||||
- name: Check out base branch (for the script)
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.base.sha }}
|
||||
- name: Detect force-merge + emit audit event
|
||||
env:
|
||||
# Same org-level secret the sop-tier-check workflow uses.
|
||||
GITEA_TOKEN: ${{ secrets.SOP_TIER_CHECK_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
GITEA_HOST: git.moleculesai.app
|
||||
REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
# Required-status-check contexts to evaluate at merge time.
|
||||
# Newline-separated. Mirror this against branch protection
|
||||
# (settings → branches → protected branch → required checks).
|
||||
# Declared here rather than fetched from /branch_protections
|
||||
# because that endpoint requires admin write — sop-tier-bot is
|
||||
# read-only by design (least-privilege).
|
||||
REQUIRED_CHECKS: |
|
||||
sop-tier-check / tier-check (pull_request)
|
||||
Secret scan / Scan diff for credential-shaped strings (pull_request)
|
||||
run: bash .gitea/scripts/audit-force-merge.sh
|
||||
191
.gitea/workflows/secret-scan.yml
Normal file
191
.gitea/workflows/secret-scan.yml
Normal file
@ -0,0 +1,191 @@
|
||||
name: Secret scan
|
||||
|
||||
# Hard CI gate. Refuses any PR / push whose diff additions contain a
|
||||
# recognisable credential. Defense-in-depth for the #2090-class incident
|
||||
# (2026-04-24): GitHub's hosted Copilot Coding Agent leaked a ghs_*
|
||||
# installation token into tenant-proxy/package.json via `npm init`
|
||||
# slurping the URL from a token-embedded origin remote. We can't fix
|
||||
# upstream's clone hygiene, so we gate here.
|
||||
#
|
||||
# Same regex set as the runtime's bundled pre-commit hook
|
||||
# (molecule-ai-workspace-runtime: molecule_runtime/scripts/pre-commit-checks.sh).
|
||||
# Keep the two sides aligned when adding patterns.
|
||||
#
|
||||
# Ported from .github/workflows/secret-scan.yml so the gate actually
|
||||
# fires on Gitea Actions. Differences from the GitHub version:
|
||||
# - drops `merge_group` event (Gitea has no merge queue)
|
||||
# - drops `workflow_call` (no cross-repo reusable invocation on Gitea)
|
||||
# - SELF path updated to .gitea/workflows/secret-scan.yml
|
||||
# The job name + step name are identical to the GitHub workflow so the
|
||||
# status-check context (`Secret scan / Scan diff for credential-shaped
|
||||
# strings (pull_request)`) matches branch protection on molecule-core/main.
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened]
|
||||
push:
|
||||
branches: [main, staging]
|
||||
|
||||
jobs:
|
||||
scan:
|
||||
name: Scan diff for credential-shaped strings
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
fetch-depth: 2 # need previous commit to diff against on push events
|
||||
|
||||
# For pull_request events the diff base may be many commits behind
|
||||
# HEAD and absent from the shallow clone. Fetch it explicitly.
|
||||
- name: Fetch PR base SHA (pull_request events only)
|
||||
if: github.event_name == 'pull_request'
|
||||
run: git fetch --depth=1 origin ${{ github.event.pull_request.base.sha }}
|
||||
|
||||
- name: Refuse if credential-shaped strings appear in diff additions
|
||||
env:
|
||||
# Plumb event-specific SHAs through env so the script doesn't
|
||||
# need conditional `${{ ... }}` interpolation per event type.
|
||||
# github.event.before/after only exist on push events;
|
||||
# pull_request has pull_request.base.sha / pull_request.head.sha.
|
||||
PR_BASE_SHA: ${{ github.event.pull_request.base.sha }}
|
||||
PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }}
|
||||
PUSH_BEFORE: ${{ github.event.before }}
|
||||
PUSH_AFTER: ${{ github.event.after }}
|
||||
run: |
|
||||
# Pattern set covers GitHub family (the actual #2090 vector),
|
||||
# Anthropic / OpenAI / Slack / AWS. Anchored on prefixes with low
|
||||
# false-positive rates against agent-generated content. Mirror of
|
||||
# molecule-ai-workspace-runtime/molecule_runtime/scripts/pre-commit-checks.sh
|
||||
# — keep aligned.
|
||||
SECRET_PATTERNS=(
|
||||
'ghp_[A-Za-z0-9]{36,}' # GitHub PAT (classic)
|
||||
'ghs_[A-Za-z0-9]{36,}' # GitHub App installation token
|
||||
'gho_[A-Za-z0-9]{36,}' # GitHub OAuth user-to-server
|
||||
'ghu_[A-Za-z0-9]{36,}' # GitHub OAuth user
|
||||
'ghr_[A-Za-z0-9]{36,}' # GitHub OAuth refresh
|
||||
'github_pat_[A-Za-z0-9_]{82,}' # GitHub fine-grained PAT
|
||||
'sk-ant-[A-Za-z0-9_-]{40,}' # Anthropic API key
|
||||
'sk-proj-[A-Za-z0-9_-]{40,}' # OpenAI project key
|
||||
'sk-svcacct-[A-Za-z0-9_-]{40,}' # OpenAI service-account key
|
||||
'sk-cp-[A-Za-z0-9_-]{60,}' # MiniMax API key (F1088 vector — caught only after the fact)
|
||||
'xox[baprs]-[A-Za-z0-9-]{20,}' # Slack tokens
|
||||
'AKIA[0-9A-Z]{16}' # AWS access key ID
|
||||
'ASIA[0-9A-Z]{16}' # AWS STS temp access key ID
|
||||
)
|
||||
|
||||
# Determine the diff base. Each event type stores its SHAs in
|
||||
# a different place — see the env block above.
|
||||
case "${{ github.event_name }}" in
|
||||
pull_request)
|
||||
BASE="$PR_BASE_SHA"
|
||||
HEAD="$PR_HEAD_SHA"
|
||||
;;
|
||||
*)
|
||||
BASE="$PUSH_BEFORE"
|
||||
HEAD="$PUSH_AFTER"
|
||||
;;
|
||||
esac
|
||||
|
||||
# On push events with shallow clones, BASE may be present in
|
||||
# the event payload but absent from the local object DB
|
||||
# (fetch-depth=2 doesn't always reach the previous commit
|
||||
# across true merges). Try fetching it on demand. If the
|
||||
# fetch fails — e.g. the SHA was force-overwritten — we fall
|
||||
# through to the empty-BASE branch below, which scans the
|
||||
# entire tree as if every file were new. Correct, just slow.
|
||||
if [ -n "$BASE" ] && ! echo "$BASE" | grep -qE '^0+$'; then
|
||||
if ! git cat-file -e "$BASE" 2>/dev/null; then
|
||||
git fetch --depth=1 origin "$BASE" 2>/dev/null || true
|
||||
fi
|
||||
fi
|
||||
|
||||
# Files added or modified in this change.
|
||||
if [ -z "$BASE" ] || echo "$BASE" | grep -qE '^0+$' || ! git cat-file -e "$BASE" 2>/dev/null; then
|
||||
# New branch / no previous SHA / BASE unreachable — check the
|
||||
# entire tree as added content. Slower, but correct on first
|
||||
# push.
|
||||
CHANGED=$(git ls-tree -r --name-only HEAD)
|
||||
DIFF_RANGE=""
|
||||
else
|
||||
CHANGED=$(git diff --name-only --diff-filter=AM "$BASE" "$HEAD")
|
||||
DIFF_RANGE="$BASE $HEAD"
|
||||
fi
|
||||
|
||||
if [ -z "$CHANGED" ]; then
|
||||
echo "No changed files to inspect."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Self-exclude: this workflow file legitimately contains the
|
||||
# pattern strings as regex literals. Without an exclude it would
|
||||
# block its own merge. Both the .github/ original and this
|
||||
# .gitea/ port are excluded so a sync between them stays clean.
|
||||
SELF_GITHUB=".github/workflows/secret-scan.yml"
|
||||
SELF_GITEA=".gitea/workflows/secret-scan.yml"
|
||||
|
||||
OFFENDING=""
|
||||
# `while IFS= read -r` (not `for f in $CHANGED`) so filenames
|
||||
# containing whitespace don't word-split silently — a path
|
||||
# with a space would otherwise produce two iterations on
|
||||
# tokens that aren't real filenames, breaking the
|
||||
# self-exclude + diff lookup.
|
||||
while IFS= read -r f; do
|
||||
[ -z "$f" ] && continue
|
||||
[ "$f" = "$SELF_GITHUB" ] && continue
|
||||
[ "$f" = "$SELF_GITEA" ] && continue
|
||||
if [ -n "$DIFF_RANGE" ]; then
|
||||
ADDED=$(git diff --no-color --unified=0 "$BASE" "$HEAD" -- "$f" 2>/dev/null | grep -E '^\+[^+]' || true)
|
||||
else
|
||||
# No diff range (new branch first push) — scan the full file
|
||||
# contents as if every line were new.
|
||||
ADDED=$(cat "$f" 2>/dev/null || true)
|
||||
fi
|
||||
[ -z "$ADDED" ] && continue
|
||||
for pattern in "${SECRET_PATTERNS[@]}"; do
|
||||
if echo "$ADDED" | grep -qE "$pattern"; then
|
||||
OFFENDING="${OFFENDING}${f} (matched: ${pattern})\n"
|
||||
break
|
||||
fi
|
||||
done
|
||||
done <<< "$CHANGED"
|
||||
|
||||
if [ -n "$OFFENDING" ]; then
|
||||
echo "::error::Credential-shaped strings detected in diff additions:"
|
||||
# `printf '%b' "$OFFENDING"` interprets backslash escapes
|
||||
# (the literal `\n` we appended above becomes a newline)
|
||||
# WITHOUT treating OFFENDING as a format string. Plain
|
||||
# `printf "$OFFENDING"` is a format-string sink: a filename
|
||||
# containing `%` would be interpreted as a conversion
|
||||
# specifier, corrupting the error message (or printing
|
||||
# `%(missing)` artifacts).
|
||||
printf '%b' "$OFFENDING"
|
||||
echo ""
|
||||
echo "The actual matched values are NOT echoed here, deliberately —"
|
||||
echo "round-tripping a leaked credential into CI logs widens the blast"
|
||||
echo "radius (logs are searchable + retained)."
|
||||
echo ""
|
||||
echo "Recovery:"
|
||||
echo " 1. Remove the secret from the file. Replace with an env var"
|
||||
echo " reference (e.g. \${{ secrets.GITHUB_TOKEN }} in workflows,"
|
||||
echo " process.env.X in code)."
|
||||
echo " 2. If the credential was already pushed (this PR's commit"
|
||||
echo " history reaches a public ref), treat it as compromised —"
|
||||
echo " ROTATE it immediately, do not just remove it. The token"
|
||||
echo " remains valid in git history forever and may be in any"
|
||||
echo " log/cache that consumed this branch."
|
||||
echo " 3. Force-push the cleaned commit (or stack a revert) and"
|
||||
echo " re-run CI."
|
||||
echo ""
|
||||
echo "If the match is a false positive (test fixture, docs example,"
|
||||
echo "or this workflow's own regex literals): use a clearly-fake"
|
||||
echo "placeholder like ghs_EXAMPLE_DO_NOT_USE that doesn't satisfy"
|
||||
echo "the length suffix, OR add the file path to the SELF exclude"
|
||||
echo "list in this workflow with a short reason."
|
||||
echo ""
|
||||
echo "Mirror of the regex set lives in the runtime's bundled"
|
||||
echo "pre-commit hook (molecule-ai-workspace-runtime:"
|
||||
echo "molecule_runtime/scripts/pre-commit-checks.sh) — keep aligned."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "✓ No credential-shaped strings in this change."
|
||||
81
.gitea/workflows/sop-tier-check.yml
Normal file
81
.gitea/workflows/sop-tier-check.yml
Normal file
@ -0,0 +1,81 @@
|
||||
# sop-tier-check — canonical Gitea Actions workflow for §SOP-6 enforcement.
|
||||
#
|
||||
# Logic lives in `.gitea/scripts/sop-tier-check.sh` (extracted 2026-05-09
|
||||
# from the previous inline-bash version). The script is the single source
|
||||
# of truth; this workflow file just sets env + invokes it.
|
||||
#
|
||||
# Copy BOTH files (`.gitea/workflows/sop-tier-check.yml` +
|
||||
# `.gitea/scripts/sop-tier-check.sh`) into any repo that wants the
|
||||
# §SOP-6 PR gate enforced. Pair with branch protection on the protected
|
||||
# branch:
|
||||
# required_status_checks: ["sop-tier-check / tier-check (pull_request)"]
|
||||
# required_approving_reviews: 1
|
||||
# approving_review_teams: ["ceo", "managers", "engineers"]
|
||||
#
|
||||
# Tier → eligible-team mapping (mirror of dev-sop §SOP-6):
|
||||
# tier:low → engineers, managers, ceo
|
||||
# tier:medium → managers, ceo
|
||||
# tier:high → ceo
|
||||
#
|
||||
# Force-merge: Owners-team override remains available out-of-band via
|
||||
# the Gitea merge API; force-merge writes `incident.force_merge` to
|
||||
# `structure_events` per §Persistent structured logging gate (Phase 3).
|
||||
#
|
||||
# Set `SOP_DEBUG: '1'` in the env block to enable per-API-call diagnostic
|
||||
# lines — useful when diagnosing token-scope or team-id-resolution
|
||||
# issues. Default off.
|
||||
|
||||
name: sop-tier-check
|
||||
|
||||
# SECURITY: triggers MUST use `pull_request_target`, not `pull_request`.
|
||||
# `pull_request_target` loads the workflow definition from the BASE
|
||||
# branch (i.e. `main`), not the PR's HEAD. With `pull_request`, anyone
|
||||
# with write access to a feature branch could rewrite this file in
|
||||
# their PR to dump SOP_TIER_CHECK_TOKEN (org-read scope) to logs and
|
||||
# exfiltrate it. Verified 2026-05-09 against Gitea 1.22.6 —
|
||||
# `pull_request_target` (added in Gitea 1.21 via go-gitea/gitea#25229)
|
||||
# is the documented mitigation.
|
||||
#
|
||||
# This workflow does NOT call `actions/checkout` of PR HEAD code, so no
|
||||
# untrusted code is ever executed in the runner — we only HTTP-call the
|
||||
# Gitea API. If a future change adds a checkout step, it MUST pin to
|
||||
# `${{ github.event.pull_request.base.sha }}` (NOT `head.sha`) to keep
|
||||
# the trust boundary.
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [opened, edited, synchronize, reopened, labeled, unlabeled]
|
||||
pull_request_review:
|
||||
types: [submitted, dismissed, edited]
|
||||
|
||||
jobs:
|
||||
tier-check:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
steps:
|
||||
- name: Check out base branch (for the script)
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
# Pin to base.sha — pull_request_target's protection only
|
||||
# works if we never check out PR HEAD. Same SHA the workflow
|
||||
# itself was loaded from.
|
||||
ref: ${{ github.event.pull_request.base.sha }}
|
||||
- name: Verify tier label + reviewer team membership
|
||||
env:
|
||||
# SOP_TIER_CHECK_TOKEN is the org-level secret for the
|
||||
# sop-tier-bot PAT (read:organization,read:user,read:issue,
|
||||
# read:repository). Stored at the org level
|
||||
# (/api/v1/orgs/molecule-ai/actions/secrets) so per-repo
|
||||
# configuration is unnecessary — every repo in the org
|
||||
# picks it up automatically.
|
||||
# Falls back to GITHUB_TOKEN with a clear error if missing.
|
||||
GITEA_TOKEN: ${{ secrets.SOP_TIER_CHECK_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
GITEA_HOST: git.moleculesai.app
|
||||
REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
PR_AUTHOR: ${{ github.event.pull_request.user.login }}
|
||||
# Set to '1' for diagnostic per-API-call output. Off by default
|
||||
# so production logs aren't noisy.
|
||||
SOP_DEBUG: '0'
|
||||
run: bash .gitea/scripts/sop-tier-check.sh
|
||||
467
.github/workflows/auto-promote-on-e2e.yml
vendored
467
.github/workflows/auto-promote-on-e2e.yml
vendored
@ -1,467 +0,0 @@
|
||||
name: Auto-promote :latest after main image build
|
||||
|
||||
# Retags `ghcr.io/molecule-ai/{platform,platform-tenant}:staging-<sha>`
|
||||
# → `:latest` after either the image build or E2E completes on a `main`
|
||||
# push, gated on E2E Staging SaaS not being red for that SHA.
|
||||
#
|
||||
# Why two triggers:
|
||||
#
|
||||
# `publish-workspace-server-image` and `e2e-staging-saas` are both
|
||||
# paths-filtered, but with DIFFERENT path sets:
|
||||
#
|
||||
# publish-workspace-server-image:
|
||||
# workspace-server/**, canvas/**, manifest.json
|
||||
#
|
||||
# e2e-staging-saas (full lifecycle):
|
||||
# workspace-server/internal/handlers/{registry,workspace_provision,
|
||||
# a2a_proxy}.go, workspace-server/internal/middleware/**,
|
||||
# workspace-server/internal/provisioner/**, tests/e2e/test_staging_full_saas.sh
|
||||
#
|
||||
# The E2E set is a strict SUBSET of the publish set. So:
|
||||
# - canvas/** changes → publish fires, E2E does not
|
||||
# - workspace-server/cmd/** changes → publish fires, E2E does not
|
||||
# - workspace-server/internal/sweep/** → publish fires, E2E does not
|
||||
#
|
||||
# The previous version triggered ONLY on E2E completion, which meant
|
||||
# non-E2E-path changes (canvas, cmd, sweep, etc.) rebuilt the image
|
||||
# but never advanced `:latest`. Result: as of 2026-04-28 this workflow
|
||||
# had run zero times since merge despite eight main pushes — `:latest`
|
||||
# was ~7 hours / 9 PRs behind main with no human realising. See
|
||||
# `molecule-core` Slack discussion 2026-04-28.
|
||||
#
|
||||
# Adding `publish-workspace-server-image` as a second trigger closes
|
||||
# the gap: any image rebuild on main eligibly advances `:latest`.
|
||||
#
|
||||
# Why E2E remains a kill-switch (not the trigger):
|
||||
#
|
||||
# When E2E DID run for this SHA and ended red, we abort — `:latest`
|
||||
# stays on the prior known-good digest. When E2E didn't run (paths
|
||||
# filtered out), we proceed: pre-merge gates already validated this
|
||||
# SHA on staging via auto-promote-staging requiring CI + E2E Canvas +
|
||||
# E2E API + CodeQL all green. Image content for non-E2E-paths
|
||||
# (canvas, cmd, sweep) is exercised by those staging gates.
|
||||
#
|
||||
# Why `main` only:
|
||||
#
|
||||
# `:latest` is what prod tenants pull. We only want SHAs that have
|
||||
# reached main (via auto-promote-staging) to advance `:latest`.
|
||||
# Triggering on staging would let a staging-only revert advance
|
||||
# `:latest` to a SHA that never reaches main, breaking the "production
|
||||
# runs what's on main" invariant.
|
||||
#
|
||||
# Idempotency:
|
||||
#
|
||||
# When a SHA touches paths that match BOTH publish and E2E, both
|
||||
# workflows fire and complete. Both trigger this workflow on
|
||||
# completion → two runs race. Both retag `:staging-<sha>` →
|
||||
# `:latest`. crane tag is idempotent (re-tagging the same digest is a
|
||||
# no-op), so the second run is harmless. concurrency group serializes
|
||||
# them anyway.
|
||||
|
||||
on:
|
||||
workflow_run:
|
||||
workflows:
|
||||
- 'E2E Staging SaaS (full lifecycle)'
|
||||
- 'publish-workspace-server-image'
|
||||
types: [completed]
|
||||
branches: [main]
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
sha:
|
||||
description: 'Short sha to promote (override; defaults to upstream workflow_run head_sha)'
|
||||
required: false
|
||||
type: string
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
|
||||
concurrency:
|
||||
# Serialize promotes per-SHA so the publish+E2E both-fired race lands
|
||||
# cleanly. Different SHAs can promote in parallel.
|
||||
group: auto-promote-latest-${{ github.event.workflow_run.head_sha || github.event.inputs.sha || github.sha }}
|
||||
cancel-in-progress: false
|
||||
|
||||
env:
|
||||
IMAGE_NAME: ghcr.io/molecule-ai/platform
|
||||
TENANT_IMAGE_NAME: ghcr.io/molecule-ai/platform-tenant
|
||||
|
||||
jobs:
|
||||
promote:
|
||||
# Proceed if upstream succeeded OR manual dispatch. Upstream-failure
|
||||
# paths are filtered here; the E2E-was-red kill-switch lives in the
|
||||
# gate-check step below (covers the case where upstream is publish
|
||||
# success but E2E for the same SHA failed).
|
||||
if: |
|
||||
github.event_name == 'workflow_dispatch' ||
|
||||
(github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success')
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Compute short sha
|
||||
id: sha
|
||||
run: |
|
||||
set -euo pipefail
|
||||
if [ -n "${{ github.event.inputs.sha }}" ]; then
|
||||
FULL="${{ github.event.inputs.sha }}"
|
||||
else
|
||||
FULL="${{ github.event.workflow_run.head_sha }}"
|
||||
fi
|
||||
echo "short=${FULL:0:7}" >> "$GITHUB_OUTPUT"
|
||||
echo "full=${FULL}" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Gate — E2E Staging SaaS state for this SHA
|
||||
# When upstream IS E2E success, we know it's green (filtered by
|
||||
# the job-level `if` already). When upstream is publish, look up
|
||||
# E2E state for the same SHA. Four buckets:
|
||||
#
|
||||
# - completed/success: E2E confirmed safe → proceed
|
||||
# - completed/failure|cancelled|timed_out: E2E found a
|
||||
# regression → ABORT (exit 1), `:latest` stays put
|
||||
# - in_progress|queued|requested: E2E is RACING with publish
|
||||
# for a runtime-touching SHA. publish typically completes
|
||||
# ~5-10min before E2E (~10-15min). If we promote on the
|
||||
# publish signal here, a later E2E failure can't roll back
|
||||
# `:latest` — it'd already be wrongly advanced. So we DEFER:
|
||||
# skip subsequent steps (proceed=false) and let E2E's own
|
||||
# completion event re-fire this workflow, which then takes
|
||||
# the upstream-is-E2E path. exit 0 so the run shows as
|
||||
# success rather than a noisy fake-failure.
|
||||
# - none/none: E2E was paths-filtered out for this SHA (the
|
||||
# change touched canvas/cmd/sweep/etc. — paths covered by
|
||||
# publish but not by E2E). pre-merge gates on staging
|
||||
# already validated this SHA → proceed.
|
||||
#
|
||||
# Manual dispatch skips this check — operator override.
|
||||
id: gate
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
REPO: ${{ github.repository }}
|
||||
SHA: ${{ steps.sha.outputs.full }}
|
||||
UPSTREAM_NAME: ${{ github.event.workflow_run.name }}
|
||||
EVENT_NAME: ${{ github.event_name }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
if [ "$EVENT_NAME" = "workflow_dispatch" ]; then
|
||||
echo "proceed=true" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::Manual dispatch — skipping E2E gate (operator override)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
if [ "$UPSTREAM_NAME" = "E2E Staging SaaS (full lifecycle)" ]; then
|
||||
echo "proceed=true" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::Upstream is E2E itself (success per job-level if) — gate trivially satisfied"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Upstream is publish-workspace-server-image. Check E2E state
|
||||
# for the same SHA via Gitea's commit-status API.
|
||||
#
|
||||
# GitHub-era this was `gh run list --workflow=X --commit=SHA
|
||||
# --json status,conclusion` returning either `[]` (no run on
|
||||
# this SHA) or `[{status, conclusion}]` (the run's state).
|
||||
# Gitea has NO workflow-runs API at all — `/api/v1/repos/.../
|
||||
# actions/runs` returns 404 (verified 2026-05-07, issue #75).
|
||||
# However Gitea Actions DOES emit a commit status per workflow
|
||||
# job, with `context = "<Workflow Name> / <Job Name> (<event>)"`,
|
||||
# which is exactly what we need: each E2E run leg becomes one
|
||||
# status row on the SHA, and the aggregate state encodes the
|
||||
# run's outcome.
|
||||
#
|
||||
# Mapping:
|
||||
# 0 matched contexts → "none/none" (E2E paths-
|
||||
# filtered
|
||||
# out — same
|
||||
# semantic
|
||||
# as before)
|
||||
# any context = pending → "in_progress/none" (defer)
|
||||
# any context = error|failure → "completed/failure" (abort)
|
||||
# all contexts = success → "completed/success" (proceed)
|
||||
#
|
||||
# The "completed/cancelled" and "completed/timed_out" buckets
|
||||
# don't have direct Gitea analogs (Gitea statuses are
|
||||
# success / failure / error / pending / warning). Per-SHA
|
||||
# concurrency cancellation surfaces as `error` on Gitea, which
|
||||
# we map to "completed/failure" rather than "completed/cancelled"
|
||||
# — losing the soft-defer semantic of the cancelled bucket on
|
||||
# this fleet. Tradeoff: the staleness alarm (auto-promote-stale-
|
||||
# alarm.yml) still catches a stuck :latest within 4h, and a
|
||||
# legitimate cancel is rare enough that aborting + manual
|
||||
# re-dispatch is acceptable. If we measure cancel frequency
|
||||
# > 1/week, revisit by reading the run-step-summary text via
|
||||
# a follow-up script.
|
||||
#
|
||||
# Network or auth blips collapse to "none/none" via the curl
|
||||
# `|| true` fallback, matching the pre-Gitea behaviour where
|
||||
# an empty list also degenerated to none/none.
|
||||
GITEA_API_URL="${GITHUB_SERVER_URL:-https://git.moleculesai.app}/api/v1"
|
||||
STATUSES_JSON=$(curl --fail-with-body -sS \
|
||||
-H "Authorization: token ${GH_TOKEN}" \
|
||||
-H "Accept: application/json" \
|
||||
"${GITEA_API_URL}/repos/${REPO}/commits/${SHA}/statuses?limit=100" \
|
||||
2>/dev/null || echo "[]")
|
||||
RESULT=$(printf '%s' "$STATUSES_JSON" | jq -r '
|
||||
# Filter to E2E Staging SaaS (full lifecycle) statuses.
|
||||
# Match by leading workflow-name prefix so the "<job>
|
||||
# (<event>)" tail is irrelevant. Gitea emits the workflow
|
||||
# name verbatim from the YAML `name:` field.
|
||||
[.[] | select(.context | startswith("E2E Staging SaaS (full lifecycle) /"))] as $rows
|
||||
| if ($rows | length) == 0 then
|
||||
"none/none"
|
||||
elif any($rows[]; .status == "pending") then
|
||||
"in_progress/none"
|
||||
elif any($rows[]; .status == "failure" or .status == "error") then
|
||||
"completed/failure"
|
||||
elif all($rows[]; .status == "success") then
|
||||
"completed/success"
|
||||
else
|
||||
# Mixed / unknown — fall through to *) bucket below.
|
||||
"completed/" + ($rows[0].status // "unknown")
|
||||
end
|
||||
' 2>/dev/null || echo "none/none")
|
||||
|
||||
echo "E2E Staging SaaS for ${SHA:0:7}: $RESULT"
|
||||
|
||||
case "$RESULT" in
|
||||
completed/success)
|
||||
echo "proceed=true" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::E2E green for this SHA — proceeding with promote"
|
||||
;;
|
||||
completed/failure|completed/timed_out)
|
||||
echo "proceed=false" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ❌ Auto-promote aborted — E2E Staging SaaS failed"
|
||||
echo
|
||||
echo "E2E Staging SaaS for \`${SHA:0:7}\`: \`$RESULT\`"
|
||||
echo "\`:latest\` stays on the prior known-good digest."
|
||||
echo
|
||||
echo "If the failure was a flake, manually dispatch this workflow with the same sha to override."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
;;
|
||||
completed/cancelled)
|
||||
# GitHub-era only: cancelled ≠ failure. Gitea statuses
|
||||
# don't expose a "cancelled" state — a per-SHA concurrency
|
||||
# cancellation surfaces as `failure` or `error` on Gitea
|
||||
# and is now handled by the failure branch above. This
|
||||
# arm is kept for backwards compatibility / dual-host
|
||||
# operation (if we ever add a non-Gitea fallback) but
|
||||
# under the post-#75 flow it's unreachable.
|
||||
echo "proceed=false" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ⏭ Auto-promote deferred — E2E Staging SaaS was cancelled"
|
||||
echo
|
||||
echo "E2E Staging SaaS for \`${SHA:0:7}\`: \`$RESULT\`"
|
||||
echo "Likely per-SHA concurrency (newer push superseded this E2E run)."
|
||||
echo "The newer SHA's E2E will fire its own promote when it lands."
|
||||
echo "If you need this specific SHA promoted, manually dispatch."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
;;
|
||||
in_progress/*|queued/*|requested/*|waiting/*|pending/*)
|
||||
echo "proceed=false" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ⏳ Auto-promote deferred — E2E Staging SaaS still running"
|
||||
echo
|
||||
echo "Publish completed before E2E for \`${SHA:0:7}\` (state: \`$RESULT\`)."
|
||||
echo "Skipping retag here — E2E's own completion event will re-fire this workflow."
|
||||
echo "If E2E ends green, that run promotes \`:latest\`. If red, it aborts."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
;;
|
||||
none/none)
|
||||
echo "proceed=true" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::E2E paths-filtered out for this SHA — pre-merge staging gates carry"
|
||||
;;
|
||||
*)
|
||||
echo "proceed=false" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ❓ Auto-promote aborted — unexpected E2E state"
|
||||
echo
|
||||
echo "E2E Staging SaaS for \`${SHA:0:7}\`: \`$RESULT\` (unhandled)"
|
||||
echo "Manual investigation needed; re-dispatch with the same sha once resolved."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
- if: steps.gate.outputs.proceed == 'true'
|
||||
uses: imjasonh/setup-crane@6da1ae018866400525525ce74ff892880c099987 # v0.5
|
||||
|
||||
- name: GHCR login
|
||||
if: steps.gate.outputs.proceed == 'true'
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | \
|
||||
crane auth login ghcr.io -u "${{ github.actor }}" --password-stdin
|
||||
|
||||
- name: Verify :staging-<sha> exists for both images
|
||||
# Better to fail fast with a clear message than to half-tag
|
||||
# (platform retagged but platform-tenant missing → tenants pull
|
||||
# a stale image).
|
||||
if: steps.gate.outputs.proceed == 'true'
|
||||
run: |
|
||||
set -euo pipefail
|
||||
for img in "${IMAGE_NAME}" "${TENANT_IMAGE_NAME}"; do
|
||||
tag="${img}:staging-${{ steps.sha.outputs.short }}"
|
||||
if ! crane manifest "$tag" >/dev/null 2>&1; then
|
||||
echo "::error::Missing tag: $tag"
|
||||
echo "::error::publish-workspace-server-image must complete on this SHA before auto-promote can retag :latest."
|
||||
exit 1
|
||||
fi
|
||||
echo " ok: $tag exists"
|
||||
done
|
||||
|
||||
- name: Ancestry check — refuse to promote :latest backwards
|
||||
# #2244: workflow_run completions arrive in arbitrary order. If
|
||||
# SHA-A and SHA-B both reach main within ~10 min and SHA-B's E2E
|
||||
# completes before SHA-A's, this workflow can fire for SHA-A
|
||||
# AFTER it already promoted SHA-B → :latest goes backwards. The
|
||||
# orphan-reconciler "next run corrects it" doesn't apply: there's
|
||||
# no auto-corrective re-promote, :latest stays wrong until the
|
||||
# next main push lands.
|
||||
#
|
||||
# Detection: read current :latest's `org.opencontainers.image.revision`
|
||||
# label (set by publish-workspace-server-image.yml at build time)
|
||||
# and ask the GitHub compare API whether the candidate SHA is
|
||||
# ahead-of / identical-to / behind / diverged-from current.
|
||||
# Hard-fail on `behind` and `diverged` per the approved design —
|
||||
# silent-bypass is the class we're moving away from. Workflow
|
||||
# goes red, oncall sees it, operator decides how to recover
|
||||
# (manual dispatch with the right SHA, force-promote, etc.).
|
||||
#
|
||||
# Manual dispatch skips this check — operator override semantics
|
||||
# match the gate-check step above.
|
||||
#
|
||||
# Backward-compat: when current :latest carries no revision
|
||||
# label (legacy image pre-publish-with-label), skip-with-warning.
|
||||
# All :latest images on main are post-label as of 2026-04-29, so
|
||||
# this branch will be dead within 90 days; remove then.
|
||||
if: steps.gate.outputs.proceed == 'true' && github.event_name != 'workflow_dispatch'
|
||||
id: ancestry
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
REPO: ${{ github.repository }}
|
||||
TARGET_SHA: ${{ steps.sha.outputs.full }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Read the current :latest config and pull the revision label.
|
||||
# `crane config` returns the OCI image config blob (not the manifest);
|
||||
# labels live under `.config.Labels`. `// empty` makes jq return ""
|
||||
# rather than the literal "null" so the test below works.
|
||||
CURRENT_REVISION=$(crane config "${IMAGE_NAME}:latest" 2>/dev/null \
|
||||
| jq -r '.config.Labels["org.opencontainers.image.revision"] // empty' \
|
||||
|| true)
|
||||
|
||||
if [ -z "$CURRENT_REVISION" ]; then
|
||||
echo "decision=skip-no-label" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ⚠ Ancestry check skipped — current :latest has no revision label"
|
||||
echo
|
||||
echo "Likely a legacy image built before \`org.opencontainers.image.revision\` was set."
|
||||
echo "Falling through to retag. After all \`:latest\` images are post-label (TODO 90 days), this branch is dead and should be removed."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "::warning::Current :latest carries no revision label — skipping ancestry check (legacy image)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
if [ "$CURRENT_REVISION" = "$TARGET_SHA" ]; then
|
||||
echo "decision=identical" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice:::latest already at ${TARGET_SHA:0:7} — retag will be a no-op"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Ask GitHub which side of the merge graph TARGET_SHA sits on
|
||||
# relative to CURRENT_REVISION. Returns one of: ahead | identical
|
||||
# | behind | diverged. Network or auth errors collapse to "error"
|
||||
# via the explicit fallback so the case below always matches.
|
||||
STATUS=$(gh api \
|
||||
"repos/${REPO}/compare/${CURRENT_REVISION}...${TARGET_SHA}" \
|
||||
--jq '.status' 2>/dev/null || echo "error")
|
||||
|
||||
echo "ancestry compare ${CURRENT_REVISION:0:7} → ${TARGET_SHA:0:7}: $STATUS"
|
||||
|
||||
case "$STATUS" in
|
||||
ahead)
|
||||
echo "decision=ahead" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::Target ${TARGET_SHA:0:7} is ahead of current :latest (${CURRENT_REVISION:0:7}) — proceeding with retag"
|
||||
;;
|
||||
identical)
|
||||
echo "decision=identical" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::Target identical to :latest — retag will be a no-op"
|
||||
;;
|
||||
behind)
|
||||
echo "decision=behind" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ❌ Auto-promote refused — target is BEHIND current :latest"
|
||||
echo
|
||||
echo "| Field | Value |"
|
||||
echo "|---|---|"
|
||||
echo "| Target SHA | \`$TARGET_SHA\` |"
|
||||
echo "| Current :latest revision | \`$CURRENT_REVISION\` |"
|
||||
echo "| GitHub compare status | \`behind\` |"
|
||||
echo
|
||||
echo "This guard catches the workflow_run-completion-order race (#2244):"
|
||||
echo "two rapid main pushes whose E2Es complete out-of-order can otherwise"
|
||||
echo "promote \`:latest\` backwards. \`:latest\` stays on \`${CURRENT_REVISION:0:7}\`."
|
||||
echo
|
||||
echo "**Recovery:** if this is a legitimate revert that should land on \`:latest\`,"
|
||||
echo "manually dispatch this workflow with the target sha as input — the manual-dispatch"
|
||||
echo "path skips the ancestry check (operator override)."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
;;
|
||||
diverged)
|
||||
echo "decision=diverged" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ❓ Auto-promote refused — history diverged"
|
||||
echo
|
||||
echo "| Field | Value |"
|
||||
echo "|---|---|"
|
||||
echo "| Target SHA | \`$TARGET_SHA\` |"
|
||||
echo "| Current :latest revision | \`$CURRENT_REVISION\` |"
|
||||
echo "| GitHub compare status | \`diverged\` |"
|
||||
echo
|
||||
echo "Likely cause: force-push rewrote main's history, leaving the previous"
|
||||
echo "\`:latest\` revision orphaned. Needs human review before \`:latest\` advances."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
;;
|
||||
error|*)
|
||||
echo "decision=error" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## ❌ Auto-promote aborted — ancestry-check API error"
|
||||
echo
|
||||
echo "\`gh api repos/${REPO}/compare/${CURRENT_REVISION}...${TARGET_SHA}\` returned unexpected status: \`$STATUS\`"
|
||||
echo
|
||||
echo "Manual dispatch with the target sha bypasses this check."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
- name: Retag platform :staging-<sha> → :latest
|
||||
if: steps.gate.outputs.proceed == 'true'
|
||||
run: |
|
||||
crane tag "${IMAGE_NAME}:staging-${{ steps.sha.outputs.short }}" latest
|
||||
|
||||
- name: Retag tenant :staging-<sha> → :latest
|
||||
if: steps.gate.outputs.proceed == 'true'
|
||||
run: |
|
||||
crane tag "${TENANT_IMAGE_NAME}:staging-${{ steps.sha.outputs.short }}" latest
|
||||
|
||||
- name: Summary
|
||||
if: steps.gate.outputs.proceed == 'true'
|
||||
run: |
|
||||
{
|
||||
echo "## :latest promoted to ${{ steps.sha.outputs.short }}"
|
||||
echo
|
||||
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
|
||||
echo "- Trigger: manual dispatch"
|
||||
else
|
||||
echo "- Upstream: \`${{ github.event.workflow_run.name }}\` ([run](${{ github.event.workflow_run.html_url }}))"
|
||||
fi
|
||||
echo "- platform:staging-${{ steps.sha.outputs.short }} → :latest"
|
||||
echo "- platform-tenant:staging-${{ steps.sha.outputs.short }} → :latest"
|
||||
echo
|
||||
echo "Tenant fleet auto-pulls within 5 min via IMAGE_AUTO_REFRESH=true."
|
||||
echo "Force immediate fanout: dispatch redeploy-tenants-on-main.yml."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
492
.github/workflows/auto-promote-staging.yml
vendored
492
.github/workflows/auto-promote-staging.yml
vendored
@ -1,492 +0,0 @@
|
||||
name: Auto-promote staging → main
|
||||
|
||||
# Fires after any of the staging-branch quality gates complete. When ALL
|
||||
# required gates are green on the same staging SHA, opens (or re-uses)
|
||||
# a PR `staging → main` and schedules Gitea auto-merge so the PR lands
|
||||
# automatically once approval + status checks are satisfied.
|
||||
#
|
||||
# ============================================================
|
||||
# What this workflow does
|
||||
# ============================================================
|
||||
#
|
||||
# 1. On a workflow_run completion event for one of the staging gate
|
||||
# workflows (CI, E2E Staging Canvas, E2E API Smoke, CodeQL),
|
||||
# checks if the combined status on the staging head SHA is green.
|
||||
# 2. If green, opens (or re-uses) a PR `head: staging → base: main`
|
||||
# via Gitea REST `POST /api/v1/repos/.../pulls`.
|
||||
# 3. Schedules auto-merge via `POST /api/v1/repos/.../pulls/{index}/merge`
|
||||
# with `merge_when_checks_succeed: true`. Gitea waits for the
|
||||
# approval requirement on `main` (`required_approvals: 1`) and
|
||||
# the status-check gates, then merges.
|
||||
# 4. The merge commit lands on `main` and fires
|
||||
# `publish-workspace-server-image.yml` naturally via its
|
||||
# `on: push: branches: [main]` trigger — no explicit dispatch
|
||||
# needed (see "Why no workflow_dispatch tail" below).
|
||||
#
|
||||
# `auto-sync-main-to-staging.yml` is the reverse-direction
|
||||
# counterpart (main → staging, fast-forward push). Together they
|
||||
# keep the staging-superset-of-main invariant tight.
|
||||
#
|
||||
# ============================================================
|
||||
# Why Gitea REST (and not `gh pr create`)
|
||||
# ============================================================
|
||||
#
|
||||
# Pre-2026-05-06 this workflow used `gh pr create`, `gh pr merge --auto`,
|
||||
# `gh run list`, and `gh workflow run` against GitHub. After the
|
||||
# GitHub→Gitea cutover those calls fail because:
|
||||
#
|
||||
# - `gh pr create / merge / view / list` route to GitHub GraphQL
|
||||
# (`/api/graphql`). Gitea does not expose a GraphQL endpoint;
|
||||
# every call returns `HTTP 405 Method Not Allowed` — same root
|
||||
# cause as #65 (auto-sync) which PR #66 fixed by dropping `gh`
|
||||
# entirely.
|
||||
# - `gh run list --workflow=...` GitHub-shape; Gitea has the
|
||||
# simpler `GET /repos/.../commits/{ref}/status` combined-status
|
||||
# endpoint instead.
|
||||
# - `gh workflow run X.yml` calls `POST /repos/.../actions/workflows/{id}/dispatches`,
|
||||
# which does NOT exist on Gitea 1.22.6 (verified via swagger.v1.json).
|
||||
#
|
||||
# So this workflow uses direct `curl` calls to Gitea REST. No `gh`
|
||||
# CLI dependency, no GraphQL, no missing-endpoint footgun.
|
||||
#
|
||||
# ============================================================
|
||||
# Why no workflow_dispatch tail (was load-bearing on GitHub, dead on Gitea)
|
||||
# ============================================================
|
||||
#
|
||||
# The GitHub-era version had a 60-line polling step that waited for
|
||||
# the promote PR to merge, then explicitly dispatched
|
||||
# `publish-workspace-server-image.yml` on `--ref main`. That step
|
||||
# existed because GitHub's GITHUB_TOKEN-initiated merges suppress
|
||||
# downstream `on: push` workflows (the documented "no recursion" rule
|
||||
# — https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#triggering-a-workflow-from-a-workflow).
|
||||
# The explicit dispatch was the workaround.
|
||||
#
|
||||
# Gitea Actions does NOT have this no-recursion rule. PR #66's auto-
|
||||
# sync merge to main fired `auto-promote-staging` on the next push
|
||||
# trigger naturally. So the cascade fires on the natural push event;
|
||||
# the explicit dispatch is dead code. (And even if we wanted to
|
||||
# preserve it, Gitea has no `workflow_dispatch` REST endpoint.)
|
||||
#
|
||||
# Removed in this rewrite. If we ever observe the cascade misfire,
|
||||
# operator can push an empty commit to `main` to wake it.
|
||||
#
|
||||
# ============================================================
|
||||
# Why open a PR (and not direct push)
|
||||
# ============================================================
|
||||
#
|
||||
# `main` branch protection has `enable_push: false` with NO
|
||||
# `push_whitelist_usernames`. Direct push is impossible for any
|
||||
# persona, including admins. PR-mediated merge is the only path,
|
||||
# which is intentional: prod state mutations (and staging→main IS a
|
||||
# prod mutation, since the next deploy fans out to tenants) require
|
||||
# Hongming's approval per `feedback_prod_apply_needs_hongming_chat_go`.
|
||||
#
|
||||
# The auto-merge schedule preserves this gate: `merge_when_checks_succeed`
|
||||
# does NOT bypass `required_approvals: 1`. Gitea waits for BOTH
|
||||
# approval AND green checks before merging. Hongming reviews via the
|
||||
# canvas/chat-handle of the PR notification, approves, and Gitea
|
||||
# auto-merges within seconds.
|
||||
#
|
||||
# ============================================================
|
||||
# Identity + token (anti-bot-ring per saved-memory
|
||||
# `feedback_per_agent_gitea_identity_default`)
|
||||
# ============================================================
|
||||
#
|
||||
# This workflow uses `secrets.AUTO_SYNC_TOKEN` — a personal access
|
||||
# token issued to the `devops-engineer` Gitea persona. NOT the
|
||||
# founder PAT. The bot-ring fingerprint that triggered the GitHub
|
||||
# org suspension on 2026-05-06 was characterised by founder PAT
|
||||
# acting as CI at machine speed.
|
||||
#
|
||||
# Token scope: `push: true` (read+write) on this repo. The persona
|
||||
# can: open PRs, comment on PRs, schedule auto-merge. The persona
|
||||
# CANNOT bypass main's branch protection (`required_approvals: 1`
|
||||
# still applies — only Hongming's review unblocks merge).
|
||||
#
|
||||
# Authorship: the PR is opened by `devops-engineer`; the merge
|
||||
# commit credits Hongming-as-approver and `devops-engineer` as
|
||||
# the merger.
|
||||
#
|
||||
# ============================================================
|
||||
# Failure modes & operational notes
|
||||
# ============================================================
|
||||
#
|
||||
# A — staging gates not all green at trigger time:
|
||||
# - The combined-status check returns `state: pending|failure`.
|
||||
# Workflow exits 0 with a step-summary "not all green; staying
|
||||
# on current main". Re-fires on the next gate completion.
|
||||
#
|
||||
# B — Gitea PR-create returns non-201 (e.g. 422 already-exists):
|
||||
# - Idempotent: the workflow first GETs the existing open
|
||||
# staging→main PR. If found, reuse it; if not, POST a new one.
|
||||
# 422 should never surface; if it does (race), step summary
|
||||
# captures the body and the next workflow_run picks up.
|
||||
#
|
||||
# C — `merge_when_checks_succeed` schedule fails:
|
||||
# - 422 with "Pull request is not mergeable" if there are
|
||||
# conflicts or stale base. Step summary surfaces it; operator
|
||||
# (or `auto-sync-main-to-staging`) needs to bring staging up
|
||||
# to date with main first. Workflow exits 1 to surface red.
|
||||
#
|
||||
# D — `AUTO_SYNC_TOKEN` rotated / wrong scope:
|
||||
# - 401/403 on first REST call. Step summary surfaces it.
|
||||
# Re-issue the token from `~/.molecule-ai/personas/` on the
|
||||
# operator host and update the repo Actions secret.
|
||||
#
|
||||
# ============================================================
|
||||
# Loop safety
|
||||
# ============================================================
|
||||
#
|
||||
# When the promote PR merges to main, `auto-sync-main-to-staging.yml`
|
||||
# fires (on:push:main) and pushes the merge commit back to staging.
|
||||
# That push to staging is by `devops-engineer`, NOT this workflow's
|
||||
# token, and triggers the staging gate workflows. When they all
|
||||
# complete, we end up back here — but the tree-diff guard catches
|
||||
# it: staging tree == main tree (the merge commit changes nothing),
|
||||
# so we skip and the cycle terminates.
|
||||
|
||||
on:
|
||||
workflow_run:
|
||||
workflows:
|
||||
- CI
|
||||
- E2E Staging Canvas (Playwright)
|
||||
- E2E API Smoke Test
|
||||
- CodeQL
|
||||
types: [completed]
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
force:
|
||||
description: "Force promote even when AUTO_PROMOTE_ENABLED is unset (manual override)"
|
||||
required: false
|
||||
default: "false"
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
|
||||
# Serialize auto-promote runs. Multiple staging gate completions can land
|
||||
# in quick succession (CI + E2E + CodeQL all finish within seconds of
|
||||
# each other on a green PR) — without this, two parallel runs both:
|
||||
# 1. Would race the GET-or-POST PR step.
|
||||
# 2. Would both call merge-schedule (idempotent — fine on Gitea).
|
||||
# cancel-in-progress: false because the second run on a fresh staging
|
||||
# tip should NOT kill the first which has already opened the PR.
|
||||
concurrency:
|
||||
group: auto-promote-staging
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
check-all-gates-green:
|
||||
# Only consider staging pushes. PRs into staging don't promote.
|
||||
if: >
|
||||
(github.event_name == 'workflow_run' &&
|
||||
github.event.workflow_run.head_branch == 'staging' &&
|
||||
github.event.workflow_run.event == 'push')
|
||||
|| github.event_name == 'workflow_dispatch'
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
all_green: ${{ steps.gates.outputs.all_green }}
|
||||
head_sha: ${{ steps.gates.outputs.head_sha }}
|
||||
steps:
|
||||
# Skip empty-tree promotes (the perpetual auto-promote↔auto-sync
|
||||
# cycle observed pre-cutover on GitHub). On Gitea the cycle shape
|
||||
# is different (auto-sync uses fast-forward, no merge commit),
|
||||
# but the tree-diff guard is cheap insurance and protects against
|
||||
# any future merge-style regression.
|
||||
- name: Checkout for tree-diff check
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
ref: staging
|
||||
|
||||
- name: Skip if staging tree == main tree (cycle-break safety)
|
||||
id: tree-diff
|
||||
env:
|
||||
HEAD_SHA: ${{ github.event.workflow_run.head_sha || github.sha }}
|
||||
run: |
|
||||
set -eu
|
||||
git fetch origin main --depth=50 || { echo "::warning::git fetch main failed — proceeding (fail-open)"; exit 0; }
|
||||
if git diff --quiet origin/main "$HEAD_SHA" -- 2>/dev/null; then
|
||||
{
|
||||
echo "## Skipped — no code to promote"
|
||||
echo
|
||||
echo "staging tip (\`${HEAD_SHA:0:8}\`) and \`main\` have identical trees."
|
||||
echo "Skipping to avoid opening an empty promote PR."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "::notice::auto-promote: staging tree == main tree — no code to promote, skipping"
|
||||
echo "skip=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "skip=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Check combined status on staging head
|
||||
if: steps.tree-diff.outputs.skip != 'true'
|
||||
id: gates
|
||||
env:
|
||||
GITEA_TOKEN: ${{ secrets.AUTO_SYNC_TOKEN }}
|
||||
HEAD_SHA: ${{ github.event.workflow_run.head_sha || github.sha }}
|
||||
REPO: ${{ github.repository }}
|
||||
GITEA_HOST: ${{ vars.GITEA_HOST || 'https://git.moleculesai.app' }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Gitea-native combined-status endpoint aggregates every
|
||||
# check context attached to a SHA. This is structurally
|
||||
# cleaner than the GitHub-era per-workflow `gh run list`
|
||||
# loop because:
|
||||
#
|
||||
# 1. There's no risk of "workflow name collision" (the
|
||||
# GitHub-era code had to switch from `--workflow=NAME`
|
||||
# to `--workflow=FILE.YML` to disambiguate "CodeQL"
|
||||
# between the explicit workflow and GitHub's UI-
|
||||
# configured default setup; Gitea has no such
|
||||
# duplicate-name surface).
|
||||
# 2. Gitea's combined state already encodes the AND
|
||||
# across all contexts: success only if EVERY context
|
||||
# is success. Pending or failure on any context
|
||||
# produces non-success state.
|
||||
#
|
||||
# See https://docs.gitea.com/api/1.22 for the schema —
|
||||
# `state` is one of: success, pending, failure, error.
|
||||
|
||||
echo "head_sha=${HEAD_SHA}" >> "$GITHUB_OUTPUT"
|
||||
echo "Checking combined status on SHA ${HEAD_SHA}"
|
||||
|
||||
# `set +o pipefail` for the http-code capture pattern; restore
|
||||
# immediately. Pattern hardened per `feedback_curl_status_capture_pollution`.
|
||||
BODY_FILE=$(mktemp)
|
||||
set +e
|
||||
STATUS=$(curl -sS \
|
||||
-H "Authorization: token ${GITEA_TOKEN}" \
|
||||
-H "Accept: application/json" \
|
||||
-o "${BODY_FILE}" \
|
||||
-w "%{http_code}" \
|
||||
"${GITEA_HOST}/api/v1/repos/${REPO}/commits/${HEAD_SHA}/status")
|
||||
CURL_RC=$?
|
||||
set -e
|
||||
|
||||
if [ "${CURL_RC}" -ne 0 ] || [ "${STATUS}" != "200" ]; then
|
||||
echo "::error::combined-status fetch failed: curl=${CURL_RC} http=${STATUS}"
|
||||
cat "${BODY_FILE}" | head -c 500 || true
|
||||
rm -f "${BODY_FILE}"
|
||||
echo "all_green=false" >> "$GITHUB_OUTPUT"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
STATE=$(jq -r '.state // "missing"' < "${BODY_FILE}")
|
||||
TOTAL=$(jq -r '.total_count // 0' < "${BODY_FILE}")
|
||||
rm -f "${BODY_FILE}"
|
||||
|
||||
echo "Combined status: state=${STATE} total_count=${TOTAL}"
|
||||
|
||||
if [ "${STATE}" = "success" ] && [ "${TOTAL}" -gt 0 ]; then
|
||||
echo "all_green=true" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::All gates green on ${HEAD_SHA} (${TOTAL} contexts)"
|
||||
else
|
||||
echo "all_green=false" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## Not promoting — combined status not green"
|
||||
echo
|
||||
echo "- SHA: \`${HEAD_SHA:0:8}\`"
|
||||
echo "- Combined state: \`${STATE}\`"
|
||||
echo "- Context count: ${TOTAL}"
|
||||
echo
|
||||
echo "Will re-fire on the next gate completion. Investigate any red gate via the Actions UI."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "::notice::auto-promote: combined status is ${STATE} on ${HEAD_SHA} — staying on current main"
|
||||
fi
|
||||
|
||||
promote:
|
||||
needs: check-all-gates-green
|
||||
if: needs.check-all-gates-green.outputs.all_green == 'true'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Check rollout gate
|
||||
env:
|
||||
AUTO_PROMOTE_ENABLED: ${{ vars.AUTO_PROMOTE_ENABLED }}
|
||||
FORCE_INPUT: ${{ github.event.inputs.force }}
|
||||
run: |
|
||||
set -eu
|
||||
# Repo variable AUTO_PROMOTE_ENABLED=true flips this on. While
|
||||
# it's unset, the workflow dry-runs (logs what it would have
|
||||
# done) but doesn't open the promote PR. Set the variable in
|
||||
# Settings → Actions → Variables.
|
||||
if [ "${AUTO_PROMOTE_ENABLED:-}" != "true" ] && [ "${FORCE_INPUT:-false}" != "true" ]; then
|
||||
{
|
||||
echo "## Auto-promote disabled"
|
||||
echo
|
||||
echo "Repo variable \`AUTO_PROMOTE_ENABLED\` is not set to \`true\`."
|
||||
echo "All gates are green on staging; would have opened a promote PR to \`main\`."
|
||||
echo
|
||||
echo "To enable: Settings → Actions → Variables → \`AUTO_PROMOTE_ENABLED=true\`."
|
||||
echo "To test once manually: workflow_dispatch with \`force=true\`."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "::notice::auto-promote disabled — dry run only"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
- name: Open or reuse promote PR + schedule auto-merge
|
||||
if: ${{ vars.AUTO_PROMOTE_ENABLED == 'true' || github.event.inputs.force == 'true' }}
|
||||
env:
|
||||
GITEA_TOKEN: ${{ secrets.AUTO_SYNC_TOKEN }}
|
||||
REPO: ${{ github.repository }}
|
||||
TARGET_SHA: ${{ needs.check-all-gates-green.outputs.head_sha }}
|
||||
GITEA_HOST: ${{ vars.GITEA_HOST || 'https://git.moleculesai.app' }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
API="${GITEA_HOST}/api/v1/repos/${REPO}"
|
||||
AUTH=(-H "Authorization: token ${GITEA_TOKEN}" -H "Accept: application/json")
|
||||
|
||||
# http_status_get RESULT_VAR URL
|
||||
# Sets RESULT_VAR to "<http_code>:<body_file>". Curl status
|
||||
# capture pattern per `feedback_curl_status_capture_pollution`:
|
||||
# http_code goes to its own tempfile-equivalent (-w), body to
|
||||
# another tempfile, set +e/-e bracket protects pipeline state.
|
||||
http_get() {
|
||||
local body_file="$1"; shift
|
||||
local url="$1"; shift
|
||||
set +e
|
||||
local code
|
||||
code=$(curl -sS "${AUTH[@]}" -o "${body_file}" -w "%{http_code}" "${url}")
|
||||
local rc=$?
|
||||
set -e
|
||||
if [ "${rc}" -ne 0 ]; then
|
||||
echo "::error::curl GET failed (rc=${rc}) on ${url}"
|
||||
return 99
|
||||
fi
|
||||
echo "${code}"
|
||||
}
|
||||
http_post_json() {
|
||||
local body_file="$1"; shift
|
||||
local data="$1"; shift
|
||||
local url="$1"; shift
|
||||
set +e
|
||||
local code
|
||||
code=$(curl -sS "${AUTH[@]}" -H "Content-Type: application/json" \
|
||||
-X POST -d "${data}" -o "${body_file}" -w "%{http_code}" "${url}")
|
||||
local rc=$?
|
||||
set -e
|
||||
if [ "${rc}" -ne 0 ]; then
|
||||
echo "::error::curl POST failed (rc=${rc}) on ${url}"
|
||||
return 99
|
||||
fi
|
||||
echo "${code}"
|
||||
}
|
||||
|
||||
# Step 1: look for an existing open staging→main promote PR
|
||||
# (idempotent on workflow re-run). Gitea doesn't have a
|
||||
# head/base filter on the list endpoint that's as ergonomic
|
||||
# as gh's, but the dedicated `/pulls/{base}/{head}` lookup
|
||||
# works.
|
||||
BODY=$(mktemp)
|
||||
STATUS=$(http_get "${BODY}" "${API}/pulls/main/staging") || true
|
||||
|
||||
PR_NUM=""
|
||||
if [ "${STATUS}" = "200" ]; then
|
||||
STATE=$(jq -r '.state // "missing"' < "${BODY}")
|
||||
if [ "${STATE}" = "open" ]; then
|
||||
PR_NUM=$(jq -r '.number // ""' < "${BODY}")
|
||||
echo "::notice::Re-using existing open promote PR #${PR_NUM}"
|
||||
fi
|
||||
fi
|
||||
rm -f "${BODY}"
|
||||
|
||||
# Step 2: if no open PR, create one.
|
||||
if [ -z "${PR_NUM}" ]; then
|
||||
TITLE="staging → main: auto-promote ${TARGET_SHA:0:7}"
|
||||
BODY_TEXT=$(cat <<EOFBODY
|
||||
Automated promotion of \`staging\` (\`${TARGET_SHA:0:8}\`) to \`main\`. All required staging gates are green at this SHA (combined status reported success).
|
||||
|
||||
This PR is auto-generated by \`.github/workflows/auto-promote-staging.yml\` whenever every required gate completes green on the same staging SHA.
|
||||
|
||||
**Approval gate:** \`main\` branch protection requires 1 approval before this can land. Once approved, Gitea will auto-merge (the workflow scheduled \`merge_when_checks_succeed: true\` immediately after open).
|
||||
|
||||
The reverse-direction sync (the merge commit on \`main\` → \`staging\`) is handled automatically by \`auto-sync-main-to-staging.yml\` after this PR lands.
|
||||
|
||||
---
|
||||
- Source: staging at \`${TARGET_SHA}\`
|
||||
- Opened by: \`devops-engineer\` persona (anti-bot-ring; never founder PAT)
|
||||
- Refs: #65, #73, #195
|
||||
EOFBODY
|
||||
)
|
||||
REQ=$(jq -n \
|
||||
--arg title "${TITLE}" \
|
||||
--arg body "${BODY_TEXT}" \
|
||||
--arg base "main" \
|
||||
--arg head "staging" \
|
||||
'{title:$title, body:$body, base:$base, head:$head}')
|
||||
|
||||
BODY=$(mktemp)
|
||||
STATUS=$(http_post_json "${BODY}" "${REQ}" "${API}/pulls")
|
||||
|
||||
if [ "${STATUS}" = "201" ]; then
|
||||
PR_NUM=$(jq -r '.number // ""' < "${BODY}")
|
||||
echo "::notice::Opened promote PR #${PR_NUM}"
|
||||
else
|
||||
echo "::error::Failed to create promote PR: HTTP ${STATUS}"
|
||||
jq -r '.message // .' < "${BODY}" | head -c 500
|
||||
rm -f "${BODY}"
|
||||
exit 1
|
||||
fi
|
||||
rm -f "${BODY}"
|
||||
fi
|
||||
|
||||
# Step 3: schedule auto-merge. merge_when_checks_succeed
|
||||
# tells Gitea to wait for both:
|
||||
# - all required status checks to pass
|
||||
# - the required-approvals gate (1 approval on main)
|
||||
# before merging. On approval+green, Gitea merges within
|
||||
# seconds. On any check failing or approval being denied,
|
||||
# the schedule stays armed but doesn't fire.
|
||||
#
|
||||
# Idempotent: re-arming on an already-armed PR is a no-op.
|
||||
REQ=$(jq -n '{Do:"merge", merge_when_checks_succeed:true}')
|
||||
BODY=$(mktemp)
|
||||
STATUS=$(http_post_json "${BODY}" "${REQ}" "${API}/pulls/${PR_NUM}/merge")
|
||||
|
||||
# Gitea returns:
|
||||
# - 200/204 on successful immediate merge (gates already green AND approved)
|
||||
# - 405 "Please try again later" when scheduled successfully but waiting
|
||||
# - 422 on "Pull request is not mergeable" (conflict, stale base, etc.)
|
||||
#
|
||||
# 405 here is benign — Gitea's way of saying "scheduled, not merging now".
|
||||
# We treat 200/204/405 as success, anything else as failure.
|
||||
case "${STATUS}" in
|
||||
200|204)
|
||||
MERGE_OUTCOME="merged-immediately"
|
||||
echo "::notice::Promote PR #${PR_NUM} merged immediately (gates+approval already green)"
|
||||
;;
|
||||
405)
|
||||
MERGE_OUTCOME="auto-merge-scheduled"
|
||||
echo "::notice::Promote PR #${PR_NUM}: auto-merge scheduled (Gitea will land on approval+green)"
|
||||
;;
|
||||
422)
|
||||
MERGE_OUTCOME="not-mergeable"
|
||||
echo "::warning::Promote PR #${PR_NUM}: not mergeable (conflict, stale base, or already merging)."
|
||||
jq -r '.message // .' < "${BODY}" | head -c 500
|
||||
;;
|
||||
*)
|
||||
echo "::error::Unexpected status ${STATUS} on merge schedule"
|
||||
jq -r '.message // .' < "${BODY}" | head -c 500
|
||||
rm -f "${BODY}"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
rm -f "${BODY}"
|
||||
|
||||
{
|
||||
echo "## Auto-promote PR opened"
|
||||
echo
|
||||
echo "- Source: staging at \`${TARGET_SHA:0:8}\`"
|
||||
echo "- PR: #${PR_NUM}"
|
||||
echo "- Outcome: \`${MERGE_OUTCOME}\`"
|
||||
echo
|
||||
if [ "${MERGE_OUTCOME}" = "auto-merge-scheduled" ]; then
|
||||
echo "Gitea will auto-merge once Hongming approves and all checks are green. No human action needed beyond approval."
|
||||
elif [ "${MERGE_OUTCOME}" = "merged-immediately" ]; then
|
||||
echo "Merged immediately. \`publish-workspace-server-image.yml\` will fire naturally on the resulting \`main\` push."
|
||||
else
|
||||
echo "PR is not auto-merging. Operator may need to bring staging up to date with main, then re-trigger this workflow via workflow_dispatch."
|
||||
fi
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
83
.github/workflows/auto-promote-stale-alarm.yml
vendored
83
.github/workflows/auto-promote-stale-alarm.yml
vendored
@ -1,83 +0,0 @@
|
||||
name: auto-promote-stale-alarm
|
||||
|
||||
# Hourly cron + on-demand alarm for the silent-block failure mode that
|
||||
# motivated issue #2975:
|
||||
# - The auto-promote-staging.yml workflow opened a PR + armed
|
||||
# auto-merge, but main's branch protection requires a human review
|
||||
# (reviewDecision=REVIEW_REQUIRED). The PR sat BLOCKED with no
|
||||
# surface-up-the-stack for 12+ hours, holding 25 commits hostage
|
||||
# including the Memory v2 redesign and a reno-stars data-loss fix.
|
||||
#
|
||||
# This workflow runs `scripts/check-stale-promote-pr.sh` against the
|
||||
# repo's open auto-promote PRs (base=main head=staging). When a PR has
|
||||
# been BLOCKED on REVIEW_REQUIRED for >4h, it:
|
||||
# 1. Emits a workflow-level warning (visible in run summary + the
|
||||
# Actions UI feed).
|
||||
# 2. Posts a comment on the PR (idempotent — one alarm per PR).
|
||||
#
|
||||
# The detection logic lives in scripts/check-stale-promote-pr.sh so
|
||||
# it's unit-testable with stubbed `gh` (see test-check-stale-promote-pr.sh).
|
||||
# This file is the schedule + invocation surface only — SSOT for the
|
||||
# detector itself.
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Hourly. Cheap (one `gh pr list` + jq), and 1h granularity is
|
||||
# plenty for a 4h staleness threshold — operators see the alarm
|
||||
# within at most 1h of crossing the threshold.
|
||||
- cron: "27 * * * *" # at :27 to dodge the cron herd at :00
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
stale_hours:
|
||||
description: "Hours after which a BLOCKED+REVIEW_REQUIRED PR is stale (default 4)"
|
||||
required: false
|
||||
default: "4"
|
||||
post_comment:
|
||||
description: "Post a comment on stale PRs (default true)"
|
||||
required: false
|
||||
default: "true"
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write # post comments on stale PRs
|
||||
|
||||
# Serialize so the on-demand and scheduled runs don't double-comment
|
||||
# the same PR. cancel-in-progress=false because the script is idempotent
|
||||
# (existing comment marker prevents dupes), but a scheduled run firing
|
||||
# while a manual one runs would just re-list the same PR set.
|
||||
concurrency:
|
||||
group: auto-promote-stale-alarm
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
scan:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout (need scripts/ only)
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
sparse-checkout: |
|
||||
scripts/check-stale-promote-pr.sh
|
||||
sparse-checkout-cone-mode: false
|
||||
- name: Run stale-PR detector
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
GITHUB_REPOSITORY: ${{ github.repository }}
|
||||
STALE_HOURS: ${{ inputs.stale_hours || '4' }}
|
||||
POST_COMMENT: ${{ inputs.post_comment || 'true' }}
|
||||
run: |
|
||||
# The script's exit code reflects the count of stale PRs.
|
||||
# We don't want a stale finding to fail the workflow run —
|
||||
# the warning + comment are the signal, the green/red is
|
||||
# noise. So convert any non-zero exit to a workflow notice
|
||||
# and exit 0.
|
||||
set +e
|
||||
bash scripts/check-stale-promote-pr.sh
|
||||
rc=$?
|
||||
set -e
|
||||
if [ "$rc" -ne 0 ]; then
|
||||
echo "::notice::Stale PR detector found $rc PR(s) needing attention. See warnings above + comments on the PRs."
|
||||
fi
|
||||
# Always succeed — operator-facing surface is the warning,
|
||||
# not the workflow status.
|
||||
exit 0
|
||||
404
.github/workflows/auto-sync-canary.yml
vendored
404
.github/workflows/auto-sync-canary.yml
vendored
@ -1,404 +0,0 @@
|
||||
name: Auto-sync canary — AUTO_SYNC_TOKEN rotation drift
|
||||
|
||||
# Synthetic health check for the AUTO_SYNC_TOKEN secret consumed by
|
||||
# auto-sync-main-to-staging.yml (PR #66) and publish-workspace-server-image.yml.
|
||||
#
|
||||
# ============================================================
|
||||
# Why this workflow exists
|
||||
# ============================================================
|
||||
#
|
||||
# PR #66 fixed auto-sync (replaced GitHub-era `gh pr create` — which
|
||||
# 405s on Gitea's GraphQL endpoint — with a direct git push from the
|
||||
# `devops-engineer` persona's `AUTO_SYNC_TOKEN`). Hostile self-review
|
||||
# weakest spot #3 of that PR:
|
||||
#
|
||||
# "Token rotation silently breaks auto-sync. If AUTO_SYNC_TOKEN is
|
||||
# rotated without updating the repo secret, every push to main
|
||||
# fails red on the auto-sync push step. The workflow surfaces the
|
||||
# failure mode in the step summary (failure mode B in the header),
|
||||
# but there's no proactive monitoring."
|
||||
#
|
||||
# Detection latency under the status quo: rotation is only caught on
|
||||
# the next push to `main`. During quiet periods (no main push for
|
||||
# many hours) the staging-superset-of-main invariant silently breaks.
|
||||
#
|
||||
# This workflow closes the gap: every 6 hours, it fires the auth
|
||||
# surface that auto-sync depends on and emits a red workflow status
|
||||
# if AUTO_SYNC_TOKEN has drifted out of validity.
|
||||
#
|
||||
# ============================================================
|
||||
# What this checks (Option B — read-only verify)
|
||||
# ============================================================
|
||||
#
|
||||
# 1. `GET /api/v1/user` against Gitea with the token → validates the
|
||||
# token authenticates AND resolves to `devops-engineer` (catches
|
||||
# the case where the token was regenerated under a different
|
||||
# persona by mistake).
|
||||
# 2. `GET /api/v1/repos/molecule-ai/molecule-core` with the token →
|
||||
# validates the token has `read:repository` scope on this repo
|
||||
# (the v2 scope contract — see saved memory
|
||||
# `reference_persona_token_v2_scope`).
|
||||
# 3. `git push --dry-run` of the current staging SHA back to
|
||||
# `refs/heads/staging` via `https://oauth2:<token>@<gitea>/...`
|
||||
# → validates the EXACT HTTPS basic-auth path that
|
||||
# `actions/checkout` + `git push origin staging` use inside
|
||||
# auto-sync-main-to-staging.yml. NOP by construction (push the
|
||||
# current tip to itself = "Everything up-to-date"); auth is
|
||||
# checked at the smart-protocol handshake BEFORE the empty-diff
|
||||
# computation, so bad token → exit 128 with "Authentication
|
||||
# failed". `git ls-remote` is NOT used here because Gitea
|
||||
# falls back to anonymous read on public repos and would
|
||||
# silently green-light a rotated token.
|
||||
#
|
||||
# Each step exits non-zero with an actionable error message if it
|
||||
# fails. The workflow status itself is the operator-facing surface.
|
||||
#
|
||||
# ============================================================
|
||||
# What this does NOT check (intentional)
|
||||
# ============================================================
|
||||
#
|
||||
# - **Branch-protection authz** (failure mode C in auto-sync header):
|
||||
# would require an actual write to staging. Already monitored by
|
||||
# `branch-protection-drift.yml` daily. Don't duplicate.
|
||||
# - **Conflict resolution** (failure mode A): a real conflict is data-
|
||||
# driven, not auth-driven; can't synthesise it without polluting
|
||||
# staging. Already surfaces immediately on the next main push.
|
||||
# - **Concurrency** (failure mode D): handled by workflow concurrency
|
||||
# group on auto-sync, not a credential issue.
|
||||
#
|
||||
# ============================================================
|
||||
# Why Option B (read-only) and not the alternatives
|
||||
# ============================================================
|
||||
#
|
||||
# Considered + rejected (see issue #72 for full write-up):
|
||||
#
|
||||
# - **Option A — full auto-sync on schedule**: every run creates a
|
||||
# no-op merge commit on staging when main hasn't advanced. 4 noise
|
||||
# commits/day. And races the real `push:` trigger when main has
|
||||
# advanced. Rejected.
|
||||
#
|
||||
# - **Option C — push to dedicated `auto-sync-canary` branch**: would
|
||||
# exercise authz too, but adds branch noise on Gitea AND requires
|
||||
# maintaining a second branch protection (or expanding staging's
|
||||
# whitelist to a junk branch). Authz already covered by
|
||||
# `branch-protection-drift.yml`. Rejected.
|
||||
#
|
||||
# Prior art for the chosen Option B shape:
|
||||
# - Cloudflare's `/user/tokens/verify` endpoint (read-only auth
|
||||
# probe explicitly designed for credential canaries).
|
||||
# - AWS Secrets Manager rotation Lambda's `testSecret` step (auth
|
||||
# probe before promoting AWSPENDING → AWSCURRENT).
|
||||
# - HashiCorp Vault's `vault token lookup` for renewal canaries.
|
||||
#
|
||||
# ============================================================
|
||||
# Operator runbook — what to do when this workflow goes RED
|
||||
# ============================================================
|
||||
#
|
||||
# 1. **Identify which step failed**:
|
||||
# - Step "Verify token authenticates as devops-engineer" red →
|
||||
# token is invalid OR resolves to wrong persona.
|
||||
# - Step "Verify token has repo read scope" red → token valid but
|
||||
# stripped of `read:repository` scope (or repo perms changed).
|
||||
# - Step "Verify git HTTPS auth path via no-op dry-run push to
|
||||
# staging" red → token rotated/revoked OR Gitea git-HTTPS
|
||||
# surface is broken (rare). Auth check happens on the
|
||||
# smart-protocol handshake, separate from the API path.
|
||||
#
|
||||
# 2. **Re-issue the token** on the operator host:
|
||||
# ```
|
||||
# ssh root@5.78.80.188 'docker exec --user git molecule-gitea-1 \
|
||||
# gitea admin user generate-access-token \
|
||||
# --username devops-engineer \
|
||||
# --token-name persona-devops-engineer-vN \
|
||||
# --scopes "read:repository,write:repository,read:user,read:organization,read:issue,write:issue,read:notification,read:misc"'
|
||||
# ```
|
||||
# Update `/etc/molecule-bootstrap/agent-secrets.env` in place
|
||||
# (per `feedback_unified_credentials_file`). The previous token
|
||||
# file lands at `.bak.<date>`.
|
||||
#
|
||||
# 3. **Update the repo Actions secret** at:
|
||||
# Settings → Secrets and variables → Actions → AUTO_SYNC_TOKEN
|
||||
# Paste the new token. (Don't echo it in chat — but per
|
||||
# `feedback_passwords_in_chat_are_burned`, a paste in a 1:1
|
||||
# Claude session is within trust boundary.)
|
||||
#
|
||||
# 4. **Re-run this canary** via workflow_dispatch. Confirm GREEN.
|
||||
#
|
||||
# 5. **Backfill any missed main → staging syncs** by re-running
|
||||
# `auto-sync-main-to-staging.yml` from its workflow_dispatch
|
||||
# surface, OR by pushing an empty commit to main (if you'd
|
||||
# rather force a real trigger).
|
||||
#
|
||||
# ============================================================
|
||||
# Security notes
|
||||
# ============================================================
|
||||
#
|
||||
# - Token usage: read-only (`GET /api/v1/user`, `GET /api/v1/repos/...`,
|
||||
# `git ls-remote`). No write paths. Same blast-radius profile as
|
||||
# `actions/checkout` on a public repo.
|
||||
# - The token NEVER appears in logs: every `curl` uses a header
|
||||
# variable, never inline; the `git ls-remote` URL builds the
|
||||
# `oauth2:$TOKEN@host` form into a single env var that's not
|
||||
# echoed. GitHub Actions secret-masking covers anything that does
|
||||
# slip through.
|
||||
# - No new token introduced — same `AUTO_SYNC_TOKEN` the workflow
|
||||
# under monitor uses. Per least-privilege we deliberately do NOT
|
||||
# broaden scope for the canary.
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Every 6 hours at :17 (offsets the cron herd at :00). Justification
|
||||
# from issue #72: cheap to run (~5s wall-clock, no quota), 3h average
|
||||
# detection latency, 6h max. 1h would be 24× the runs for marginal
|
||||
# benefit; daily would be 6× longer latency and worse than status
|
||||
# quo on a quiet-main day.
|
||||
- cron: '17 */6 * * *'
|
||||
workflow_dispatch:
|
||||
|
||||
# No concurrency group needed — the canary is read-only and idempotent.
|
||||
# Two parallel runs (e.g. operator dispatch during a scheduled tick) are
|
||||
# harmless: same result, doubled HTTPS calls, no shared state.
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
verify-token:
|
||||
name: Verify AUTO_SYNC_TOKEN validity
|
||||
runs-on: ubuntu-latest
|
||||
# 2 min surfaces hangs (Gitea API stall, DNS issue) within one
|
||||
# cron interval. Realistic worst case is ~10s: 2 curls + 1 git
|
||||
# ls-remote, each capped by the explicit timeouts below.
|
||||
timeout-minutes: 2
|
||||
|
||||
env:
|
||||
# Pinned in env so individual steps can read it without
|
||||
# repeating the secret reference. GitHub masks the value in
|
||||
# logs automatically.
|
||||
AUTO_SYNC_TOKEN: ${{ secrets.AUTO_SYNC_TOKEN }}
|
||||
# MUST stay in sync with auto-sync-main-to-staging.yml's
|
||||
# `git config user.name "devops-engineer"` line. Renaming the
|
||||
# devops-engineer persona requires updating both files (and
|
||||
# the staging branch protection's `push_whitelist_usernames`).
|
||||
EXPECTED_PERSONA: devops-engineer
|
||||
GITEA_HOST: git.moleculesai.app
|
||||
REPO_PATH: molecule-ai/molecule-core
|
||||
|
||||
steps:
|
||||
- name: Verify AUTO_SYNC_TOKEN secret is configured
|
||||
# Schedule-vs-dispatch behaviour split, per
|
||||
# `feedback_schedule_vs_dispatch_secrets_hardening`:
|
||||
#
|
||||
# - schedule: hard-fail when the secret is missing. The
|
||||
# whole point of the canary is to surface drift; soft-
|
||||
# skipping on missing-secret would make the canary
|
||||
# itself drift-invisible (sweep-cf-orphans #2088 lesson).
|
||||
# - workflow_dispatch: hard-fail too — there's no scenario
|
||||
# where an operator wants this canary to silently no-op.
|
||||
# The workflow has no other ad-hoc utility; if you ran
|
||||
# it, you wanted the answer.
|
||||
run: |
|
||||
if [ -z "${AUTO_SYNC_TOKEN}" ]; then
|
||||
echo "::error::AUTO_SYNC_TOKEN secret is not set on this repo." >&2
|
||||
echo "::error::Set it at Settings → Secrets and variables → Actions." >&2
|
||||
echo "::error::Without it, auto-sync-main-to-staging.yml will fail every push to main." >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "AUTO_SYNC_TOKEN is configured (value masked)."
|
||||
|
||||
- name: Verify token authenticates as ${{ env.EXPECTED_PERSONA }}
|
||||
# Calls Gitea's `/api/v1/user` — the canonical
|
||||
# auth-probe-with-no-side-effects endpoint (mirrors
|
||||
# Cloudflare's /user/tokens/verify).
|
||||
#
|
||||
# Failure surfaces:
|
||||
# - HTTP 401: token invalid (rotated, revoked, or never
|
||||
# correctly registered).
|
||||
# - HTTP 200 but username != devops-engineer: token was
|
||||
# regenerated under the wrong persona — this would let
|
||||
# auth pass but commit attribution would be wrong, and
|
||||
# branch-protection authz would fail because only
|
||||
# `devops-engineer` is whitelisted.
|
||||
run: |
|
||||
set -euo pipefail
|
||||
response_file="$(mktemp)"
|
||||
code_file="$(mktemp)"
|
||||
# `--max-time 30`: full call ceiling. `--connect-timeout 10`:
|
||||
# DNS + TCP. `-w "%{http_code}"` routed to a tempfile so curl's
|
||||
# exit code can't pollute the captured status — see
|
||||
# feedback_curl_status_capture_pollution + the
|
||||
# `lint-curl-status-capture.yml` gate that rejects the unsafe
|
||||
# `$(curl ... || echo "000")` shape.
|
||||
set +e
|
||||
curl -sS -o "$response_file" \
|
||||
--max-time 30 --connect-timeout 10 \
|
||||
-w "%{http_code}" \
|
||||
-H "Authorization: token ${AUTO_SYNC_TOKEN}" \
|
||||
-H "Accept: application/json" \
|
||||
"https://${GITEA_HOST}/api/v1/user" >"$code_file" 2>/dev/null
|
||||
set -e
|
||||
status=$(cat "$code_file" 2>/dev/null || true)
|
||||
[ -z "$status" ] && status="000"
|
||||
|
||||
if [ "$status" != "200" ]; then
|
||||
echo "::error::Token rotation suspected: GET /api/v1/user returned HTTP $status (expected 200)." >&2
|
||||
echo "::error::Likely cause: AUTO_SYNC_TOKEN has been rotated/revoked on Gitea but the repo Actions secret was not updated." >&2
|
||||
echo "::error::Runbook: see header comment of this workflow file." >&2
|
||||
# Print response body but redact anything that looks like a token.
|
||||
sed -E 's/[A-Fa-f0-9]{32,}/<redacted>/g' "$response_file" >&2 || true
|
||||
exit 1
|
||||
fi
|
||||
|
||||
username=$(python3 -c "import json,sys; print(json.load(open(sys.argv[1])).get('login',''))" "$response_file")
|
||||
if [ "$username" != "${EXPECTED_PERSONA}" ]; then
|
||||
echo "::error::Token resolves to user '$username', expected '${EXPECTED_PERSONA}'." >&2
|
||||
echo "::error::AUTO_SYNC_TOKEN must be the devops-engineer persona PAT (not founder PAT, not another persona)." >&2
|
||||
echo "::error::Auto-sync push will fail because only 'devops-engineer' is whitelisted on staging branch protection." >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "Token authenticates as: $username ✓"
|
||||
|
||||
- name: Verify token has repo read scope
|
||||
# `GET /api/v1/repos/<owner>/<repo>` requires `read:repository`
|
||||
# on the persona's v2 scope contract. If the scope was
|
||||
# narrowed/dropped on rotation we catch it here, before the
|
||||
# next main push reveals it via a checkout failure.
|
||||
run: |
|
||||
set -euo pipefail
|
||||
response_file="$(mktemp)"
|
||||
code_file="$(mktemp)"
|
||||
# See first probe step for the rationale on the tempfile-routed
|
||||
# `-w "%{http_code}"` pattern — the unsafe `|| echo "000"` shape
|
||||
# is rejected by lint-curl-status-capture.yml.
|
||||
set +e
|
||||
curl -sS -o "$response_file" \
|
||||
--max-time 30 --connect-timeout 10 \
|
||||
-w "%{http_code}" \
|
||||
-H "Authorization: token ${AUTO_SYNC_TOKEN}" \
|
||||
-H "Accept: application/json" \
|
||||
"https://${GITEA_HOST}/api/v1/repos/${REPO_PATH}" >"$code_file" 2>/dev/null
|
||||
set -e
|
||||
status=$(cat "$code_file" 2>/dev/null || true)
|
||||
[ -z "$status" ] && status="000"
|
||||
|
||||
if [ "$status" != "200" ]; then
|
||||
echo "::error::Token lacks read:repository scope on ${REPO_PATH}: HTTP $status." >&2
|
||||
echo "::error::Auto-sync's actions/checkout step will fail with this token." >&2
|
||||
echo "::error::Re-issue with v2 scope contract: read:repository,write:repository,read:user,read:organization,read:issue,write:issue,read:notification,read:misc" >&2
|
||||
sed -E 's/[A-Fa-f0-9]{32,}/<redacted>/g' "$response_file" >&2 || true
|
||||
exit 1
|
||||
fi
|
||||
echo "Token has read:repository on ${REPO_PATH} ✓"
|
||||
|
||||
- name: Verify git HTTPS auth path via no-op dry-run push to staging
|
||||
# Final probe: exercise the EXACT auth path that
|
||||
# `actions/checkout` + `git push origin staging` use in
|
||||
# auto-sync-main-to-staging.yml. Gitea's API and git-HTTPS
|
||||
# surfaces share the token-lookup code path internally but
|
||||
# the wire-level error shapes differ — historically (#173)
|
||||
# the API path was healthy while git-HTTPS rejected, so
|
||||
# checking only the API would have given false-green.
|
||||
#
|
||||
# IMPORTANT: `git ls-remote` on a public repo (which
|
||||
# molecule-core is) succeeds even with a junk token because
|
||||
# Gitea falls back to anonymous-read. `ls-remote` therefore
|
||||
# CANNOT validate auth on this surface. We use
|
||||
# `git push --dry-run` instead — push is auth-gated even on
|
||||
# public repos.
|
||||
#
|
||||
# NOP shape: read the current staging SHA via authenticated
|
||||
# ls-remote (the SHA itself is public; auth is incidental
|
||||
# here, used only to colocate the discovery in one step), then
|
||||
# `git push --dry-run <SHA>:refs/heads/staging`. Pushing the
|
||||
# current tip back to itself is "Everything up-to-date" with
|
||||
# exit 0 when auth succeeds. With a bad token Gitea returns
|
||||
# HTTP 401 in the smart-protocol handshake and git exits 128
|
||||
# with "Authentication failed".
|
||||
#
|
||||
# The dry-run never reaches Gitea's pre-receive hook (which
|
||||
# is where branch-protection authz runs), so this probe does
|
||||
# not validate failure mode C. That's intentional —
|
||||
# branch-protection-drift.yml owns authz monitoring; this
|
||||
# canary owns auth.
|
||||
env:
|
||||
# Don't hang waiting for password prompt if auth fails on a
|
||||
# terminal-attached run. (In Actions there's no terminal,
|
||||
# but the env-var hardens against an interactive runner
|
||||
# config.)
|
||||
GIT_TERMINAL_PROMPT: "0"
|
||||
run: |
|
||||
set -euo pipefail
|
||||
# Token is in $AUTO_SYNC_TOKEN (job-level env). Compose the
|
||||
# URL as a local var that's never echoed.
|
||||
url="https://oauth2:${AUTO_SYNC_TOKEN}@${GITEA_HOST}/${REPO_PATH}"
|
||||
|
||||
# Step a: read current staging SHA. ~1KB; auth-gated only
|
||||
# on private repos but always works on public — used here
|
||||
# only to discover the SHA, not to validate auth.
|
||||
staging_ref=$(timeout 30s git ls-remote --refs "$url" refs/heads/staging 2>&1) || {
|
||||
redacted=$(echo "$staging_ref" | sed -E "s|oauth2:[^@]+@|oauth2:<redacted>@|g")
|
||||
echo "::error::ls-remote against staging failed (network/DNS issue):" >&2
|
||||
echo "$redacted" >&2
|
||||
exit 1
|
||||
}
|
||||
if ! echo "$staging_ref" | grep -qE '^[0-9a-f]{40}[[:space:]]+refs/heads/staging$'; then
|
||||
echo "::error::ls-remote returned unexpected shape:" >&2
|
||||
echo "$staging_ref" | sed -E "s|oauth2:[^@]+@|oauth2:<redacted>@|g" >&2
|
||||
exit 1
|
||||
fi
|
||||
staging_sha=$(echo "$staging_ref" | awk '{print $1}')
|
||||
|
||||
# Step b: spin up an ephemeral local repo. `git push` always
|
||||
# requires a local repo even when pushing a remote SHA that
|
||||
# isn't in the local object DB (the protocol negotiates and
|
||||
# discovers we don't need to send any objects). We don't use
|
||||
# `actions/checkout` for this — it would clone the whole
|
||||
# repo (~hundreds of MB) for what's essentially `git init`.
|
||||
tmp_repo="$(mktemp -d)"
|
||||
trap 'rm -rf "$tmp_repo"' EXIT
|
||||
git -C "$tmp_repo" init -q
|
||||
# Author config required for any git operation; values are
|
||||
# arbitrary because nothing gets committed here.
|
||||
git -C "$tmp_repo" config user.email canary@auto-sync.local
|
||||
git -C "$tmp_repo" config user.name auto-sync-canary
|
||||
|
||||
# Step c: dry-run push the current staging SHA back to
|
||||
# staging. NOP by construction — the remote tip equals the
|
||||
# SHA we're pushing, so "Everything up-to-date" is the
|
||||
# success path.
|
||||
#
|
||||
# Authentication is checked at the smart-protocol handshake,
|
||||
# BEFORE the dry-run can compute an empty diff. Bad token
|
||||
# → "Authentication failed", exit 128. Good token → exit 0.
|
||||
set +e
|
||||
push_out=$(timeout 30s git -C "$tmp_repo" push --dry-run "$url" "${staging_sha}:refs/heads/staging" 2>&1)
|
||||
push_rc=$?
|
||||
set -e
|
||||
|
||||
if [ "$push_rc" -ne 0 ]; then
|
||||
redacted=$(echo "$push_out" | sed -E "s|oauth2:[^@]+@|oauth2:<redacted>@|g")
|
||||
echo "::error::Token rotation suspected: git push --dry-run against staging failed via the AUTO_SYNC_TOKEN HTTPS auth path (exit $push_rc)." >&2
|
||||
echo "::error::This is the EXACT auth path that actions/checkout + git push use in auto-sync-main-to-staging.yml." >&2
|
||||
echo "::error::Likely cause: AUTO_SYNC_TOKEN was rotated/revoked on Gitea but the repo Actions secret was not updated. Runbook: see header." >&2
|
||||
echo "$redacted" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "git HTTPS auth path: NOP push --dry-run to staging → ${staging_sha:0:8} ✓"
|
||||
|
||||
- name: Summarise canary result
|
||||
# Everything passed — surface a green summary. (Failures
|
||||
# already wrote ::error:: lines and exited above; if we got
|
||||
# here, all three probes passed.)
|
||||
run: |
|
||||
{
|
||||
echo "## Auto-sync canary: GREEN"
|
||||
echo ""
|
||||
echo "AUTO_SYNC_TOKEN is healthy:"
|
||||
echo "- Authenticates as \`${EXPECTED_PERSONA}\` ✓"
|
||||
echo "- Has \`read:repository\` scope on \`${REPO_PATH}\` ✓"
|
||||
echo "- Git HTTPS auth path: no-op dry-run push to \`refs/heads/staging\` succeeds ✓"
|
||||
echo ""
|
||||
echo "Auto-sync main → staging will succeed on the next push to main."
|
||||
echo "If this canary ever goes RED, see the runbook in this workflow's header."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
255
.github/workflows/auto-sync-main-to-staging.yml
vendored
255
.github/workflows/auto-sync-main-to-staging.yml
vendored
@ -1,255 +0,0 @@
|
||||
name: Auto-sync main → staging
|
||||
|
||||
# Reflects every push to `main` back onto `staging` so the
|
||||
# staging-as-superset-of-main invariant holds.
|
||||
#
|
||||
# ============================================================
|
||||
# What this workflow does
|
||||
# ============================================================
|
||||
#
|
||||
# On every push to `main`:
|
||||
# 1. Checks if staging already contains main → no-op.
|
||||
# 2. Fetches both branches, merges main into staging in the
|
||||
# runner workspace (fast-forward if possible, else
|
||||
# `--no-ff` merge commit).
|
||||
# 3. Pushes staging directly to origin via the
|
||||
# `devops-engineer` persona's `AUTO_SYNC_TOKEN`.
|
||||
#
|
||||
# Authoritative path: a single `git push origin staging` from
|
||||
# inside this workflow is the SSOT for advancing staging after
|
||||
# a main push. No PR, no merge queue, no human approval —
|
||||
# staging is mechanically maintained as a superset of main.
|
||||
#
|
||||
# `auto-promote-staging.yml` is the reverse-direction
|
||||
# counterpart (staging → main, gated on green CI). Together
|
||||
# they keep the staging-superset-of-main invariant tight.
|
||||
#
|
||||
# ============================================================
|
||||
# Why direct push (and not "open a PR")
|
||||
# ============================================================
|
||||
#
|
||||
# Pre-2026-05-06 the canonical SCM was GitHub.com, where:
|
||||
# - The `staging` branch had a `merge_queue` ruleset that
|
||||
# blocked ALL direct pushes (no bypass even for org
|
||||
# admins or the GitHub Actions integration).
|
||||
# - Therefore this workflow opened a PR via `gh pr create`
|
||||
# and let auto-merge land it through the queue.
|
||||
#
|
||||
# Post-2026-05-06 the canonical SCM is Gitea
|
||||
# (`git.moleculesai.app/molecule-ai/molecule-core`). Gitea:
|
||||
# - Has no `merge_queue` concept.
|
||||
# - Allows direct push to protected branches via per-user
|
||||
# `push_whitelist_usernames` on the branch protection.
|
||||
# - Does not expose a GraphQL endpoint, so `gh pr create`
|
||||
# returns `HTTP 405 Method Not Allowed
|
||||
# (https://git.moleculesai.app/api/graphql)` — the
|
||||
# pre-suspension architecture cannot work on Gitea.
|
||||
#
|
||||
# The molecule-ai/molecule-core staging branch protection
|
||||
# (verified via `GET /api/v1/repos/.../branch_protections`)
|
||||
# whitelists `devops-engineer` for direct push. So the
|
||||
# correct Gitea-shape architecture is: authenticate as
|
||||
# `devops-engineer`, merge locally, push staging directly.
|
||||
#
|
||||
# This is structurally simpler than the GitHub-era PR dance
|
||||
# and removes the dependence on `gh` CLI / GraphQL entirely.
|
||||
#
|
||||
# ============================================================
|
||||
# Identity + token (anti-bot-ring per saved-memory
|
||||
# `feedback_per_agent_gitea_identity_default`)
|
||||
# ============================================================
|
||||
#
|
||||
# This workflow uses `secrets.AUTO_SYNC_TOKEN`, which is a
|
||||
# personal access token issued to the `devops-engineer`
|
||||
# persona on Gitea — NOT the founder PAT. The bot-ring
|
||||
# fingerprint that triggered the GitHub org suspension on
|
||||
# 2026-05-06 was characterised by founder PAT acting as CI
|
||||
# at machine speed; per-persona identities split the
|
||||
# attribution honestly.
|
||||
#
|
||||
# Token scope on Gitea: repo write. Push target restricted
|
||||
# to `staging` (this workflow is the only writer; main is
|
||||
# untouched). Compromise blast radius: bounded to staging
|
||||
# branch + this repo's read surface.
|
||||
#
|
||||
# Commits are authored by the persona email
|
||||
# `devops-engineer@agents.moleculesai.app` so commit history
|
||||
# reflects which automation produced the merge.
|
||||
#
|
||||
# ============================================================
|
||||
# Failure modes & operational notes
|
||||
# ============================================================
|
||||
#
|
||||
# A — staging has commits main doesn't, and the merge
|
||||
# conflicts:
|
||||
# - The `--no-ff` merge step exits non-zero. Workflow
|
||||
# fails red. Operator (devops-engineer or human)
|
||||
# resolves manually:
|
||||
# git fetch origin
|
||||
# git checkout staging
|
||||
# git merge --no-ff origin/main
|
||||
# # resolve conflicts
|
||||
# git push origin staging
|
||||
# - Step summary surfaces the conflict so the failed run
|
||||
# is self-explanatory.
|
||||
#
|
||||
# B — `AUTO_SYNC_TOKEN` rotated / wrong scope:
|
||||
# - `git push` step exits non-zero with `HTTP 401` /
|
||||
# `403`. Step summary surfaces the failed push.
|
||||
# - Re-issue the token from `~/.molecule-ai/personas/`
|
||||
# on the operator host and update the repo Actions
|
||||
# secret. Re-run the workflow.
|
||||
#
|
||||
# C — staging branch protection no longer whitelists
|
||||
# `devops-engineer`:
|
||||
# - `git push` exits non-zero with a Gitea protected-
|
||||
# branch rejection. Step summary surfaces it.
|
||||
# - Re-add `devops-engineer` to
|
||||
# `push_whitelist_usernames` on the staging
|
||||
# protection (Settings → Branches → staging).
|
||||
#
|
||||
# D — concurrent push to main while a sync is in flight:
|
||||
# - The `concurrency` group below serialises runs.
|
||||
# The second waits for the first; if main advances
|
||||
# again while we're syncing, the second run picks
|
||||
# up the new tip on its own fetch.
|
||||
#
|
||||
# ============================================================
|
||||
# Loop safety
|
||||
# ============================================================
|
||||
#
|
||||
# The push to staging from this workflow does NOT itself
|
||||
# fire a `push: branches: [main]` event (different branch),
|
||||
# so there's no risk of self-recursion. `auto-promote-staging.yml`
|
||||
# fires on `workflow_run` of CI etc. — it sees the new
|
||||
# staging tip on its next gate-completion event, NOT on this
|
||||
# push directly. No loop.
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
# workflow_dispatch lets operators manually backfill a
|
||||
# missed sync (e.g. if AUTO_SYNC_TOKEN was rotated and a
|
||||
# main push slipped through while the secret was stale).
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
concurrency:
|
||||
group: auto-sync-main-to-staging
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
sync-staging:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout staging (with devops-engineer push token)
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
ref: staging
|
||||
# AUTO_SYNC_TOKEN authenticates as the
|
||||
# `devops-engineer` Gitea persona — the only
|
||||
# identity whitelisted for direct push to
|
||||
# staging. See header comment for context.
|
||||
token: ${{ secrets.AUTO_SYNC_TOKEN }}
|
||||
|
||||
- name: Configure git author
|
||||
run: |
|
||||
# Per-persona identity, NOT founder PAT.
|
||||
# `feedback_per_agent_gitea_identity_default`.
|
||||
git config user.name "devops-engineer"
|
||||
git config user.email "devops-engineer@agents.moleculesai.app"
|
||||
|
||||
- name: Check if staging already contains main
|
||||
id: check
|
||||
run: |
|
||||
set -euo pipefail
|
||||
git fetch origin main
|
||||
if git merge-base --is-ancestor origin/main HEAD; then
|
||||
echo "needs_sync=false" >> "$GITHUB_OUTPUT"
|
||||
{
|
||||
echo "## No-op"
|
||||
echo
|
||||
echo "staging already contains \`origin/main\` ($(git rev-parse --short=8 origin/main))."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
else
|
||||
echo "needs_sync=true" >> "$GITHUB_OUTPUT"
|
||||
MAIN_SHORT=$(git rev-parse --short=8 origin/main)
|
||||
echo "main_short=${MAIN_SHORT}" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::staging is missing main's tip (${MAIN_SHORT}) — merging in-runner and pushing"
|
||||
fi
|
||||
|
||||
- name: Merge main into staging (in-runner)
|
||||
if: steps.check.outputs.needs_sync == 'true'
|
||||
id: merge
|
||||
run: |
|
||||
set -euo pipefail
|
||||
# Already on staging from checkout. Try fast-forward
|
||||
# first (cleanest history); fall back to merge commit
|
||||
# if staging has commits main doesn't.
|
||||
if git merge --ff-only origin/main; then
|
||||
echo "did_ff=true" >> "$GITHUB_OUTPUT"
|
||||
echo "::notice::Fast-forwarded staging to origin/main"
|
||||
else
|
||||
echo "did_ff=false" >> "$GITHUB_OUTPUT"
|
||||
if ! git merge --no-ff origin/main \
|
||||
-m "chore: sync main → staging (auto, ${{ steps.check.outputs.main_short }})"; then
|
||||
# Hygiene: leave the work tree clean before failing.
|
||||
git merge --abort || true
|
||||
{
|
||||
echo "## Conflict"
|
||||
echo
|
||||
echo "Auto-merge \`main → staging\` failed with conflicts."
|
||||
echo "A human (or devops-engineer persona) needs to resolve manually:"
|
||||
echo
|
||||
echo '```'
|
||||
echo "git fetch origin"
|
||||
echo "git checkout staging"
|
||||
echo "git merge --no-ff origin/main"
|
||||
echo "# resolve conflicts"
|
||||
echo "git push origin staging"
|
||||
echo '```'
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
- name: Push staging to origin
|
||||
if: steps.check.outputs.needs_sync == 'true'
|
||||
run: |
|
||||
set -euo pipefail
|
||||
# Direct push to staging. devops-engineer persona is
|
||||
# whitelisted for direct push on the staging branch
|
||||
# protection (Settings → Branches → staging).
|
||||
#
|
||||
# No --force / --force-with-lease: a fast-forward or
|
||||
# legitimate merge commit on top of current staging
|
||||
# is the only thing we'd ever push. If origin/staging
|
||||
# advanced under us (concurrent merge), the push
|
||||
# legitimately rejects and the next run picks up the
|
||||
# new state.
|
||||
if ! git push origin staging; then
|
||||
{
|
||||
echo "## Push rejected"
|
||||
echo
|
||||
echo "Direct push to \`staging\` failed. Likely causes:"
|
||||
echo "- \`AUTO_SYNC_TOKEN\` rotated / wrong scope (HTTP 401/403)"
|
||||
echo "- \`devops-engineer\` no longer in"
|
||||
echo " \`push_whitelist_usernames\` on the staging"
|
||||
echo " branch protection (HTTP 422)"
|
||||
echo "- staging advanced concurrently — re-running this"
|
||||
echo " workflow on the new main tip will pick it up"
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
{
|
||||
echo "## Auto-sync succeeded"
|
||||
echo
|
||||
echo "- staging advanced to: \`$(git rev-parse --short=8 HEAD)\`"
|
||||
echo "- main tip: \`${{ steps.check.outputs.main_short }}\`"
|
||||
echo "- Strategy: $([ "${{ steps.merge.outputs.did_ff }}" = "true" ] && echo "fast-forward" || echo "merge commit")"
|
||||
echo "- Pushed by: \`devops-engineer\` (per-agent persona, anti-bot-ring)"
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
82
.github/workflows/canary-staging.yml
vendored
82
.github/workflows/canary-staging.yml
vendored
@ -20,6 +20,19 @@ on:
|
||||
# a few minutes under load — that's fine for a canary.
|
||||
- cron: '*/30 * * * *'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
keep_on_failure:
|
||||
description: >-
|
||||
Skip teardown when the canary fails (debugging only). The
|
||||
tenant org + EC2 + CF tunnel + DNS stay alive so an operator
|
||||
can SSM into the workspace EC2 and capture docker logs of the
|
||||
failing claude-code container. REMEMBER to manually delete
|
||||
via DELETE /cp/admin/tenants/<slug> when done so the org
|
||||
doesn't accumulate cost. Only honored on workflow_dispatch;
|
||||
cron runs always tear down (we don't want unattended cron
|
||||
to leak resources).
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
# Serialise with the full-SaaS workflow so they don't contend for the
|
||||
# same org-create quota on staging. Different group key from
|
||||
@ -80,6 +93,14 @@ jobs:
|
||||
# is "Token Plan only" but cheap-per-token and fast.
|
||||
E2E_MODEL_SLUG: MiniMax-M2.7-highspeed
|
||||
E2E_RUN_ID: "canary-${{ github.run_id }}"
|
||||
# Debug-only: when an operator dispatches with keep_on_failure=true,
|
||||
# the canary script's E2E_KEEP_ORG=1 path skips teardown so the
|
||||
# tenant org + EC2 stay alive for SSM-based log capture. Cron runs
|
||||
# never set this (the input only exists on workflow_dispatch) so
|
||||
# unattended cron always tears down. See molecule-core#129
|
||||
# failure mode #1 — capturing the actual exception requires
|
||||
# docker logs from the live container.
|
||||
E2E_KEEP_ORG: ${{ github.event.inputs.keep_on_failure == 'true' && '1' || '0' }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
@ -137,27 +158,28 @@ jobs:
|
||||
id: canary
|
||||
run: bash tests/e2e/test_staging_full_saas.sh
|
||||
|
||||
# Alerting: open an issue only after THREE consecutive failures so
|
||||
# transient flakes (Cloudflare DNS hiccup, AWS API blip) don't spam
|
||||
# the issue list. If an issue is already open, we still comment on
|
||||
# every failure so ops sees the streak. Auto-close on next green.
|
||||
# Alerting: open a sticky issue on the FIRST failure; comment on
|
||||
# subsequent failures; auto-close on next green. Comment-on-existing
|
||||
# de-duplicates so a single open issue accumulates the streak —
|
||||
# ops sees one issue with N comments rather than N issues.
|
||||
#
|
||||
# Threshold rationale: canary fires every 30 min, so 3 failures =
|
||||
# ~90 min of consecutive red — well past any single-run flake but
|
||||
# still tight enough that a real outage gets surfaced before the
|
||||
# next deploy window.
|
||||
# Why no consecutive-failures threshold (e.g., wait 3 runs before
|
||||
# filing): the prior threshold check used
|
||||
# `github.rest.actions.listWorkflowRuns()` which Gitea 1.22.6 does
|
||||
# not expose (returns 404). On Gitea Actions the threshold call
|
||||
# ALWAYS failed, breaking the entire alerting step and going days
|
||||
# silent on real regressions (38h+ chronic red on 2026-05-07/08
|
||||
# before this fix; tracked in molecule-core#129). Filing on first
|
||||
# failure is also better UX — we want to know about the first red,
|
||||
# not wait 90 min for it to "count." Real flakes get one issue +
|
||||
# a quick close-on-green; persistent reds accumulate comments.
|
||||
- name: Open issue on failure
|
||||
if: failure()
|
||||
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
|
||||
env:
|
||||
# Inject the workflow path explicitly — context.workflow is
|
||||
# the *name*, not the file path the actions API needs.
|
||||
WORKFLOW_PATH: '.github/workflows/canary-staging.yml'
|
||||
CONSECUTIVE_THRESHOLD: '3'
|
||||
with:
|
||||
script: |
|
||||
const title = '🔴 Canary failing: staging SaaS smoke';
|
||||
const runURL = `https://github.com/${context.repo.owner}/${context.repo.repo}/actions/runs/${context.runId}`;
|
||||
const runURL = `${context.serverUrl}/${context.repo.owner}/${context.repo.repo}/actions/runs/${context.runId}`;
|
||||
|
||||
// Find an existing open canary issue (stable title match).
|
||||
// If one exists, this isn't a "first failure" — comment and exit.
|
||||
@ -177,32 +199,12 @@ jobs:
|
||||
return;
|
||||
}
|
||||
|
||||
// No open issue yet — check the last N-1 runs' conclusions.
|
||||
// We open the issue only if the last (THRESHOLD-1) runs ALSO
|
||||
// failed (so this is the 3rd consecutive red).
|
||||
const threshold = parseInt(process.env.CONSECUTIVE_THRESHOLD, 10);
|
||||
const { data: runs } = await github.rest.actions.listWorkflowRuns({
|
||||
owner: context.repo.owner, repo: context.repo.repo,
|
||||
workflow_id: process.env.WORKFLOW_PATH,
|
||||
status: 'completed',
|
||||
per_page: threshold,
|
||||
// Skip the current in-progress run; it isn't 'completed' yet.
|
||||
});
|
||||
// listWorkflowRuns returns recent first. We need (threshold-1)
|
||||
// prior failures (current run is the threshold-th).
|
||||
const priorFailures = (runs.workflow_runs || [])
|
||||
.slice(0, threshold - 1)
|
||||
.filter(r => r.id !== context.runId)
|
||||
.filter(r => r.conclusion === 'failure')
|
||||
.length;
|
||||
if (priorFailures < threshold - 1) {
|
||||
core.info(`Below threshold: ${priorFailures + 1}/${threshold} consecutive failures — not filing yet`);
|
||||
return;
|
||||
}
|
||||
|
||||
// No open issue yet — file one on this first failure. The
|
||||
// comment-on-existing branch above means subsequent failures
|
||||
// accumulate as comments on this same issue, so we don't
|
||||
// spam new issues per run.
|
||||
const body =
|
||||
`Canary run failed at ${new Date().toISOString()}, ` +
|
||||
`${threshold} consecutive runs red.\n\n` +
|
||||
`Canary run failed at ${new Date().toISOString()}.\n\n` +
|
||||
`Run: ${runURL}\n\n` +
|
||||
`This issue auto-closes on the next green canary run. ` +
|
||||
`Consecutive failures add a comment here rather than a new issue.`;
|
||||
@ -211,7 +213,7 @@ jobs:
|
||||
title, body,
|
||||
labels: ['canary-staging', 'bug'],
|
||||
});
|
||||
core.info(`Opened canary failure issue (${threshold} consecutive reds)`);
|
||||
core.info('Opened canary failure issue (first red)');
|
||||
|
||||
- name: Auto-close canary issue on success
|
||||
if: success()
|
||||
|
||||
99
.github/workflows/check-merge-group-trigger.yml
vendored
99
.github/workflows/check-merge-group-trigger.yml
vendored
@ -14,6 +14,13 @@ name: Check merge_group trigger on required workflows
|
||||
# Reasoning for staging-only: main has its own CI gating model (PR review),
|
||||
# but staging is what the merge queue runs on, so it's the trigger that
|
||||
# matters.
|
||||
#
|
||||
# Gitea stub: Gitea has no merge queue feature and no `merge_group:`
|
||||
# event type. The linter would find no `merge_group:` triggers to verify
|
||||
# (they don't exist on Gitea), so the lint is vacuously satisfied.
|
||||
# Converting to a no-op stub keeps the workflow+job name stable for any
|
||||
# commit-status context consumers while eliminating the `gh api` call
|
||||
# that fails against Gitea's REST surface (#75 / PR-D).
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
@ -25,9 +32,6 @@ on:
|
||||
paths:
|
||||
- '.github/workflows/**.yml'
|
||||
- '.github/workflows/**.yaml'
|
||||
# Self-listen on merge_group so the linter passes its own queue run.
|
||||
merge_group:
|
||||
types: [checks_requested]
|
||||
|
||||
jobs:
|
||||
check:
|
||||
@ -36,88 +40,9 @@ jobs:
|
||||
permissions:
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- name: Verify merge_group trigger on required-check workflows
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
REPO: ${{ github.repository }}
|
||||
shell: bash
|
||||
- name: Gitea no-op (merge queue not applicable)
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Branch we care about — the one merge queue runs on.
|
||||
BRANCH=staging
|
||||
|
||||
# Pull the list of required status check contexts. If the branch
|
||||
# has no protection or no required checks, exit clean — nothing
|
||||
# to lint.
|
||||
REQUIRED=$(gh api "repos/${REPO}/branches/${BRANCH}/protection/required_status_checks" \
|
||||
--jq '.contexts[]' 2>/dev/null || true)
|
||||
if [ -z "$REQUIRED" ]; then
|
||||
echo "No required status checks on ${BRANCH} — nothing to verify."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "Required checks on ${BRANCH}:"
|
||||
echo "${REQUIRED}" | sed 's/^/ - /'
|
||||
echo
|
||||
|
||||
# Build a map: workflow file -> set of job names declared in it.
|
||||
# We use yq if available, otherwise grep the `name:` lines under
|
||||
# `jobs:`. Stick with grep for portability — runner image always
|
||||
# has it; yq isn't in the default image as of 2026-04.
|
||||
declare -A workflow_jobs
|
||||
shopt -s nullglob
|
||||
for wf in .github/workflows/*.yml .github/workflows/*.yaml; do
|
||||
[ -f "$wf" ] || continue
|
||||
# Extract the workflow name (the `name:` at file root).
|
||||
wf_name=$(awk '/^name:[[:space:]]/ {sub(/^name:[[:space:]]+/,""); gsub(/^"|"$/,""); print; exit}' "$wf")
|
||||
# Extract job step names from the `jobs:` block. A job step is:
|
||||
# - id under `jobs:` (key with 2-space indent followed by colon)
|
||||
# - the `name:` field inside that job (4-space indent)
|
||||
# We collect both because required_status_checks contexts can
|
||||
# match either, depending on how the workflow was authored.
|
||||
jobs_block=$(awk '/^jobs:/{flag=1; next} flag' "$wf")
|
||||
job_names=$(echo "$jobs_block" | awk '/^[[:space:]]{4}name:[[:space:]]/ {sub(/^[[:space:]]+name:[[:space:]]+/,""); gsub(/^["'"'"']|["'"'"']$/,""); print}')
|
||||
workflow_jobs["$wf"]="${wf_name}"$'\n'"${job_names}"
|
||||
done
|
||||
|
||||
# For each required check, find the workflow that produces it.
|
||||
# Then verify that workflow lists merge_group as a trigger.
|
||||
FAILED=0
|
||||
while IFS= read -r check; do
|
||||
[ -z "$check" ] && continue
|
||||
owning_wf=""
|
||||
for wf in "${!workflow_jobs[@]}"; do
|
||||
if echo "${workflow_jobs[$wf]}" | grep -Fxq "$check"; then
|
||||
owning_wf="$wf"
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
if [ -z "$owning_wf" ]; then
|
||||
echo "::warning::Required check '${check}' has no matching workflow in this repo. Skipping (may be from an external app)."
|
||||
continue
|
||||
fi
|
||||
|
||||
# Does the workflow's trigger list include merge_group?
|
||||
# Match either bare `merge_group:` line or merge_group with
|
||||
# subsequent indented config (types: [checks_requested]).
|
||||
if grep -qE '^[[:space:]]*merge_group:' "$owning_wf"; then
|
||||
echo "OK: '${check}' (in $owning_wf) — has merge_group trigger"
|
||||
else
|
||||
echo "::error file=${owning_wf}::Required check '${check}' is produced by ${owning_wf}, but the workflow does not declare a 'merge_group:' trigger. With merge queue enabled on ${BRANCH}, this will deadlock the queue (every PR sits AWAITING_CHECKS forever). Add this to the workflow's 'on:' block:"
|
||||
echo "::error file=${owning_wf}:: merge_group:"
|
||||
echo "::error file=${owning_wf}:: types: [checks_requested]"
|
||||
FAILED=1
|
||||
fi
|
||||
done <<< "$REQUIRED"
|
||||
|
||||
if [ "$FAILED" -ne 0 ]; then
|
||||
echo
|
||||
echo "::error::Block. See errors above. Reference: $(grep -l 'reference_merge_queue' /dev/null 2>/dev/null || echo 'memory: reference_merge_queue_enablement.md')."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo
|
||||
echo "All required workflows on ${BRANCH} declare merge_group triggers."
|
||||
echo "Gitea Actions — merge queue not supported; no-op."
|
||||
echo "On GitHub this workflow lints that required-check workflows declare"
|
||||
echo "merge_group: triggers to prevent queue deadlock. On Gitea that"
|
||||
echo "constraint is inapplicable — all workflows pass vacuously."
|
||||
|
||||
17
.github/workflows/ci.yml
vendored
17
.github/workflows/ci.yml
vendored
@ -304,13 +304,9 @@ jobs:
|
||||
needs: [changes, canvas-build]
|
||||
# Only fires on direct pushes to main (i.e. after staging→main promotion).
|
||||
if: needs.changes.outputs.canvas == 'true' && github.event_name == 'push' && github.ref == 'refs/heads/main'
|
||||
permissions:
|
||||
# Required to post commit comments via the GitHub API.
|
||||
contents: write
|
||||
steps:
|
||||
- name: Post deploy reminder as commit comment
|
||||
- name: Write deploy reminder to step summary
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
COMMIT_SHA: ${{ github.sha }}
|
||||
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
|
||||
run: |
|
||||
@ -337,10 +333,13 @@ jobs:
|
||||
printf '\n> Posted automatically by CI · commit `%s` · [build log](%s)\n' \
|
||||
"$COMMIT_SHA" "$RUN_URL" >> /tmp/deploy-reminder.md
|
||||
|
||||
gh api \
|
||||
--method POST \
|
||||
"repos/${{ github.repository }}/commits/${{ github.sha }}/comments" \
|
||||
--field "body=@/tmp/deploy-reminder.md"
|
||||
# Gitea has no commit-comments API (no equivalent of
|
||||
# POST /repos/{owner}/{repo}/commits/{commit_sha}/comments).
|
||||
# Write to GITHUB_STEP_SUMMARY instead — both GitHub Actions and
|
||||
# Gitea Actions render this as the workflow run's summary page,
|
||||
# which is where operators look for post-deploy action items.
|
||||
# (#75 / PR-D)
|
||||
cat /tmp/deploy-reminder.md >> "$GITHUB_STEP_SUMMARY"
|
||||
|
||||
# Python Lint & Test — required check, always runs. See platform-build
|
||||
# for the rationale.
|
||||
|
||||
8
.github/workflows/e2e-api.yml
vendored
8
.github/workflows/e2e-api.yml
vendored
@ -51,7 +51,7 @@ name: E2E API Smoke Test
|
||||
# * Pre-pull `alpine:latest` so the platform-server's provisioner
|
||||
# (`internal/handlers/container_files.go`) can stand up its
|
||||
# ephemeral token-write helper without a daemon.io round-trip.
|
||||
# * Create `molecule-monorepo-net` bridge network if missing so the
|
||||
# * Create `molecule-core-net` bridge network if missing so the
|
||||
# provisioner's container.HostConfig {NetworkMode: ...} attach
|
||||
# succeeds.
|
||||
# Item #1 (timeouts) — evidence on recent runs (77/3191, ae/4270, 0e/
|
||||
@ -163,12 +163,12 @@ jobs:
|
||||
# when the image is already present.
|
||||
docker pull alpine:latest >/dev/null
|
||||
# Provisioner attaches workspace containers to
|
||||
# molecule-monorepo-net (workspace-server/internal/provisioner/
|
||||
# molecule-core-net (workspace-server/internal/provisioner/
|
||||
# provisioner.go::DefaultNetwork). The bridge already exists on
|
||||
# the operator host's docker daemon — `network create` is
|
||||
# idempotent via `|| true`.
|
||||
docker network create molecule-monorepo-net >/dev/null 2>&1 || true
|
||||
echo "alpine:latest pre-pulled; molecule-monorepo-net ensured."
|
||||
docker network create molecule-core-net >/dev/null 2>&1 || true
|
||||
echo "alpine:latest pre-pulled; molecule-core-net ensured."
|
||||
- name: Start Postgres (docker)
|
||||
if: needs.detect-changes.outputs.api == 'true'
|
||||
run: |
|
||||
|
||||
@ -34,7 +34,7 @@ name: Handlers Postgres Integration
|
||||
# So we sidestep `services:` entirely. The job container still uses
|
||||
# host-net (inherited from runner config; required for cache server
|
||||
# discovery on the bridge IP 172.18.0.17:42631). We launch a sibling
|
||||
# postgres on the existing `molecule-monorepo-net` bridge with a
|
||||
# postgres on the existing `molecule-core-net` bridge with a
|
||||
# UNIQUE name per run — `pg-handlers-${RUN_ID}-${RUN_ATTEMPT}` — and
|
||||
# read its bridge IP via `docker inspect`. A host-net job container
|
||||
# can reach a bridge-net container directly via the bridge IP (verified
|
||||
@ -44,7 +44,7 @@ name: Handlers Postgres Integration
|
||||
# + No host-port collision; N parallel runs share the bridge cleanly
|
||||
# + `if: always()` cleanup runs even on test-step failure
|
||||
# - One more step in the workflow (+~3 lines)
|
||||
# - Requires `molecule-monorepo-net` to exist on the operator host
|
||||
# - Requires `molecule-core-net` to exist on the operator host
|
||||
# (it does; declared in docker-compose.yml + docker-compose.infra.yml)
|
||||
#
|
||||
# Class B Hongming-owned CICD red sweep, 2026-05-08.
|
||||
@ -96,7 +96,7 @@ jobs:
|
||||
PG_NAME: pg-handlers-${{ github.run_id }}-${{ github.run_attempt }}
|
||||
# Bridge network already exists on the operator host (declared
|
||||
# in docker-compose.yml + docker-compose.infra.yml).
|
||||
PG_NETWORK: molecule-monorepo-net
|
||||
PG_NETWORK: molecule-core-net
|
||||
defaults:
|
||||
run:
|
||||
working-directory: workspace-server
|
||||
|
||||
11
.github/workflows/harness-replays.yml
vendored
11
.github/workflows/harness-replays.yml
vendored
@ -119,6 +119,17 @@ jobs:
|
||||
# symptom, different root cause: staging still has the in-image
|
||||
# clone path, hits the auth error directly).
|
||||
#
|
||||
# 2026-05-08 sub-finding (#192): the clone step ALSO fails when
|
||||
# any referenced workspace-template repo is private and the
|
||||
# AUTO_SYNC_TOKEN bearer (devops-engineer persona) lacks read
|
||||
# access. Root cause: 5 of 9 workspace-template repos
|
||||
# (openclaw, codex, crewai, deepagents, gemini-cli) had been
|
||||
# marked private with no team grant. Resolution: flipped them
|
||||
# to public per `feedback_oss_first_repo_visibility_default`
|
||||
# (the OSS surface should be public). Layer-3 (customer-private +
|
||||
# marketplace third-party repos) tracked separately in
|
||||
# internal#102.
|
||||
#
|
||||
# Token shape matches publish-workspace-server-image.yml: AUTO_SYNC_TOKEN
|
||||
# is the devops-engineer persona PAT, NOT the founder PAT (per
|
||||
# `feedback_per_agent_gitea_identity_default`). clone-manifest.sh
|
||||
|
||||
276
.github/workflows/retarget-main-to-staging.yml
vendored
276
.github/workflows/retarget-main-to-staging.yml
vendored
@ -1,276 +0,0 @@
|
||||
name: Retarget main PRs to staging
|
||||
|
||||
# Mechanical enforcement of SHARED_RULES rule 8 ("Staging-first
|
||||
# workflow, no exceptions"). When a bot opens a PR against `main`,
|
||||
# retarget it to `staging` automatically and leave an explanatory
|
||||
# comment. Human / CEO-authored PRs (the staging→main promotion
|
||||
# PRs, etc.) are left alone — they're the authorised exception
|
||||
# to the rule.
|
||||
#
|
||||
# ============================================================
|
||||
# What this workflow does
|
||||
# ============================================================
|
||||
#
|
||||
# On `pull_request_target` opened/reopened against `main`:
|
||||
# 1. If the PR head is `staging`, skip (the auto-promote PRs
|
||||
# MUST stay base=main).
|
||||
# 2. If the PR author is a bot, retarget the PR base to
|
||||
# `staging` via Gitea REST `PATCH /pulls/{N}` body
|
||||
# `{"base":"staging"}`.
|
||||
# 3. If the retarget returns 422 "pull request already exists
|
||||
# for base branch 'staging'" (issue #1884 case: another PR
|
||||
# on the same head already targets staging), close the
|
||||
# now-redundant main-PR via Gitea REST instead of failing
|
||||
# red.
|
||||
# 4. Post an explainer comment on the retargeted PR via
|
||||
# Gitea REST `POST /issues/{N}/comments`.
|
||||
#
|
||||
# ============================================================
|
||||
# Why Gitea REST (and not `gh api / gh pr close / gh pr comment`)
|
||||
# ============================================================
|
||||
#
|
||||
# Pre-2026-05-06 this workflow used `gh api -X PATCH "repos/{owner}/{repo}/pulls/{N}" -f base=staging`
|
||||
# plus `gh pr close` and `gh pr comment`. After the GitHub→Gitea
|
||||
# cutover those calls fail because:
|
||||
#
|
||||
# - `gh` CLI defaults to `api.github.com`. Even with `GH_HOST`
|
||||
# pointing at Gitea, `gh pr close / comment` route through
|
||||
# GraphQL (`/api/graphql`) which Gitea does not expose.
|
||||
# Empirical: every `gh pr *` call returns
|
||||
# `HTTP 405 Method Not Allowed (https://git.moleculesai.app/api/graphql)`
|
||||
# — same root cause as #65 (auto-sync, fixed in PR #66) and
|
||||
# #73/#195 (auto-promote, fixed in PR #78).
|
||||
# - `gh api -X PATCH /pulls/{N}` happens to use a REST path
|
||||
# that Gitea also has, but the `gh` host-resolution layer
|
||||
# and pagination/retry logic don't always hit Gitea cleanly,
|
||||
# and the cost of switching to direct `curl` is one extra
|
||||
# line of code.
|
||||
#
|
||||
# So this workflow uses direct `curl` calls to Gitea REST. No
|
||||
# `gh` CLI dependency, no GraphQL, no flaky host-resolution.
|
||||
#
|
||||
# ============================================================
|
||||
# Identity + token (anti-bot-ring per saved-memory
|
||||
# `feedback_per_agent_gitea_identity_default`)
|
||||
# ============================================================
|
||||
#
|
||||
# Pre-fix this workflow used the per-job ephemeral
|
||||
# `secrets.GITHUB_TOKEN`. On Gitea Actions that token has
|
||||
# narrow scope and unpredictable cross-PR write capability.
|
||||
#
|
||||
# Post-fix: `secrets.AUTO_SYNC_TOKEN` (the `devops-engineer`
|
||||
# Gitea persona). Same persona used by `auto-sync-main-to-staging.yml`
|
||||
# (PR #66) and `auto-promote-staging.yml` (PR #78). Token scope:
|
||||
# `push: true` repo write, sufficient for PR-edit + close + comment.
|
||||
#
|
||||
# Why this token does NOT need branch-protection bypass:
|
||||
# patching a PR's base ref is a PR-level operation that does not
|
||||
# require push perms on either branch (the PR's own commits stay
|
||||
# put; only the metadata changes).
|
||||
#
|
||||
# ============================================================
|
||||
# Failure modes & operational notes
|
||||
# ============================================================
|
||||
#
|
||||
# A — PATCH base→staging returns 422 "pull request already exists"
|
||||
# (issue #1884 case):
|
||||
# - Detected by string-match on response body. Workflow
|
||||
# falls through to closing the now-redundant main-PR
|
||||
# (Gitea REST `PATCH /pulls/{N}` with `state: closed`)
|
||||
# and posts an explanation comment. Step summary surfaces.
|
||||
#
|
||||
# B — `AUTO_SYNC_TOKEN` rotated / wrong scope:
|
||||
# - First REST call returns 401/403. Step summary surfaces.
|
||||
# Re-issue token from `~/.molecule-ai/personas/` on the
|
||||
# operator host and update repo Actions secret.
|
||||
#
|
||||
# C — PR was deleted between trigger and run:
|
||||
# - REST call returns 404. Workflow exits 0 with a notice
|
||||
# (the rule was already enforced or the PR is gone).
|
||||
#
|
||||
# D — author is not actually a bot but the filter mis-fires:
|
||||
# - Filter is conservative: only triggers on
|
||||
# `user.type == 'Bot'`, `login` ends with `[bot]`, or
|
||||
# known bot logins (`molecule-ai[bot]`, `app/molecule-ai`).
|
||||
# Human PRs slip through unaffected. If a NEW bot login
|
||||
# starts shipping main-PRs, add it to the filter.
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [opened, reopened]
|
||||
branches: [main]
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
retarget:
|
||||
name: Retarget to staging
|
||||
runs-on: ubuntu-latest
|
||||
# Only fire for bot-authored PRs. Human CEO PRs (staging→main
|
||||
# promotion) are intentional and pass through.
|
||||
#
|
||||
# Head-ref guard: never retarget a PR whose head IS `staging`
|
||||
# — those are the auto-promote staging→main PRs (opened by
|
||||
# `devops-engineer` since PR #78 / #195 fix). Retargeting
|
||||
# head=staging onto base=staging fails with HTTP 422 "no new
|
||||
# commits between base 'staging' and head 'staging'", which
|
||||
# would surface as a noisy red workflow run on every
|
||||
# auto-promote (caught 2026-05-03 on the GitHub-era PR #2588).
|
||||
if: >-
|
||||
github.event.pull_request.head.ref != 'staging'
|
||||
&& (
|
||||
github.event.pull_request.user.type == 'Bot'
|
||||
|| endsWith(github.event.pull_request.user.login, '[bot]')
|
||||
|| github.event.pull_request.user.login == 'app/molecule-ai'
|
||||
|| github.event.pull_request.user.login == 'molecule-ai[bot]'
|
||||
|| github.event.pull_request.user.login == 'devops-engineer'
|
||||
)
|
||||
steps:
|
||||
- name: Retarget PR base to staging via Gitea REST
|
||||
id: retarget
|
||||
env:
|
||||
GITEA_TOKEN: ${{ secrets.AUTO_SYNC_TOKEN }}
|
||||
GITEA_HOST: ${{ vars.GITEA_HOST || 'https://git.moleculesai.app' }}
|
||||
REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
PR_AUTHOR: ${{ github.event.pull_request.user.login }}
|
||||
# Issue #1884 case: when the bot opens a PR against main
|
||||
# and there's already another PR on the same head branch
|
||||
# targeting staging, Gitea's PATCH returns 422 with a
|
||||
# body mentioning "pull request already exists for base
|
||||
# branch 'staging'" (the Gitea message wording is
|
||||
# slightly different from GitHub's; the substring match
|
||||
# below covers both for forward/back compat).
|
||||
# The retarget can't proceed — but the right response is
|
||||
# to close the now-redundant main-PR, not to fail the
|
||||
# workflow noisily. Detect that specific 422 and close
|
||||
# instead.
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
API="${GITEA_HOST}/api/v1/repos/${REPO}"
|
||||
AUTH=(-H "Authorization: token ${GITEA_TOKEN}" -H "Accept: application/json")
|
||||
|
||||
echo "Retargeting PR #${PR_NUMBER} (author: ${PR_AUTHOR}) from main → staging"
|
||||
|
||||
# Curl-status-capture pattern per `feedback_curl_status_capture_pollution`:
|
||||
# http_code via -w to its own scalar, body to a tempfile, set +e/-e
|
||||
# bracket so curl's non-zero-on-4xx doesn't pollute the script's exit chain.
|
||||
BODY_FILE=$(mktemp)
|
||||
REQ='{"base":"staging"}'
|
||||
|
||||
set +e
|
||||
STATUS=$(curl -sS "${AUTH[@]}" -H "Content-Type: application/json" \
|
||||
-X PATCH -d "${REQ}" \
|
||||
-o "${BODY_FILE}" -w "%{http_code}" \
|
||||
"${API}/pulls/${PR_NUMBER}")
|
||||
CURL_RC=$?
|
||||
set -e
|
||||
|
||||
if [ "${CURL_RC}" -ne 0 ]; then
|
||||
echo "::error::curl PATCH failed (rc=${CURL_RC})"
|
||||
rm -f "${BODY_FILE}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ "${STATUS}" = "201" ] || [ "${STATUS}" = "200" ]; then
|
||||
NEW_BASE=$(jq -r '.base.ref // "?"' < "${BODY_FILE}")
|
||||
rm -f "${BODY_FILE}"
|
||||
if [ "${NEW_BASE}" = "staging" ]; then
|
||||
echo "::notice::Retargeted PR #${PR_NUMBER} → staging"
|
||||
echo "outcome=retargeted" >> "$GITHUB_OUTPUT"
|
||||
exit 0
|
||||
fi
|
||||
echo "::error::PATCH returned ${STATUS} but base.ref is '${NEW_BASE}', not 'staging'"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Specifically match the 422 duplicate-base/head error so
|
||||
# any OTHER PATCH failure (auth, deleted PR, etc.) still
|
||||
# surfaces as a real workflow failure.
|
||||
BODY=$(cat "${BODY_FILE}" || true)
|
||||
rm -f "${BODY_FILE}"
|
||||
|
||||
if [ "${STATUS}" = "422" ] && echo "${BODY}" | grep -qE "(pull request already exists for base branch 'staging'|already exists.*base.*staging)"; then
|
||||
echo "::notice::PR #${PR_NUMBER}: duplicate target-staging PR exists on same head — closing this main-PR as redundant."
|
||||
|
||||
# Close the now-redundant main-PR via Gitea REST
|
||||
# (PATCH state=closed). Post comment explaining
|
||||
# rationale BEFORE close so the comment lands on the
|
||||
# PR (commenting on a closed PR works on Gitea, but
|
||||
# historically caused notification ordering surprises).
|
||||
|
||||
CLOSE_BODY_FILE=$(mktemp)
|
||||
CMT_REQ=$(jq -n '{body:"[retarget-bot] Closing — another PR on the same head branch already targets `staging`. This PR is redundant. See issue #1884 for the rationale."}')
|
||||
set +e
|
||||
CMT_STATUS=$(curl -sS "${AUTH[@]}" -H "Content-Type: application/json" \
|
||||
-X POST -d "${CMT_REQ}" \
|
||||
-o "${CLOSE_BODY_FILE}" -w "%{http_code}" \
|
||||
"${API}/issues/${PR_NUMBER}/comments")
|
||||
set -e
|
||||
if [ "${CMT_STATUS}" != "201" ]; then
|
||||
echo "::warning::dup-close comment POST returned ${CMT_STATUS}; continuing to close anyway"
|
||||
cat "${CLOSE_BODY_FILE}" | head -c 300 || true
|
||||
fi
|
||||
rm -f "${CLOSE_BODY_FILE}"
|
||||
|
||||
CLOSE_REQ='{"state":"closed"}'
|
||||
CLOSE_RESP=$(mktemp)
|
||||
set +e
|
||||
CL_STATUS=$(curl -sS "${AUTH[@]}" -H "Content-Type: application/json" \
|
||||
-X PATCH -d "${CLOSE_REQ}" \
|
||||
-o "${CLOSE_RESP}" -w "%{http_code}" \
|
||||
"${API}/pulls/${PR_NUMBER}")
|
||||
set -e
|
||||
if [ "${CL_STATUS}" = "201" ] || [ "${CL_STATUS}" = "200" ]; then
|
||||
echo "::notice::Closed PR #${PR_NUMBER} as redundant"
|
||||
echo "outcome=closed-as-duplicate" >> "$GITHUB_OUTPUT"
|
||||
rm -f "${CLOSE_RESP}"
|
||||
exit 0
|
||||
fi
|
||||
echo "::error::Failed to close redundant PR: HTTP ${CL_STATUS}"
|
||||
cat "${CLOSE_RESP}" | head -c 300 || true
|
||||
rm -f "${CLOSE_RESP}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "::error::Retarget PATCH failed and was NOT a duplicate-base error: HTTP ${STATUS}"
|
||||
echo "${BODY}" | head -c 500 >&2
|
||||
exit 1
|
||||
|
||||
- name: Post explainer comment
|
||||
if: steps.retarget.outputs.outcome == 'retargeted'
|
||||
env:
|
||||
GITEA_TOKEN: ${{ secrets.AUTO_SYNC_TOKEN }}
|
||||
GITEA_HOST: ${{ vars.GITEA_HOST || 'https://git.moleculesai.app' }}
|
||||
REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
API="${GITEA_HOST}/api/v1/repos/${REPO}"
|
||||
AUTH=(-H "Authorization: token ${GITEA_TOKEN}" -H "Accept: application/json")
|
||||
|
||||
# PR comments live on the issue endpoint in Gitea
|
||||
# (PRs ARE issues — same endpoint, different sub-resources
|
||||
# for diffs/files/etc.). The body uses jq to safely
|
||||
# encode the multi-line markdown without shell-quote
|
||||
# nightmares.
|
||||
REQ=$(jq -n '{body:"[retarget-bot] This PR was opened against `main` and has been retargeted to `staging` automatically.\n\n**Why:** per [SHARED_RULES rule 8](https://git.moleculesai.app/molecule-ai/molecule-ai-org-template-molecule-dev/src/branch/main/SHARED_RULES.md), all feature work targets `staging` first; the CEO promotes `staging → main` separately.\n\n**What changed:** just the base branch — no code change. CI will re-run against `staging`. If you get merge conflicts, rebase on `staging`.\n\n**If this PR is the CEO`s staging→main promotion:** the Action skipped you (only bot-authored PRs are retargeted, head=staging is also exempted). If you see this comment on your CEO PR, that`s a bug — please tag @hongmingwang."}')
|
||||
|
||||
BODY_FILE=$(mktemp)
|
||||
set +e
|
||||
STATUS=$(curl -sS "${AUTH[@]}" -H "Content-Type: application/json" \
|
||||
-X POST -d "${REQ}" \
|
||||
-o "${BODY_FILE}" -w "%{http_code}" \
|
||||
"${API}/issues/${PR_NUMBER}/comments")
|
||||
set -e
|
||||
|
||||
if [ "${STATUS}" = "201" ]; then
|
||||
echo "::notice::Posted explainer comment on PR #${PR_NUMBER}"
|
||||
else
|
||||
echo "::warning::Failed to post explainer (HTTP ${STATUS}) — retarget itself succeeded"
|
||||
cat "${BODY_FILE}" | head -c 300 || true
|
||||
fi
|
||||
rm -f "${BODY_FILE}"
|
||||
@ -284,7 +284,7 @@ cp .env.example .env
|
||||
./infra/scripts/setup.sh
|
||||
# Boots Postgres (:5432), Redis (:6379), Langfuse (:3001),
|
||||
# and Temporal (:7233 gRPC, :8233 UI) on the shared
|
||||
# `molecule-monorepo-net` Docker network. Temporal runs with
|
||||
# `molecule-core-net` Docker network. Temporal runs with
|
||||
# no auth on localhost — dev-only; production must gate it.
|
||||
#
|
||||
# Also populates the template/plugin registry by cloning every repo
|
||||
|
||||
@ -283,7 +283,7 @@ cp .env.example .env
|
||||
./infra/scripts/setup.sh
|
||||
# 启动 Postgres (:5432)、Redis (:6379)、Langfuse (:3001)
|
||||
# 以及 Temporal (:7233 gRPC, :8233 UI),全部挂在共享的
|
||||
# `molecule-monorepo-net` Docker 网络上。Temporal 默认无鉴权,
|
||||
# `molecule-core-net` Docker 网络上。Temporal 默认无鉴权,
|
||||
# 仅用于本地开发;生产环境必须加 mTLS / API Key。
|
||||
#
|
||||
# 同时会根据 manifest.json 拉取所有模板/插件仓库到
|
||||
|
||||
10
canvas/.dockerignore
Normal file
10
canvas/.dockerignore
Normal file
@ -0,0 +1,10 @@
|
||||
# Excluded from `docker build` context. Without this, the COPY . . step in
|
||||
# canvas/Dockerfile clobbers the freshly-installed node_modules with the
|
||||
# host's (potentially broken / wrong-arch) copy — the @tailwindcss/oxide
|
||||
# native binary disagreed and broke `next build`.
|
||||
node_modules
|
||||
.next
|
||||
.git
|
||||
*.log
|
||||
.env*
|
||||
!.env.example
|
||||
@ -1,7 +1,11 @@
|
||||
FROM node:22-alpine AS builder
|
||||
FROM node:22-alpine@sha256:cb15fca92530d7ac113467696cf1001208dac49c3c64355fd1348c11a88ddf8f AS builder
|
||||
WORKDIR /app
|
||||
COPY package.json package-lock.json* ./
|
||||
RUN npm install
|
||||
# `npm ci` (not `install`) for lockfile-exact reproducibility.
|
||||
# `--include=optional` ensures the platform-specific @tailwindcss/oxide
|
||||
# native binary lands — without it, postcss fails with "Cannot read
|
||||
# properties of undefined (reading 'All')" at build time.
|
||||
RUN npm ci --include=optional
|
||||
COPY . .
|
||||
ARG NEXT_PUBLIC_PLATFORM_URL=http://localhost:8080
|
||||
ARG NEXT_PUBLIC_WS_URL=ws://localhost:8080/ws
|
||||
@ -11,7 +15,7 @@ ENV NEXT_PUBLIC_WS_URL=$NEXT_PUBLIC_WS_URL
|
||||
ENV NEXT_PUBLIC_ADMIN_TOKEN=$NEXT_PUBLIC_ADMIN_TOKEN
|
||||
RUN npm run build
|
||||
|
||||
FROM node:22-alpine
|
||||
FROM node:22-alpine@sha256:cb15fca92530d7ac113467696cf1001208dac49c3c64355fd1348c11a88ddf8f
|
||||
WORKDIR /app
|
||||
COPY --from=builder /app/.next/standalone ./
|
||||
COPY --from=builder /app/.next/static ./.next/static
|
||||
|
||||
@ -17,6 +17,24 @@ import { dirname, join } from "node:path";
|
||||
// update one heuristic. Production is unaffected: `output: "standalone"`
|
||||
// bakes resolved env into the build, and the marker file isn't shipped.
|
||||
loadMonorepoEnv();
|
||||
// Boot-time matched-pair guard for ADMIN_TOKEN / NEXT_PUBLIC_ADMIN_TOKEN.
|
||||
// When ADMIN_TOKEN is set on the workspace-server (server-side bearer
|
||||
// gate, wsauth_middleware.go ~L245), the canvas MUST send the matching
|
||||
// NEXT_PUBLIC_ADMIN_TOKEN as `Authorization: Bearer ...` on every API
|
||||
// call. If only one is set, every workspace API call 401s silently —
|
||||
// the canvas hydrates with empty data and the user sees a broken page
|
||||
// with no console hint about the auth-config mismatch.
|
||||
//
|
||||
// Pre-fix the matched-pair contract was descriptive only (a comment in
|
||||
// .env): future devs/agents could re-misconfigure with one of the two
|
||||
// unset and silently 401. Closes the post-PR-#174 self-review gap.
|
||||
//
|
||||
// Warn-only (not exit) — production canvas Docker images bake these
|
||||
// vars into the build at image-build time, and a missed pair there
|
||||
// would still emit the warning at runtime via the standalone server's
|
||||
// startup. Killing the process on misconfiguration would turn a
|
||||
// recoverable auth issue into a hard crashloop.
|
||||
checkAdminTokenPair();
|
||||
|
||||
const nextConfig: NextConfig = {
|
||||
output: "standalone",
|
||||
@ -57,6 +75,43 @@ function loadMonorepoEnv() {
|
||||
);
|
||||
}
|
||||
|
||||
// Boot-time matched-pair guard. Runs after .env has been loaded so the
|
||||
// check sees the post-load state. The two env vars must be set or
|
||||
// unset together; one-without-the-other is the silent-401 footgun.
|
||||
//
|
||||
// Treats empty string ("") as unset. An explicitly-empty `KEY=` in
|
||||
// .env counts as set-to-empty in `process.env`, but for auth purposes
|
||||
// an empty bearer token is equivalent to no token — so both
|
||||
// `ADMIN_TOKEN=` and an unset ADMIN_TOKEN are equivalent relative to
|
||||
// the matched-pair invariant.
|
||||
//
|
||||
// Returns void; side effect is the console.error warning. Kept as a
|
||||
// separate function (exported) so a future test can reset env, call
|
||||
// this, and assert on captured stderr.
|
||||
export function checkAdminTokenPair(): void {
|
||||
const serverSet = !!process.env.ADMIN_TOKEN;
|
||||
const clientSet = !!process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (serverSet === clientSet) return;
|
||||
// Distinct messages so the operator can tell which half is missing
|
||||
// — the fix is symmetric (set the other one) but the diagnostic
|
||||
// mentions which side is currently set so they don't have to grep.
|
||||
if (serverSet && !clientSet) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.error(
|
||||
"[next.config] ADMIN_TOKEN is set but NEXT_PUBLIC_ADMIN_TOKEN is not — " +
|
||||
"canvas will 401 against workspace-server because the bearer header " +
|
||||
"is never attached. Set both to the same value, or unset both.",
|
||||
);
|
||||
} else {
|
||||
// eslint-disable-next-line no-console
|
||||
console.error(
|
||||
"[next.config] NEXT_PUBLIC_ADMIN_TOKEN is set but ADMIN_TOKEN is not — " +
|
||||
"workspace-server will reject the bearer because no AdminAuth gate " +
|
||||
"is configured. Set both to the same value, or unset both.",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
function findMonorepoRoot(start: string): string | null {
|
||||
let dir = start;
|
||||
for (let i = 0; i < 6; i++) {
|
||||
|
||||
@ -354,7 +354,7 @@ function OrgCTA({ org }: { org: Org }) {
|
||||
);
|
||||
}
|
||||
// provisioning / unknown — non-interactive
|
||||
return <span className="text-sm text-ink-soft">{org.status}…</span>;
|
||||
return <span className="text-sm text-ink-mid">{org.status}…</span>;
|
||||
}
|
||||
|
||||
function EmptyState({ banner }: { banner?: React.ReactNode }) {
|
||||
@ -420,7 +420,7 @@ function CreateOrgForm({ onCreated }: { onCreated: (slug: string) => void }) {
|
||||
aria-describedby="org-slug-hint"
|
||||
className="mt-1 w-full rounded border border-line bg-surface-card px-3 py-2 text-sm text-ink"
|
||||
/>
|
||||
<p id="org-slug-hint" className="mt-1 text-xs text-ink-soft">
|
||||
<p id="org-slug-hint" className="mt-1 text-xs text-ink-mid">
|
||||
Lowercase letters, numbers, and hyphens only. Cannot be changed later.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@ -56,7 +56,7 @@ export default function Home() {
|
||||
<div className="fixed inset-0 flex items-center justify-center bg-surface">
|
||||
<div role="status" aria-live="polite" className="flex flex-col items-center gap-3">
|
||||
<Spinner size="lg" />
|
||||
<span className="text-xs text-ink-soft">Loading canvas...</span>
|
||||
<span className="text-xs text-ink-mid">Loading canvas...</span>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
@ -119,11 +119,11 @@ function PlatformDownDiagnostic() {
|
||||
Most common cause on a dev host: one of those services stopped.
|
||||
</p>
|
||||
<div className="bg-surface-sunken/80 border border-line/50 rounded-lg px-4 py-3 max-w-lg w-full">
|
||||
<div className="text-[10px] uppercase tracking-wider text-ink-soft mb-2">Try first</div>
|
||||
<div className="text-[10px] uppercase tracking-wider text-ink-mid mb-2">Try first</div>
|
||||
<pre className="text-[12px] text-ink-mid font-mono whitespace-pre-wrap leading-relaxed">{`brew services start postgresql@14
|
||||
brew services start redis`}</pre>
|
||||
</div>
|
||||
<p className="text-[11px] text-ink-soft max-w-lg text-center">
|
||||
<p className="text-[11px] text-ink-mid max-w-lg text-center">
|
||||
If both are running, check <code className="font-mono">/tmp/molecule-server.log</code> for
|
||||
the underlying error. If you're on hosted SaaS, this is a platform incident — try again in a moment.
|
||||
</p>
|
||||
|
||||
@ -55,13 +55,13 @@ export default function PricingPage() {
|
||||
</a>
|
||||
.
|
||||
</p>
|
||||
<p className="mt-6 text-sm text-ink-soft">
|
||||
<p className="mt-6 text-sm text-ink-mid">
|
||||
Prices shown in USD. Flat-rate per org — no per-seat fees on any paid tier.
|
||||
Enterprise / self-hosted licensing available — contact us.
|
||||
</p>
|
||||
</section>
|
||||
|
||||
<footer className="mx-auto mt-20 max-w-5xl border-t border-line px-6 py-6 text-center text-sm text-ink-soft">
|
||||
<footer className="mx-auto mt-20 max-w-5xl border-t border-line px-6 py-6 text-center text-sm text-ink-mid">
|
||||
<p>
|
||||
© {new Date().getFullYear()} Molecule AI, Inc. ·{" "}
|
||||
<a href="/legal/terms" className="hover:text-ink-mid">
|
||||
|
||||
@ -127,7 +127,7 @@ export function AuditTrailPanel({ workspaceId }: Props) {
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="flex items-center justify-center h-32">
|
||||
<span className="text-xs text-ink-soft">Loading audit trail…</span>
|
||||
<span className="text-xs text-ink-mid">Loading audit trail…</span>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@ -145,7 +145,7 @@ export function AuditTrailPanel({ workspaceId }: Props) {
|
||||
className={`px-2 py-1 text-[10px] rounded-md font-medium transition-all shrink-0 ${
|
||||
filter === f.id
|
||||
? "bg-surface-card text-ink ring-1 ring-zinc-600"
|
||||
: "text-ink-soft hover:text-ink-mid hover:bg-surface-card/60"
|
||||
: "text-ink-mid hover:text-ink-mid hover:bg-surface-card/60"
|
||||
}`}
|
||||
>
|
||||
{f.label}
|
||||
@ -174,9 +174,9 @@ export function AuditTrailPanel({ workspaceId }: Props) {
|
||||
{entries.length === 0 ? (
|
||||
/* Empty state */
|
||||
<div className="flex flex-col items-center justify-center py-16 gap-3 text-center">
|
||||
<span className="text-4xl text-ink-soft" aria-hidden="true">⊟</span>
|
||||
<span className="text-4xl text-ink-mid" aria-hidden="true">⊟</span>
|
||||
<p className="text-sm font-medium text-ink-mid">No audit events yet</p>
|
||||
<p className="text-[11px] text-ink-soft max-w-[200px] leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid max-w-[200px] leading-relaxed">
|
||||
Delegation, decision, gate, and human-in-the-loop events will appear here.
|
||||
</p>
|
||||
</div>
|
||||
@ -203,7 +203,7 @@ export function AuditTrailPanel({ workspaceId }: Props) {
|
||||
)}
|
||||
|
||||
{/* Entry count footer */}
|
||||
<p className="mt-3 text-center text-[9px] text-ink-soft">
|
||||
<p className="mt-3 text-center text-[9px] text-ink-mid">
|
||||
{entries.length} event{entries.length !== 1 ? "s" : ""} loaded
|
||||
{cursor ? " · more available" : " · all loaded"}
|
||||
</p>
|
||||
@ -265,7 +265,7 @@ export function AuditEntryRow({ entry, now }: AuditEntryRowProps) {
|
||||
)}
|
||||
|
||||
{/* Relative timestamp */}
|
||||
<span className="shrink-0 text-[9px] text-ink-soft">
|
||||
<span className="shrink-0 text-[9px] text-ink-mid">
|
||||
{formatAuditRelativeTime(entry.created_at, now)}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
@ -125,7 +125,7 @@ export function BundleDropZone() {
|
||||
<div className="bg-surface-sunken/95 border border-accent/50 rounded-2xl px-8 py-6 shadow-2xl text-center">
|
||||
<div className="text-3xl mb-2" aria-hidden="true">📦</div>
|
||||
<div className="text-sm font-semibold text-ink">Drop Bundle to Import</div>
|
||||
<div className="text-xs text-ink-soft mt-1">.bundle.json files only</div>
|
||||
<div className="text-xs text-ink-mid mt-1">.bundle.json files only</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
"use client";
|
||||
|
||||
import { useCallback, useMemo } from "react";
|
||||
import { useCallback, useEffect, useMemo, useRef } from "react";
|
||||
import {
|
||||
ReactFlow,
|
||||
ReactFlowProvider,
|
||||
@ -187,6 +187,23 @@ function CanvasInner() {
|
||||
// Pan-to-node / zoom-to-team CustomEvent listeners + viewport save.
|
||||
const { onMoveEnd } = useCanvasViewport();
|
||||
|
||||
// Screen-reader announcements — read liveAnnouncement from the store and
|
||||
// immediately clear it so the same announcement doesn't re-fire on
|
||||
// re-render. Using a ref avoids a setState loop while keeping the
|
||||
// effect reactive to new announcement strings.
|
||||
const liveAnnouncement = useCanvasStore((s) => s.liveAnnouncement);
|
||||
const clearAnnouncement = useCanvasStore((s) => s.setLiveAnnouncement);
|
||||
const prevAnnouncement = useRef("");
|
||||
useEffect(() => {
|
||||
if (liveAnnouncement && liveAnnouncement !== prevAnnouncement.current) {
|
||||
prevAnnouncement.current = liveAnnouncement;
|
||||
// Small delay so the DOM update lands before clearing, giving
|
||||
// screen readers time to pick up the new text.
|
||||
const timer = setTimeout(() => clearAnnouncement(""), 500);
|
||||
return () => clearTimeout(timer);
|
||||
}
|
||||
}, [liveAnnouncement, clearAnnouncement]);
|
||||
|
||||
// Delete-confirmation lives in the store so the dialog survives ContextMenu
|
||||
// unmounting — the prior local-in-ContextMenu state raced with the menu's
|
||||
// outside-click handler.
|
||||
@ -326,11 +343,21 @@ function CanvasInner() {
|
||||
<DropTargetBadge />
|
||||
</ReactFlow>
|
||||
|
||||
{/* Screen-reader live region: announces workspace count on canvas load or change */}
|
||||
<div role="status" aria-live="polite" className="sr-only">
|
||||
{nodes.filter((n) => !n.parentId).length === 0
|
||||
? "No workspaces on canvas"
|
||||
: `${nodes.filter((n) => !n.parentId).length} workspace${nodes.filter((n) => !n.parentId).length !== 1 ? "s" : ""} on canvas`}
|
||||
{/* Screen-reader live region — announces workspace count on initial load and
|
||||
live status updates from WebSocket events (online, offline, provisioning, etc.).
|
||||
The liveAnnouncement text is cleared after the screen reader has had time
|
||||
to read it so the same message doesn't re-announce on re-render. */}
|
||||
<div
|
||||
role="status"
|
||||
aria-live="polite"
|
||||
aria-atomic="true"
|
||||
className="sr-only"
|
||||
>
|
||||
{liveAnnouncement || (
|
||||
nodes.filter((n) => !n.parentId).length === 0
|
||||
? "No workspaces on canvas"
|
||||
: `${nodes.filter((n) => !n.parentId).length} workspace${nodes.filter((n) => !n.parentId).length !== 1 ? "s" : ""} on canvas`
|
||||
)}
|
||||
</div>
|
||||
|
||||
{nodes.length === 0 && <EmptyState />}
|
||||
|
||||
@ -226,7 +226,7 @@ export function CommunicationOverlay() {
|
||||
type="button"
|
||||
onClick={() => setVisible(false)}
|
||||
aria-label="Close communications panel"
|
||||
className="text-ink-soft hover:text-ink-mid text-xs"
|
||||
className="text-ink-mid hover:text-ink-mid text-xs"
|
||||
>
|
||||
<span aria-hidden="true">✕</span>
|
||||
</button>
|
||||
@ -268,7 +268,7 @@ export function CommunicationOverlay() {
|
||||
</div>
|
||||
</div>
|
||||
{c.summary && (
|
||||
<div className="text-ink-soft truncate mt-0.5 pl-4">{c.summary}</div>
|
||||
<div className="text-ink-mid truncate mt-0.5 pl-4">{c.summary}</div>
|
||||
)}
|
||||
{c.durationMs && (
|
||||
<div className="text-ink-mid pl-4">{c.durationMs}ms</div>
|
||||
|
||||
@ -103,7 +103,7 @@ export function ConsoleModal({ workspaceId, workspaceName, open, onClose }: Prop
|
||||
EC2 console output
|
||||
</h3>
|
||||
{workspaceName && (
|
||||
<div className="text-[11px] text-ink-soft mt-0.5 truncate max-w-[600px]">
|
||||
<div className="text-[11px] text-ink-mid mt-0.5 truncate max-w-[600px]">
|
||||
{workspaceName}
|
||||
</div>
|
||||
)}
|
||||
@ -124,7 +124,7 @@ export function ConsoleModal({ workspaceId, workspaceName, open, onClose }: Prop
|
||||
|
||||
<div className="flex-1 overflow-auto bg-black/80 p-4">
|
||||
{loading && (
|
||||
<div className="text-[12px] text-ink-soft" data-testid="console-loading">
|
||||
<div className="text-[12px] text-ink-mid" data-testid="console-loading">
|
||||
Loading console output…
|
||||
</div>
|
||||
)}
|
||||
|
||||
@ -311,7 +311,7 @@ export function ContextMenu() {
|
||||
aria-hidden="true"
|
||||
className={`w-1.5 h-1.5 rounded-full ${statusDotClass(contextMenu.nodeData.status)}`}
|
||||
/>
|
||||
<span className="text-[10px] text-ink-soft">{contextMenu.nodeData.status}</span>
|
||||
<span className="text-[10px] text-ink-mid">{contextMenu.nodeData.status}</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
@ -106,7 +106,7 @@ export function ConversationTraceModal({ open, workspaceId: _workspaceId, onClos
|
||||
<Dialog.Title className="text-sm font-semibold text-ink">
|
||||
Conversation Trace
|
||||
</Dialog.Title>
|
||||
<p className="text-[10px] text-ink-soft mt-0.5">
|
||||
<p className="text-[10px] text-ink-mid mt-0.5">
|
||||
{entries.length} events across all workspaces
|
||||
</p>
|
||||
</div>
|
||||
@ -114,7 +114,7 @@ export function ConversationTraceModal({ open, workspaceId: _workspaceId, onClos
|
||||
<button
|
||||
type="button"
|
||||
aria-label="Close conversation trace"
|
||||
className="text-ink-soft hover:text-ink-mid text-lg px-2"
|
||||
className="text-ink-mid hover:text-ink-mid text-lg px-2"
|
||||
>
|
||||
✕
|
||||
</button>
|
||||
@ -124,13 +124,13 @@ export function ConversationTraceModal({ open, workspaceId: _workspaceId, onClos
|
||||
{/* Timeline */}
|
||||
<div className="flex-1 overflow-y-auto px-5 py-4">
|
||||
{loading && (
|
||||
<div className="text-xs text-ink-soft text-center py-8">
|
||||
<div className="text-xs text-ink-mid text-center py-8">
|
||||
Loading trace from all workspaces...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && entries.length === 0 && (
|
||||
<div className="text-xs text-ink-soft text-center py-8">
|
||||
<div className="text-xs text-ink-mid text-center py-8">
|
||||
No activity found
|
||||
</div>
|
||||
)}
|
||||
@ -250,7 +250,7 @@ export function ConversationTraceModal({ open, workspaceId: _workspaceId, onClos
|
||||
{/* Message content — show request and/or response */}
|
||||
{requestText && (
|
||||
<div className="mt-1.5 bg-surface/60 border border-line/50 rounded-lg px-3 py-2 max-h-32 overflow-y-auto">
|
||||
<div className="text-[8px] text-ink-soft uppercase mb-1">
|
||||
<div className="text-[8px] text-ink-mid uppercase mb-1">
|
||||
{isSend ? "Task" : "Request"}
|
||||
</div>
|
||||
<div className="text-[10px] text-ink-mid whitespace-pre-wrap break-words leading-relaxed">
|
||||
|
||||
@ -338,7 +338,7 @@ export function CreateWorkspaceButton() {
|
||||
<Dialog.Title className="text-base font-semibold text-ink mb-1">
|
||||
Create Workspace
|
||||
</Dialog.Title>
|
||||
<p className="text-xs text-ink-soft mb-5">
|
||||
<p className="text-xs text-ink-mid mb-5">
|
||||
Add a new workspace node to the canvas
|
||||
</p>
|
||||
|
||||
@ -376,7 +376,7 @@ export function CreateWorkspaceButton() {
|
||||
/>
|
||||
<div className="text-xs">
|
||||
<div className="text-ink font-medium">External agent (bring your own compute)</div>
|
||||
<div className="text-ink-soft mt-0.5">
|
||||
<div className="text-ink-mid mt-0.5">
|
||||
Skip the container. We'll return a workspace_id + auth token + ready-to-paste snippet so an agent running on your laptop / server / CI can register via A2A.
|
||||
</div>
|
||||
</div>
|
||||
@ -456,7 +456,7 @@ export function CreateWorkspaceButton() {
|
||||
<p className="text-[11px] font-semibold text-violet-400 uppercase tracking-wide">
|
||||
Hermes Provider
|
||||
</p>
|
||||
<p className="text-[11px] text-ink-soft -mt-1">
|
||||
<p className="text-[11px] text-ink-mid -mt-1">
|
||||
Choose the AI provider and paste your API key. The key is
|
||||
stored as an encrypted workspace secret.
|
||||
</p>
|
||||
@ -534,7 +534,7 @@ export function CreateWorkspaceButton() {
|
||||
(m) => <option key={m} value={m} />,
|
||||
)}
|
||||
</datalist>
|
||||
<p className="text-[10px] text-ink-soft mt-1">
|
||||
<p className="text-[10px] text-ink-mid mt-1">
|
||||
Slug determines which provider hermes routes to at install time.
|
||||
</p>
|
||||
</div>
|
||||
@ -626,7 +626,7 @@ function InputField({
|
||||
className={`w-full bg-surface-card/60 border border-line/50 rounded-lg px-3 py-2 text-sm text-ink placeholder-ink-soft focus:outline-none focus:border-accent/60 focus:ring-1 focus:ring-accent/20 transition-colors ${mono ? "font-mono text-xs" : ""}`}
|
||||
/>
|
||||
{helper && (
|
||||
<p className="mt-1 text-xs text-ink-soft">{helper}</p>
|
||||
<p className="mt-1 text-xs text-ink-mid">{helper}</p>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
|
||||
@ -129,11 +129,11 @@ export function EmptyState() {
|
||||
T{t.tier}
|
||||
</span>
|
||||
</div>
|
||||
<p className="text-[11px] text-ink-soft line-clamp-2 leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid line-clamp-2 leading-relaxed">
|
||||
{t.description || "No description"}
|
||||
</p>
|
||||
{t.skill_count > 0 && (
|
||||
<p className="text-[9px] text-ink-soft mt-1.5">
|
||||
<p className="text-[9px] text-ink-mid mt-1.5">
|
||||
{t.skill_count} skill{t.skill_count !== 1 ? "s" : ""}
|
||||
{t.model ? ` · ${t.model}` : ""}
|
||||
</p>
|
||||
@ -174,10 +174,10 @@ export function EmptyState() {
|
||||
<div className="mt-5 pt-4 border-t border-line/50">
|
||||
<div className="flex items-center justify-center gap-6 text-[10px] text-ink-mid">
|
||||
<span>Drag to nest workspaces into teams</span>
|
||||
<span className="text-ink-soft">|</span>
|
||||
<span className="text-ink-mid">|</span>
|
||||
<span>Right-click for actions</span>
|
||||
<span className="text-ink-soft">|</span>
|
||||
<span>Press <kbd className="px-1 py-0.5 bg-surface-card rounded text-ink-soft font-mono">⌘K</kbd> to search</span>
|
||||
<span className="text-ink-mid">|</span>
|
||||
<span>Press <kbd className="px-1 py-0.5 bg-surface-card rounded text-ink-mid font-mono">⌘K</kbd> to search</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@ -201,7 +201,7 @@ export function ExternalConnectModal({ info, onClose }: Props) {
|
||||
className={`px-3 py-2 text-sm border-b-2 -mb-px transition-colors ${
|
||||
tab === t
|
||||
? "border-accent text-ink"
|
||||
: "border-transparent text-ink-soft hover:text-ink-mid"
|
||||
: "border-transparent text-ink-mid hover:text-ink-mid"
|
||||
}`}
|
||||
>
|
||||
{t === "claude"
|
||||
@ -335,7 +335,7 @@ function SnippetBlock({
|
||||
return (
|
||||
<div>
|
||||
<div className="flex items-center justify-between pb-1">
|
||||
<span className="text-xs text-ink-soft">{label}</span>
|
||||
<span className="text-xs text-ink-mid">{label}</span>
|
||||
<button
|
||||
type="button"
|
||||
onClick={onCopy}
|
||||
@ -366,7 +366,7 @@ function Field({
|
||||
}) {
|
||||
return (
|
||||
<div className="flex items-center gap-2">
|
||||
<span className="text-xs text-ink-soft w-36 shrink-0">{label}</span>
|
||||
<span className="text-xs text-ink-mid w-36 shrink-0">{label}</span>
|
||||
<code
|
||||
className={`flex-1 text-xs bg-surface border border-line rounded px-2 py-1 text-ink break-all ${mono ? "font-mono" : ""}`}
|
||||
>
|
||||
|
||||
235
canvas/src/components/KeyboardShortcutsDialog.tsx
Normal file
235
canvas/src/components/KeyboardShortcutsDialog.tsx
Normal file
@ -0,0 +1,235 @@
|
||||
"use client";
|
||||
|
||||
import { useEffect, useRef, useState } from "react";
|
||||
import { createPortal } from "react-dom";
|
||||
|
||||
interface ShortcutGroup {
|
||||
title: string;
|
||||
shortcuts: Array<{ keys: string[]; description: string }>;
|
||||
}
|
||||
|
||||
const SHORTCUT_GROUPS: ShortcutGroup[] = [
|
||||
{
|
||||
title: "Canvas",
|
||||
shortcuts: [
|
||||
{
|
||||
keys: ["Esc"],
|
||||
description: "Close context menu, clear selection, or deselect",
|
||||
},
|
||||
{
|
||||
keys: ["↑↓←→"],
|
||||
description: "Nudge selected node 10px; hold Shift for 50px",
|
||||
},
|
||||
{
|
||||
keys: ["Cmd", "↑↓←→"],
|
||||
description: "Resize selected node (↑↓ height, ←→ width); hold Shift for fine control (2px)",
|
||||
},
|
||||
{
|
||||
keys: ["Enter"],
|
||||
description: "Descend into selected node's first child",
|
||||
},
|
||||
{
|
||||
keys: ["Shift", "Enter"],
|
||||
description: "Ascend to selected node's parent",
|
||||
},
|
||||
{
|
||||
keys: ["Cmd", "]"],
|
||||
description: "Bring selected node forward in z-order",
|
||||
},
|
||||
{
|
||||
keys: ["Cmd", "["],
|
||||
description: "Send selected node backward in z-order",
|
||||
},
|
||||
{
|
||||
keys: ["Z"],
|
||||
description: "Zoom to fit the selected team and its sub-workspaces",
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
title: "Navigation",
|
||||
shortcuts: [
|
||||
{
|
||||
keys: ["⌘K"],
|
||||
description: "Open workspace search",
|
||||
},
|
||||
{
|
||||
keys: ["Palette"],
|
||||
description: "Open the template palette to deploy a new workspace",
|
||||
},
|
||||
{
|
||||
keys: ["Dbl-click"],
|
||||
description: "Zoom canvas to fit a team node and all its sub-workspaces",
|
||||
},
|
||||
{
|
||||
keys: ["Right-click"],
|
||||
description: "Open the workspace context menu",
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
title: "Agent",
|
||||
shortcuts: [
|
||||
{
|
||||
keys: ["Chat"],
|
||||
description: "Send a message or resume a running task",
|
||||
},
|
||||
{
|
||||
keys: ["Config"],
|
||||
description: "Edit skills, model, secrets, and runtime settings",
|
||||
},
|
||||
{
|
||||
keys: ["Audit"],
|
||||
description: "View the activity ledger for the selected workspace",
|
||||
},
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
interface Props {
|
||||
open: boolean;
|
||||
onClose: () => void;
|
||||
}
|
||||
|
||||
export function KeyboardShortcutsDialog({ open, onClose }: Props) {
|
||||
const dialogRef = useRef<HTMLDivElement>(null);
|
||||
const [mounted, setMounted] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
setMounted(true);
|
||||
}, []);
|
||||
|
||||
// Move focus into the dialog when it opens (WCAG 2.1 SC 2.4.3)
|
||||
useEffect(() => {
|
||||
if (!open || !mounted) return;
|
||||
const raf = requestAnimationFrame(() => {
|
||||
dialogRef.current?.querySelector<HTMLElement>("button")?.focus();
|
||||
});
|
||||
return () => cancelAnimationFrame(raf);
|
||||
}, [open, mounted]);
|
||||
|
||||
// Keyboard: Escape closes, Tab is trapped
|
||||
useEffect(() => {
|
||||
if (!open) return;
|
||||
const handler = (e: KeyboardEvent) => {
|
||||
if (e.key === "Escape") {
|
||||
onClose();
|
||||
return;
|
||||
}
|
||||
if (e.key === "Tab" && dialogRef.current) {
|
||||
const focusable = Array.from(
|
||||
dialogRef.current.querySelectorAll<HTMLElement>(
|
||||
'button, [href], input, select, textarea, [tabindex]:not([tabindex="-1"])'
|
||||
)
|
||||
).filter((el) => !el.hasAttribute("disabled"));
|
||||
if (focusable.length === 0) {
|
||||
e.preventDefault();
|
||||
return;
|
||||
}
|
||||
const first = focusable[0];
|
||||
const last = focusable[focusable.length - 1];
|
||||
if (e.shiftKey) {
|
||||
if (document.activeElement === first) {
|
||||
e.preventDefault();
|
||||
last.focus();
|
||||
}
|
||||
} else {
|
||||
if (document.activeElement === last) {
|
||||
e.preventDefault();
|
||||
first.focus();
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
window.addEventListener("keydown", handler);
|
||||
return () => window.removeEventListener("keydown", handler);
|
||||
}, [open, onClose]);
|
||||
|
||||
if (!open || !mounted) return null;
|
||||
|
||||
return createPortal(
|
||||
<div className="fixed inset-0 z-[9999] flex items-center justify-center">
|
||||
{/* Backdrop */}
|
||||
<div
|
||||
className="absolute inset-0 bg-black/60 backdrop-blur-sm"
|
||||
onClick={onClose}
|
||||
/>
|
||||
|
||||
{/* Dialog */}
|
||||
<div
|
||||
ref={dialogRef}
|
||||
role="dialog"
|
||||
aria-modal="true"
|
||||
aria-labelledby="keyboard-shortcuts-title"
|
||||
className="relative bg-surface border border-line rounded-xl shadow-2xl shadow-black/60 max-w-[480px] w-full mx-4 overflow-hidden max-h-[80vh] flex flex-col"
|
||||
>
|
||||
{/* Header */}
|
||||
<div className="flex items-center justify-between px-5 py-4 border-b border-line shrink-0">
|
||||
<h2
|
||||
id="keyboard-shortcuts-title"
|
||||
className="text-sm font-semibold text-ink"
|
||||
>
|
||||
Keyboard Shortcuts
|
||||
</h2>
|
||||
<button
|
||||
type="button"
|
||||
onClick={onClose}
|
||||
aria-label="Close keyboard shortcuts"
|
||||
className="w-7 h-7 flex items-center justify-center rounded-lg text-ink-mid hover:text-ink hover:bg-surface-sunken transition-colors focus:outline-none focus-visible:ring-2 focus-visible:ring-accent/40"
|
||||
>
|
||||
×
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div className="overflow-y-auto p-5 space-y-5">
|
||||
{SHORTCUT_GROUPS.map((group) => (
|
||||
<div key={group.title}>
|
||||
<h3 className="text-[10px] font-semibold uppercase tracking-[0.2em] text-ink-mid mb-2.5">
|
||||
{group.title}
|
||||
</h3>
|
||||
<div className="space-y-2">
|
||||
{group.shortcuts.map((shortcut, i) => (
|
||||
<div
|
||||
key={i}
|
||||
className="flex items-center justify-between gap-4"
|
||||
>
|
||||
<span className="text-[13px] text-ink-mid">
|
||||
{shortcut.description}
|
||||
</span>
|
||||
<kbd className="flex items-center gap-0.5 shrink-0">
|
||||
{shortcut.keys.map((k, j) => (
|
||||
<span key={j} className="flex items-center gap-0.5">
|
||||
{j > 0 && (
|
||||
<span className="text-[9px] text-ink-mid mx-0.5">
|
||||
+
|
||||
</span>
|
||||
)}
|
||||
<span className="inline-flex items-center rounded-md border border-line/70 bg-surface-sunken/70 px-2 py-0.5 text-[11px] font-medium text-ink tabular-nums font-mono">
|
||||
{k}
|
||||
</span>
|
||||
</span>
|
||||
))}
|
||||
</kbd>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* Footer */}
|
||||
<div className="px-5 py-3 border-t border-line bg-surface-sunken/30 shrink-0">
|
||||
<p className="text-[10px] text-ink-mid text-center">
|
||||
Press{" "}
|
||||
<kbd className="inline-flex items-center rounded border border-line/70 bg-surface-sunken/70 px-1.5 py-0.5 text-[10px] font-medium text-ink font-mono">
|
||||
Esc
|
||||
</kbd>{" "}
|
||||
to close
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>,
|
||||
document.body
|
||||
);
|
||||
}
|
||||
@ -97,7 +97,7 @@ export function Legend() {
|
||||
// 24×24 touch target (was ~10×16, well under WCAG 2.5.5 min).
|
||||
// Negative margin keeps the visual position the same as before
|
||||
// — only the hit area + focus ring are larger.
|
||||
className="-mt-1.5 -mr-1.5 w-6 h-6 inline-flex items-center justify-center rounded text-[14px] leading-none text-ink-soft hover:text-ink hover:bg-surface-card/40 focus:outline-none focus-visible:ring-2 focus-visible:ring-accent/60 transition-colors"
|
||||
className="-mt-1.5 -mr-1.5 w-6 h-6 inline-flex items-center justify-center rounded text-[14px] leading-none text-ink-mid hover:text-ink hover:bg-surface-card/40 focus:outline-none focus-visible:ring-2 focus-visible:ring-accent/60 transition-colors"
|
||||
>
|
||||
×
|
||||
</button>
|
||||
@ -105,7 +105,7 @@ export function Legend() {
|
||||
|
||||
{/* Status */}
|
||||
<div className="mb-2">
|
||||
<div className="text-[11px] text-ink-soft font-medium mb-1">Status</div>
|
||||
<div className="text-[11px] text-ink-mid font-medium mb-1">Status</div>
|
||||
<div className="flex flex-wrap gap-x-3 gap-y-1">
|
||||
{LEGEND_STATUSES.map((s) => (
|
||||
<StatusItem key={s} color={STATUS_CONFIG[s].dot} label={STATUS_CONFIG[s].label} />
|
||||
@ -115,7 +115,7 @@ export function Legend() {
|
||||
|
||||
{/* Tiers */}
|
||||
<div className="mb-2">
|
||||
<div className="text-[11px] text-ink-soft font-medium mb-1">Tier</div>
|
||||
<div className="text-[11px] text-ink-mid font-medium mb-1">Tier</div>
|
||||
<div className="flex flex-wrap gap-x-3 gap-y-1">
|
||||
{LEGEND_TIERS.map(({ tier, label }) => (
|
||||
<TierItem key={tier} tier={tier} label={label} color={TIER_CONFIG[tier].border} />
|
||||
@ -125,7 +125,7 @@ export function Legend() {
|
||||
|
||||
{/* Communication */}
|
||||
<div>
|
||||
<div className="text-[11px] text-ink-soft font-medium mb-1">Communication</div>
|
||||
<div className="text-[11px] text-ink-mid font-medium mb-1">Communication</div>
|
||||
<div className="flex flex-wrap gap-x-3 gap-y-1">
|
||||
<CommItem icon="↗" color="text-cyan-400" label="A2A Out" />
|
||||
<CommItem icon="↙" color="text-accent" label="A2A In" />
|
||||
|
||||
@ -288,7 +288,7 @@ export function MemoryInspectorPanel({ workspaceId }: Props) {
|
||||
if (loading && entries.length === 0 && !error && !pluginUnavailable) {
|
||||
return (
|
||||
<div className="flex items-center justify-center h-32">
|
||||
<span className="text-xs text-ink-soft">Loading memories…</span>
|
||||
<span className="text-xs text-ink-mid">Loading memories…</span>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@ -311,7 +311,7 @@ export function MemoryInspectorPanel({ workspaceId }: Props) {
|
||||
{/* Namespace dropdown */}
|
||||
<div className="px-4 pt-3 pb-2 border-b border-line/40 shrink-0 space-y-2">
|
||||
<div className="flex items-center gap-2">
|
||||
<label htmlFor="namespace-dropdown" className="text-[10px] text-ink-soft shrink-0">
|
||||
<label htmlFor="namespace-dropdown" className="text-[10px] text-ink-mid shrink-0">
|
||||
Namespace:
|
||||
</label>
|
||||
<select
|
||||
@ -337,7 +337,7 @@ export function MemoryInspectorPanel({ workspaceId }: Props) {
|
||||
height="12"
|
||||
viewBox="0 0 16 16"
|
||||
fill="none"
|
||||
className="absolute left-2.5 text-ink-soft pointer-events-none shrink-0"
|
||||
className="absolute left-2.5 text-ink-mid pointer-events-none shrink-0"
|
||||
aria-hidden="true"
|
||||
>
|
||||
<circle cx="7" cy="7" r="4.5" stroke="currentColor" strokeWidth="1.5" />
|
||||
@ -360,7 +360,7 @@ export function MemoryInspectorPanel({ workspaceId }: Props) {
|
||||
setDebouncedQuery('');
|
||||
}}
|
||||
aria-label="Clear search"
|
||||
className="absolute right-2 text-ink-soft hover:text-ink transition-colors text-sm leading-none"
|
||||
className="absolute right-2 text-ink-mid hover:text-ink transition-colors text-sm leading-none"
|
||||
>
|
||||
×
|
||||
</button>
|
||||
@ -370,7 +370,7 @@ export function MemoryInspectorPanel({ workspaceId }: Props) {
|
||||
|
||||
{/* Toolbar */}
|
||||
<div className="px-4 py-2.5 border-b border-line/40 flex items-center justify-between shrink-0">
|
||||
<span className="text-[11px] text-ink-soft">
|
||||
<span className="text-[11px] text-ink-mid">
|
||||
{debouncedQuery
|
||||
? `${entries.length} result${entries.length !== 1 ? 's' : ''}`
|
||||
: entries.length === 1
|
||||
@ -446,11 +446,11 @@ function EmptyState({
|
||||
// mirror it so the operator sees both signals.
|
||||
return (
|
||||
<div className="flex flex-col items-center justify-center py-16 gap-3 text-center">
|
||||
<span className="text-4xl text-ink-soft" aria-hidden="true">
|
||||
<span className="text-4xl text-ink-mid" aria-hidden="true">
|
||||
◇
|
||||
</span>
|
||||
<p className="text-sm font-medium text-ink-mid">Memory plugin disabled</p>
|
||||
<p className="text-[11px] text-ink-soft max-w-[220px] leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid max-w-[220px] leading-relaxed">
|
||||
See banner above for the operator-side fix.
|
||||
</p>
|
||||
</div>
|
||||
@ -459,11 +459,11 @@ function EmptyState({
|
||||
if (query) {
|
||||
return (
|
||||
<div className="flex flex-col items-center justify-center py-16 gap-3 text-center">
|
||||
<span className="text-4xl text-ink-soft" aria-hidden="true">
|
||||
<span className="text-4xl text-ink-mid" aria-hidden="true">
|
||||
◇
|
||||
</span>
|
||||
<p className="text-sm font-medium text-ink-mid">No memories match your search</p>
|
||||
<p className="text-[11px] text-ink-soft max-w-[200px] leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid max-w-[200px] leading-relaxed">
|
||||
Try a different query or clear the search.
|
||||
</p>
|
||||
</div>
|
||||
@ -471,11 +471,11 @@ function EmptyState({
|
||||
}
|
||||
return (
|
||||
<div className="flex flex-col items-center justify-center py-16 gap-3 text-center">
|
||||
<span className="text-4xl text-ink-soft" aria-hidden="true">
|
||||
<span className="text-4xl text-ink-mid" aria-hidden="true">
|
||||
◇
|
||||
</span>
|
||||
<p className="text-sm font-medium text-ink-mid">No memories yet</p>
|
||||
<p className="text-[11px] text-ink-soft max-w-[220px] leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid max-w-[220px] leading-relaxed">
|
||||
Agents commit memories via MCP tools (commit_memory, commit_summary). They
|
||||
appear here once written.
|
||||
</p>
|
||||
@ -558,7 +558,7 @@ function MemoryEntryRow({ entry, onDelete }: MemoryEntryRowProps) {
|
||||
|
||||
{/* Namespace tag */}
|
||||
<span
|
||||
className="text-[9px] shrink-0 font-mono text-ink-soft truncate max-w-[100px]"
|
||||
className="text-[9px] shrink-0 font-mono text-ink-mid truncate max-w-[100px]"
|
||||
title={entry.namespace}
|
||||
>
|
||||
{entry.namespace}
|
||||
@ -598,10 +598,10 @@ function MemoryEntryRow({ entry, onDelete }: MemoryEntryRowProps) {
|
||||
)}
|
||||
|
||||
|
||||
<span className="text-[9px] text-ink-soft shrink-0">
|
||||
<span className="text-[9px] text-ink-mid shrink-0">
|
||||
{formatRelativeTime(entry.created_at)}
|
||||
</span>
|
||||
<span className="text-[9px] text-ink-soft shrink-0" aria-hidden="true">
|
||||
<span className="text-[9px] text-ink-mid shrink-0" aria-hidden="true">
|
||||
{expanded ? '▼' : '▶'}
|
||||
</span>
|
||||
</button>
|
||||
@ -618,7 +618,7 @@ function MemoryEntryRow({ entry, onDelete }: MemoryEntryRowProps) {
|
||||
{entry.content}
|
||||
</pre>
|
||||
<div className="flex items-center justify-between gap-2">
|
||||
<span className="text-[9px] text-ink-soft">
|
||||
<span className="text-[9px] text-ink-mid">
|
||||
Created: {new Date(entry.created_at).toLocaleString()}
|
||||
{entry.expires_at && ` · Expires: ${new Date(entry.expires_at).toLocaleString()}`}
|
||||
</span>
|
||||
|
||||
@ -421,7 +421,7 @@ function ProviderPickerModal({
|
||||
<div className="text-[11px] text-ink-mid font-medium">
|
||||
{getKeyLabel(entry.key)}
|
||||
</div>
|
||||
<div className="text-[9px] font-mono text-ink-soft">{entry.key}</div>
|
||||
<div className="text-[9px] font-mono text-ink-mid">{entry.key}</div>
|
||||
</div>
|
||||
{entry.saved && (
|
||||
<span className="text-[9px] text-good bg-emerald-900/30 px-1.5 py-0.5 rounded flex items-center gap-1">
|
||||
@ -675,7 +675,7 @@ function AllKeysModal({
|
||||
<div className="text-[11px] text-ink-mid font-medium">
|
||||
{getKeyLabel(entry.key)}
|
||||
</div>
|
||||
<div className="text-[9px] font-mono text-ink-soft">{entry.key}</div>
|
||||
<div className="text-[9px] font-mono text-ink-mid">{entry.key}</div>
|
||||
</div>
|
||||
{entry.saved && (
|
||||
<span className="text-[9px] text-good bg-emerald-900/30 px-1.5 py-0.5 rounded flex items-center gap-1">
|
||||
|
||||
@ -247,7 +247,7 @@ export function OrgImportPreflightModal({
|
||||
<h2 id="org-preflight-title" className="text-sm font-semibold text-ink">
|
||||
Deploy {orgName}
|
||||
</h2>
|
||||
<p className="mt-0.5 text-[11px] text-ink-soft">
|
||||
<p className="mt-0.5 text-[11px] text-ink-mid">
|
||||
{workspaceCount} workspace{workspaceCount === 1 ? "" : "s"}.
|
||||
Review the credentials needed before import.
|
||||
</p>
|
||||
@ -400,7 +400,7 @@ function StrictEnvRow({
|
||||
<li className="flex items-center gap-2 rounded bg-surface-sunken/70 border border-line px-2 py-1.5">
|
||||
<code
|
||||
className={`text-[11px] font-mono flex-1 ${
|
||||
configured ? "text-ink-soft line-through" : "text-ink"
|
||||
configured ? "text-ink-mid line-through" : "text-ink"
|
||||
}`}
|
||||
>
|
||||
{envKey}
|
||||
@ -492,7 +492,7 @@ function AnyOfEnvGroup({
|
||||
>
|
||||
<code
|
||||
className={`text-[11px] font-mono flex-1 ${
|
||||
isConfigured ? "text-ink-soft line-through" : "text-ink"
|
||||
isConfigured ? "text-ink-mid line-through" : "text-ink"
|
||||
}`}
|
||||
>
|
||||
{m}
|
||||
|
||||
@ -356,7 +356,7 @@ export function ProviderModelSelector({
|
||||
<div>
|
||||
<label
|
||||
htmlFor={providerSelectId}
|
||||
className="text-[10px] uppercase tracking-wide text-ink-soft font-semibold mb-1.5 block"
|
||||
className="text-[10px] uppercase tracking-wide text-ink-mid font-semibold mb-1.5 block"
|
||||
>
|
||||
Provider <span aria-hidden="true" className="text-bad">*</span>
|
||||
<span className="sr-only"> (required)</span>
|
||||
@ -382,13 +382,13 @@ export function ProviderModelSelector({
|
||||
{selected?.tooltip && (
|
||||
<p
|
||||
id={`${providerSelectId}-help`}
|
||||
className="text-[9px] text-ink-soft mt-1 leading-relaxed"
|
||||
className="text-[9px] text-ink-mid mt-1 leading-relaxed"
|
||||
>
|
||||
{selected.tooltip}
|
||||
</p>
|
||||
)}
|
||||
{selected && selected.envVars.length > 0 && (
|
||||
<p className="text-[9px] text-ink-soft mt-0.5 font-mono">
|
||||
<p className="text-[9px] text-ink-mid mt-0.5 font-mono">
|
||||
requires: {selected.envVars.join(", ")}
|
||||
</p>
|
||||
)}
|
||||
@ -397,7 +397,7 @@ export function ProviderModelSelector({
|
||||
<div>
|
||||
<label
|
||||
htmlFor={modelSelectId}
|
||||
className="text-[10px] uppercase tracking-wide text-ink-soft font-semibold mb-1.5 block"
|
||||
className="text-[10px] uppercase tracking-wide text-ink-mid font-semibold mb-1.5 block"
|
||||
>
|
||||
Model <span aria-hidden="true" className="text-bad">*</span>
|
||||
<span className="sr-only"> (required)</span>
|
||||
@ -422,7 +422,7 @@ export function ProviderModelSelector({
|
||||
data-testid="model-input"
|
||||
className="w-full bg-surface-sunken border border-line rounded px-2 py-1.5 text-[11px] text-ink font-mono focus:outline-none focus:border-accent focus:ring-1 focus:ring-accent/20 transition-colors disabled:opacity-50"
|
||||
/>
|
||||
<p className="text-[9px] text-ink-soft mt-1 leading-relaxed">
|
||||
<p className="text-[9px] text-ink-mid mt-1 leading-relaxed">
|
||||
{selected?.wildcard
|
||||
? wildcardHelpText(selected)
|
||||
: "Free-text model id. Make sure the provider can resolve it."}
|
||||
|
||||
@ -157,7 +157,7 @@ export function PurchaseSuccessModal() {
|
||||
</div>
|
||||
|
||||
<div className="flex items-center justify-between gap-3 px-6 py-3 border-t border-line bg-surface/50">
|
||||
<span className="font-mono text-[10.5px] uppercase tracking-[0.12em] text-ink-soft">
|
||||
<span className="font-mono text-[10.5px] uppercase tracking-[0.12em] text-ink-mid">
|
||||
auto-dismiss · {AUTO_DISMISS_MS / 1000}s
|
||||
</span>
|
||||
<button
|
||||
|
||||
@ -104,7 +104,7 @@ export function SearchDialog() {
|
||||
>
|
||||
{/* Search input */}
|
||||
<div className="flex items-center gap-3 px-4 py-3 border-b border-line/40">
|
||||
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" className="shrink-0 text-ink-soft" aria-hidden="true">
|
||||
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" className="shrink-0 text-ink-mid" aria-hidden="true">
|
||||
<circle cx="7" cy="7" r="5.5" stroke="currentColor" strokeWidth="1.5" />
|
||||
<path d="M11 11l3.5 3.5" stroke="currentColor" strokeWidth="1.5" strokeLinecap="round" />
|
||||
</svg>
|
||||
@ -156,7 +156,7 @@ export function SearchDialog() {
|
||||
<div className="min-w-0 flex-1">
|
||||
<div className="text-sm text-ink truncate">{node.data.name}</div>
|
||||
{node.data.role && (
|
||||
<div className="text-[10px] text-ink-soft truncate">{node.data.role}</div>
|
||||
<div className="text-[10px] text-ink-mid truncate">{node.data.role}</div>
|
||||
)}
|
||||
</div>
|
||||
<span
|
||||
|
||||
@ -165,12 +165,12 @@ export function SidePanel() {
|
||||
</h2>
|
||||
<div className="flex items-center gap-2 mt-0.5">
|
||||
{node.data.role && (
|
||||
<span className="text-[10px] text-ink-soft truncate">
|
||||
<span className="text-[10px] text-ink-mid truncate">
|
||||
{node.data.role}
|
||||
</span>
|
||||
)}
|
||||
<span className={`text-[9px] px-1.5 py-0.5 rounded-md font-mono ${
|
||||
isOnline ? "text-good bg-emerald-950/30" : "text-ink-soft bg-surface-card/50"
|
||||
isOnline ? "text-good bg-emerald-950/30" : "text-ink-mid bg-surface-card/50"
|
||||
}`}>
|
||||
T{node.data.tier}
|
||||
</span>
|
||||
@ -181,7 +181,7 @@ export function SidePanel() {
|
||||
type="button"
|
||||
onClick={() => selectNode(null)}
|
||||
aria-label="Close workspace panel"
|
||||
className="w-7 h-7 flex items-center justify-center rounded-lg text-ink-soft hover:text-ink hover:bg-surface-card/60 transition-colors"
|
||||
className="w-7 h-7 flex items-center justify-center rounded-lg text-ink-mid hover:text-ink hover:bg-surface-card/60 transition-colors"
|
||||
>
|
||||
<svg width="12" height="12" viewBox="0 0 12 12" fill="none" aria-hidden="true">
|
||||
<path d="M1 1l10 10M11 1L1 11" stroke="currentColor" strokeWidth="1.5" strokeLinecap="round" />
|
||||
@ -296,7 +296,7 @@ export function SidePanel() {
|
||||
|
||||
{/* Footer — workspace ID */}
|
||||
<div className="px-5 py-2 border-t border-line/40 bg-surface-sunken/20">
|
||||
<span className="text-[9px] font-mono text-ink-soft select-all">
|
||||
<span className="text-[9px] font-mono text-ink-mid select-all">
|
||||
{selectedNodeId}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
@ -236,7 +236,7 @@ export function OrgTemplatesSection() {
|
||||
onClick={() => setExpanded((v) => !v)}
|
||||
aria-expanded={expanded}
|
||||
aria-controls="org-templates-body"
|
||||
className="flex items-center gap-1.5 text-[10px] uppercase tracking-wide text-ink-soft hover:text-ink-mid font-semibold transition-colors"
|
||||
className="flex items-center gap-1.5 text-[10px] uppercase tracking-wide text-ink-mid hover:text-ink-mid font-semibold transition-colors"
|
||||
>
|
||||
<span
|
||||
aria-hidden="true"
|
||||
@ -246,7 +246,7 @@ export function OrgTemplatesSection() {
|
||||
</span>
|
||||
Org Templates
|
||||
{orgs.length > 0 && (
|
||||
<span className="text-ink-soft normal-case tracking-normal">
|
||||
<span className="text-ink-mid normal-case tracking-normal">
|
||||
({orgs.length})
|
||||
</span>
|
||||
)}
|
||||
@ -255,7 +255,7 @@ export function OrgTemplatesSection() {
|
||||
type="button"
|
||||
onClick={loadOrgs}
|
||||
aria-label="Refresh org templates"
|
||||
className="text-[10px] text-ink-soft hover:text-ink-mid"
|
||||
className="text-[10px] text-ink-mid hover:text-ink-mid"
|
||||
>
|
||||
↻
|
||||
</button>
|
||||
@ -264,14 +264,14 @@ export function OrgTemplatesSection() {
|
||||
{expanded && (
|
||||
<div id="org-templates-body" className="space-y-2">
|
||||
{loading && (
|
||||
<div role="status" aria-live="polite" className="flex items-center gap-1.5 text-[10px] text-ink-soft">
|
||||
<div role="status" aria-live="polite" className="flex items-center gap-1.5 text-[10px] text-ink-mid">
|
||||
<Spinner size="sm" />
|
||||
Loading…
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && orgs.length === 0 && (
|
||||
<div className="text-[10px] text-ink-soft">
|
||||
<div className="text-[10px] text-ink-mid">
|
||||
No org templates in <code>org-templates/</code>
|
||||
</div>
|
||||
)}
|
||||
@ -298,7 +298,7 @@ export function OrgTemplatesSection() {
|
||||
</span>
|
||||
</div>
|
||||
{o.description && (
|
||||
<p className="text-[10px] text-ink-soft mb-2.5 line-clamp-2 leading-relaxed">
|
||||
<p className="text-[10px] text-ink-mid mb-2.5 line-clamp-2 leading-relaxed">
|
||||
{o.description}
|
||||
</p>
|
||||
)}
|
||||
@ -499,7 +499,7 @@ export function TemplatePalette() {
|
||||
<div className="fixed top-0 left-0 h-full w-[280px] bg-surface-sunken/95 backdrop-blur-md border-r border-line/60 z-30 flex flex-col shadow-2xl shadow-black/40">
|
||||
<div className="px-4 pt-14 pb-3 border-b border-line/60">
|
||||
<h2 className="text-sm font-semibold text-ink">Templates</h2>
|
||||
<p className="text-[10px] text-ink-soft mt-0.5">Click to deploy a workspace</p>
|
||||
<p className="text-[10px] text-ink-mid mt-0.5">Click to deploy a workspace</p>
|
||||
</div>
|
||||
|
||||
<div className="flex-1 overflow-y-auto p-3 space-y-2">
|
||||
@ -509,14 +509,14 @@ export function TemplatePalette() {
|
||||
<OrgTemplatesSection />
|
||||
|
||||
{loading && (
|
||||
<div role="status" aria-live="polite" className="flex items-center justify-center gap-2 text-xs text-ink-soft text-center py-8">
|
||||
<div role="status" aria-live="polite" className="flex items-center justify-center gap-2 text-xs text-ink-mid text-center py-8">
|
||||
<Spinner />
|
||||
Loading…
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && templates.length === 0 && (
|
||||
<div role="status" aria-live="polite" className="text-xs text-ink-soft text-center py-8">
|
||||
<div role="status" aria-live="polite" className="text-xs text-ink-mid text-center py-8">
|
||||
No templates found in<br />workspace-configs-templates/
|
||||
</div>
|
||||
)}
|
||||
@ -549,7 +549,7 @@ export function TemplatePalette() {
|
||||
</div>
|
||||
|
||||
{t.description && (
|
||||
<p className="text-[10px] text-ink-soft mb-2 line-clamp-2 leading-relaxed">
|
||||
<p className="text-[10px] text-ink-mid mb-2 line-clamp-2 leading-relaxed">
|
||||
{t.description}
|
||||
</p>
|
||||
)}
|
||||
@ -562,7 +562,7 @@ export function TemplatePalette() {
|
||||
</span>
|
||||
))}
|
||||
{t.skills.length > 3 && (
|
||||
<span className="text-[8px] text-ink-soft">+{t.skills.length - 3}</span>
|
||||
<span className="text-[8px] text-ink-mid">+{t.skills.length - 3}</span>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
@ -580,7 +580,7 @@ export function TemplatePalette() {
|
||||
<button
|
||||
type="button"
|
||||
onClick={loadTemplates}
|
||||
className="text-[10px] text-ink-soft hover:text-ink-mid transition-colors block"
|
||||
className="text-[10px] text-ink-mid hover:text-ink-mid transition-colors block"
|
||||
>
|
||||
Refresh templates
|
||||
</button>
|
||||
|
||||
@ -124,7 +124,7 @@ export function TermsGate({ children }: { children: React.ReactNode }) {
|
||||
</a>
|
||||
. Click agree to continue.
|
||||
</p>
|
||||
<p className="mt-3 text-xs text-ink-soft">
|
||||
<p className="mt-3 text-xs text-ink-mid">
|
||||
By agreeing you acknowledge that workspace data is stored in AWS us-east-2 (Ohio, United States).
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@ -57,7 +57,7 @@ export function ThemeToggle({ className = "" }: { className?: string }) {
|
||||
"flex h-6 w-6 items-center justify-center rounded transition-colors " +
|
||||
(active
|
||||
? "bg-surface-elevated text-ink shadow-sm"
|
||||
: "text-ink-soft hover:text-ink-mid")
|
||||
: "text-ink-mid hover:text-ink-mid")
|
||||
}
|
||||
>
|
||||
<svg
|
||||
|
||||
@ -9,6 +9,7 @@ import { ConfirmDialog } from "@/components/ConfirmDialog";
|
||||
import { showToast } from "@/components/Toaster";
|
||||
import { ThemeToggle } from "@/components/ThemeToggle";
|
||||
import { statusDotClass } from "@/lib/design-tokens";
|
||||
import { KeyboardShortcutsDialog } from "@/components/KeyboardShortcutsDialog";
|
||||
|
||||
export function Toolbar() {
|
||||
const nodes = useCanvasStore((s) => s.nodes);
|
||||
@ -33,6 +34,7 @@ export function Toolbar() {
|
||||
const [restartingAll, setRestartingAll] = useState(false);
|
||||
const [restartConfirmOpen, setRestartConfirmOpen] = useState(false);
|
||||
const [helpOpen, setHelpOpen] = useState(false);
|
||||
const [shortcutsOpen, setShortcutsOpen] = useState(false);
|
||||
const helpRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
// Suppress toast on the very first connect at page load; only fire on reconnects.
|
||||
@ -127,6 +129,29 @@ export function Toolbar() {
|
||||
};
|
||||
}, []);
|
||||
|
||||
// Global ? shortcut opens the shortcuts dialog (mirrors the help button).
|
||||
// Skip when the user is typing in an input so ? in a text field doesn't
|
||||
// steal focus. Also skip when a modal/dialog is already open.
|
||||
useEffect(() => {
|
||||
const handler = (e: KeyboardEvent) => {
|
||||
if (e.key !== "?") return;
|
||||
const tag = (e.target as HTMLElement).tagName;
|
||||
const inInput =
|
||||
tag === "INPUT" ||
|
||||
tag === "TEXTAREA" ||
|
||||
tag === "SELECT" ||
|
||||
(e.target as HTMLElement).isContentEditable;
|
||||
if (inInput) return;
|
||||
// Don't fire when a modal/dialog is already mounted (canvas modals,
|
||||
// side panel, etc. use z-50 or above).
|
||||
if (document.querySelector('[role="dialog"][aria-modal="true"]')) return;
|
||||
e.preventDefault();
|
||||
setShortcutsOpen(true);
|
||||
};
|
||||
window.addEventListener("keydown", handler);
|
||||
return () => window.removeEventListener("keydown", handler);
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<div
|
||||
className="fixed top-3 left-1/2 -translate-x-1/2 z-20 flex items-center gap-3 bg-surface-sunken/80 backdrop-blur-md border border-line/60 rounded-xl px-4 py-2 shadow-xl shadow-black/20 transition-[margin-left] duration-200"
|
||||
@ -321,6 +346,14 @@ export function Toolbar() {
|
||||
<HelpRow shortcut="Config" text="Use the Config tab for skills, model, secrets, and runtime settings." />
|
||||
<HelpRow shortcut="Dbl-click / Z" text="Zoom canvas to fit a team node and all its sub-workspaces." />
|
||||
</div>
|
||||
{/* Link to the full keyboard shortcuts dialog */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => { setHelpOpen(false); setShortcutsOpen(true); }}
|
||||
className="mt-3 w-full text-center text-[10px] text-ink-mid hover:text-accent transition-colors focus:outline-none focus-visible:underline"
|
||||
>
|
||||
See all shortcuts →
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
@ -340,6 +373,11 @@ export function Toolbar() {
|
||||
onConfirm={restartAll}
|
||||
onCancel={() => setRestartConfirmOpen(false)}
|
||||
/>
|
||||
|
||||
<KeyboardShortcutsDialog
|
||||
open={shortcutsOpen}
|
||||
onClose={() => setShortcutsOpen(false)}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
"use client";
|
||||
|
||||
import { useCallback, useMemo } from "react";
|
||||
import { useCallback, useMemo, type KeyboardEvent } from "react";
|
||||
import { Handle, NodeResizer, Position, type NodeProps, type Node } from "@xyflow/react";
|
||||
import { useCanvasStore, type WorkspaceNodeData } from "@/store/canvas";
|
||||
import { getConfigurationError, getConfigurationStatus } from "@/store/canvas-topology";
|
||||
@ -191,7 +191,23 @@ export function WorkspaceNode({ id, data }: NodeProps<Node<WorkspaceNodeData>>)
|
||||
<Handle
|
||||
type="target"
|
||||
position={Position.Top}
|
||||
className="!w-2.5 !h-1 !rounded-full !bg-surface-card/80 !border-0 !-top-0.5 hover:!bg-blue-400 hover:!h-1.5 transition-all"
|
||||
tabIndex={0}
|
||||
role="button"
|
||||
aria-label={`Extract ${data.name} from its parent (Enter or Space)`}
|
||||
onKeyDown={(e: KeyboardEvent<HTMLDivElement>) => {
|
||||
if (e.key === "Enter" || e.key === " ") {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
// Keyboard accessibility for edge anchors: pressing Enter/Space on
|
||||
// the top handle extracts this node from its current parent,
|
||||
// moving it to the root level. Mirrors the Figma/Excalidraw
|
||||
// pattern of using the connector dot as a keyboard affordance.
|
||||
if (data.parentId) {
|
||||
void nestNode(id, null);
|
||||
}
|
||||
}
|
||||
}}
|
||||
className="!w-2.5 !h-1 !rounded-full !bg-surface-card/80 !border-0 !-top-0.5 hover:!bg-blue-400 hover:!h-1.5 focus-visible:!bg-blue-400 focus-visible:!h-1.5 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-blue-400/60 focus-visible:ring-offset-1 focus-visible:ring-offset-zinc-950 transition-all"
|
||||
/>
|
||||
|
||||
<div className="relative px-3.5 py-2.5">
|
||||
@ -358,7 +374,23 @@ export function WorkspaceNode({ id, data }: NodeProps<Node<WorkspaceNodeData>>)
|
||||
<Handle
|
||||
type="source"
|
||||
position={Position.Bottom}
|
||||
className="!w-2.5 !h-1 !rounded-full !bg-surface-card/80 !border-0 !-bottom-0.5 hover:!bg-blue-400 hover:!h-1.5 transition-all"
|
||||
tabIndex={0}
|
||||
role="button"
|
||||
aria-label={`Nest selected workspace inside ${data.name} (Enter or Space)`}
|
||||
onKeyDown={(e: KeyboardEvent<HTMLDivElement>) => {
|
||||
if (e.key === "Enter" || e.key === " ") {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
// Keyboard accessibility for edge anchors: pressing Enter/Space on
|
||||
// the bottom handle nests the currently-selected node as a child
|
||||
// of this node. Requires another node to be selected first.
|
||||
const selected = selectedNodeId;
|
||||
if (selected && selected !== id) {
|
||||
void nestNode(selected, id);
|
||||
}
|
||||
}
|
||||
}}
|
||||
className="!w-2.5 !h-1 !rounded-full !bg-surface-card/80 !border-0 !-bottom-0.5 hover:!bg-blue-400 hover:!h-1.5 focus-visible:!bg-blue-400 focus-visible:!h-1.5 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-blue-400/60 focus-visible:ring-offset-1 focus-visible:ring-offset-zinc-950 transition-all"
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
|
||||
@ -55,7 +55,7 @@ export function WorkspaceUsage({ workspaceId }: WorkspaceUsageProps) {
|
||||
</h4>
|
||||
{!loading && metrics && (
|
||||
<span
|
||||
className="text-[10px] text-ink-soft font-mono"
|
||||
className="text-[10px] text-ink-mid font-mono"
|
||||
data-testid="usage-period"
|
||||
>
|
||||
{formatPeriod(metrics.period_start, metrics.period_end)}
|
||||
@ -131,7 +131,7 @@ function StatRow({
|
||||
}) {
|
||||
return (
|
||||
<div className="flex justify-between items-center" data-testid={testId}>
|
||||
<span className="text-xs text-ink-soft">{label}</span>
|
||||
<span className="text-xs text-ink-mid">{label}</span>
|
||||
<span className="text-xs text-ink-mid font-mono">{value}</span>
|
||||
</div>
|
||||
);
|
||||
|
||||
@ -0,0 +1,90 @@
|
||||
// @vitest-environment jsdom
|
||||
import { describe, it, expect, vi, beforeEach, afterEach } from "vitest";
|
||||
import { render, screen, fireEvent, cleanup, act, waitFor } from "@testing-library/react";
|
||||
|
||||
// ── Component under test — imported AFTER mocks ───────────────────────────────
|
||||
import { KeyboardShortcutsDialog } from "../KeyboardShortcutsDialog";
|
||||
|
||||
afterEach(cleanup);
|
||||
|
||||
const onCloseMock = vi.fn();
|
||||
|
||||
beforeEach(() => {
|
||||
onCloseMock.mockReset();
|
||||
});
|
||||
|
||||
describe("KeyboardShortcutsDialog — a11y render", () => {
|
||||
it("renders with role=dialog and aria-modal=true when open", async () => {
|
||||
render(<KeyboardShortcutsDialog open={true} onClose={onCloseMock} />);
|
||||
await waitFor(() => {
|
||||
expect(screen.getByRole("dialog")).toBeTruthy();
|
||||
});
|
||||
const dialog = screen.getByRole("dialog");
|
||||
expect(dialog.getAttribute("aria-modal")).toBe("true");
|
||||
});
|
||||
|
||||
it("has aria-labelledby pointing to the dialog title", async () => {
|
||||
render(<KeyboardShortcutsDialog open={true} onClose={onCloseMock} />);
|
||||
const dialog = await waitFor(() => screen.getByRole("dialog"));
|
||||
const labelledby = dialog.getAttribute("aria-labelledby");
|
||||
expect(labelledby).toBeTruthy();
|
||||
// The labelledby should reference the h2 with id="keyboard-shortcuts-title"
|
||||
const title = document.getElementById(labelledby!);
|
||||
expect(title?.textContent).toMatch(/keyboard shortcuts/i);
|
||||
});
|
||||
|
||||
it("does not render when open=false", () => {
|
||||
render(<KeyboardShortcutsDialog open={false} onClose={onCloseMock} />);
|
||||
expect(screen.queryByRole("dialog")).toBeNull();
|
||||
});
|
||||
|
||||
it("calls onClose when Escape is pressed", async () => {
|
||||
render(<KeyboardShortcutsDialog open={true} onClose={onCloseMock} />);
|
||||
await waitFor(() => expect(screen.getByRole("dialog")).toBeTruthy());
|
||||
act(() => {
|
||||
fireEvent.keyDown(window, { key: "Escape" });
|
||||
});
|
||||
expect(onCloseMock).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("focuses the first focusable element (close button) when dialog opens", async () => {
|
||||
render(<KeyboardShortcutsDialog open={true} onClose={onCloseMock} />);
|
||||
// The component uses requestAnimationFrame to move focus; wait for it to settle.
|
||||
await waitFor(() => expect(screen.getByRole("dialog")).toBeTruthy());
|
||||
await act(async () => {
|
||||
await new Promise((r) => requestAnimationFrame(() => requestAnimationFrame(r)));
|
||||
});
|
||||
const closeBtn = screen.getByRole("button", { name: /close/i });
|
||||
expect(document.activeElement).toBe(closeBtn);
|
||||
});
|
||||
|
||||
it("traps Tab focus within the dialog", async () => {
|
||||
render(<KeyboardShortcutsDialog open={true} onClose={onCloseMock} />);
|
||||
const dialog = await waitFor(() => screen.getByRole("dialog"));
|
||||
|
||||
// Collect all focusable elements inside the dialog
|
||||
const focusableSelectors =
|
||||
'button:not([disabled]), [href], input:not([disabled]), select:not([disabled]), textarea:not([disabled]), [tabindex]:not([tabindex="-1"])';
|
||||
const focusableEls = Array.from(
|
||||
dialog.querySelectorAll<HTMLElement>(focusableSelectors)
|
||||
);
|
||||
expect(focusableEls.length).toBeGreaterThan(0);
|
||||
|
||||
const onlyFocusable = focusableEls[0];
|
||||
act(() => { onlyFocusable.focus(); });
|
||||
|
||||
// Simulate Tab keydown. The dialog's handler should call preventDefault()
|
||||
// to stop focus leaving the dialog. Verify by checking the event was
|
||||
// handled (focus remains on the only focusable element).
|
||||
let tabWasIntercepted = false;
|
||||
const tabHandler = (e: KeyboardEvent) => {
|
||||
if (e.key === "Tab") tabWasIntercepted = e.defaultPrevented;
|
||||
};
|
||||
window.addEventListener("keydown", tabHandler);
|
||||
act(() => {
|
||||
fireEvent.keyDown(onlyFocusable, { key: "Tab", shiftKey: false });
|
||||
});
|
||||
expect(tabWasIntercepted).toBe(true);
|
||||
window.removeEventListener("keydown", tabHandler);
|
||||
});
|
||||
});
|
||||
@ -0,0 +1,436 @@
|
||||
// @vitest-environment jsdom
|
||||
/**
|
||||
* Tests for canvas keyboard shortcuts (useKeyboardShortcuts hook).
|
||||
*
|
||||
* Covers: Esc, Enter/Shift+Enter, Cmd+]/[, Z, and Arrow keys.
|
||||
*
|
||||
* The hook is tested by dispatching KeyboardEvents at the window and
|
||||
* asserting the resulting store mutations / dispatched events.
|
||||
*/
|
||||
import React from "react";
|
||||
import { render, cleanup, fireEvent } from "@testing-library/react";
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { useKeyboardShortcuts } from "../useKeyboardShortcuts";
|
||||
import { useCanvasStore } from "@/store/canvas";
|
||||
|
||||
// ─── Mock store ──────────────────────────────────────────────────────────────
|
||||
|
||||
const mockSavePosition = vi.fn().mockResolvedValue(undefined);
|
||||
|
||||
vi.mock("@/store/canvas", () => ({
|
||||
useCanvasStore: Object.assign(
|
||||
vi.fn((sel) => sel(mockStoreState)),
|
||||
{
|
||||
getState: () => mockStoreState,
|
||||
}
|
||||
),
|
||||
}));
|
||||
|
||||
// Module-level mutable state so tests can mutate between cases
|
||||
const mockStoreState = {
|
||||
selectedNodeId: null as string | null,
|
||||
selectedNodeIds: new Set<string>(),
|
||||
nodes: [] as Array<{
|
||||
id: string;
|
||||
position: { x: number; y: number };
|
||||
data: { parentId?: string | null };
|
||||
width?: number;
|
||||
height?: number;
|
||||
}>,
|
||||
contextMenu: null as { x: number; y: number; nodeId: string } | null,
|
||||
closeContextMenu: vi.fn(),
|
||||
selectNode: vi.fn(),
|
||||
clearSelection: vi.fn(),
|
||||
bumpZOrder: vi.fn(),
|
||||
savePosition: mockSavePosition,
|
||||
moveNode: vi.fn(),
|
||||
onNodesChange: vi.fn(),
|
||||
};
|
||||
|
||||
afterEach(() => {
|
||||
cleanup();
|
||||
vi.clearAllMocks();
|
||||
// Reset to default empty state between tests
|
||||
mockStoreState.selectedNodeId = null;
|
||||
mockStoreState.selectedNodeIds = new Set();
|
||||
mockStoreState.nodes = [];
|
||||
mockStoreState.contextMenu = null;
|
||||
mockStoreState.closeContextMenu.mockClear();
|
||||
mockStoreState.selectNode.mockClear();
|
||||
mockStoreState.clearSelection.mockClear();
|
||||
mockStoreState.bumpZOrder.mockClear();
|
||||
mockStoreState.moveNode.mockClear();
|
||||
mockStoreState.savePosition.mockClear();
|
||||
mockStoreState.onNodesChange.mockClear();
|
||||
});
|
||||
|
||||
// ─── Test wrapper ────────────────────────────────────────────────────────────
|
||||
|
||||
function ShortcutTestComponent() {
|
||||
useKeyboardShortcuts();
|
||||
return <div data-testid="canvas-root" />;
|
||||
}
|
||||
|
||||
function renderWithProvider() {
|
||||
return render(<ShortcutTestComponent />);
|
||||
}
|
||||
|
||||
// ─── Tests ───────────────────────────────────────────────────────────────────
|
||||
|
||||
describe("Esc — deselect / close context menu", () => {
|
||||
it("closes the context menu when one is open", () => {
|
||||
mockStoreState.contextMenu = { x: 100, y: 100, nodeId: "n1" };
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "Escape" });
|
||||
expect(mockStoreState.closeContextMenu).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("clears the batch selection when no context menu is open", () => {
|
||||
mockStoreState.contextMenu = null;
|
||||
mockStoreState.selectedNodeIds = new Set(["n1", "n2"]);
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "Escape" });
|
||||
expect(mockStoreState.clearSelection).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("deselects the focused node when no batch selection exists", () => {
|
||||
mockStoreState.contextMenu = null;
|
||||
mockStoreState.selectedNodeIds = new Set();
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "Escape" });
|
||||
expect(mockStoreState.selectNode).toHaveBeenCalledWith(null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("Enter — hierarchy navigation", () => {
|
||||
beforeEach(() => {
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
mockStoreState.nodes = [
|
||||
{ id: "n1", position: { x: 0, y: 0 }, data: { parentId: null } },
|
||||
{ id: "n2", position: { x: 100, y: 0 }, data: { parentId: "n1" } },
|
||||
{ id: "n3", position: { x: 200, y: 0 }, data: { parentId: null } },
|
||||
];
|
||||
});
|
||||
|
||||
it("navigates to the first child on Enter", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "Enter" });
|
||||
expect(mockStoreState.selectNode).toHaveBeenCalledWith("n2");
|
||||
});
|
||||
|
||||
it("navigates to the parent on Shift+Enter", () => {
|
||||
mockStoreState.nodes = [
|
||||
{ id: "n1", position: { x: 0, y: 0 }, data: { parentId: null } },
|
||||
{ id: "n2", position: { x: 100, y: 0 }, data: { parentId: "n1" } },
|
||||
];
|
||||
mockStoreState.selectedNodeId = "n2";
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "Enter", shiftKey: true });
|
||||
expect(mockStoreState.selectNode).toHaveBeenCalledWith("n1");
|
||||
});
|
||||
|
||||
it("does NOT navigate when no node is selected", () => {
|
||||
mockStoreState.selectedNodeId = null;
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "Enter" });
|
||||
expect(mockStoreState.selectNode).not.toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
||||
describe("Cmd+]/[ — z-order bump", () => {
|
||||
beforeEach(() => {
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
});
|
||||
|
||||
it("bumps z-order forward on Cmd+]", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "]", metaKey: true });
|
||||
expect(mockStoreState.bumpZOrder).toHaveBeenCalledWith("n1", 1);
|
||||
});
|
||||
|
||||
it("bumps z-order backward on Cmd+[", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "[", metaKey: true });
|
||||
expect(mockStoreState.bumpZOrder).toHaveBeenCalledWith("n1", -1);
|
||||
});
|
||||
|
||||
it("uses Ctrl as the modifier key", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "]", ctrlKey: true });
|
||||
expect(mockStoreState.bumpZOrder).toHaveBeenCalledWith("n1", 1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("Z — zoom-to-team", () => {
|
||||
let dispatchedEvents: CustomEvent[] = [];
|
||||
|
||||
beforeEach(() => {
|
||||
dispatchedEvents = [];
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
mockStoreState.nodes = [
|
||||
{ id: "n1", position: { x: 0, y: 0 }, data: { parentId: null } },
|
||||
{ id: "n2", position: { x: 100, y: 0 }, data: { parentId: "n1" } },
|
||||
];
|
||||
window.addEventListener("molecule:zoom-to-team", (e) => {
|
||||
dispatchedEvents.push(e as CustomEvent);
|
||||
});
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
window.removeEventListener("molecule:zoom-to-team", () => {});
|
||||
});
|
||||
|
||||
it("dispatches zoom-to-team when the selected node has children", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "z" });
|
||||
expect(dispatchedEvents).toHaveLength(1);
|
||||
expect(dispatchedEvents[0].detail.nodeId).toBe("n1");
|
||||
});
|
||||
|
||||
it("does NOT fire when no node is selected", () => {
|
||||
mockStoreState.selectedNodeId = null;
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "z" });
|
||||
expect(dispatchedEvents).toHaveLength(0);
|
||||
});
|
||||
|
||||
it("does NOT fire when the node has no children", () => {
|
||||
mockStoreState.nodes = [
|
||||
{ id: "n1", position: { x: 0, y: 0 }, data: { parentId: null } },
|
||||
];
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "z" });
|
||||
expect(dispatchedEvents).toHaveLength(0);
|
||||
});
|
||||
|
||||
it("skips when the target element is an input", () => {
|
||||
renderWithProvider();
|
||||
const input = document.createElement("input");
|
||||
document.body.appendChild(input);
|
||||
fireEvent.keyDown(input, { key: "z" });
|
||||
expect(dispatchedEvents).toHaveLength(0);
|
||||
document.body.removeChild(input);
|
||||
});
|
||||
});
|
||||
|
||||
describe("Arrow keys — keyboard node movement", () => {
|
||||
beforeEach(() => {
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
mockStoreState.nodes = [
|
||||
{ id: "n1", position: { x: 100, y: 200 }, data: { parentId: null } },
|
||||
];
|
||||
});
|
||||
|
||||
it("moves the selected node down on ArrowDown", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "ArrowDown" });
|
||||
expect(mockStoreState.moveNode).toHaveBeenCalledWith("n1", 0, 10);
|
||||
});
|
||||
|
||||
it("moves the selected node up on ArrowUp", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "ArrowUp" });
|
||||
expect(mockStoreState.moveNode).toHaveBeenCalledWith("n1", 0, -10);
|
||||
});
|
||||
|
||||
it("moves the selected node right on ArrowRight", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "ArrowRight" });
|
||||
expect(mockStoreState.moveNode).toHaveBeenCalledWith("n1", 10, 0);
|
||||
});
|
||||
|
||||
it("moves the selected node left on ArrowLeft", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "ArrowLeft" });
|
||||
expect(mockStoreState.moveNode).toHaveBeenCalledWith("n1", -10, 0);
|
||||
});
|
||||
|
||||
it("moves 50 px when Shift is held", () => {
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "ArrowDown", shiftKey: true });
|
||||
expect(mockStoreState.moveNode).toHaveBeenCalledWith("n1", 0, 50);
|
||||
});
|
||||
|
||||
it("does NOT fire when no node is selected", () => {
|
||||
mockStoreState.selectedNodeId = null;
|
||||
renderWithProvider();
|
||||
fireEvent.keyDown(window, { key: "ArrowDown" });
|
||||
expect(mockStoreState.moveNode).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("skips when the target element is an input", () => {
|
||||
renderWithProvider();
|
||||
const input = document.createElement("input");
|
||||
document.body.appendChild(input);
|
||||
fireEvent.keyDown(input, { key: "ArrowDown" });
|
||||
expect(mockStoreState.moveNode).not.toHaveBeenCalled();
|
||||
document.body.removeChild(input);
|
||||
});
|
||||
|
||||
it("skips when a modal dialog is already open", () => {
|
||||
renderWithProvider();
|
||||
const dialog = document.createElement("div");
|
||||
dialog.setAttribute("role", "dialog");
|
||||
dialog.setAttribute("aria-modal", "true");
|
||||
document.body.appendChild(dialog);
|
||||
fireEvent.keyDown(window, { key: "ArrowDown" });
|
||||
expect(mockStoreState.moveNode).not.toHaveBeenCalled();
|
||||
document.body.removeChild(dialog);
|
||||
});
|
||||
|
||||
// NOTE: "prevents default browser scroll on arrow keys" was removed.
|
||||
// jsdom's KeyboardEvent.initKeyboardEvent does not copy the preventDefault
|
||||
// function from eventProperties into the real KeyboardEvent, so a
|
||||
// preventDefault mock passed via fireEvent.keyDown(eventProperties) is
|
||||
// never called. The guard (selected node required) is covered by
|
||||
// "does NOT fire when no node is selected". The e.preventDefault() call
|
||||
// itself is verified by code inspection.
|
||||
});
|
||||
|
||||
describe("all shortcuts respect inInput guard", () => {
|
||||
it("ArrowDown is skipped in an input element", () => {
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
renderWithProvider();
|
||||
const textarea = document.createElement("textarea");
|
||||
document.body.appendChild(textarea);
|
||||
fireEvent.keyDown(textarea, { key: "ArrowDown" });
|
||||
expect(mockStoreState.moveNode).not.toHaveBeenCalled();
|
||||
document.body.removeChild(textarea);
|
||||
});
|
||||
|
||||
it("Enter navigation is skipped in an input element", () => {
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
mockStoreState.nodes = [
|
||||
{ id: "n1", position: { x: 0, y: 0 }, data: { parentId: null } },
|
||||
{ id: "n2", position: { x: 100, y: 0 }, data: { parentId: "n1" } },
|
||||
];
|
||||
renderWithProvider();
|
||||
const input = document.createElement("input");
|
||||
document.body.appendChild(input);
|
||||
fireEvent.keyDown(input, { key: "Enter" });
|
||||
expect(mockStoreState.selectNode).not.toHaveBeenCalled();
|
||||
document.body.removeChild(input);
|
||||
});
|
||||
});
|
||||
|
||||
describe("Cmd/Ctrl+Arrow — keyboard node resize", () => {
|
||||
beforeEach(() => {
|
||||
mockStoreState.nodes = [
|
||||
{
|
||||
id: "n1",
|
||||
position: { x: 0, y: 0 },
|
||||
data: { parentId: null },
|
||||
width: 210,
|
||||
height: 110,
|
||||
},
|
||||
];
|
||||
mockStoreState.selectedNodeId = "n1";
|
||||
renderWithProvider();
|
||||
});
|
||||
|
||||
it("resizes height down (smaller) on Cmd/Ctrl+ArrowUp", () => {
|
||||
// Node starts at minHeight=110 (no children). Shrinking clamps to min —
|
||||
// height stays 110. Width is unchanged.
|
||||
fireEvent.keyDown(window, { key: "ArrowUp", metaKey: true });
|
||||
expect(mockStoreState.onNodesChange).toHaveBeenCalledWith([
|
||||
expect.objectContaining({
|
||||
type: "dimensions",
|
||||
id: "n1",
|
||||
dimensions: { width: 210, height: 110 },
|
||||
}),
|
||||
]);
|
||||
});
|
||||
|
||||
it("resizes height up (larger) on Cmd/Ctrl+ArrowDown", () => {
|
||||
fireEvent.keyDown(window, { key: "ArrowDown", ctrlKey: true });
|
||||
expect(mockStoreState.onNodesChange).toHaveBeenCalledWith([
|
||||
expect.objectContaining({
|
||||
type: "dimensions",
|
||||
id: "n1",
|
||||
dimensions: { width: 210, height: 120 },
|
||||
}),
|
||||
]);
|
||||
});
|
||||
|
||||
it("resizes width down (smaller) on Cmd/Ctrl+ArrowLeft", () => {
|
||||
// Node starts at minWidth=210 (no children). Shrinking clamps to min —
|
||||
// width stays 210. Height is unchanged.
|
||||
fireEvent.keyDown(window, { key: "ArrowLeft", metaKey: true });
|
||||
expect(mockStoreState.onNodesChange).toHaveBeenCalledWith([
|
||||
expect.objectContaining({
|
||||
type: "dimensions",
|
||||
id: "n1",
|
||||
dimensions: { width: 210, height: 110 },
|
||||
}),
|
||||
]);
|
||||
});
|
||||
|
||||
it("resizes width up (larger) on Cmd/Ctrl+ArrowRight", () => {
|
||||
fireEvent.keyDown(window, { key: "ArrowRight", ctrlKey: true });
|
||||
expect(mockStoreState.onNodesChange).toHaveBeenCalledWith([
|
||||
expect.objectContaining({
|
||||
type: "dimensions",
|
||||
id: "n1",
|
||||
dimensions: { width: 220, height: 110 },
|
||||
}),
|
||||
]);
|
||||
});
|
||||
|
||||
it("uses 2px step with Shift held", () => {
|
||||
// Step is 2px with Shift, but minHeight=110 clamps the result.
|
||||
// 110 - 2 = 108, Math.max(110, 108) = 110. Width is unchanged.
|
||||
fireEvent.keyDown(window, { key: "ArrowUp", metaKey: true, shiftKey: true });
|
||||
expect(mockStoreState.onNodesChange).toHaveBeenCalledWith([
|
||||
expect.objectContaining({
|
||||
dimensions: { width: 210, height: 110 },
|
||||
}),
|
||||
]);
|
||||
});
|
||||
|
||||
it("respects min-height constraint (no children)", () => {
|
||||
fireEvent.keyDown(window, { key: "ArrowUp", metaKey: true });
|
||||
fireEvent.keyDown(window, { key: "ArrowUp", metaKey: true });
|
||||
// After shrinking from 110 to 100, another ArrowUp hits min-height of 110
|
||||
// (110 - 10 = 100, but 100 < 110 so it should stay at 110)
|
||||
// Actually: 110 -> 100 -> 110 (resets to min)
|
||||
// Let me check: the hook does Math.max(minHeight, currentHeight - step)
|
||||
// minHeight=110, step=10, so 110 - 10 = 100, but Math.max(110, 100) = 110
|
||||
// So two ArrowUp calls should both result in height=100 then height=110?
|
||||
// Wait: 110 - 10 = 100, Math.max(110, 100) = 110 (not 100)
|
||||
// So the height never goes below 110. After first: 110 -> 100, but clamped to 110.
|
||||
// Actually Math.max(110, 100) = 110, so the height never changes.
|
||||
// The min constraint is respected — height stays at 110.
|
||||
expect(mockStoreState.onNodesChange).toHaveBeenLastCalledWith([
|
||||
expect.objectContaining({ dimensions: { width: 210, height: 110 } }),
|
||||
]);
|
||||
});
|
||||
|
||||
it("does NOT fire when no node is selected", () => {
|
||||
mockStoreState.selectedNodeId = null;
|
||||
fireEvent.keyDown(window, { key: "ArrowDown", metaKey: true });
|
||||
expect(mockStoreState.onNodesChange).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("skips when a modal dialog is open", () => {
|
||||
const dialog = document.createElement("div");
|
||||
dialog.setAttribute("role", "dialog");
|
||||
dialog.setAttribute("aria-modal", "true");
|
||||
document.body.appendChild(dialog);
|
||||
fireEvent.keyDown(window, { key: "ArrowDown", metaKey: true });
|
||||
expect(mockStoreState.onNodesChange).not.toHaveBeenCalled();
|
||||
document.body.removeChild(dialog);
|
||||
});
|
||||
|
||||
it("skips plain arrow keys (no modifier) — moveNode is called instead", () => {
|
||||
fireEvent.keyDown(window, { key: "ArrowUp" });
|
||||
expect(mockStoreState.moveNode).toHaveBeenCalled();
|
||||
expect(mockStoreState.onNodesChange).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("skips Alt+Arrow (not a resize combo)", () => {
|
||||
fireEvent.keyDown(window, { key: "ArrowUp", altKey: true });
|
||||
expect(mockStoreState.onNodesChange).not.toHaveBeenCalled();
|
||||
expect(mockStoreState.moveNode).not.toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
@ -2,6 +2,13 @@
|
||||
|
||||
import { useEffect } from "react";
|
||||
import { useCanvasStore } from "@/store/canvas";
|
||||
import { type NodeChange, type Node } from "@xyflow/react";
|
||||
import type { WorkspaceNodeData } from "@/store/canvas";
|
||||
|
||||
/** Returns true if the node has any direct child in the node list. */
|
||||
function hasChildren(nodeId: string, nodes: Node<WorkspaceNodeData>[]): boolean {
|
||||
return nodes.some((n) => n.data.parentId === nodeId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Canvas-wide keyboard shortcuts. All bound to the document window so
|
||||
@ -14,6 +21,9 @@ import { useCanvasStore } from "@/store/canvas";
|
||||
* Cmd/Ctrl+] — bump selected node forward in z-order
|
||||
* Cmd/Ctrl+[ — bump selected node backward in z-order
|
||||
* Z — zoom-to-team if the selected node has children
|
||||
* Arrow keys — move selected node 10px (50px with Shift)
|
||||
* Cmd/Ctrl+Arrow — resize selected node (↑↓ height, ←→ width)
|
||||
* Cmd/Ctrl+Shift+Arrow — resize by 2px per press (fine control)
|
||||
*/
|
||||
export function useKeyboardShortcuts() {
|
||||
useEffect(() => {
|
||||
@ -80,6 +90,76 @@ export function useKeyboardShortcuts() {
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Arrow-key node movement — Figma-style keyboard drag for keyboard users.
|
||||
// 10 px per press, 50 px with Shift held. Only fires when a node
|
||||
// is selected and the target isn't a form control. Skipped when a
|
||||
// modifier key (Cmd/Ctrl/Alt) is held so those combos can be used
|
||||
// for other shortcuts (e.g. Cmd+Arrow = resize).
|
||||
if (
|
||||
!inInput &&
|
||||
!e.metaKey &&
|
||||
!e.ctrlKey &&
|
||||
!e.altKey &&
|
||||
(e.key === "ArrowUp" ||
|
||||
e.key === "ArrowDown" ||
|
||||
e.key === "ArrowLeft" ||
|
||||
e.key === "ArrowRight")
|
||||
) {
|
||||
const state = useCanvasStore.getState();
|
||||
const selectedId = state.selectedNodeId;
|
||||
if (!selectedId) return;
|
||||
// Skip when a modal/dialog is already open — dialogs own their own
|
||||
// arrow-key semantics and shouldn't trigger canvas moves.
|
||||
if (document.querySelector('[role="dialog"][aria-modal="true"]')) return;
|
||||
e.preventDefault();
|
||||
const step = e.shiftKey ? 50 : 10;
|
||||
let dx = 0;
|
||||
let dy = 0;
|
||||
if (e.key === "ArrowUp") dy = -step;
|
||||
else if (e.key === "ArrowDown") dy = step;
|
||||
else if (e.key === "ArrowLeft") dx = -step;
|
||||
else dx = step;
|
||||
state.moveNode(selectedId, dx, dy);
|
||||
}
|
||||
|
||||
// Cmd/Ctrl+Arrow — keyboard-accessible node resize.
|
||||
// ↑/↓ resizes height, ←/→ resizes width.
|
||||
// 10 px per press (2 px with Shift for fine control).
|
||||
// Uses the same onNodesChange('dimensions') path that NodeResizer uses.
|
||||
if (
|
||||
!inInput &&
|
||||
(e.metaKey || e.ctrlKey) &&
|
||||
(e.key === "ArrowUp" ||
|
||||
e.key === "ArrowDown" ||
|
||||
e.key === "ArrowLeft" ||
|
||||
e.key === "ArrowRight")
|
||||
) {
|
||||
const state = useCanvasStore.getState();
|
||||
const selectedId = state.selectedNodeId;
|
||||
if (!selectedId) return;
|
||||
if (document.querySelector('[role="dialog"][aria-modal="true"]')) return;
|
||||
e.preventDefault();
|
||||
const step = e.shiftKey ? 2 : 10;
|
||||
const node = state.nodes.find((n) => n.id === selectedId);
|
||||
if (!node) return;
|
||||
const currentWidth = (node.width ?? 210) as number;
|
||||
const currentHeight = (node.height ?? 110) as number;
|
||||
const minWidth = hasChildren(node.id, state.nodes) ? 360 : 210;
|
||||
const minHeight = hasChildren(node.id, state.nodes) ? 200 : 110;
|
||||
let newWidth = currentWidth;
|
||||
let newHeight = currentHeight;
|
||||
if (e.key === "ArrowUp") newHeight = Math.max(minHeight, currentHeight - step);
|
||||
else if (e.key === "ArrowDown") newHeight = currentHeight + step;
|
||||
else if (e.key === "ArrowLeft") newWidth = Math.max(minWidth, currentWidth - step);
|
||||
else newWidth = currentWidth + step;
|
||||
const change: NodeChange = {
|
||||
type: "dimensions",
|
||||
id: selectedId,
|
||||
dimensions: { width: newWidth, height: newHeight },
|
||||
};
|
||||
state.onNodesChange([change]);
|
||||
}
|
||||
};
|
||||
window.addEventListener("keydown", handler);
|
||||
return () => window.removeEventListener("keydown", handler);
|
||||
|
||||
@ -109,7 +109,7 @@ export function OrgTokensTab() {
|
||||
Organization API Keys
|
||||
</h3>
|
||||
</div>
|
||||
<p className="text-[10px] text-ink-soft leading-relaxed">
|
||||
<p className="text-[10px] text-ink-mid leading-relaxed">
|
||||
Full-admin bearer tokens for this organization. Use with external
|
||||
integrations, CLI tools, or AI agents that need to manage
|
||||
workspaces, settings, and secrets. Each key has the same
|
||||
@ -182,13 +182,13 @@ export function OrgTokensTab() {
|
||||
|
||||
{/* Token list */}
|
||||
{loading ? (
|
||||
<div role="status" aria-live="polite" className="flex items-center justify-center gap-2 py-6 text-ink-soft text-xs">
|
||||
<div role="status" aria-live="polite" className="flex items-center justify-center gap-2 py-6 text-ink-mid text-xs">
|
||||
<Spinner /> Loading keys...
|
||||
</div>
|
||||
) : tokens.length === 0 ? (
|
||||
<div className="text-center py-6">
|
||||
<p className="text-xs text-ink-soft">No active keys</p>
|
||||
<p className="text-[10px] text-ink-soft mt-1">
|
||||
<p className="text-xs text-ink-mid">No active keys</p>
|
||||
<p className="text-[10px] text-ink-mid mt-1">
|
||||
Create a key above to authenticate API calls to this organization.
|
||||
</p>
|
||||
</div>
|
||||
@ -209,7 +209,7 @@ export function OrgTokensTab() {
|
||||
{t.name}
|
||||
</span>
|
||||
)}
|
||||
<div className="text-[9px] text-ink-soft space-x-3">
|
||||
<div className="text-[9px] text-ink-mid space-x-3">
|
||||
<span>Created {formatAge(t.created_at)}</span>
|
||||
{t.last_used_at && (
|
||||
<span>Last used {formatAge(t.last_used_at)}</span>
|
||||
|
||||
@ -81,7 +81,7 @@ export function TokensTab({ workspaceId }: TokensTabProps) {
|
||||
<div className="flex items-center justify-between">
|
||||
<div>
|
||||
<h3 className="text-sm font-semibold text-ink">API Tokens</h3>
|
||||
<p className="text-[10px] text-ink-soft mt-0.5">
|
||||
<p className="text-[10px] text-ink-mid mt-0.5">
|
||||
Bearer tokens for authenticating API calls to this workspace.
|
||||
</p>
|
||||
</div>
|
||||
@ -129,13 +129,13 @@ export function TokensTab({ workspaceId }: TokensTabProps) {
|
||||
|
||||
{/* Token list */}
|
||||
{loading ? (
|
||||
<div role="status" aria-live="polite" className="flex items-center justify-center gap-2 py-6 text-ink-soft text-xs">
|
||||
<div role="status" aria-live="polite" className="flex items-center justify-center gap-2 py-6 text-ink-mid text-xs">
|
||||
<Spinner /> Loading tokens...
|
||||
</div>
|
||||
) : tokens.length === 0 ? (
|
||||
<div className="text-center py-6">
|
||||
<p className="text-xs text-ink-soft">No active tokens</p>
|
||||
<p className="text-[10px] text-ink-soft mt-1">
|
||||
<p className="text-xs text-ink-mid">No active tokens</p>
|
||||
<p className="text-[10px] text-ink-mid mt-1">
|
||||
Create a token to authenticate API calls.
|
||||
</p>
|
||||
</div>
|
||||
@ -150,7 +150,7 @@ export function TokensTab({ workspaceId }: TokensTabProps) {
|
||||
<code className="text-[11px] font-mono text-ink-mid bg-surface-sunken/60 px-1.5 py-0.5 rounded">
|
||||
{t.prefix}...
|
||||
</code>
|
||||
<div className="text-[9px] text-ink-soft space-x-3">
|
||||
<div className="text-[9px] text-ink-mid space-x-3">
|
||||
<span>Created {formatAge(t.created_at)}</span>
|
||||
{t.last_used_at && (
|
||||
<span>Last used {formatAge(t.last_used_at)}</span>
|
||||
|
||||
@ -142,7 +142,7 @@ export function ActivityTab({ workspaceId }: Props) {
|
||||
className={`px-2 py-1 text-[11px] rounded-md font-medium transition-all ${
|
||||
filter === f.id
|
||||
? "bg-surface-card text-ink ring-1 ring-zinc-600"
|
||||
: "text-ink-soft hover:text-ink-mid hover:bg-surface-card/60"
|
||||
: "text-ink-mid hover:text-ink-mid hover:bg-surface-card/60"
|
||||
}`}
|
||||
>
|
||||
<span className="mr-0.5 opacity-60">{f.icon}</span> {f.label}
|
||||
@ -153,7 +153,7 @@ export function ActivityTab({ workspaceId }: Props) {
|
||||
onClick={() => setAutoRefresh(!autoRefresh)}
|
||||
aria-pressed={autoRefresh}
|
||||
className={`text-[11px] px-1.5 py-0.5 rounded ${
|
||||
autoRefresh ? "text-good bg-emerald-950/30" : "text-ink-soft"
|
||||
autoRefresh ? "text-good bg-emerald-950/30" : "text-ink-mid"
|
||||
}`}
|
||||
title={autoRefresh ? "Auto-refresh ON" : "Auto-refresh OFF"}
|
||||
>
|
||||
@ -177,7 +177,7 @@ export function ActivityTab({ workspaceId }: Props) {
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<div className="mt-1.5 text-[10px] text-ink-soft">
|
||||
<div className="mt-1.5 text-[10px] text-ink-mid">
|
||||
{activities.length} {filter === "all" ? "activities" : filter.replace("_", " ") + " entries"}
|
||||
</div>
|
||||
</div>
|
||||
@ -185,7 +185,7 @@ export function ActivityTab({ workspaceId }: Props) {
|
||||
{/* Activity list */}
|
||||
<div className="flex-1 overflow-y-auto p-3 space-y-1.5">
|
||||
{loading && activities.length === 0 && (
|
||||
<div className="text-xs text-ink-soft text-center py-8">Loading activity...</div>
|
||||
<div className="text-xs text-ink-mid text-center py-8">Loading activity...</div>
|
||||
)}
|
||||
|
||||
{error && (
|
||||
@ -196,8 +196,8 @@ export function ActivityTab({ workspaceId }: Props) {
|
||||
|
||||
{!loading && !error && activities.length === 0 && (
|
||||
<div className="text-center py-8">
|
||||
<div className="text-ink-soft text-xs">No activity recorded yet</div>
|
||||
<div className="text-ink-soft text-[9px] mt-1">
|
||||
<div className="text-ink-mid text-xs">No activity recorded yet</div>
|
||||
<div className="text-ink-mid text-[9px] mt-1">
|
||||
Activity logs appear when agents communicate or perform tasks
|
||||
</div>
|
||||
</div>
|
||||
@ -265,16 +265,16 @@ function ActivityRow({
|
||||
</span>
|
||||
|
||||
{entry.duration_ms != null && (
|
||||
<span className="text-[8px] text-ink-soft font-mono tabular-nums shrink-0">
|
||||
<span className="text-[8px] text-ink-mid font-mono tabular-nums shrink-0">
|
||||
{entry.duration_ms}ms
|
||||
</span>
|
||||
)}
|
||||
|
||||
<span className="text-[8px] text-ink-soft shrink-0">
|
||||
<span className="text-[8px] text-ink-mid shrink-0">
|
||||
{formatTime(entry.created_at)}
|
||||
</span>
|
||||
|
||||
<span className="text-[9px] text-ink-soft">
|
||||
<span className="text-[9px] text-ink-mid">
|
||||
{expanded ? "▼" : "▶"}
|
||||
</span>
|
||||
</div>
|
||||
@ -296,7 +296,7 @@ function ActivityRow({
|
||||
{resolveName(entry.source_id)}
|
||||
</span>
|
||||
)}
|
||||
<span className="text-[9px] text-ink-soft">→</span>
|
||||
<span className="text-[9px] text-ink-mid">→</span>
|
||||
{entry.target_id && (
|
||||
<span className="text-[9px] text-accent/80 truncate max-w-[140px]" title={entry.target_id}>
|
||||
{resolveName(entry.target_id)}
|
||||
@ -338,7 +338,7 @@ function ActivityRow({
|
||||
{entry.response_body && (
|
||||
<JsonBlock label="Response" data={entry.response_body} />
|
||||
)}
|
||||
<div className="text-[8px] text-ink-soft font-mono select-all">
|
||||
<div className="text-[8px] text-ink-mid font-mono select-all">
|
||||
ID: {entry.id}
|
||||
</div>
|
||||
</div>
|
||||
@ -386,7 +386,7 @@ function MessagePreview({ label, body }: { label: string; body: Record<string, u
|
||||
}
|
||||
return (
|
||||
<div>
|
||||
<div className="text-[8px] text-ink-soft uppercase tracking-wider mb-1">{label}</div>
|
||||
<div className="text-[8px] text-ink-mid uppercase tracking-wider mb-1">{label}</div>
|
||||
<div className="text-[10px] text-ink-mid bg-surface-sunken/60 rounded p-2 max-h-32 overflow-y-auto whitespace-pre-wrap break-words">
|
||||
{text.slice(0, 2000)}
|
||||
</div>
|
||||
@ -429,7 +429,7 @@ function MessagePreview({ label, body }: { label: string; body: Record<string, u
|
||||
|
||||
return (
|
||||
<div>
|
||||
<div className="text-[8px] text-ink-soft uppercase tracking-wider mb-1">{label}</div>
|
||||
<div className="text-[8px] text-ink-mid uppercase tracking-wider mb-1">{label}</div>
|
||||
<div className="text-[10px] text-ink-mid bg-surface-sunken/60 rounded p-2 max-h-32 overflow-y-auto whitespace-pre-wrap break-words">
|
||||
{text.slice(0, 2000)}
|
||||
</div>
|
||||
@ -440,7 +440,7 @@ function MessagePreview({ label, body }: { label: string; body: Record<string, u
|
||||
function Detail({ label, value, mono, error: isError }: { label: string; value: string; mono?: boolean; error?: boolean }) {
|
||||
return (
|
||||
<div className="flex items-start gap-2">
|
||||
<span className="text-[8px] text-ink-soft uppercase tracking-wider w-14 shrink-0 pt-0.5">{label}</span>
|
||||
<span className="text-[8px] text-ink-mid uppercase tracking-wider w-14 shrink-0 pt-0.5">{label}</span>
|
||||
<span className={`text-[9px] break-all ${isError ? "text-bad" : "text-ink-mid"} ${mono ? "font-mono" : ""}`}>
|
||||
{value}
|
||||
</span>
|
||||
@ -451,7 +451,7 @@ function Detail({ label, value, mono, error: isError }: { label: string; value:
|
||||
function JsonBlock({ label, data }: { label: string; data: Record<string, unknown> }) {
|
||||
return (
|
||||
<div>
|
||||
<div className="text-[8px] text-ink-soft uppercase tracking-wider mb-1">{label}</div>
|
||||
<div className="text-[8px] text-ink-mid uppercase tracking-wider mb-1">{label}</div>
|
||||
<pre className="text-[9px] text-ink-mid bg-surface-sunken/80 rounded p-2 overflow-x-auto max-h-48 font-mono">
|
||||
{JSON.stringify(data, null, 2)}
|
||||
</pre>
|
||||
|
||||
@ -158,7 +158,7 @@ export function BudgetSection({ workspaceId }: Props) {
|
||||
|
||||
{/* Usage stats */}
|
||||
{loading ? (
|
||||
<p className="text-xs text-ink-soft" data-testid="budget-loading">
|
||||
<p className="text-xs text-ink-mid" data-testid="budget-loading">
|
||||
Loading…
|
||||
</p>
|
||||
) : fetchError ? (
|
||||
@ -172,7 +172,7 @@ export function BudgetSection({ workspaceId }: Props) {
|
||||
<span className="text-xs text-ink-mid">Credits used</span>
|
||||
<span className="text-xs font-mono text-ink-mid">
|
||||
<span data-testid="budget-used-value">{(budget.budget_used ?? 0).toLocaleString()}</span>
|
||||
<span className="text-ink-soft mx-1">/</span>
|
||||
<span className="text-ink-mid mx-1">/</span>
|
||||
<span data-testid="budget-limit-value">
|
||||
{budget.budget_limit != null
|
||||
? budget.budget_limit.toLocaleString()
|
||||
@ -201,7 +201,7 @@ export function BudgetSection({ workspaceId }: Props) {
|
||||
|
||||
{/* Remaining credits */}
|
||||
{budget.budget_remaining != null && (
|
||||
<p className="text-[11px] text-ink-soft" data-testid="budget-remaining">
|
||||
<p className="text-[11px] text-ink-mid" data-testid="budget-remaining">
|
||||
{budget.budget_remaining.toLocaleString()} credits remaining
|
||||
</p>
|
||||
)}
|
||||
@ -227,7 +227,7 @@ export function BudgetSection({ workspaceId }: Props) {
|
||||
data-testid="budget-limit-input"
|
||||
className="w-full bg-surface-card border border-line rounded-lg px-3 py-2 text-sm text-ink-mid placeholder-zinc-500 focus:outline-none focus:border-accent focus:ring-1 focus:ring-accent/30 transition-colors"
|
||||
/>
|
||||
<p className="text-xs text-ink-soft">Leave blank for unlimited</p>
|
||||
<p className="text-xs text-ink-mid">Leave blank for unlimited</p>
|
||||
|
||||
{saveError && (
|
||||
<div
|
||||
|
||||
@ -242,7 +242,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="p-4 text-ink-soft text-xs">Loading channels...</div>
|
||||
<div className="p-4 text-ink-mid text-xs">Loading channels...</div>
|
||||
);
|
||||
}
|
||||
|
||||
@ -271,7 +271,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
{showForm && (
|
||||
<div className="space-y-2 p-3 bg-surface-card/40 rounded border border-line/50">
|
||||
<div>
|
||||
<label htmlFor={platformId} className="text-[10px] text-ink-soft block mb-1">Platform</label>
|
||||
<label htmlFor={platformId} className="text-[10px] text-ink-mid block mb-1">Platform</label>
|
||||
<select
|
||||
id={platformId}
|
||||
value={formType}
|
||||
@ -327,7 +327,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
className="rounded border-line"
|
||||
/>
|
||||
<span className="text-xs text-ink-mid">{chat.name || "Unknown"}</span>
|
||||
<span className="text-[10px] text-ink-soft ml-auto">{chat.type} {chat.chat_id}</span>
|
||||
<span className="text-[10px] text-ink-mid ml-auto">{chat.type} {chat.chat_id}</span>
|
||||
</label>
|
||||
))}
|
||||
<button
|
||||
@ -347,8 +347,8 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
)}
|
||||
|
||||
<div>
|
||||
<label htmlFor={allowedUsersId} className="text-[10px] text-ink-soft block mb-1">
|
||||
Allowed Users <span className="text-ink-soft">(optional, comma-separated)</span>
|
||||
<label htmlFor={allowedUsersId} className="text-[10px] text-ink-mid block mb-1">
|
||||
Allowed Users <span className="text-ink-mid">(optional, comma-separated)</span>
|
||||
</label>
|
||||
<input
|
||||
id={allowedUsersId}
|
||||
@ -357,7 +357,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
placeholder="123456789, 987654321"
|
||||
className="w-full text-xs bg-surface-sunken border border-line rounded px-2 py-1.5 text-ink-mid placeholder-zinc-600"
|
||||
/>
|
||||
<p className="text-[11px] text-ink-soft mt-0.5">
|
||||
<p className="text-[11px] text-ink-mid mt-0.5">
|
||||
Platform-specific user IDs. Leave empty to allow everyone.
|
||||
</p>
|
||||
</div>
|
||||
@ -380,8 +380,8 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
{/* Channel list */}
|
||||
{channels.length === 0 && !showForm && (
|
||||
<div className="text-center py-8">
|
||||
<p className="text-ink-soft text-xs">No channels connected</p>
|
||||
<p className="text-ink-soft text-[10px] mt-1">
|
||||
<p className="text-ink-mid text-xs">No channels connected</p>
|
||||
<p className="text-ink-mid text-[10px] mt-1">
|
||||
Connect Telegram, Slack, Discord, or Lark / Feishu to chat with this agent from social platforms.
|
||||
</p>
|
||||
</div>
|
||||
@ -402,7 +402,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
<span className="text-xs font-medium text-ink">
|
||||
{ch.channel_type.charAt(0).toUpperCase() + ch.channel_type.slice(1)}
|
||||
</span>
|
||||
<span className="text-[10px] text-ink-soft">
|
||||
<span className="text-[10px] text-ink-mid">
|
||||
{ch.config.chat_id || ch.config.channel_id || ""}
|
||||
</span>
|
||||
</div>
|
||||
@ -419,7 +419,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
className={`text-[10px] px-2 py-0.5 rounded transition ${
|
||||
ch.enabled
|
||||
? "bg-emerald-900/30 text-good hover:bg-emerald-900/50"
|
||||
: "bg-surface-card/50 text-ink-soft hover:text-ink-mid"
|
||||
: "bg-surface-card/50 text-ink-mid hover:text-ink-mid"
|
||||
}`}
|
||||
>
|
||||
{ch.enabled ? "On" : "Off"}
|
||||
@ -432,7 +432,7 @@ export function ChannelsTab({ workspaceId }: Props) {
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-center gap-4 text-[10px] text-ink-soft">
|
||||
<div className="flex items-center gap-4 text-[10px] text-ink-mid">
|
||||
<span>{ch.message_count} messages</span>
|
||||
<span>Last: {relativeTime(ch.last_message_at)}</span>
|
||||
{ch.allowed_users.length > 0 && (
|
||||
@ -474,9 +474,9 @@ function SchemaField({
|
||||
"w-full text-xs bg-surface-sunken border border-line rounded px-2 py-1.5 text-ink-mid placeholder-zinc-600";
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={inputId} className="text-[10px] text-ink-soft block mb-1">
|
||||
<label htmlFor={inputId} className="text-[10px] text-ink-mid block mb-1">
|
||||
{field.label}
|
||||
{!field.required && <span className="text-ink-soft"> (optional)</span>}
|
||||
{!field.required && <span className="text-ink-mid"> (optional)</span>}
|
||||
</label>
|
||||
{field.type === "textarea" ? (
|
||||
<textarea
|
||||
@ -499,7 +499,7 @@ function SchemaField({
|
||||
)}
|
||||
{renderExtras?.()}
|
||||
{field.help && (
|
||||
<p className="text-[11px] text-ink-soft mt-0.5">{field.help}</p>
|
||||
<p className="text-[11px] text-ink-mid mt-0.5">{field.help}</p>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
|
||||
@ -965,7 +965,7 @@ function MyChatPanel({ workspaceId, data }: Props) {
|
||||
{/* Messages */}
|
||||
<div ref={containerRef} className="flex-1 overflow-y-auto p-3 space-y-3">
|
||||
{loading && (
|
||||
<div className="text-xs text-ink-soft text-center py-4">Loading chat history...</div>
|
||||
<div className="text-xs text-ink-mid text-center py-4">Loading chat history...</div>
|
||||
)}
|
||||
{!loading && loadError !== null && messages.length === 0 && (
|
||||
<div
|
||||
@ -984,7 +984,7 @@ function MyChatPanel({ workspaceId, data }: Props) {
|
||||
</div>
|
||||
)}
|
||||
{!loading && loadError === null && messages.length === 0 && (
|
||||
<div className="text-xs text-ink-soft text-center py-8">
|
||||
<div className="text-xs text-ink-mid text-center py-8">
|
||||
No messages yet. Send a message to start chatting with this agent.
|
||||
</div>
|
||||
)}
|
||||
@ -1002,7 +1002,7 @@ function MyChatPanel({ workspaceId, data }: Props) {
|
||||
scroll resting against the top of the conversation IS the
|
||||
signal. */}
|
||||
{hasMore && messages.length > 0 && (
|
||||
<div ref={topRef} className="text-xs text-ink-soft text-center py-1">
|
||||
<div ref={topRef} className="text-xs text-ink-mid text-center py-1">
|
||||
{loadingOlder ? "Loading older messages…" : " "}
|
||||
</div>
|
||||
)}
|
||||
@ -1153,7 +1153,7 @@ function MyChatPanel({ workspaceId, data }: Props) {
|
||||
{thinkingElapsed}s
|
||||
</div>
|
||||
{activityLog.length > 0 && (
|
||||
<div className="mt-1.5 text-[9px] text-ink-soft space-y-0.5">
|
||||
<div className="mt-1.5 text-[9px] text-ink-mid space-y-0.5">
|
||||
<div className="text-ink-mid">Processing with {runtimeDisplayName(data.runtime)}...</div>
|
||||
{activityLog.map((line, i) => (
|
||||
<div key={line + i} className="pl-2 border-l border-line">◇ {line}</div>
|
||||
|
||||
@ -97,7 +97,7 @@ function AgentCardSection({ workspaceId }: { workspaceId: string }) {
|
||||
{JSON.stringify(card, null, 2)}
|
||||
</pre>
|
||||
) : (
|
||||
<div className="text-[10px] text-ink-soft">No agent card</div>
|
||||
<div className="text-[10px] text-ink-mid">No agent card</div>
|
||||
)}
|
||||
{success && <div className="mt-2 px-2 py-1 bg-green-900/30 border border-green-800 rounded text-[10px] text-good">Updated</div>}
|
||||
<button type="button" onClick={() => { setDraft(JSON.stringify(card || {}, null, 2)); setEditing(true); setError(null); setSuccess(false); }}
|
||||
@ -635,16 +635,16 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
const isDirty = (rawMode ? rawDraft !== originalYaml : toYaml(config) !== originalYaml) || providerDirty;
|
||||
|
||||
if (loading) {
|
||||
return <div className="p-4 text-xs text-ink-soft">Loading config...</div>;
|
||||
return <div className="p-4 text-xs text-ink-mid">Loading config...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="flex flex-col h-full">
|
||||
{/* Mode toggle */}
|
||||
<div className="flex items-center justify-between px-3 py-1.5 border-b border-line/40 bg-surface-sunken/30">
|
||||
<span className="text-[10px] text-ink-soft">config.yaml</span>
|
||||
<span className="text-[10px] text-ink-mid">config.yaml</span>
|
||||
<label className="flex items-center gap-1.5 cursor-pointer">
|
||||
<span className="text-[9px] text-ink-soft">Raw YAML</span>
|
||||
<span className="text-[9px] text-ink-mid">Raw YAML</span>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={rawMode}
|
||||
@ -677,7 +677,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
<Section title="General">
|
||||
<TextInput label="Name" value={config.name} onChange={(v) => update("name", v)} />
|
||||
<div>
|
||||
<label htmlFor={descriptionId} className="text-[10px] text-ink-soft block mb-1">Description</label>
|
||||
<label htmlFor={descriptionId} className="text-[10px] text-ink-mid block mb-1">Description</label>
|
||||
<textarea
|
||||
id={descriptionId}
|
||||
value={config.description}
|
||||
@ -689,7 +689,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
<div className="grid grid-cols-2 gap-3">
|
||||
<TextInput label="Version" value={config.version} onChange={(v) => update("version", v)} mono />
|
||||
<div>
|
||||
<label htmlFor={tierId} className="text-[10px] text-ink-soft block mb-1">Tier</label>
|
||||
<label htmlFor={tierId} className="text-[10px] text-ink-mid block mb-1">Tier</label>
|
||||
<select
|
||||
id={tierId}
|
||||
value={config.tier}
|
||||
@ -707,7 +707,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
|
||||
<Section title="Runtime">
|
||||
<div>
|
||||
<label htmlFor={runtimeId} className="text-[10px] text-ink-soft block mb-1">Runtime</label>
|
||||
<label htmlFor={runtimeId} className="text-[10px] text-ink-mid block mb-1">Runtime</label>
|
||||
<select
|
||||
id={runtimeId}
|
||||
value={config.runtime || ""}
|
||||
@ -791,7 +791,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
// workspace_secrets MODEL_PROVIDER override.
|
||||
<div className="space-y-3">
|
||||
<div>
|
||||
<label className="text-[10px] text-ink-soft block mb-1">Model</label>
|
||||
<label className="text-[10px] text-ink-mid block mb-1">Model</label>
|
||||
<input
|
||||
type="text"
|
||||
value={currentModelId}
|
||||
@ -808,9 +808,9 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label htmlFor={`${runtimeId}-provider`} className="text-[10px] text-ink-soft block mb-1">
|
||||
<label htmlFor={`${runtimeId}-provider`} className="text-[10px] text-ink-mid block mb-1">
|
||||
Provider
|
||||
<span className="ml-1 text-ink-soft">
|
||||
<span className="ml-1 text-ink-mid">
|
||||
(override — leave empty to auto-derive from model slug)
|
||||
</span>
|
||||
</label>
|
||||
@ -859,7 +859,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
onChange={(v) => updateNested("runtime_config" as keyof ConfigData, "required_env", v)}
|
||||
placeholder="variable NAME (e.g. ANTHROPIC_API_KEY) — not the value"
|
||||
/>
|
||||
<p className="text-[10px] text-ink-soft mt-1">
|
||||
<p className="text-[10px] text-ink-mid mt-1">
|
||||
This declares which env var <em>names</em> the workspace needs.
|
||||
Set the actual values in the <strong>Secrets</strong> section
|
||||
below — those are encrypted and mounted into the container at
|
||||
@ -867,7 +867,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
</p>
|
||||
{currentModelSpec?.required_env?.length &&
|
||||
!arraysEqual(config.runtime_config?.required_env ?? [], currentModelSpec.required_env) && (
|
||||
<div className="text-[10px] text-ink-soft mt-1 flex items-center gap-2">
|
||||
<div className="text-[10px] text-ink-mid mt-1 flex items-center gap-2">
|
||||
<span>
|
||||
Template suggests{" "}
|
||||
<code className="text-ink-mid">{currentModelSpec.required_env.join(", ")}</code>{" "}
|
||||
@ -890,9 +890,9 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
(config.runtime_config?.model || config.model || "").toLowerCase().includes("anthropic")) && (
|
||||
<Section title="Claude Settings" defaultOpen={false}>
|
||||
<div>
|
||||
<label htmlFor={effortId} className="text-[10px] text-ink-soft block mb-1">
|
||||
<label htmlFor={effortId} className="text-[10px] text-ink-mid block mb-1">
|
||||
Effort
|
||||
<span className="ml-1 text-ink-soft">(output_config.effort — Opus 4.7+)</span>
|
||||
<span className="ml-1 text-ink-mid">(output_config.effort — Opus 4.7+)</span>
|
||||
</label>
|
||||
<select
|
||||
id={effortId}
|
||||
@ -910,9 +910,9 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label htmlFor={taskBudgetId} className="text-[10px] text-ink-soft block mb-1">
|
||||
<label htmlFor={taskBudgetId} className="text-[10px] text-ink-mid block mb-1">
|
||||
Task Budget (tokens)
|
||||
<span className="ml-1 text-ink-soft">(output_config.task_budget.total — 0 = unset)</span>
|
||||
<span className="ml-1 text-ink-mid">(output_config.task_budget.total — 0 = unset)</span>
|
||||
</label>
|
||||
<input
|
||||
id={taskBudgetId}
|
||||
@ -938,7 +938,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
showing the misnamed list-input affordance. */}
|
||||
|
||||
<Section title="Prompt Files" defaultOpen={false}>
|
||||
<p className="text-[10px] text-ink-soft px-1 pb-1">
|
||||
<p className="text-[10px] text-ink-mid px-1 pb-1">
|
||||
Markdown files that compose this workspace's system prompt.
|
||||
Loaded in order at boot from the workspace config dir
|
||||
(e.g. <code className="font-mono">system-prompt.md</code>,{' '}
|
||||
@ -966,7 +966,7 @@ export function ConfigTab({ workspaceId }: Props) {
|
||||
|
||||
<Section title="Sandbox" defaultOpen={false}>
|
||||
<div>
|
||||
<label htmlFor={sandboxBackendId} className="text-[10px] text-ink-soft block mb-1">Backend</label>
|
||||
<label htmlFor={sandboxBackendId} className="text-[10px] text-ink-mid block mb-1">Backend</label>
|
||||
<select
|
||||
id={sandboxBackendId}
|
||||
value={config.sandbox?.backend || "docker"}
|
||||
|
||||
@ -242,7 +242,7 @@ export function DetailsTab({ workspaceId, data }: Props) {
|
||||
{data.lastSampleError}
|
||||
</pre>
|
||||
) : (
|
||||
<p className="text-xs text-ink-soft">No error detail recorded.</p>
|
||||
<p className="text-xs text-ink-mid">No error detail recorded.</p>
|
||||
)}
|
||||
<button
|
||||
type="button"
|
||||
@ -268,7 +268,7 @@ export function DetailsTab({ workspaceId, data }: Props) {
|
||||
<div key={s.id} className="flex items-start gap-2">
|
||||
<span className="text-xs text-accent font-mono shrink-0">{s.id}</span>
|
||||
{s.description && (
|
||||
<span className="text-xs text-ink-soft">{s.description}</span>
|
||||
<span className="text-xs text-ink-mid">{s.description}</span>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
@ -281,11 +281,11 @@ export function DetailsTab({ workspaceId, data }: Props) {
|
||||
{peersError ? (
|
||||
<p className="text-xs text-bad">{peersError}</p>
|
||||
) : peers.length === 0 && data.status !== "online" && data.status !== "degraded" ? (
|
||||
<p className="text-xs text-ink-soft">
|
||||
<p className="text-xs text-ink-mid">
|
||||
Peers are only discoverable while the workspace is online.
|
||||
</p>
|
||||
) : peers.length === 0 ? (
|
||||
<p className="text-xs text-ink-soft">No reachable peers</p>
|
||||
<p className="text-xs text-ink-mid">No reachable peers</p>
|
||||
) : (
|
||||
<div className="space-y-1">
|
||||
{peers.map((p) => (
|
||||
@ -297,7 +297,7 @@ export function DetailsTab({ workspaceId, data }: Props) {
|
||||
>
|
||||
<StatusDot status={p.status} />
|
||||
<span className="text-xs text-ink">{p.name}</span>
|
||||
{p.role && <span className="text-[10px] text-ink-soft">{p.role}</span>}
|
||||
{p.role && <span className="text-[10px] text-ink-mid">{p.role}</span>}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
@ -385,7 +385,7 @@ function Field({ label, children }: { label: string; children: React.ReactNode }
|
||||
const fieldId = useId();
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={fieldId} className="text-[10px] text-ink-soft block mb-0.5">{label}</label>
|
||||
<label htmlFor={fieldId} className="text-[10px] text-ink-mid block mb-0.5">{label}</label>
|
||||
{cloneElement(children as ReactElement<{ id?: string }>, { id: fieldId })}
|
||||
</div>
|
||||
);
|
||||
@ -394,7 +394,7 @@ function Field({ label, children }: { label: string; children: React.ReactNode }
|
||||
function Row({ label, value, mono }: { label: string; value: string; mono?: boolean }) {
|
||||
return (
|
||||
<div className="flex justify-between">
|
||||
<span className="text-xs text-ink-soft">{label}</span>
|
||||
<span className="text-xs text-ink-mid">{label}</span>
|
||||
<span className={`text-xs text-ink ${mono ? "font-mono" : ""} text-right max-w-[200px] truncate`}>
|
||||
{value}
|
||||
</span>
|
||||
|
||||
@ -62,7 +62,7 @@ export function EventsTab({ workspaceId }: Props) {
|
||||
}, [loadEvents]);
|
||||
|
||||
if (loading && events.length === 0) {
|
||||
return <div className="p-4 text-xs text-ink-soft">Loading events...</div>;
|
||||
return <div className="p-4 text-xs text-ink-mid">Loading events...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
@ -88,7 +88,7 @@ export function EventsTab({ workspaceId }: Props) {
|
||||
)}
|
||||
|
||||
{!error && events.length === 0 ? (
|
||||
<p className="text-xs text-ink-soft text-center py-4">No events yet</p>
|
||||
<p className="text-xs text-ink-mid text-center py-4">No events yet</p>
|
||||
) : (
|
||||
<div className="space-y-1">
|
||||
{events.map((event) => {
|
||||
@ -115,10 +115,10 @@ export function EventsTab({ workspaceId }: Props) {
|
||||
>
|
||||
{event.event_type}
|
||||
</span>
|
||||
<span className="text-[9px] text-ink-soft ml-auto">
|
||||
<span className="text-[9px] text-ink-mid ml-auto">
|
||||
{formatTime(event.created_at)}
|
||||
</span>
|
||||
<span aria-hidden="true" className="text-[10px] text-ink-soft">
|
||||
<span aria-hidden="true" className="text-[10px] text-ink-mid">
|
||||
{isOpen ? "▼" : "▶"}
|
||||
</span>
|
||||
</button>
|
||||
@ -128,7 +128,7 @@ export function EventsTab({ workspaceId }: Props) {
|
||||
<pre className="text-[10px] text-ink-mid bg-surface-sunken rounded p-2 overflow-x-auto max-h-40">
|
||||
{JSON.stringify(event.payload, null, 2)}
|
||||
</pre>
|
||||
<div className="mt-1 text-[9px] text-ink-soft font-mono">
|
||||
<div className="mt-1 text-[9px] text-ink-mid font-mono">
|
||||
ID: {event.id}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@ -77,7 +77,7 @@ export function ExternalConnectionSection({ workspaceId }: Props) {
|
||||
return (
|
||||
<div className="mx-3 mt-3 p-3 bg-surface-sunken/50 border border-line rounded">
|
||||
<h3 className="text-xs text-ink-mid font-medium mb-1">External Connection</h3>
|
||||
<p className="text-[10px] text-ink-soft mb-2">
|
||||
<p className="text-[10px] text-ink-mid mb-2">
|
||||
This workspace runs an external agent. Use these controls to
|
||||
re-show the setup snippets or rotate the workspace token.
|
||||
</p>
|
||||
|
||||
@ -203,7 +203,7 @@ function PlatformOwnedFilesTab({ workspaceId }: { workspaceId: string }) {
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return <div className="p-4 text-xs text-ink-soft">Loading files...</div>;
|
||||
return <div className="p-4 text-xs text-ink-mid">Loading files...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
@ -304,7 +304,7 @@ function PlatformOwnedFilesTab({ workspaceId }: { workspaceId: string }) {
|
||||
)}
|
||||
|
||||
{files.length === 0 ? (
|
||||
<div className="px-3 py-4 text-[10px] text-ink-soft text-center">
|
||||
<div className="px-3 py-4 text-[10px] text-ink-mid text-center">
|
||||
{rootDragHover
|
||||
? "Drop to upload to root"
|
||||
: root === "/configs"
|
||||
|
||||
@ -36,7 +36,7 @@ export function FileEditor({
|
||||
<div className="flex-1 flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<div className="text-2xl opacity-20 mb-2">📄</div>
|
||||
<p className="text-[10px] text-ink-soft">Select a file to edit</p>
|
||||
<p className="text-[10px] text-ink-mid">Select a file to edit</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
@ -56,7 +56,7 @@ export function FileEditor({
|
||||
<button
|
||||
onClick={onDownload}
|
||||
aria-label="Download file"
|
||||
className="text-[10px] text-ink-soft hover:text-ink-mid"
|
||||
className="text-[10px] text-ink-mid hover:text-ink-mid"
|
||||
>
|
||||
↓
|
||||
</button>
|
||||
@ -74,7 +74,7 @@ export function FileEditor({
|
||||
|
||||
{/* Editor area */}
|
||||
{loadingFile ? (
|
||||
<div className="p-4 text-xs text-ink-soft">Loading...</div>
|
||||
<div className="p-4 text-xs text-ink-mid">Loading...</div>
|
||||
) : (
|
||||
<textarea
|
||||
ref={editorRef}
|
||||
|
||||
@ -209,7 +209,7 @@ function TreeItem({
|
||||
onContextMenu={(e) => openContextMenu(e, node)}
|
||||
{...dragProps}
|
||||
>
|
||||
<span className="text-[9px] text-ink-soft w-3">{isLoading ? "…" : expanded ? "▼" : "▶"}</span>
|
||||
<span className="text-[9px] text-ink-mid w-3">{isLoading ? "…" : expanded ? "▼" : "▶"}</span>
|
||||
<span className="text-[10px]">📁</span>
|
||||
<span className="text-[10px] text-ink-mid flex-1">{node.name}</span>
|
||||
<button
|
||||
|
||||
@ -132,7 +132,7 @@ export function FileTreeContextMenu({ x, y, items, onClose }: Props) {
|
||||
: "w-full text-left px-3 py-1 text-ink-mid hover:bg-surface-card hover:text-ink focus:bg-surface-card focus:text-ink focus:outline-none disabled:opacity-40 disabled:pointer-events-none transition-colors"
|
||||
}
|
||||
>
|
||||
{item.icon && <span className="inline-block w-4 mr-1.5 text-ink-soft">{item.icon}</span>}
|
||||
{item.icon && <span className="inline-block w-4 mr-1.5 text-ink-mid">{item.icon}</span>}
|
||||
{item.label}
|
||||
</button>
|
||||
))}
|
||||
|
||||
@ -39,7 +39,7 @@ export function FilesToolbar({
|
||||
<option value="/workspace">/workspace</option>
|
||||
<option value="/plugins">/plugins</option>
|
||||
</select>
|
||||
<span className="text-[10px] text-ink-soft">{fileCount} files</span>
|
||||
<span className="text-[10px] text-ink-mid">{fileCount} files</span>
|
||||
</div>
|
||||
<div className="flex gap-1.5">
|
||||
{root === "/configs" && (
|
||||
@ -62,7 +62,7 @@ export function FilesToolbar({
|
||||
</button>
|
||||
</>
|
||||
)}
|
||||
<button type="button" onClick={onDownloadAll} aria-label="Download all files" className="text-[10px] text-ink-soft hover:text-ink-mid" title="Download all files">
|
||||
<button type="button" onClick={onDownloadAll} aria-label="Download all files" className="text-[10px] text-ink-mid hover:text-ink-mid" title="Download all files">
|
||||
Export
|
||||
</button>
|
||||
{root === "/configs" && (
|
||||
@ -70,7 +70,7 @@ export function FilesToolbar({
|
||||
Clear
|
||||
</button>
|
||||
)}
|
||||
<button type="button" onClick={onRefresh} aria-label="Refresh file list" className="text-[10px] text-ink-soft hover:text-ink-mid" title="Refresh">
|
||||
<button type="button" onClick={onRefresh} aria-label="Refresh file list" className="text-[10px] text-ink-mid hover:text-ink-mid" title="Refresh">
|
||||
↻
|
||||
</button>
|
||||
</div>
|
||||
|
||||
@ -27,7 +27,7 @@ export function NotAvailablePanel({ runtime }: { runtime: string }) {
|
||||
viewBox="0 0 72 72"
|
||||
fill="none"
|
||||
aria-hidden="true"
|
||||
className="text-ink-soft mb-4"
|
||||
className="text-ink-mid mb-4"
|
||||
>
|
||||
{/* Folder body */}
|
||||
<path
|
||||
@ -47,7 +47,7 @@ export function NotAvailablePanel({ runtime }: { runtime: string }) {
|
||||
/>
|
||||
</svg>
|
||||
<h3 className="text-sm font-medium text-ink mb-1.5">Files not available</h3>
|
||||
<p className="text-[11px] text-ink-soft max-w-xs leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid max-w-xs leading-relaxed">
|
||||
This workspace runs the{" "}
|
||||
<span className="font-mono text-ink-mid">{runtime}</span> runtime,
|
||||
whose filesystem isn't owned by the platform. Use the Chat tab to
|
||||
|
||||
@ -182,7 +182,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return <div className="p-4 text-xs text-ink-soft">Loading memory...</div>;
|
||||
return <div className="p-4 text-xs text-ink-mid">Loading memory...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
@ -197,7 +197,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
<div className="flex items-center justify-between gap-3">
|
||||
<div>
|
||||
<div className="text-xs font-medium text-ink">Awareness dashboard</div>
|
||||
<p className="text-[10px] text-ink-soft">
|
||||
<p className="text-[10px] text-ink-mid">
|
||||
Embedded view for the local Awareness memory UI. The current workspace id is appended to the URL for workspace-scoped routing or future filtering.
|
||||
</p>
|
||||
</div>
|
||||
@ -230,7 +230,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
/>
|
||||
</div>
|
||||
) : (
|
||||
<div className="rounded-xl border border-dashed border-line bg-surface-sunken/40 p-4 text-xs text-ink-soft">
|
||||
<div className="rounded-xl border border-dashed border-line bg-surface-sunken/40 p-4 text-xs text-ink-mid">
|
||||
Set <code className="font-mono text-ink-mid">NEXT_PUBLIC_AWARENESS_URL</code> to embed the Awareness dashboard here.
|
||||
</div>
|
||||
)
|
||||
@ -238,7 +238,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
<div className="rounded-xl border border-line bg-surface-sunken/50 px-4 py-3 flex items-center justify-between gap-3">
|
||||
<div className="min-w-0">
|
||||
<p className="text-xs text-ink">Awareness dashboard is collapsed</p>
|
||||
<p className="text-[10px] text-ink-soft truncate">
|
||||
<p className="text-[10px] text-ink-mid truncate">
|
||||
Workspace context stays linked through <span className="font-mono text-ink-mid">{workspaceId}</span>.
|
||||
</p>
|
||||
</div>
|
||||
@ -254,15 +254,15 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
|
||||
<div className="grid gap-2 rounded-xl border border-line bg-surface/40 px-3 py-2 text-[10px] text-ink-mid sm:grid-cols-3">
|
||||
<div className="flex items-center justify-between gap-2">
|
||||
<span className="uppercase tracking-[0.18em] text-ink-soft">Status</span>
|
||||
<span className="uppercase tracking-[0.18em] text-ink-mid">Status</span>
|
||||
<span className="font-medium text-good">Connected</span>
|
||||
</div>
|
||||
<div className="flex items-center justify-between gap-2">
|
||||
<span className="uppercase tracking-[0.18em] text-ink-soft">Mode</span>
|
||||
<span className="uppercase tracking-[0.18em] text-ink-mid">Mode</span>
|
||||
<span className="font-medium text-ink">{awarenessStatus}</span>
|
||||
</div>
|
||||
<div className="flex items-center justify-between gap-2 min-w-0">
|
||||
<span className="uppercase tracking-[0.18em] text-ink-soft">Workspace</span>
|
||||
<span className="uppercase tracking-[0.18em] text-ink-mid">Workspace</span>
|
||||
<span className="font-mono text-ink-mid truncate">{workspaceId}</span>
|
||||
</div>
|
||||
</div>
|
||||
@ -272,7 +272,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
<div className="flex items-center justify-between">
|
||||
<div>
|
||||
<div className="text-xs font-medium text-ink">Workspace KV memory</div>
|
||||
<p className="text-[10px] text-ink-soft">
|
||||
<p className="text-[10px] text-ink-mid">
|
||||
Native platform key-value memory for workspace <span className="font-mono text-ink-mid">{workspaceId}</span>.
|
||||
</p>
|
||||
</div>
|
||||
@ -350,7 +350,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
|
||||
{showAdvanced ? (
|
||||
entries.length === 0 ? (
|
||||
<p className="text-xs text-ink-soft text-center py-4">No memory entries</p>
|
||||
<p className="text-xs text-ink-mid text-center py-4">No memory entries</p>
|
||||
) : (
|
||||
<div className="space-y-1">
|
||||
{entries.map((entry) => (
|
||||
@ -364,11 +364,11 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
<span className="text-xs font-mono text-accent">{entry.key}</span>
|
||||
<div className="flex items-center gap-2">
|
||||
{entry.expires_at && (
|
||||
<span className="text-[9px] text-ink-soft">
|
||||
<span className="text-[9px] text-ink-mid">
|
||||
TTL {new Date(entry.expires_at).toLocaleString()}
|
||||
</span>
|
||||
)}
|
||||
<span className="text-[10px] text-ink-soft">
|
||||
<span className="text-[10px] text-ink-mid">
|
||||
{expanded === entry.key ? "▼" : "▶"}
|
||||
</span>
|
||||
</div>
|
||||
@ -420,7 +420,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
</pre>
|
||||
)}
|
||||
<div className="flex items-center justify-between">
|
||||
<span className="text-[9px] text-ink-soft">
|
||||
<span className="text-[9px] text-ink-mid">
|
||||
Updated: {new Date(entry.updated_at).toLocaleString()}
|
||||
</span>
|
||||
<div className="flex items-center gap-2">
|
||||
@ -452,7 +452,7 @@ export function MemoryTab({ workspaceId }: Props) {
|
||||
<div className="rounded-xl border border-line bg-surface/30 px-4 py-3 flex items-center justify-between gap-3">
|
||||
<div className="min-w-0">
|
||||
<p className="text-xs text-ink">Advanced workspace memory is hidden</p>
|
||||
<p className="text-[10px] text-ink-soft truncate">
|
||||
<p className="text-[10px] text-ink-mid truncate">
|
||||
KV entries remain available if you need the raw platform store.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@ -180,7 +180,7 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return <div className="p-4 text-[10px] text-ink-soft">Loading schedules...</div>;
|
||||
return <div className="p-4 text-[10px] text-ink-mid">Loading schedules...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
@ -207,11 +207,11 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
placeholder="Schedule name (e.g., Daily security scan)"
|
||||
value={formName}
|
||||
onChange={(e) => setFormName(e.target.value)}
|
||||
className="w-full text-[10px] bg-surface-card border border-line rounded px-2 py-1 text-ink placeholder:text-ink-soft"
|
||||
className="w-full text-[10px] bg-surface-card border border-line rounded px-2 py-1 text-ink placeholder:text-ink-mid"
|
||||
/>
|
||||
<div className="flex gap-2">
|
||||
<div className="flex-1">
|
||||
<label htmlFor={cronId} className="text-[10px] text-ink-soft block mb-0.5">Cron Expression</label>
|
||||
<label htmlFor={cronId} className="text-[10px] text-ink-mid block mb-0.5">Cron Expression</label>
|
||||
<input
|
||||
id={cronId}
|
||||
type="text"
|
||||
@ -219,12 +219,12 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
onChange={(e) => setFormCron(e.target.value)}
|
||||
className="w-full text-[10px] bg-surface-card border border-line rounded px-2 py-1 text-ink font-mono"
|
||||
/>
|
||||
<div className="text-[10px] text-ink-soft mt-0.5">
|
||||
<div className="text-[10px] text-ink-mid mt-0.5">
|
||||
{cronToHuman(formCron)}
|
||||
</div>
|
||||
</div>
|
||||
<div className="w-24">
|
||||
<label htmlFor={timezoneId} className="text-[10px] text-ink-soft block mb-0.5">Timezone</label>
|
||||
<label htmlFor={timezoneId} className="text-[10px] text-ink-mid block mb-0.5">Timezone</label>
|
||||
<select
|
||||
id={timezoneId}
|
||||
value={formTimezone}
|
||||
@ -245,14 +245,14 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<label htmlFor={promptId} className="text-[10px] text-ink-soft block mb-0.5">Prompt / Task</label>
|
||||
<label htmlFor={promptId} className="text-[10px] text-ink-mid block mb-0.5">Prompt / Task</label>
|
||||
<textarea
|
||||
id={promptId}
|
||||
value={formPrompt}
|
||||
onChange={(e) => setFormPrompt(e.target.value)}
|
||||
placeholder="What should the agent do on this schedule?"
|
||||
rows={3}
|
||||
className="w-full text-[10px] bg-surface-card border border-line rounded px-2 py-1 text-ink placeholder:text-ink-soft resize-y"
|
||||
className="w-full text-[10px] bg-surface-card border border-line rounded px-2 py-1 text-ink placeholder:text-ink-mid resize-y"
|
||||
/>
|
||||
</div>
|
||||
<div className="flex items-center gap-2">
|
||||
@ -290,7 +290,7 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
<div className="text-[10px] text-ink-soft space-y-0.5">
|
||||
<div className="text-[10px] text-ink-mid space-y-0.5">
|
||||
<div>Common patterns:</div>
|
||||
<div className="font-mono">{"0 9 * * *"} — Daily at 9:00 AM</div>
|
||||
<div className="font-mono">{"*/30 * * * *"} — Every 30 minutes</div>
|
||||
@ -306,7 +306,7 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
<div className="p-6 text-center">
|
||||
<div className="text-2xl mb-2">⏲</div>
|
||||
<div className="text-[10px] text-ink-mid mb-1">No schedules yet</div>
|
||||
<div className="text-[9px] text-ink-soft">
|
||||
<div className="text-[9px] text-ink-mid">
|
||||
Add a schedule to run tasks automatically — daily scans, periodic reports, standup reminders.
|
||||
</div>
|
||||
</div>
|
||||
@ -336,16 +336,16 @@ export function ScheduleTab({ workspaceId }: Props) {
|
||||
{sched.name || "Unnamed schedule"}
|
||||
</span>
|
||||
</div>
|
||||
<div className="text-[9px] text-ink-soft mt-0.5 font-mono">
|
||||
<div className="text-[9px] text-ink-mid mt-0.5 font-mono">
|
||||
{cronToHuman(sched.cron_expr)}
|
||||
{sched.timezone !== "UTC" && (
|
||||
<span className="text-ink-soft"> ({sched.timezone})</span>
|
||||
<span className="text-ink-mid"> ({sched.timezone})</span>
|
||||
)}
|
||||
</div>
|
||||
<div className="text-[9px] text-ink-soft mt-0.5 truncate">
|
||||
<div className="text-[9px] text-ink-mid mt-0.5 truncate">
|
||||
{sched.prompt.slice(0, 80)}{sched.prompt.length > 80 ? "..." : ""}
|
||||
</div>
|
||||
<div className="flex items-center gap-3 mt-1 text-[8px] text-ink-soft">
|
||||
<div className="flex items-center gap-3 mt-1 text-[8px] text-ink-mid">
|
||||
<span>Last: {relativeTime(sched.last_run_at)}</span>
|
||||
<span>Next: {relativeTime(sched.next_run_at)}</span>
|
||||
<span>Runs: {sched.run_count}</span>
|
||||
|
||||
@ -320,7 +320,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
aria-label="Plugins (none installed)"
|
||||
>
|
||||
<div className="flex items-center gap-2">
|
||||
<span className="text-[10px] uppercase tracking-[0.2em] text-ink-soft">Plugins</span>
|
||||
<span className="text-[10px] uppercase tracking-[0.2em] text-ink-mid">Plugins</span>
|
||||
<span className="text-[11px] text-ink-mid">0 installed</span>
|
||||
</div>
|
||||
<button
|
||||
@ -342,7 +342,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
<div id="plugins-section" className="rounded-xl border border-line bg-surface-sunken/70 p-3">
|
||||
<div className="flex items-center justify-between gap-3">
|
||||
<div>
|
||||
<div className="text-[10px] uppercase tracking-[0.22em] text-ink-soft">Plugins</div>
|
||||
<div className="text-[10px] uppercase tracking-[0.22em] text-ink-mid">Plugins</div>
|
||||
<h3 className="mt-1 text-sm font-semibold text-ink">
|
||||
{installed.length} installed
|
||||
</h3>
|
||||
@ -379,21 +379,21 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
<div className="min-w-0">
|
||||
<div className="flex items-center gap-2">
|
||||
<span className="text-[11px] font-medium text-ink">{p.name}</span>
|
||||
{p.version && <span className="text-[10px] text-ink-soft">v{p.version}</span>}
|
||||
{p.version && <span className="text-[10px] text-ink-mid">v{p.version}</span>}
|
||||
{inert && (
|
||||
<span className="rounded-full border border-amber-700/50 bg-amber-950/30 px-1.5 py-0.5 text-[10px] text-warm">
|
||||
inert on this runtime
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{p.description && <div className="text-[10px] text-ink-soft truncate">{p.description}</div>}
|
||||
{p.description && <div className="text-[10px] text-ink-mid truncate">{p.description}</div>}
|
||||
{p.skills && p.skills.length > 0 && (
|
||||
<div className="mt-1 flex flex-wrap gap-1">
|
||||
{p.skills.slice(0, 4).map((s) => (
|
||||
<span key={s} className="rounded-full bg-surface-card/60 px-1.5 py-0.5 text-[10px] text-ink-mid">{s}</span>
|
||||
))}
|
||||
{p.skills.length > 4 && (
|
||||
<span className="text-[10px] text-ink-soft">+{p.skills.length - 4}</span>
|
||||
<span className="text-[10px] text-ink-mid">+{p.skills.length - 4}</span>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
@ -417,7 +417,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
{/* Install from any source (github://, clawhub://, …) */}
|
||||
<div className="mb-3 rounded-lg border border-line/60 bg-surface/40 p-2.5">
|
||||
<div className="flex items-center justify-between gap-2 mb-1.5">
|
||||
<div className="text-[10px] uppercase tracking-[0.2em] text-ink-soft">
|
||||
<div className="text-[10px] uppercase tracking-[0.2em] text-ink-mid">
|
||||
Install from source
|
||||
</div>
|
||||
{sourceSchemes.length > 0 && (
|
||||
@ -425,7 +425,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
{sourceSchemes.map((s) => (
|
||||
<span
|
||||
key={s}
|
||||
className="rounded-full border border-line/50 bg-surface-sunken/50 px-1.5 py-0.5 text-[10px] text-ink-soft"
|
||||
className="rounded-full border border-line/50 bg-surface-sunken/50 px-1.5 py-0.5 text-[10px] text-ink-mid"
|
||||
>
|
||||
{s}://
|
||||
</span>
|
||||
@ -444,7 +444,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
}}
|
||||
placeholder="e.g. github://owner/repo#v1.0"
|
||||
spellCheck={false}
|
||||
className="flex-1 rounded border border-line bg-surface px-2 py-1 text-[10px] text-ink placeholder:text-ink-soft focus:outline-none focus:border-violet-600 focus-visible:ring-2 focus-visible:ring-violet-600/50"
|
||||
className="flex-1 rounded border border-line bg-surface px-2 py-1 text-[10px] text-ink placeholder:text-ink-mid focus:outline-none focus:border-violet-600 focus-visible:ring-2 focus-visible:ring-violet-600/50"
|
||||
/>
|
||||
<button
|
||||
onClick={handleInstallCustom}
|
||||
@ -454,12 +454,12 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
{installing === customSource.trim() ? "Installing..." : "Install"}
|
||||
</button>
|
||||
</div>
|
||||
<div className="mt-1 text-[10px] text-ink-soft">
|
||||
<div className="mt-1 text-[10px] text-ink-mid">
|
||||
Local registry plugins below; paste any scheme URL above for GitHub or other sources.
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-center justify-between mb-2">
|
||||
<div className="text-[10px] uppercase tracking-[0.2em] text-ink-soft">Available plugins</div>
|
||||
<div className="text-[10px] uppercase tracking-[0.2em] text-ink-mid">Available plugins</div>
|
||||
{/* Retry visible whenever registry is empty — including
|
||||
the loading state — so a stuck fetch (Fast Refresh
|
||||
stranded promise, slow server, browser quirk) has a
|
||||
@ -486,21 +486,21 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
)}
|
||||
</div>
|
||||
{registryLoading && registry.length === 0 ? (
|
||||
<div className="text-[10px] text-ink-soft">Loading registry…</div>
|
||||
<div className="text-[10px] text-ink-mid">Loading registry…</div>
|
||||
) : registryError ? (
|
||||
<div className="rounded-lg border border-red-800/40 bg-red-950/20 px-2 py-1.5">
|
||||
<div className="text-[10px] text-bad font-semibold mb-0.5">
|
||||
Couldn't load the plugin registry
|
||||
</div>
|
||||
<div className="text-[10px] text-bad/80">{registryError}</div>
|
||||
<div className="mt-1 text-[10px] text-ink-soft">
|
||||
<div className="mt-1 text-[10px] text-ink-mid">
|
||||
Check the platform server is reachable at /plugins. The Retry button is in the header above.
|
||||
</div>
|
||||
</div>
|
||||
) : registry.length === 0 ? (
|
||||
<div className="rounded-lg border border-line/40 bg-surface/40 px-2 py-1.5">
|
||||
<div className="text-[10px] text-ink-mid mb-0.5">Registry returned 0 plugins.</div>
|
||||
<div className="text-[10px] text-ink-soft">
|
||||
<div className="text-[10px] text-ink-mid">
|
||||
This usually means the platform's plugins/ directory is empty.
|
||||
Run scripts/clone-manifest.sh to populate it from the standalone repos.
|
||||
</div>
|
||||
@ -514,13 +514,13 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
<div className="min-w-0">
|
||||
<div className="flex items-center gap-2">
|
||||
<span className="text-[11px] text-ink-mid">{p.name}</span>
|
||||
{p.version && <span className="text-[10px] text-ink-soft">v{p.version}</span>}
|
||||
{p.version && <span className="text-[10px] text-ink-mid">v{p.version}</span>}
|
||||
</div>
|
||||
{p.description && <div className="text-[10px] text-ink-soft truncate">{p.description}</div>}
|
||||
{p.description && <div className="text-[10px] text-ink-mid truncate">{p.description}</div>}
|
||||
{p.tags && p.tags.length > 0 && (
|
||||
<div className="mt-1 flex flex-wrap gap-1">
|
||||
{p.tags.map((t) => (
|
||||
<span key={t} className="rounded-full border border-line/40 px-1.5 py-0.5 text-[10px] text-ink-soft">{t}</span>
|
||||
<span key={t} className="rounded-full border border-line/40 px-1.5 py-0.5 text-[10px] text-ink-mid">{t}</span>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
@ -556,7 +556,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
<div className="rounded-xl border border-line bg-surface-sunken/70 p-3">
|
||||
<div className="flex items-center justify-between gap-3">
|
||||
<div>
|
||||
<div className="text-[10px] uppercase tracking-[0.22em] text-ink-soft">Workspace skills</div>
|
||||
<div className="text-[10px] uppercase tracking-[0.22em] text-ink-mid">Workspace skills</div>
|
||||
<h3 className="mt-1 text-sm font-semibold text-ink">Installed skills</h3>
|
||||
</div>
|
||||
<div className="flex flex-wrap gap-2">
|
||||
@ -564,7 +564,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
<MetaPill label="Runtime" value={capability.runtime || "unknown"} />
|
||||
</div>
|
||||
</div>
|
||||
<p className="mt-2 text-[11px] leading-5 text-ink-soft">
|
||||
<p className="mt-2 text-[11px] leading-5 text-ink-mid">
|
||||
Live skill directory from the Agent Card — updates when the workspace hot-reloads skills.
|
||||
</p>
|
||||
<div className="mt-3 flex flex-wrap gap-2">
|
||||
@ -593,7 +593,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
{skills.length === 0 ? (
|
||||
<div className="rounded-xl border border-dashed border-line bg-surface-sunken/40 p-6 text-center">
|
||||
<div className="text-sm text-ink">No skills loaded</div>
|
||||
<p className="mt-2 text-[11px] leading-5 text-ink-soft">
|
||||
<p className="mt-2 text-[11px] leading-5 text-ink-mid">
|
||||
Add skills from the Config tab, install a plugin above, or let the runtime hot-load them.
|
||||
</p>
|
||||
</div>
|
||||
@ -604,7 +604,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
<div className="flex items-start justify-between gap-3">
|
||||
<div>
|
||||
<div className="text-xs font-semibold text-ink">{skill.name}</div>
|
||||
<div className="mt-0.5 text-[10px] font-mono text-ink-soft">{skill.id}</div>
|
||||
<div className="mt-0.5 text-[10px] font-mono text-ink-mid">{skill.id}</div>
|
||||
</div>
|
||||
{skill.tags.length > 0 && (
|
||||
<div className="flex flex-wrap justify-end gap-1.5">
|
||||
@ -626,7 +626,7 @@ export function SkillsTab({ workspaceId, data }: Props) {
|
||||
|
||||
{skill.examples.length > 0 && (
|
||||
<div className="mt-2">
|
||||
<div className="text-[9px] uppercase tracking-[0.2em] text-ink-soft">Examples</div>
|
||||
<div className="text-[9px] uppercase tracking-[0.2em] text-ink-mid">Examples</div>
|
||||
<div className="mt-1 space-y-1">
|
||||
{skill.examples.slice(0, 2).map((example, index) => (
|
||||
<div
|
||||
@ -666,7 +666,7 @@ function extractSkills(agentCard: Record<string, unknown> | null): SkillEntry[]
|
||||
function MetaPill({ label, value }: { label: string; value: string }) {
|
||||
return (
|
||||
<span className="inline-flex items-center gap-1 rounded-full border border-line/60 bg-surface/60 px-2 py-1 text-[9px] text-ink-mid">
|
||||
<span className="uppercase tracking-[0.18em] text-[8px] text-ink-soft">{label}</span>
|
||||
<span className="uppercase tracking-[0.18em] text-[8px] text-ink-mid">{label}</span>
|
||||
<span className="font-medium">{value}</span>
|
||||
</span>
|
||||
);
|
||||
|
||||
@ -37,7 +37,7 @@ function NotAvailablePanel({ runtime }: { runtime: string }) {
|
||||
viewBox="0 0 72 72"
|
||||
fill="none"
|
||||
aria-hidden="true"
|
||||
className="text-ink-soft mb-4"
|
||||
className="text-ink-mid mb-4"
|
||||
>
|
||||
<rect
|
||||
x="10"
|
||||
@ -74,7 +74,7 @@ function NotAvailablePanel({ runtime }: { runtime: string }) {
|
||||
/>
|
||||
</svg>
|
||||
<h3 className="text-sm font-medium text-ink mb-1.5">Terminal not available</h3>
|
||||
<p className="text-[11px] text-ink-soft max-w-xs leading-relaxed">
|
||||
<p className="text-[11px] text-ink-mid max-w-xs leading-relaxed">
|
||||
This workspace runs the{" "}
|
||||
<span className="font-mono text-ink-mid">{runtime}</span> runtime,
|
||||
which doesn't expose a shell. Use the Chat tab to interact with the
|
||||
|
||||
@ -48,7 +48,7 @@ export function TracesTab({ workspaceId }: Props) {
|
||||
}, [loadTraces]);
|
||||
|
||||
if (loading) {
|
||||
return <div className="p-4 text-xs text-ink-soft">Loading traces...</div>;
|
||||
return <div className="p-4 text-xs text-ink-mid">Loading traces...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
@ -60,7 +60,7 @@ export function TracesTab({ workspaceId }: Props) {
|
||||
onClick={loadTraces}
|
||||
// Added focus-visible ring; previous version was hover-only,
|
||||
// invisible to keyboard users.
|
||||
className="text-[10px] text-ink-soft hover:text-ink-mid rounded-sm px-1 transition-colors focus:outline-none focus-visible:ring-2 focus-visible:ring-accent/50"
|
||||
className="text-[10px] text-ink-mid hover:text-ink-mid rounded-sm px-1 transition-colors focus:outline-none focus-visible:ring-2 focus-visible:ring-accent/50"
|
||||
>
|
||||
Refresh
|
||||
</button>
|
||||
@ -75,9 +75,9 @@ export function TracesTab({ workspaceId }: Props) {
|
||||
{traces.length === 0 && !error ? (
|
||||
<div className="text-center py-8">
|
||||
<div className="text-2xl opacity-20 mb-2" aria-hidden="true">--</div>
|
||||
<p className="text-xs text-ink-soft">No traces yet</p>
|
||||
<details className="mt-2 text-[10px] text-ink-soft">
|
||||
<summary className="cursor-pointer text-ink-soft hover:text-ink-mid">How to enable tracing</summary>
|
||||
<p className="text-xs text-ink-mid">No traces yet</p>
|
||||
<details className="mt-2 text-[10px] text-ink-mid">
|
||||
<summary className="cursor-pointer text-ink-mid hover:text-ink-mid">How to enable tracing</summary>
|
||||
<p className="mt-1">
|
||||
Set <code className="font-mono text-ink-mid">LANGFUSE_HOST</code>, <code className="font-mono text-ink-mid">LANGFUSE_PUBLIC_KEY</code>, <code className="font-mono text-ink-mid">LANGFUSE_SECRET_KEY</code> as workspace secrets to enable tracing.
|
||||
</p>
|
||||
@ -108,20 +108,20 @@ export function TracesTab({ workspaceId }: Props) {
|
||||
}`} />
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="text-[11px] text-ink truncate">{trace.name || "trace"}</div>
|
||||
<div className="text-[9px] text-ink-soft">{formatTime(trace.timestamp)}</div>
|
||||
<div className="text-[9px] text-ink-mid">{formatTime(trace.timestamp)}</div>
|
||||
</div>
|
||||
<div className="flex items-center gap-2 shrink-0">
|
||||
{trace.latency != null && (
|
||||
<span className="text-[9px] text-ink-soft tabular-nums">
|
||||
<span className="text-[9px] text-ink-mid tabular-nums">
|
||||
{trace.latency > 1000 ? `${(trace.latency / 1000).toFixed(1)}s` : `${trace.latency}ms`}
|
||||
</span>
|
||||
)}
|
||||
{trace.usage?.total != null && (
|
||||
<span className="text-[9px] text-ink-soft tabular-nums">
|
||||
<span className="text-[9px] text-ink-mid tabular-nums">
|
||||
{trace.usage.total} tok
|
||||
</span>
|
||||
)}
|
||||
<span aria-hidden="true" className="text-[9px] text-ink-soft">
|
||||
<span aria-hidden="true" className="text-[9px] text-ink-mid">
|
||||
{isOpen ? "▼" : "▶"}
|
||||
</span>
|
||||
</div>
|
||||
@ -131,7 +131,7 @@ export function TracesTab({ workspaceId }: Props) {
|
||||
<div id={panelId} className="px-3 pb-2 space-y-2 border-t border-line/30">
|
||||
{trace.input && (
|
||||
<div>
|
||||
<div className="text-[9px] text-ink-soft uppercase tracking-wider mt-2 mb-1">Input</div>
|
||||
<div className="text-[9px] text-ink-mid uppercase tracking-wider mt-2 mb-1">Input</div>
|
||||
<pre className="text-[9px] text-ink-mid bg-surface-sunken rounded p-2 overflow-x-auto max-h-32">
|
||||
{String(typeof trace.input === "string" ? trace.input : JSON.stringify(trace.input, null, 2))}
|
||||
</pre>
|
||||
@ -139,18 +139,18 @@ export function TracesTab({ workspaceId }: Props) {
|
||||
)}
|
||||
{trace.output && (
|
||||
<div>
|
||||
<div className="text-[9px] text-ink-soft uppercase tracking-wider mb-1">Output</div>
|
||||
<div className="text-[9px] text-ink-mid uppercase tracking-wider mb-1">Output</div>
|
||||
<pre className="text-[9px] text-ink-mid bg-surface-sunken rounded p-2 overflow-x-auto max-h-32">
|
||||
{String(typeof trace.output === "string" ? trace.output : JSON.stringify(trace.output, null, 2))}
|
||||
</pre>
|
||||
</div>
|
||||
)}
|
||||
{trace.totalCost != null && (
|
||||
<div className="text-[9px] text-ink-soft">
|
||||
<div className="text-[9px] text-ink-mid">
|
||||
Cost: ${trace.totalCost.toFixed(6)}
|
||||
</div>
|
||||
)}
|
||||
<div className="text-[8px] text-ink-soft font-mono select-all">
|
||||
<div className="text-[8px] text-ink-mid font-mono select-all">
|
||||
{trace.id}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@ -389,7 +389,7 @@ export function AgentCommsPanel({ workspaceId }: { workspaceId: string }) {
|
||||
}, [messages]);
|
||||
|
||||
if (loading) {
|
||||
return <div className="text-xs text-ink-soft text-center py-8">Loading agent communications...</div>;
|
||||
return <div className="text-xs text-ink-mid text-center py-8">Loading agent communications...</div>;
|
||||
}
|
||||
|
||||
if (loadError !== null && messages.length === 0) {
|
||||
@ -415,10 +415,10 @@ export function AgentCommsPanel({ workspaceId }: { workspaceId: string }) {
|
||||
|
||||
if (messages.length === 0) {
|
||||
return (
|
||||
<div className="text-xs text-ink-soft text-center py-8">
|
||||
<div className="text-xs text-ink-mid text-center py-8">
|
||||
No agent-to-agent communications yet.
|
||||
<br />
|
||||
<span className="text-ink-soft">Delegations and peer messages will appear here.</span>
|
||||
<span className="text-ink-mid">Delegations and peer messages will appear here.</span>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@ -513,7 +513,20 @@ function GroupedCommsView({
|
||||
/>
|
||||
<div className="flex-1 overflow-y-auto p-3 space-y-2">
|
||||
{visible.map((msg) =>
|
||||
msg.status === "error" ? (
|
||||
// Only render the error UI when there is NO usable response
|
||||
// content. A "error" status from the platform means the HTTP
|
||||
// transport layer had a problem — but the agent response text
|
||||
// may have arrived and been stored in response_body.text.
|
||||
// Delegation results set responseText via extractResponseText
|
||||
// once that function learned to parse body.text, so checking
|
||||
// !msg.responseText here correctly identifies "no actual reply
|
||||
// was received" vs. "reply arrived but status=error".
|
||||
//
|
||||
// Without this guard, successful delegation results were
|
||||
// rendered as error banners, PMs saw "restart" prompts and
|
||||
// restarted working agents, and retry storms formed as the
|
||||
// platform re-delivered the same completed work (issue #159).
|
||||
msg.status === "error" && !msg.responseText ? (
|
||||
<ErrorMessage key={msg.id} msg={msg} />
|
||||
) : (
|
||||
<NormalMessage key={msg.id} msg={msg} />
|
||||
@ -600,10 +613,10 @@ function PeerTabButton({
|
||||
className={`shrink-0 px-3 py-1.5 text-[10px] font-medium transition-colors whitespace-nowrap ${
|
||||
active
|
||||
? "border-b-2 border-cyan-500 text-cyan-200"
|
||||
: "border-b-2 border-transparent text-ink-soft hover:text-ink-mid"
|
||||
: "border-b-2 border-transparent text-ink-mid hover:text-ink-mid"
|
||||
}`}
|
||||
>
|
||||
{label} <span className="text-[9px] text-ink-soft">({count})</span>
|
||||
{label} <span className="text-[9px] text-ink-mid">({count})</span>
|
||||
</button>
|
||||
);
|
||||
}
|
||||
@ -656,7 +669,7 @@ function WaitingBubbles({ visible }: { visible: CommMessage[] }) {
|
||||
role="status"
|
||||
aria-label={`Waiting for reply from ${m.peerName}`}
|
||||
>
|
||||
<div className="text-[9px] text-ink-soft mb-1">→ To {m.peerName}</div>
|
||||
<div className="text-[9px] text-ink-mid mb-1">→ To {m.peerName}</div>
|
||||
<span className="flex items-center gap-2 text-ink-mid">
|
||||
<span className="flex gap-0.5" aria-hidden="true">
|
||||
<span
|
||||
@ -695,7 +708,7 @@ function NormalMessage({ msg }: { msg: CommMessage }) {
|
||||
: "bg-surface-card/80 text-ink border border-line/30"
|
||||
}`}
|
||||
>
|
||||
<div className="text-[9px] text-ink-soft mb-1">
|
||||
<div className="text-[9px] text-ink-mid mb-1">
|
||||
{msg.flow === "out" ? `→ To ${msg.peerName}` : `← From ${msg.peerName}`}
|
||||
</div>
|
||||
{msg.text ? (
|
||||
@ -718,7 +731,7 @@ function NormalMessage({ msg }: { msg: CommMessage }) {
|
||||
{msg.responseText}
|
||||
</MarkdownBody>
|
||||
)}
|
||||
<div className="text-[9px] text-ink-soft mt-1">
|
||||
<div className="text-[9px] text-ink-mid mt-1">
|
||||
{new Date(msg.timestamp).toLocaleTimeString()}
|
||||
</div>
|
||||
</div>
|
||||
@ -791,7 +804,7 @@ function ErrorMessage({ msg }: { msg: CommMessage }) {
|
||||
</div>
|
||||
|
||||
{msg.text && (
|
||||
<div className="text-[10px] text-ink-soft mb-1.5">
|
||||
<div className="text-[10px] text-ink-mid mb-1.5">
|
||||
<span className="uppercase tracking-wide">Task</span>
|
||||
<MarkdownBody className="text-ink-mid">{msg.text}</MarkdownBody>
|
||||
</div>
|
||||
@ -828,7 +841,7 @@ function ErrorMessage({ msg }: { msg: CommMessage }) {
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="text-[9px] text-ink-soft mt-1.5">
|
||||
<div className="text-[9px] text-ink-mid mt-1.5">
|
||||
{new Date(msg.timestamp).toLocaleTimeString()}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@ -9,6 +9,7 @@
|
||||
// AttachmentLightbox).
|
||||
|
||||
import { useState, useEffect, useRef } from "react";
|
||||
import { platformAuthHeaders } from "@/lib/api";
|
||||
import type { ChatAttachment } from "./types";
|
||||
import { isPlatformAttachment, resolveAttachmentHref } from "./uploads";
|
||||
import { AttachmentChip } from "./AttachmentViews";
|
||||
@ -43,13 +44,8 @@ export function AttachmentAudio({ workspaceId, attachment, onDownload, tone }: P
|
||||
void (async () => {
|
||||
try {
|
||||
const href = resolveAttachmentHref(workspaceId, attachment.uri);
|
||||
const headers: Record<string, string> = {};
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const res = await fetch(href, {
|
||||
headers,
|
||||
headers: platformAuthHeaders(),
|
||||
credentials: "include",
|
||||
signal: AbortSignal.timeout(60_000),
|
||||
});
|
||||
@ -116,9 +112,5 @@ export function AttachmentAudio({ workspaceId, attachment, onDownload, tone }: P
|
||||
);
|
||||
}
|
||||
|
||||
function getTenantSlug(): string | null {
|
||||
if (typeof window === "undefined") return null;
|
||||
const host = window.location.hostname;
|
||||
const m = host.match(/^([^.]+)\.moleculesai\.app$/);
|
||||
return m ? m[1] : null;
|
||||
}
|
||||
// Local getTenantSlug() removed — auth-header construction now goes
|
||||
// through platformAuthHeaders() from @/lib/api (#178).
|
||||
|
||||
@ -35,6 +35,7 @@
|
||||
// downscale via canvas, but defer that to v2.
|
||||
|
||||
import { useState, useEffect, useRef } from "react";
|
||||
import { platformAuthHeaders } from "@/lib/api";
|
||||
import type { ChatAttachment } from "./types";
|
||||
import { isPlatformAttachment, resolveAttachmentHref } from "./uploads";
|
||||
import { AttachmentLightbox } from "./AttachmentLightbox";
|
||||
@ -75,22 +76,14 @@ export function AttachmentImage({ workspaceId, attachment, onDownload, tone }: P
|
||||
}
|
||||
|
||||
// Platform-auth path: identical to downloadChatFile but we keep
|
||||
// the blob (don't trigger a Save-As). Use the same headers it does
|
||||
// by going through it indirectly — no, downloadChatFile triggers a
|
||||
// Save-As. Need a separate fetch.
|
||||
// the blob (don't trigger a Save-As). Auth headers come from the
|
||||
// shared `platformAuthHeaders()` helper — one source of truth for
|
||||
// every authenticated raw fetch in the canvas (#178).
|
||||
void (async () => {
|
||||
try {
|
||||
const href = resolveAttachmentHref(workspaceId, attachment.uri);
|
||||
const headers: Record<string, string> = {};
|
||||
// Read the same env var downloadChatFile reads — single source
|
||||
// of truth would be cleaner; refactor opportunity for PR-2 if
|
||||
// we add the same path to AttachmentVideo.
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const res = await fetch(href, {
|
||||
headers,
|
||||
headers: platformAuthHeaders(),
|
||||
credentials: "include",
|
||||
signal: AbortSignal.timeout(30_000),
|
||||
});
|
||||
@ -184,15 +177,7 @@ export function AttachmentImage({ workspaceId, attachment, onDownload, tone }: P
|
||||
);
|
||||
}
|
||||
|
||||
// Internal helper — duplicated from uploads.ts (it's not exported
|
||||
// there). Kept local so this component doesn't reach into private
|
||||
// surface; if AttachmentVideo / AttachmentPDF in PR-2/PR-3 also need
|
||||
// it, lift to an exported helper at that point (the third-caller
|
||||
// rule).
|
||||
function getTenantSlug(): string | null {
|
||||
if (typeof window === "undefined") return null;
|
||||
const host = window.location.hostname;
|
||||
// Tenant subdomain shape: <slug>.moleculesai.app
|
||||
const m = host.match(/^([^.]+)\.moleculesai\.app$/);
|
||||
return m ? m[1] : null;
|
||||
}
|
||||
// Local getTenantSlug() removed — auth-header construction now goes
|
||||
// through platformAuthHeaders() from @/lib/api which uses the canonical
|
||||
// getTenantSlug() from @/lib/tenant. This eliminates the duplicate
|
||||
// hostname-regex + the duplicate bearer-token-attach pattern (#178).
|
||||
|
||||
@ -33,6 +33,7 @@
|
||||
// timeout, swap to chip. Implemented as a 3-second watchdog.
|
||||
|
||||
import { useState, useEffect, useRef } from "react";
|
||||
import { platformAuthHeaders } from "@/lib/api";
|
||||
import type { ChatAttachment } from "./types";
|
||||
import { isPlatformAttachment, resolveAttachmentHref } from "./uploads";
|
||||
import { AttachmentLightbox } from "./AttachmentLightbox";
|
||||
@ -69,13 +70,8 @@ export function AttachmentPDF({ workspaceId, attachment, onDownload, tone }: Pro
|
||||
void (async () => {
|
||||
try {
|
||||
const href = resolveAttachmentHref(workspaceId, attachment.uri);
|
||||
const headers: Record<string, string> = {};
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const res = await fetch(href, {
|
||||
headers,
|
||||
headers: platformAuthHeaders(),
|
||||
credentials: "include",
|
||||
signal: AbortSignal.timeout(60_000),
|
||||
});
|
||||
@ -189,9 +185,5 @@ function PdfGlyph() {
|
||||
);
|
||||
}
|
||||
|
||||
function getTenantSlug(): string | null {
|
||||
if (typeof window === "undefined") return null;
|
||||
const host = window.location.hostname;
|
||||
const m = host.match(/^([^.]+)\.moleculesai\.app$/);
|
||||
return m ? m[1] : null;
|
||||
}
|
||||
// Local getTenantSlug() removed — auth-header construction now goes
|
||||
// through platformAuthHeaders() from @/lib/api (#178).
|
||||
|
||||
@ -26,6 +26,7 @@
|
||||
// to download the full file.
|
||||
|
||||
import { useState, useEffect } from "react";
|
||||
import { platformAuthHeaders } from "@/lib/api";
|
||||
import type { ChatAttachment } from "./types";
|
||||
import { isPlatformAttachment, resolveAttachmentHref } from "./uploads";
|
||||
import { AttachmentChip } from "./AttachmentViews";
|
||||
@ -57,13 +58,13 @@ export function AttachmentTextPreview({ workspaceId, attachment, onDownload, ton
|
||||
void (async () => {
|
||||
try {
|
||||
const href = resolveAttachmentHref(workspaceId, attachment.uri);
|
||||
const headers: Record<string, string> = {};
|
||||
if (isPlatformAttachment(attachment.uri)) {
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
}
|
||||
// Only attach platform auth headers for in-platform URIs —
|
||||
// off-platform URLs (HTTP/HTTPS attachments) MUST NOT receive
|
||||
// our bearer token (it would leak the admin token to a third
|
||||
// party). The branch is preserved with the new shared helper.
|
||||
const headers: Record<string, string> = isPlatformAttachment(attachment.uri)
|
||||
? platformAuthHeaders()
|
||||
: {};
|
||||
const res = await fetch(href, {
|
||||
headers,
|
||||
credentials: "include",
|
||||
@ -147,7 +148,7 @@ export function AttachmentTextPreview({ workspaceId, attachment, onDownload, ton
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => onDownload(attachment)}
|
||||
className="text-ink-soft hover:text-ink"
|
||||
className="text-ink-mid hover:text-ink"
|
||||
title={`Download ${attachment.name}`}
|
||||
aria-label={`Download ${attachment.name}`}
|
||||
>
|
||||
@ -182,9 +183,5 @@ export function AttachmentTextPreview({ workspaceId, attachment, onDownload, ton
|
||||
);
|
||||
}
|
||||
|
||||
function getTenantSlug(): string | null {
|
||||
if (typeof window === "undefined") return null;
|
||||
const host = window.location.hostname;
|
||||
const m = host.match(/^([^.]+)\.moleculesai\.app$/);
|
||||
return m ? m[1] : null;
|
||||
}
|
||||
// Local getTenantSlug() removed — auth-header construction now goes
|
||||
// through platformAuthHeaders() from @/lib/api (#178).
|
||||
|
||||
@ -25,6 +25,7 @@
|
||||
// fetch via service worker. v2 if measured-needed.
|
||||
|
||||
import { useState, useEffect, useRef } from "react";
|
||||
import { platformAuthHeaders } from "@/lib/api";
|
||||
import type { ChatAttachment } from "./types";
|
||||
import { isPlatformAttachment, resolveAttachmentHref } from "./uploads";
|
||||
import { AttachmentChip } from "./AttachmentViews";
|
||||
@ -61,13 +62,8 @@ export function AttachmentVideo({ workspaceId, attachment, onDownload, tone }: P
|
||||
void (async () => {
|
||||
try {
|
||||
const href = resolveAttachmentHref(workspaceId, attachment.uri);
|
||||
const headers: Record<string, string> = {};
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const res = await fetch(href, {
|
||||
headers,
|
||||
headers: platformAuthHeaders(),
|
||||
credentials: "include",
|
||||
// Videos are larger than images on average; give the request
|
||||
// more headroom. The server's per-request body cap (50MB) is
|
||||
@ -147,11 +143,5 @@ export function AttachmentVideo({ workspaceId, attachment, onDownload, tone }: P
|
||||
);
|
||||
}
|
||||
|
||||
// Internal helper — same shape as AttachmentImage's. Lifted to a
|
||||
// shared util in PR-2.5 if a third caller needs it (PDF, audio).
|
||||
function getTenantSlug(): string | null {
|
||||
if (typeof window === "undefined") return null;
|
||||
const host = window.location.hostname;
|
||||
const m = host.match(/^([^.]+)\.moleculesai\.app$/);
|
||||
return m ? m[1] : null;
|
||||
}
|
||||
// Local getTenantSlug() removed — auth-header construction now goes
|
||||
// through platformAuthHeaders() from @/lib/api (#178).
|
||||
|
||||
@ -29,11 +29,11 @@ export function PendingAttachmentPill({
|
||||
<div className="flex items-center gap-1.5 rounded-md border border-line/60 bg-surface-card/80 px-2 py-1 text-[10px] text-ink-mid max-w-[200px]">
|
||||
<FileGlyph className="text-ink-mid shrink-0" />
|
||||
<span className="truncate" title={file.name}>{file.name}</span>
|
||||
<span className="text-ink-soft shrink-0 tabular-nums">{formatSize(file.size)}</span>
|
||||
<span className="text-ink-mid shrink-0 tabular-nums">{formatSize(file.size)}</span>
|
||||
<button
|
||||
onClick={onRemove}
|
||||
aria-label={`Remove ${file.name}`}
|
||||
className="ml-0.5 text-ink-soft hover:text-ink transition-colors shrink-0"
|
||||
className="ml-0.5 text-ink-mid hover:text-ink transition-colors shrink-0"
|
||||
>
|
||||
<svg width="10" height="10" viewBox="0 0 16 16" fill="none" aria-hidden="true">
|
||||
<path d="M4 4l8 8M12 4l-8 8" stroke="currentColor" strokeWidth="1.6" strokeLinecap="round" />
|
||||
|
||||
@ -4,9 +4,11 @@ import { render, screen, fireEvent, waitFor } from "@testing-library/react";
|
||||
|
||||
// API mock — tests can override per case via apiGetMock.mockImplementationOnce.
|
||||
const apiGetMock = vi.fn<(url: string) => Promise<unknown>>();
|
||||
const apiPostMock = vi.fn<(url: string, body?: unknown) => Promise<unknown>>();
|
||||
vi.mock("@/lib/api", () => ({
|
||||
api: {
|
||||
get: (url: string) => apiGetMock(url),
|
||||
post: (url: string, body?: unknown) => apiPostMock(url, body),
|
||||
},
|
||||
}));
|
||||
|
||||
@ -16,17 +18,23 @@ vi.mock("@/hooks/useSocketEvent", () => ({
|
||||
useSocketEvent: () => {},
|
||||
}));
|
||||
|
||||
// Canvas store — peer name resolution.
|
||||
vi.mock("@/store/canvas", () => ({
|
||||
useCanvasStore: {
|
||||
getState: () => ({
|
||||
nodes: [
|
||||
{ id: "ws-self", data: { name: "Self" } },
|
||||
{ id: "ws-peer", data: { name: "Peer Agent" } },
|
||||
],
|
||||
}),
|
||||
},
|
||||
}));
|
||||
// Canvas store — peer name resolution + ErrorMessage requires selectNode
|
||||
// (Zustand hook usage). The mock must support BOTH:
|
||||
// useCanvasStore.getState().nodes (plain object with getState)
|
||||
// useCanvasStore((s) => s.selectNode) (Zustand hook with selector)
|
||||
vi.mock("@/store/canvas", () => {
|
||||
const state = {
|
||||
nodes: [
|
||||
{ id: "ws-self", data: { name: "Self" } },
|
||||
{ id: "ws-peer", data: { name: "Peer Agent" } },
|
||||
],
|
||||
selectNode: vi.fn(),
|
||||
};
|
||||
const hook = (selector?: (s: typeof state) => unknown) =>
|
||||
selector ? selector(state) : state;
|
||||
hook.getState = () => state;
|
||||
return { useCanvasStore: hook };
|
||||
});
|
||||
|
||||
// Toaster shim — AgentCommsPanel imports showToast.
|
||||
vi.mock("../../Toaster", () => ({
|
||||
@ -41,6 +49,8 @@ import { AgentCommsPanel } from "../AgentCommsPanel";
|
||||
const scrollSpy = vi.fn<(opts?: ScrollIntoViewOptions | boolean) => void>();
|
||||
beforeEach(() => {
|
||||
apiGetMock.mockReset();
|
||||
apiPostMock.mockReset();
|
||||
apiPostMock.mockResolvedValue({});
|
||||
scrollSpy.mockReset();
|
||||
Element.prototype.scrollIntoView = scrollSpy as unknown as Element["scrollIntoView"];
|
||||
});
|
||||
@ -49,6 +59,81 @@ afterEach(() => {
|
||||
vi.clearAllMocks();
|
||||
});
|
||||
|
||||
// Regression test: when a delegation succeeds but the platform persisted
|
||||
// status="error" (transport-layer HTTP failure, not agent failure), the
|
||||
// canvas had the response text in msg.text but rendered ErrorMessage
|
||||
// anyway, burying the real content in an "Underlying error" banner and
|
||||
// prompting PMs to restart working agents (issue #159).
|
||||
describe("AgentCommsPanel — error rendering guard (issue #159)", () => {
|
||||
it("renders NormalMessage when status=error but msg.text is present (successful delegation)", async () => {
|
||||
// Simulate a delegation result where status="error" (HTTP transport
|
||||
// failed) but response_body.text carries the actual agent response.
|
||||
// The correct behaviour: show the content as a normal inbound bubble,
|
||||
// NOT an error banner.
|
||||
apiGetMock.mockResolvedValueOnce([
|
||||
{
|
||||
id: "act-1",
|
||||
activity_type: "delegation",
|
||||
method: "delegate_result",
|
||||
source_id: "ws-self",
|
||||
target_id: "ws-peer",
|
||||
summary: "Delegation completed",
|
||||
request_body: null,
|
||||
// delegation.go stores response_body as {text: "...", delegation_id: "..."}
|
||||
response_body: {
|
||||
text: "PR #149: tier-check fails NO REVIEWS (author needs engineers/managers/ceo approval)",
|
||||
delegation_id: "delg_01jx8q4n3k",
|
||||
},
|
||||
status: "error", // transport-layer error, not agent failure
|
||||
created_at: "2026-04-25T18:00:00Z",
|
||||
},
|
||||
]);
|
||||
render(<AgentCommsPanel workspaceId="ws-self" />);
|
||||
|
||||
// The response text should appear in a normal inbound bubble, NOT in
|
||||
// an error banner. Specifically: no "Failed to deliver" or "returned
|
||||
// an error" text should appear.
|
||||
await waitFor(() => {
|
||||
expect(screen.queryByText(/failed to deliver/i)).toBeNull();
|
||||
expect(screen.queryByText(/returned an error/i)).toBeNull();
|
||||
});
|
||||
// The actual content must be visible.
|
||||
await waitFor(() =>
|
||||
expect(
|
||||
screen.getByText(/tier-check fails NO REVIEWS/i),
|
||||
).toBeDefined(),
|
||||
);
|
||||
});
|
||||
|
||||
it("renders ErrorMessage when status=error and msg.text is absent (true failure)", async () => {
|
||||
// True delivery failure: no response body, no text. The error banner
|
||||
// IS appropriate here.
|
||||
apiGetMock.mockResolvedValueOnce([
|
||||
{
|
||||
id: "act-1",
|
||||
activity_type: "a2a_send",
|
||||
source_id: "ws-self",
|
||||
target_id: "ws-peer",
|
||||
method: "message/send",
|
||||
summary: "A2A send failed",
|
||||
request_body: null,
|
||||
response_body: null,
|
||||
status: "error",
|
||||
created_at: "2026-04-25T18:00:00Z",
|
||||
},
|
||||
]);
|
||||
render(<AgentCommsPanel workspaceId="ws-self" />);
|
||||
|
||||
// Error banner IS shown for true failures (no content).
|
||||
// jsdom doesn't reliably match role="alert" in getByRole, so use
|
||||
// getByText instead.
|
||||
const errorBanner = await waitFor(() =>
|
||||
screen.getByText(/failed to deliver/i),
|
||||
);
|
||||
expect(errorBanner).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe("AgentCommsPanel — initial-state parity with ChatTab my-chat", () => {
|
||||
it("shows loading text while history fetch is in flight", () => {
|
||||
apiGetMock.mockReturnValueOnce(new Promise(() => { /* never resolves */ }));
|
||||
|
||||
@ -64,6 +64,54 @@ describe("extractRequestText", () => {
|
||||
};
|
||||
expect(extractRequestText(body)).toBe("");
|
||||
});
|
||||
|
||||
// Regression: delegation.go stores request_body as {"task": "...", "delegation_id": "..."}.
|
||||
// extractRequestText was checking only the A2A params.message.parts path, so
|
||||
// outbound delegation messages were rendered as blank bubbles.
|
||||
// Fix: check body.task first (delegation format), then fall back to A2A.
|
||||
it("extracts text from body.task (delegation format)", () => {
|
||||
const body = {
|
||||
task: "Deploy the staging environment for this sprint's release",
|
||||
delegation_id: "delg_01jx8q4n3k",
|
||||
};
|
||||
expect(extractRequestText(body)).toBe(
|
||||
"Deploy the staging environment for this sprint's release"
|
||||
);
|
||||
});
|
||||
|
||||
it("prefers body.task over A2A params when both present", () => {
|
||||
const body = {
|
||||
task: "Delegation text wins",
|
||||
params: {
|
||||
message: {
|
||||
parts: [{ kind: "text", text: "A2A text" }],
|
||||
},
|
||||
},
|
||||
};
|
||||
// body.task is checked first; delegation wins for delegation activities.
|
||||
expect(extractRequestText(body)).toBe("Delegation text wins");
|
||||
});
|
||||
|
||||
it("falls back to A2A format when body.task is absent", () => {
|
||||
const body = {
|
||||
params: {
|
||||
message: {
|
||||
parts: [{ kind: "text", text: "A2A fallback" }],
|
||||
},
|
||||
},
|
||||
};
|
||||
expect(extractRequestText(body)).toBe("A2A fallback");
|
||||
});
|
||||
|
||||
it("returns empty string when body.task is empty string", () => {
|
||||
const body = { task: "" };
|
||||
expect(extractRequestText(body)).toBe("");
|
||||
});
|
||||
|
||||
it("returns empty string when body.task is not a string", () => {
|
||||
const body = { task: 42 };
|
||||
expect(extractRequestText(body)).toBe("");
|
||||
});
|
||||
});
|
||||
|
||||
describe("extractResponseText", () => {
|
||||
@ -161,6 +209,43 @@ describe("extractResponseText", () => {
|
||||
};
|
||||
expect(extractResponseText(body)).toBe("Summary\nDetail block one\nDetail block two");
|
||||
});
|
||||
|
||||
// Regression: delegation.go stores response_body as
|
||||
// {"text": "...", "delegation_id": "..."} — no "result" wrapper.
|
||||
// Without body.text handling, extractResponseText returns "" for
|
||||
// delegate_result rows, causing the error UI to fire even when the
|
||||
// delegation succeeded (issue #159).
|
||||
it("extracts from body.text (delegation response_body shape)", () => {
|
||||
const body = {
|
||||
text: "PR #149: tier-check fails NO REVIEWS (author needs engineers/managers/ceo approval)",
|
||||
delegation_id: "delg_01jx8q4n3k",
|
||||
};
|
||||
expect(extractResponseText(body)).toBe(
|
||||
"PR #149: tier-check fails NO REVIEWS (author needs engineers/managers/ceo approval)"
|
||||
);
|
||||
});
|
||||
|
||||
it("prefers body.result over body.text when both present", () => {
|
||||
const body = {
|
||||
result: { parts: [{ kind: "text", text: "A2A result wins" }] },
|
||||
text: "Delegation text",
|
||||
};
|
||||
// result path is checked first; A2A wins when both present.
|
||||
expect(extractResponseText(body)).toBe("A2A result wins");
|
||||
});
|
||||
|
||||
it("returns empty string when body.text is empty string", () => {
|
||||
expect(extractResponseText({ text: "" })).toBe("");
|
||||
});
|
||||
|
||||
it("extracts from body.response_preview (DELEGATION_COMPLETE WS event shape)", () => {
|
||||
const body = {
|
||||
response_preview: "PR #149: tier-check fails NO REVIEWS (author needs engineers/managers/ceo approval)",
|
||||
};
|
||||
expect(extractResponseText(body)).toBe(
|
||||
"PR #149: tier-check fails NO REVIEWS (author needs engineers/managers/ceo approval)"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("extractTextsFromParts", () => {
|
||||
|
||||
@ -114,9 +114,15 @@ function basename(uri: string): string {
|
||||
return slash >= 0 ? cleaned.slice(slash + 1) : cleaned || "file";
|
||||
}
|
||||
|
||||
/** Extract user message text from an activity log request_body */
|
||||
/** Extract user message text from an activity log request_body.
|
||||
*
|
||||
* Delegation activities from delegation.go store the task text directly
|
||||
* at `body.task` as a plain string: {"task": "...", "delegation_id": "..."}.
|
||||
* Check this first before falling back to the A2A JSON-RPC format
|
||||
* (`body.params.message.parts[].text`). */
|
||||
export function extractRequestText(body: Record<string, unknown> | null): string {
|
||||
if (!body) return "";
|
||||
if (typeof body.task === "string" && body.task) return body.task;
|
||||
const params = body.params as Record<string, unknown> | undefined;
|
||||
const msg = params?.message as Record<string, unknown> | undefined;
|
||||
const parts = msg?.parts as Array<Record<string, unknown>> | undefined;
|
||||
@ -162,10 +168,10 @@ export function extractResponseText(body: Record<string, unknown>): string {
|
||||
if (rootTexts.length > 0) collected.push(rootTexts.join("\n"));
|
||||
|
||||
// Task shape: {result: {artifacts: [{parts: [...]}]}}
|
||||
const artifacts = result.artifacts as Array<Record<string, unknown>> | undefined;
|
||||
const artifacts = result.artifacts as Array<Record<string, unknown> | undefined>;
|
||||
if (artifacts) {
|
||||
for (const a of artifacts) {
|
||||
const t = extractTextsFromParts(a.parts);
|
||||
const t = extractTextsFromParts(a?.parts);
|
||||
if (t) collected.push(t);
|
||||
}
|
||||
}
|
||||
@ -173,6 +179,20 @@ export function extractResponseText(body: Record<string, unknown>): string {
|
||||
if (collected.length > 0) return collected.join("\n");
|
||||
}
|
||||
|
||||
// Delegation results from delegation.go store response_body as
|
||||
// {"text": "...", "delegation_id": "..."} — no "result" wrapper.
|
||||
// Check this after the body.result path so A2A responses take
|
||||
// precedence when both shapes are somehow present.
|
||||
// Without this, responseText is always "" for delegate_result rows,
|
||||
// causing the error UI to fire even when the delegation succeeded
|
||||
// (issue #159).
|
||||
if (typeof body.text === "string" && body.text) return body.text;
|
||||
// DELEGATION_COMPLETE event (via canvas-events WS handler) stores
|
||||
// response_body as {response_preview: "..."}. Handle this too.
|
||||
if (typeof body.response_preview === "string" && body.response_preview) {
|
||||
return body.response_preview;
|
||||
}
|
||||
|
||||
// {task: "text"} — request body format, shouldn't be in response but handle it
|
||||
if (typeof body.task === "string") return body.task;
|
||||
} catch { /* ignore */ }
|
||||
|
||||
@ -1,12 +1,16 @@
|
||||
import { PLATFORM_URL } from "@/lib/api";
|
||||
import { getTenantSlug } from "@/lib/tenant";
|
||||
import { PLATFORM_URL, platformAuthHeaders } from "@/lib/api";
|
||||
import type { ChatAttachment } from "./types";
|
||||
|
||||
/** Chat attachments are intentionally uploaded via a direct fetch()
|
||||
* instead of the `api.post` helper — `api.post` JSON-stringifies the
|
||||
* body, which would 500 on a Blob. Mirrors the header plumbing
|
||||
* (tenant slug, admin token, credentials) so SaaS + self-hosted
|
||||
* callers work the same way. */
|
||||
* body, which would 500 on a Blob. Auth headers (tenant slug, admin
|
||||
* token, credentials) come from `platformAuthHeaders()` — the same
|
||||
* helper `request()` uses, so a missing bearer surfaces as a single
|
||||
* fix site instead of N copies. We deliberately do NOT set
|
||||
* Content-Type so the browser writes the multipart boundary into the
|
||||
* header; setting it manually would yield a multipart body the server
|
||||
* can't parse. See lib/api.ts platformAuthHeaders() for the full
|
||||
* rationale on why this pair must stay matched. */
|
||||
export async function uploadChatFiles(
|
||||
workspaceId: string,
|
||||
files: File[],
|
||||
@ -16,18 +20,12 @@ export async function uploadChatFiles(
|
||||
const form = new FormData();
|
||||
for (const f of files) form.append("files", f, f.name);
|
||||
|
||||
const headers: Record<string, string> = {};
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
|
||||
// Uploads legitimately take a while on cold cache (tar write +
|
||||
// docker cp into the container). 60s is comfortable for the 25MB/
|
||||
// 50MB caps the server enforces.
|
||||
const res = await fetch(`${PLATFORM_URL}/workspaces/${workspaceId}/chat/uploads`, {
|
||||
method: "POST",
|
||||
headers,
|
||||
headers: platformAuthHeaders(),
|
||||
body: form,
|
||||
credentials: "include",
|
||||
signal: AbortSignal.timeout(60_000),
|
||||
@ -143,14 +141,8 @@ export async function downloadChatFile(
|
||||
return;
|
||||
}
|
||||
|
||||
const headers: Record<string, string> = {};
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
|
||||
const res = await fetch(href, {
|
||||
headers,
|
||||
headers: platformAuthHeaders(),
|
||||
credentials: "include",
|
||||
signal: AbortSignal.timeout(60_000),
|
||||
});
|
||||
|
||||
@ -50,7 +50,7 @@ export function TextInput({ label, value, onChange, placeholder, mono }: { label
|
||||
const id = `textinput-${label.toLowerCase().replace(/\s+/g, "-")}`;
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={id} className="text-[10px] text-ink-soft block mb-1">{label}</label>
|
||||
<label htmlFor={id} className="text-[10px] text-ink-mid block mb-1">{label}</label>
|
||||
<input
|
||||
id={id}
|
||||
type="text"
|
||||
@ -68,7 +68,7 @@ export function NumberInput({ label, value, onChange, min, max }: { label: strin
|
||||
const id = `numberinput-${label.toLowerCase().replace(/\s+/g, "-")}`;
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={id} className="text-[10px] text-ink-soft block mb-1">{label}</label>
|
||||
<label htmlFor={id} className="text-[10px] text-ink-mid block mb-1">{label}</label>
|
||||
<input
|
||||
id={id}
|
||||
type="number"
|
||||
@ -97,12 +97,12 @@ export function TagList({ label, values, onChange, placeholder }: { label: strin
|
||||
const [input, setInput] = useState("");
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={id} className="text-[10px] text-ink-soft block mb-1">{label}</label>
|
||||
<label htmlFor={id} className="text-[10px] text-ink-mid block mb-1">{label}</label>
|
||||
<div className="flex flex-wrap gap-1 mb-1">
|
||||
{values.map((v, i) => (
|
||||
<span key={i} className="inline-flex items-center gap-1 px-1.5 py-0.5 bg-surface-card border border-line rounded text-[10px] text-ink-mid font-mono">
|
||||
{v}
|
||||
<button type="button" aria-label={`Remove tag ${v}`} onClick={() => onChange(values.filter((_, j) => j !== i))} className="text-ink-soft hover:text-bad">×</button>
|
||||
<button type="button" aria-label={`Remove tag ${v}`} onClick={() => onChange(values.filter((_, j) => j !== i))} className="text-ink-mid hover:text-bad">×</button>
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
|
||||
@ -101,9 +101,9 @@ function SecretRow({ label, secretKey, isSet, scope, globalMode, onSave, onDelet
|
||||
<div className="min-w-0">
|
||||
<div className="text-[10px] text-ink-mid">{label}</div>
|
||||
<div className="flex items-center gap-2 mt-0.5">
|
||||
<span className="text-[9px] font-mono text-ink-soft">{secretKey}</span>
|
||||
<span className="text-[9px] font-mono text-ink-mid">{secretKey}</span>
|
||||
{isSet && (
|
||||
<span className="text-[9px] font-mono text-ink-soft tracking-widest" title="Value is set (encrypted)">
|
||||
<span className="text-[9px] font-mono text-ink-mid tracking-widest" title="Value is set (encrypted)">
|
||||
•••••
|
||||
</span>
|
||||
)}
|
||||
@ -159,7 +159,7 @@ function CustomSecretRow({ secretKey, scope, globalMode, onSave, onDelete }: {
|
||||
<span className={`text-[10px] font-mono ${globalMode ? "text-warm" : scope === "global" ? "text-ink-mid" : "text-accent"}`}>
|
||||
{secretKey}
|
||||
</span>
|
||||
<span className="text-[9px] font-mono text-ink-soft tracking-widest ml-2">•••••</span>
|
||||
<span className="text-[9px] font-mono text-ink-mid tracking-widest ml-2">•••••</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-2 shrink-0">
|
||||
<span className="text-[10px] text-good">Set</span>
|
||||
@ -288,7 +288,7 @@ export function SecretsSection({ workspaceId, requiredEnv }: { workspaceId: stri
|
||||
return (
|
||||
<Section title="Secrets & API Keys" defaultOpen={false}>
|
||||
{loading ? (
|
||||
<div className="text-[10px] text-ink-soft">Loading secrets...</div>
|
||||
<div className="text-[10px] text-ink-mid">Loading secrets...</div>
|
||||
) : (
|
||||
<div className="space-y-2">
|
||||
{error && <div className="px-2 py-1 bg-red-900/30 border border-red-800 rounded text-[10px] text-bad">{error}</div>}
|
||||
@ -369,7 +369,7 @@ export function SecretsSection({ workspaceId, requiredEnv }: { workspaceId: stri
|
||||
</button>
|
||||
)}
|
||||
|
||||
<div className="text-[9px] text-ink-soft pt-1">
|
||||
<div className="text-[9px] text-ink-mid pt-1">
|
||||
Values are encrypted and never exposed to the browser.
|
||||
{globalMode
|
||||
? " Global keys are shared across all workspaces. Restart workspaces to apply changes."
|
||||
|
||||
130
canvas/src/lib/__tests__/admin-token-pair.test.ts
Normal file
130
canvas/src/lib/__tests__/admin-token-pair.test.ts
Normal file
@ -0,0 +1,130 @@
|
||||
// @vitest-environment node
|
||||
import { describe, it, expect, beforeEach, afterEach, vi } from "vitest";
|
||||
|
||||
// Tests for the boot-time matched-pair guard added to next.config.ts.
|
||||
//
|
||||
// Why this lives in src/lib/__tests__ even though the function is in
|
||||
// canvas/next.config.ts:
|
||||
// - next.config.ts runs as ESM-but-also-CJS depending on which
|
||||
// consumer loads it (Next.js dev server vs Next.js build); we
|
||||
// want the test to be a plain ESM module Vitest already handles.
|
||||
// - Importing from "../../../next.config" pulls in the rest of the
|
||||
// file (loadMonorepoEnv, the default export, etc.) which has
|
||||
// side effects on module load (it runs loadMonorepoEnv()
|
||||
// immediately). To keep the test hermetic we don't import — we
|
||||
// duplicate the function under test.
|
||||
//
|
||||
// Sourcing the function from a shared module would be cleaner, but
|
||||
// next.config.ts is required to be a single self-contained file by
|
||||
// Next.js's loader on some host configurations. Pin invariant: the
|
||||
// duplicated function below MUST stay byte-identical to the one in
|
||||
// next.config.ts. If you change one, change the other and bump this
|
||||
// comment.
|
||||
|
||||
function checkAdminTokenPair(): void {
|
||||
const serverSet = !!process.env.ADMIN_TOKEN;
|
||||
const clientSet = !!process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (serverSet === clientSet) return;
|
||||
if (serverSet && !clientSet) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.error(
|
||||
"[next.config] ADMIN_TOKEN is set but NEXT_PUBLIC_ADMIN_TOKEN is not — " +
|
||||
"canvas will 401 against workspace-server because the bearer header " +
|
||||
"is never attached. Set both to the same value, or unset both.",
|
||||
);
|
||||
} else {
|
||||
// eslint-disable-next-line no-console
|
||||
console.error(
|
||||
"[next.config] NEXT_PUBLIC_ADMIN_TOKEN is set but ADMIN_TOKEN is not — " +
|
||||
"workspace-server will reject the bearer because no AdminAuth gate " +
|
||||
"is configured. Set both to the same value, or unset both.",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
describe("checkAdminTokenPair", () => {
|
||||
// Snapshot env so individual tests can stomp on it without leaking.
|
||||
// Rebuild from snapshot in afterEach so the next test sees a known
|
||||
// baseline regardless of mutation pattern.
|
||||
let originalEnv: Record<string, string | undefined>;
|
||||
let errorSpy: ReturnType<typeof vi.spyOn>;
|
||||
|
||||
beforeEach(() => {
|
||||
originalEnv = {
|
||||
ADMIN_TOKEN: process.env.ADMIN_TOKEN,
|
||||
NEXT_PUBLIC_ADMIN_TOKEN: process.env.NEXT_PUBLIC_ADMIN_TOKEN,
|
||||
};
|
||||
delete process.env.ADMIN_TOKEN;
|
||||
delete process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
errorSpy = vi.spyOn(console, "error").mockImplementation(() => {});
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
if (originalEnv.ADMIN_TOKEN === undefined) delete process.env.ADMIN_TOKEN;
|
||||
else process.env.ADMIN_TOKEN = originalEnv.ADMIN_TOKEN;
|
||||
if (originalEnv.NEXT_PUBLIC_ADMIN_TOKEN === undefined) delete process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
else process.env.NEXT_PUBLIC_ADMIN_TOKEN = originalEnv.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
errorSpy.mockRestore();
|
||||
});
|
||||
|
||||
it("emits no warning when both are unset", () => {
|
||||
checkAdminTokenPair();
|
||||
expect(errorSpy).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("emits no warning when both are set (matched pair, the happy path)", () => {
|
||||
process.env.ADMIN_TOKEN = "local-dev-admin";
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "local-dev-admin";
|
||||
checkAdminTokenPair();
|
||||
expect(errorSpy).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("warns when ADMIN_TOKEN is set but NEXT_PUBLIC_ADMIN_TOKEN is not", () => {
|
||||
process.env.ADMIN_TOKEN = "local-dev-admin";
|
||||
checkAdminTokenPair();
|
||||
expect(errorSpy).toHaveBeenCalledTimes(1);
|
||||
// Exact-string assertion — substring would also pass when the
|
||||
// function's branch logic is broken (e.g. emits both messages, or
|
||||
// emits the wrong one). Pin the exact message that operators will
|
||||
// see in their dev console so regressions are visible.
|
||||
expect(errorSpy).toHaveBeenCalledWith(
|
||||
"[next.config] ADMIN_TOKEN is set but NEXT_PUBLIC_ADMIN_TOKEN is not — " +
|
||||
"canvas will 401 against workspace-server because the bearer header " +
|
||||
"is never attached. Set both to the same value, or unset both.",
|
||||
);
|
||||
});
|
||||
|
||||
it("warns when NEXT_PUBLIC_ADMIN_TOKEN is set but ADMIN_TOKEN is not", () => {
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "local-dev-admin";
|
||||
checkAdminTokenPair();
|
||||
expect(errorSpy).toHaveBeenCalledTimes(1);
|
||||
expect(errorSpy).toHaveBeenCalledWith(
|
||||
"[next.config] NEXT_PUBLIC_ADMIN_TOKEN is set but ADMIN_TOKEN is not — " +
|
||||
"workspace-server will reject the bearer because no AdminAuth gate " +
|
||||
"is configured. Set both to the same value, or unset both.",
|
||||
);
|
||||
});
|
||||
|
||||
// Empty string in process.env is the JS-side representation of `KEY=`
|
||||
// (no value) in a .env file. Treating "" as unset makes the pair
|
||||
// invariant symmetric: `KEY=` and `unset KEY` produce the same
|
||||
// verdict. Without this branch, an operator who comments out the
|
||||
// value but leaves the line would get a false-positive warning.
|
||||
it("treats empty string as unset (so KEY= and unset KEY are equivalent)", () => {
|
||||
process.env.ADMIN_TOKEN = "";
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "";
|
||||
checkAdminTokenPair();
|
||||
expect(errorSpy).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("warns when ADMIN_TOKEN is set and NEXT_PUBLIC_ADMIN_TOKEN is empty string", () => {
|
||||
process.env.ADMIN_TOKEN = "local-dev-admin";
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "";
|
||||
checkAdminTokenPair();
|
||||
expect(errorSpy).toHaveBeenCalledTimes(1);
|
||||
// First branch — server set, client unset.
|
||||
expect(errorSpy).toHaveBeenCalledWith(
|
||||
expect.stringContaining("ADMIN_TOKEN is set but NEXT_PUBLIC_ADMIN_TOKEN is not"),
|
||||
);
|
||||
});
|
||||
});
|
||||
97
canvas/src/lib/__tests__/platform-auth-headers.test.ts
Normal file
97
canvas/src/lib/__tests__/platform-auth-headers.test.ts
Normal file
@ -0,0 +1,97 @@
|
||||
// @vitest-environment jsdom
|
||||
import { describe, it, expect, beforeEach, afterEach, vi } from "vitest";
|
||||
|
||||
// Tests for platformAuthHeaders — the shared helper extracted in #178
|
||||
// to consolidate the bearer-token-attach + tenant-slug-attach pattern
|
||||
// that was previously duplicated across 7 raw-fetch callsites in the
|
||||
// canvas (uploads + 5 Attachment* components + the api.ts request()
|
||||
// function).
|
||||
//
|
||||
// What we pin here:
|
||||
// - Returns a fresh object each call (so callers can mutate without
|
||||
// leaking into each other).
|
||||
// - Empty result on a non-tenant host with no admin token (the
|
||||
// localhost / self-hosted shape).
|
||||
// - Bearer attached when NEXT_PUBLIC_ADMIN_TOKEN is set.
|
||||
// - X-Molecule-Org-Slug attached when window.location.hostname is a
|
||||
// tenant subdomain (<slug>.moleculesai.app).
|
||||
// - Both attached when both apply (the production SaaS shape).
|
||||
//
|
||||
// Why jsdom: getTenantSlug() reads window.location.hostname. Node-only
|
||||
// environment yields no window and getTenantSlug returns null
|
||||
// unconditionally — wouldn't exercise the slug branch.
|
||||
|
||||
import { platformAuthHeaders } from "../api";
|
||||
|
||||
describe("platformAuthHeaders", () => {
|
||||
let originalAdminToken: string | undefined;
|
||||
|
||||
beforeEach(() => {
|
||||
originalAdminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
delete process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
if (originalAdminToken === undefined) delete process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
else process.env.NEXT_PUBLIC_ADMIN_TOKEN = originalAdminToken;
|
||||
// jsdom resets hostname between tests via the @vitest-environment
|
||||
// pragma's per-test isolation. No explicit reset needed.
|
||||
});
|
||||
|
||||
it("returns an empty object on a non-tenant host with no admin token", () => {
|
||||
// jsdom default hostname is "localhost" — not a tenant slug, so
|
||||
// getTenantSlug() returns null and no X-Molecule-Org-Slug is added.
|
||||
const headers = platformAuthHeaders();
|
||||
expect(headers).toEqual({});
|
||||
});
|
||||
|
||||
it("attaches Authorization when NEXT_PUBLIC_ADMIN_TOKEN is set", () => {
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "local-dev-admin";
|
||||
const headers = platformAuthHeaders();
|
||||
expect(headers).toEqual({ Authorization: "Bearer local-dev-admin" });
|
||||
});
|
||||
|
||||
it("does NOT attach Authorization when NEXT_PUBLIC_ADMIN_TOKEN is empty string", () => {
|
||||
// Empty-string env is the JS-side shape of `KEY=` in .env.
|
||||
// Treating it as unset matches the matched-pair guard in
|
||||
// next.config.ts (admin-token-pair.test.ts) — symmetric semantics.
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "";
|
||||
const headers = platformAuthHeaders();
|
||||
expect(headers).toEqual({});
|
||||
});
|
||||
|
||||
it("attaches X-Molecule-Org-Slug on a tenant subdomain", () => {
|
||||
Object.defineProperty(window, "location", {
|
||||
value: { hostname: "reno-stars.moleculesai.app" },
|
||||
writable: true,
|
||||
});
|
||||
const headers = platformAuthHeaders();
|
||||
expect(headers).toEqual({ "X-Molecule-Org-Slug": "reno-stars" });
|
||||
});
|
||||
|
||||
it("attaches both when both apply (production SaaS shape)", () => {
|
||||
Object.defineProperty(window, "location", {
|
||||
value: { hostname: "reno-stars.moleculesai.app" },
|
||||
writable: true,
|
||||
});
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "tenant-bearer";
|
||||
const headers = platformAuthHeaders();
|
||||
// Pin exact-equality on the full shape — substring/contains
|
||||
// assertions would also pass for an extra-header bug.
|
||||
expect(headers).toEqual({
|
||||
"X-Molecule-Org-Slug": "reno-stars",
|
||||
Authorization: "Bearer tenant-bearer",
|
||||
});
|
||||
});
|
||||
|
||||
it("returns a fresh object each call (callers can mutate safely)", () => {
|
||||
process.env.NEXT_PUBLIC_ADMIN_TOKEN = "tok";
|
||||
const a = platformAuthHeaders();
|
||||
const b = platformAuthHeaders();
|
||||
expect(a).not.toBe(b); // distinct refs
|
||||
expect(a).toEqual(b); // same content
|
||||
a["Content-Type"] = "application/json";
|
||||
// Mutation on `a` does not leak into `b`.
|
||||
expect(b["Content-Type"]).toBeUndefined();
|
||||
});
|
||||
});
|
||||
@ -21,6 +21,45 @@ export interface RequestOptions {
|
||||
timeoutMs?: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the platform auth header set used by every authenticated fetch
|
||||
* from the canvas. Returns a fresh object so callers can mutate (e.g.
|
||||
* append `Content-Type` for JSON requests, omit it for FormData).
|
||||
*
|
||||
* SaaS cross-origin shape:
|
||||
* - `X-Molecule-Org-Slug` — derived from `window.location.hostname`
|
||||
* by `getTenantSlug()`. Control plane uses it for fly-replay
|
||||
* routing. Empty on localhost / non-tenant hosts — safe to omit.
|
||||
* - `Authorization: Bearer <token>` — `NEXT_PUBLIC_ADMIN_TOKEN` baked
|
||||
* into the canvas build (see canvas/Dockerfile L8/L11). Required by
|
||||
* the workspace-server when `ADMIN_TOKEN` is set on the server side
|
||||
* (Tier-2b AdminAuth gate, wsauth_middleware.go ~L245). Empty when
|
||||
* no admin token was provisioned — the Tier-1 session-cookie path
|
||||
* handles that case via `credentials:"include"`.
|
||||
*
|
||||
* Why a shared helper: the two-line "read env, attach bearer; read
|
||||
* slug, attach header" pattern was duplicated across `request()` and
|
||||
* 7 raw-fetch callsites (chat uploads/download + 5 Attachment*
|
||||
* components) before this consolidation. A new poller or raw fetch
|
||||
* that forgets one of the two headers silently 401s against
|
||||
* workspace-server when ADMIN_TOKEN is set — the exact bug shape
|
||||
* called out in #178 / closes the post-#176 self-review gap.
|
||||
*
|
||||
* Callers that want JSON Content-Type should spread this and add it
|
||||
* themselves; FormData callers should NOT add Content-Type (the
|
||||
* browser sets the multipart boundary). Centralizing the auth pair
|
||||
* but leaving Content-Type up to the caller is the minimum viable
|
||||
* shared shape.
|
||||
*/
|
||||
export function platformAuthHeaders(): Record<string, string> {
|
||||
const headers: Record<string, string> = {};
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
return headers;
|
||||
}
|
||||
|
||||
async function request<T>(
|
||||
method: string,
|
||||
path: string,
|
||||
@ -28,17 +67,16 @@ async function request<T>(
|
||||
retryCount = 0,
|
||||
options?: RequestOptions,
|
||||
): Promise<T> {
|
||||
// SaaS cross-origin shape:
|
||||
// - X-Molecule-Org-Slug: derived from window.location.hostname by
|
||||
// getTenantSlug(). Control plane uses it for fly-replay routing.
|
||||
// Empty on localhost / non-tenant hosts — safe to omit.
|
||||
// - credentials:"include": sends the session cookie cross-origin.
|
||||
// Cookie's Domain=.moleculesai.app attribute + cp's CORS allow this.
|
||||
const headers: Record<string, string> = { "Content-Type": "application/json" };
|
||||
// JSON-bodied request — Content-Type is JSON. Auth pair comes from
|
||||
// the shared helper; see its doc comment for the SaaS-shape rationale.
|
||||
const headers: Record<string, string> = {
|
||||
"Content-Type": "application/json",
|
||||
...platformAuthHeaders(),
|
||||
};
|
||||
// Re-read slug locally for the 401 handler below — `headers` already
|
||||
// has it, but the 401 branch needs the bare value to gate the
|
||||
// session-probe + redirect logic on tenant context.
|
||||
const slug = getTenantSlug();
|
||||
if (slug) headers["X-Molecule-Org-Slug"] = slug;
|
||||
const adminToken = process.env.NEXT_PUBLIC_ADMIN_TOKEN;
|
||||
if (adminToken) headers["Authorization"] = `Bearer ${adminToken}`;
|
||||
|
||||
const res = await fetch(`${PLATFORM_URL}${path}`, {
|
||||
method,
|
||||
|
||||
@ -835,3 +835,180 @@ describe("handleCanvasEvent – unknown event", () => {
|
||||
).not.toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Screen-reader live announcements
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe("handleCanvasEvent – liveAnnouncement", () => {
|
||||
it("announces WORKSPACE_ONLINE with node name", () => {
|
||||
const node = makeNode("ws-1", { name: "Alpha" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({ event: "WORKSPACE_ONLINE", workspace_id: "ws-1" }),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("Alpha is now online");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_OFFLINE with node name", () => {
|
||||
const node = makeNode("ws-1", { name: "Beta" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({ event: "WORKSPACE_OFFLINE", workspace_id: "ws-1" }),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("Beta is now offline");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_PAUSED with node name", () => {
|
||||
const node = makeNode("ws-1", { name: "Gamma" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({ event: "WORKSPACE_PAUSED", workspace_id: "ws-1" }),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("Gamma has been paused");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_DEGRADED with node name", () => {
|
||||
const node = makeNode("ws-1", { name: "Delta" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "WORKSPACE_DEGRADED",
|
||||
workspace_id: "ws-1",
|
||||
payload: { sample_error: "connection timeout" },
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("Delta is degraded");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_PROVISIONING for new workspace with payload name", () => {
|
||||
const { get, set, state } = makeStore([]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "WORKSPACE_PROVISIONING",
|
||||
workspace_id: "ws-new",
|
||||
payload: { name: "NewBot" },
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("NewBot is provisioning");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_PROVISIONING for new workspace with default name", () => {
|
||||
const { get, set, state } = makeStore([]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "WORKSPACE_PROVISIONING",
|
||||
workspace_id: "ws-new",
|
||||
payload: {},
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("New Workspace is provisioning");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_REMOVED with node name", () => {
|
||||
const node = makeNode("ws-1", { name: "Gamma" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({ event: "WORKSPACE_REMOVED", workspace_id: "ws-1" }),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("Gamma was removed");
|
||||
});
|
||||
|
||||
it("announces WORKSPACE_PROVISION_FAILED with node name", () => {
|
||||
const node = makeNode("ws-1", { name: "Delta" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "WORKSPACE_PROVISION_FAILED",
|
||||
workspace_id: "ws-1",
|
||||
payload: { error: "docker pull failed" },
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement).toBe("Delta provisioning failed");
|
||||
});
|
||||
|
||||
it("does not announce for TASK_UPDATED", () => {
|
||||
const node = makeNode("ws-1", { name: "Alpha" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "TASK_UPDATED",
|
||||
workspace_id: "ws-1",
|
||||
payload: { current_task: "building release", active_tasks: 1 },
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
// TASK_UPDATED is noisy (every heartbeat); it should not announce
|
||||
expect(state.liveAnnouncement ?? "").toBe("");
|
||||
});
|
||||
|
||||
it("does not announce for AGENT_MESSAGE", () => {
|
||||
const node = makeNode("ws-1", { name: "Alpha" });
|
||||
const { get, set, state } = makeStore([node]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "AGENT_MESSAGE",
|
||||
workspace_id: "ws-1",
|
||||
payload: { message: "hello from the agent" },
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
expect(state.liveAnnouncement ?? "").toBe("");
|
||||
});
|
||||
|
||||
it("uses payload name for ONLINE when node not found in store", () => {
|
||||
const { get, set, state } = makeStore([]);
|
||||
|
||||
handleCanvasEvent(
|
||||
makeMsg({
|
||||
event: "WORKSPACE_ONLINE",
|
||||
workspace_id: "ws-1",
|
||||
payload: { name: "FromPayload" },
|
||||
}),
|
||||
get,
|
||||
set
|
||||
);
|
||||
|
||||
// ONLINE when node doesn't exist just buffers _pendingOnline;
|
||||
// no announcement should be set
|
||||
expect(state.liveAnnouncement ?? "").toBe("");
|
||||
});
|
||||
});
|
||||
|
||||
@ -1181,3 +1181,46 @@ describe("batchNest", () => {
|
||||
expect(nestPatches).toHaveLength(1);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------- moveNode ----------
|
||||
|
||||
describe("moveNode", () => {
|
||||
beforeEach(() => {
|
||||
const mock = global.fetch as ReturnType<typeof vi.fn>;
|
||||
mock.mockImplementation(() =>
|
||||
Promise.resolve({ ok: true, json: () => Promise.resolve({}) } as Response),
|
||||
);
|
||||
mock.mockClear();
|
||||
});
|
||||
|
||||
it("updates the node's position by the given delta", () => {
|
||||
useCanvasStore.getState().hydrate([
|
||||
makeWS({ id: "n1", name: "Node 1", x: 100, y: 200 }),
|
||||
]);
|
||||
useCanvasStore.getState().selectNode("n1");
|
||||
useCanvasStore.getState().moveNode("n1", 10, -50);
|
||||
const node = useCanvasStore.getState().nodes.find((n) => n.id === "n1")!;
|
||||
expect(node.position).toEqual({ x: 110, y: 150 });
|
||||
});
|
||||
|
||||
it("is a no-op when the node does not exist", () => {
|
||||
useCanvasStore.getState().hydrate([makeWS({ id: "n1", name: "Node 1", x: 0, y: 0 })]);
|
||||
expect(() => useCanvasStore.getState().moveNode("nonexistent", 10, 10)).not.toThrow();
|
||||
});
|
||||
|
||||
it("calls savePosition with the new absolute coordinates", async () => {
|
||||
useCanvasStore.getState().hydrate([makeWS({ id: "n1", name: "Node 1", x: 100, y: 200 })]);
|
||||
useCanvasStore.getState().selectNode("n1");
|
||||
const mock = global.fetch as ReturnType<typeof vi.fn>;
|
||||
useCanvasStore.getState().moveNode("n1", 10, 20);
|
||||
await vi.waitFor(() => {
|
||||
expect(mock).toHaveBeenCalledWith(
|
||||
expect.stringContaining("/workspaces/n1"),
|
||||
expect.objectContaining({
|
||||
method: "PATCH",
|
||||
body: JSON.stringify({ x: 110, y: 220 }),
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@ -80,6 +80,7 @@ export function handleCanvasEvent(
|
||||
switch (msg.event) {
|
||||
case "WORKSPACE_ONLINE": {
|
||||
const existing = nodes.find((n) => n.id === msg.workspace_id);
|
||||
const nodeName = existing?.data.name ?? (msg.payload.name as string) ?? "Workspace";
|
||||
if (!existing) {
|
||||
// PROVISIONING event hasn't been applied yet (WS reorder or
|
||||
// this tab joined mid-deploy). Buffer so the later PROVISIONING
|
||||
@ -105,6 +106,7 @@ export function handleCanvasEvent(
|
||||
? { ...n, data: { ...n.data, status: "online" } }
|
||||
: n,
|
||||
),
|
||||
liveAnnouncement: `${nodeName} is now online`,
|
||||
});
|
||||
// Remove the laser class after its keyframe ends so the edge
|
||||
// settles into the app's default solid styling. Fire-and-forget.
|
||||
@ -123,28 +125,36 @@ export function handleCanvasEvent(
|
||||
}
|
||||
|
||||
case "WORKSPACE_OFFLINE": {
|
||||
const offlineNode = nodes.find((n) => n.id === msg.workspace_id);
|
||||
const offlineName = offlineNode?.data.name ?? "Workspace";
|
||||
set({
|
||||
nodes: nodes.map((n) =>
|
||||
n.id === msg.workspace_id
|
||||
? { ...n, data: { ...n.data, status: "offline" } }
|
||||
: n
|
||||
),
|
||||
liveAnnouncement: `${offlineName} is now offline`,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
case "WORKSPACE_PAUSED": {
|
||||
const pausedNode = nodes.find((n) => n.id === msg.workspace_id);
|
||||
const pausedName = pausedNode?.data.name ?? "Workspace";
|
||||
set({
|
||||
nodes: nodes.map((n) =>
|
||||
n.id === msg.workspace_id
|
||||
? { ...n, data: { ...n.data, status: "paused", currentTask: "" } }
|
||||
: n
|
||||
),
|
||||
liveAnnouncement: `${pausedName} has been paused`,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
case "WORKSPACE_DEGRADED": {
|
||||
const degradedNode = nodes.find((n) => n.id === msg.workspace_id);
|
||||
const degradedName = degradedNode?.data.name ?? "Workspace";
|
||||
set({
|
||||
nodes: nodes.map((n) =>
|
||||
n.id === msg.workspace_id
|
||||
@ -160,6 +170,7 @@ export function handleCanvasEvent(
|
||||
}
|
||||
: n
|
||||
),
|
||||
liveAnnouncement: `${degradedName} is degraded`,
|
||||
});
|
||||
break;
|
||||
}
|
||||
@ -230,6 +241,7 @@ export function handleCanvasEvent(
|
||||
// removed per demo feedback. A2A edges (showA2AEdges) still
|
||||
// render when enabled — those represent runtime traffic,
|
||||
// which nesting doesn't express.
|
||||
const newNodeName = (msg.payload.name as string) ?? "New Workspace";
|
||||
set({
|
||||
nodes: [
|
||||
...nodes,
|
||||
@ -244,7 +256,7 @@ export function handleCanvasEvent(
|
||||
...(parentId ? { parentId } : {}),
|
||||
className: "mol-deploy-spawn",
|
||||
data: {
|
||||
name: (msg.payload.name as string) ?? "New Workspace",
|
||||
name: newNodeName,
|
||||
status: "provisioning",
|
||||
tier: (msg.payload.tier as number) ?? 1,
|
||||
agentCard: null,
|
||||
@ -261,6 +273,7 @@ export function handleCanvasEvent(
|
||||
},
|
||||
},
|
||||
],
|
||||
liveAnnouncement: `${newNodeName} is provisioning`,
|
||||
});
|
||||
|
||||
// Grow the parent to fit the just-landed child. DEBOUNCED
|
||||
@ -345,6 +358,7 @@ export function handleCanvasEvent(
|
||||
|
||||
case "WORKSPACE_REMOVED": {
|
||||
const removedNode = nodes.find((n) => n.id === msg.workspace_id);
|
||||
const removedName = removedNode?.data.name ?? "Workspace";
|
||||
const parentOfRemoved = removedNode?.data.parentId ?? null;
|
||||
set({
|
||||
nodes: nodes
|
||||
@ -363,6 +377,7 @@ export function handleCanvasEvent(
|
||||
e.source !== msg.workspace_id && e.target !== msg.workspace_id
|
||||
),
|
||||
selectedNodeId: selectedNodeId === msg.workspace_id ? null : selectedNodeId,
|
||||
liveAnnouncement: `${removedName} was removed`,
|
||||
});
|
||||
break;
|
||||
}
|
||||
@ -445,6 +460,8 @@ export function handleCanvasEvent(
|
||||
|
||||
case "WORKSPACE_PROVISION_FAILED": {
|
||||
const errorMsg = (msg.payload.error as string) ?? "Unknown provisioning error";
|
||||
const failedNode = nodes.find((n) => n.id === msg.workspace_id);
|
||||
const failedName = failedNode?.data.name ?? "Workspace";
|
||||
set({
|
||||
nodes: nodes.map((n) =>
|
||||
n.id === msg.workspace_id
|
||||
@ -458,6 +475,7 @@ export function handleCanvasEvent(
|
||||
}
|
||||
: n
|
||||
),
|
||||
liveAnnouncement: `${failedName} provisioning failed`,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@ -165,6 +165,13 @@ interface CanvasState {
|
||||
* this so a drag that pushed a child past the parent edge commits
|
||||
* the parent grow on release (commit-on-release pattern). */
|
||||
growParentsToFitChildren: () => void;
|
||||
/** Move a selected node by (dx, dy) in canvas space. Used by keyboard
|
||||
* arrow-key shortcuts so keyboard users can reposition nodes without a
|
||||
* mouse. Persists the new position to the backend and skips the
|
||||
* grow-parents pass that onNodesChange runs on every drag tick
|
||||
* (avoids the "edge-chase" flicker that commit-on-release is meant to
|
||||
* prevent). */
|
||||
moveNode: (nodeId: string, dx: number, dy: number) => void;
|
||||
/** Re-layout a parent's children to the default 2-column grid. Used
|
||||
* by the "Arrange children" context-menu command so users can rescue
|
||||
* out-of-bounds children on demand — topology no longer does it
|
||||
@ -225,6 +232,11 @@ interface CanvasState {
|
||||
/** Whether the A2A topology overlay is visible. Persisted to localStorage. Default: true. */
|
||||
showA2AEdges: boolean;
|
||||
setShowA2AEdges: (show: boolean) => void;
|
||||
/** Screen-reader announcement text. Set by handleCanvasEvent on significant
|
||||
* status changes; consumed and cleared by the aria-live region in Canvas.tsx
|
||||
* so the same announcement doesn't re-fire on re-render. */
|
||||
liveAnnouncement: string;
|
||||
setLiveAnnouncement: (msg: string) => void;
|
||||
}
|
||||
|
||||
export const useCanvasStore = create<CanvasState>((set, get) => ({
|
||||
@ -321,6 +333,8 @@ export const useCanvasStore = create<CanvasState>((set, get) => ({
|
||||
localStorage.setItem("molecule:show-a2a-edges", String(show));
|
||||
}
|
||||
},
|
||||
liveAnnouncement: "",
|
||||
setLiveAnnouncement: (msg) => set({ liveAnnouncement: msg }),
|
||||
|
||||
viewport: { x: 0, y: 0, zoom: 1 },
|
||||
|
||||
@ -1025,6 +1039,19 @@ export const useCanvasStore = create<CanvasState>((set, get) => ({
|
||||
}
|
||||
},
|
||||
|
||||
moveNode: (nodeId, dx, dy) => {
|
||||
const node = get().nodes.find((n) => n.id === nodeId);
|
||||
if (!node) return;
|
||||
set({
|
||||
nodes: get().nodes.map((n) =>
|
||||
n.id === nodeId
|
||||
? { ...n, position: { x: n.position.x + dx, y: n.position.y + dy } }
|
||||
: n,
|
||||
),
|
||||
});
|
||||
void get().savePosition(nodeId, node.position.x + dx, node.position.y + dy);
|
||||
},
|
||||
|
||||
savePosition: async (nodeId: string, x: number, y: number) => {
|
||||
try {
|
||||
await api.patch(`/workspaces/${nodeId}`, { x, y });
|
||||
|
||||
@ -7,6 +7,22 @@ export default defineConfig({
|
||||
test: {
|
||||
environment: 'node',
|
||||
exclude: ['e2e/**', 'node_modules/**', '**/dist/**'],
|
||||
// Issue #22 / vitest pool investigation:
|
||||
//
|
||||
// The forks pool spawns one Node.js worker per concurrent slot.
|
||||
// Each jsdom-environment worker bootstraps a full DOM (~30-50 MB resident
|
||||
// set) at cold-start. With the default maxWorkers derived from CPU
|
||||
// count, multiple jsdom workers can start simultaneously, exhausting
|
||||
// memory on the 2-CPU Gitea Actions runner and causing pool workers to
|
||||
// fail to respond with "[vitest-pool]: Timeout starting … runner."
|
||||
//
|
||||
// Fix: cap maxWorkers at 1 so only one worker is alive at any time.
|
||||
// Tests still run in parallel within that single worker's process (via
|
||||
// node's EventLoop) — this is the same parallelism as the `threads`
|
||||
// pool but without the per-worker jsdom cold-start overhead. 51 test
|
||||
// files that previously took 5070 s with 5 failures now run
|
||||
// sequentially through one worker, eliminating the memory spike.
|
||||
maxWorkers: 1,
|
||||
// CI-conditional test timeout (issue #96).
|
||||
//
|
||||
// Vitest's 5000ms default is too tight for the first test in any
|
||||
|
||||
@ -119,7 +119,7 @@ services:
|
||||
|
||||
networks:
|
||||
default:
|
||||
name: molecule-monorepo-net
|
||||
name: molecule-core-net
|
||||
external: true
|
||||
|
||||
volumes:
|
||||
|
||||
@ -1,3 +1,7 @@
|
||||
# Include infra services (Temporal, Langfuse) so `docker compose up` starts the full stack.
|
||||
include:
|
||||
- docker-compose.infra.yml
|
||||
|
||||
services:
|
||||
# --- Infrastructure ---
|
||||
postgres:
|
||||
@ -12,7 +16,8 @@ services:
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-dev}"]
|
||||
interval: 2s
|
||||
@ -39,7 +44,7 @@ services:
|
||||
psql -h postgres -U "$${POSTGRES_USER}" -d postgres -c "CREATE DATABASE langfuse"
|
||||
fi
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
@ -49,7 +54,8 @@ services:
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 2s
|
||||
@ -66,7 +72,7 @@ services:
|
||||
volumes:
|
||||
- clickhousedata:/var/lib/clickhouse
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --no-verbose --tries=1 --spider http://127.0.0.1:8123/ping || exit 1"]
|
||||
interval: 5s
|
||||
@ -95,7 +101,7 @@ services:
|
||||
ports:
|
||||
- "3001:3000"
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --no-verbose --tries=1 --spider http://localhost:3000/api/public/health || exit 1"]
|
||||
interval: 10s
|
||||
@ -126,6 +132,10 @@ services:
|
||||
REDIS_URL: redis://redis:6379
|
||||
PORT: "${PLATFORM_PORT:-8080}"
|
||||
PLATFORM_URL: "http://platform:${PLATFORM_PORT:-8080}"
|
||||
# Container network namespace is already isolated; "all interfaces"
|
||||
# inside the container = the bridge interface only. The fail-open
|
||||
# default (127.0.0.1) would block host-to-container access.
|
||||
BIND_ADDR: "${BIND_ADDR:-0.0.0.0}"
|
||||
# Default MOLECULE_ENV=development so the WorkspaceAuth / AdminAuth
|
||||
# middleware fail-open path activates when ADMIN_TOKEN is unset —
|
||||
# otherwise the canvas (which runs without a bearer in pure local
|
||||
@ -195,12 +205,28 @@ services:
|
||||
# App private key — read-only bind-mount. The host-side path is
|
||||
# gitignored per .gitignore rules (/.secrets/ + *.pem).
|
||||
- ./.secrets/github-app.pem:/secrets/github-app.pem:ro
|
||||
# Per-role persona credentials (molecule-core#242 local surface).
|
||||
# Sourced at workspace creation time by org_import.go::loadPersonaEnvFile
|
||||
# when a workspace.yaml carries `role: <name>`. The host-side dir is
|
||||
# populated by the operator-host bootstrap kit (28 dev-tree personas);
|
||||
# /etc/molecule-bootstrap/personas is the in-container path the
|
||||
# platform expects (matches the prod tenant-EC2 path so the same code
|
||||
# works in both modes).
|
||||
#
|
||||
# Read-only mount — workspace-server only reads, never writes here.
|
||||
# If the host dir is empty/missing the platform's loadPersonaEnvFile
|
||||
# silently no-ops per its existing semantics, so this mount is safe
|
||||
# even on a fresh machine that hasn't run the bootstrap kit yet.
|
||||
- ${MOLECULE_PERSONA_ROOT_HOST:-${HOME}/.molecule-ai/personas}:/etc/molecule-bootstrap/personas:ro
|
||||
ports:
|
||||
- "${PLATFORM_PUBLISH_PORT:-8080}:${PLATFORM_PORT:-8080}"
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --no-verbose --tries=1 --spider http://localhost:${PLATFORM_PORT:-8080}/health || exit 1"]
|
||||
# Plain GET — `--spider` would issue HEAD, which returns 404 because
|
||||
# /health is registered as GET only.
|
||||
test: ["CMD-SHELL", "wget -qO /dev/null --tries=1 http://localhost:${PLATFORM_PORT:-8080}/health || exit 1"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
@ -236,9 +262,9 @@ services:
|
||||
ports:
|
||||
- "${CANVAS_PUBLISH_PORT:-3000}:${CANVAS_PORT:-3000}"
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --no-verbose --tries=1 --spider http://127.0.0.1:${CANVAS_PORT:-3000} || exit 1"]
|
||||
test: ["CMD-SHELL", "wget -qO /dev/null --tries=1 http://127.0.0.1:${CANVAS_PORT:-3000} || exit 1"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
@ -269,7 +295,7 @@ services:
|
||||
OPENROUTER_API_KEY: ${OPENROUTER_API_KEY:-}
|
||||
LITELLM_MASTER_KEY: ${LITELLM_MASTER_KEY:-sk-molecule}
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --no-verbose --tries=1 --spider http://localhost:4000/health || exit 1"]
|
||||
@ -294,7 +320,7 @@ services:
|
||||
volumes:
|
||||
- ollamadata:/root/.ollama
|
||||
networks:
|
||||
- molecule-monorepo-net
|
||||
- molecule-core-net
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "ollama list || exit 1"]
|
||||
@ -304,8 +330,8 @@ services:
|
||||
start_period: 20s
|
||||
|
||||
networks:
|
||||
molecule-monorepo-net:
|
||||
name: molecule-monorepo-net
|
||||
molecule-core-net:
|
||||
name: molecule-core-net
|
||||
|
||||
volumes:
|
||||
pgdata:
|
||||
|
||||
@ -67,7 +67,7 @@ On-demand fits naturally with how agents work — an agent only needs to know ab
|
||||
|
||||
This is acceptable for MVP because:
|
||||
- All workspaces are provisioned by the same platform on trusted infrastructure
|
||||
- Docker network isolation (`molecule-monorepo-net`) limits who can reach workspace endpoints
|
||||
- Docker network isolation (`molecule-core-net`) limits who can reach workspace endpoints
|
||||
- The tool is self-hosted — the operator controls the network
|
||||
|
||||
**Known gap:** Once workspace A caches workspace B's URL, nothing stops A from calling B directly even after the hierarchy changes and A is no longer supposed to reach B. The cached URL remains valid until the container is restarted or the URL changes.
|
||||
|
||||
@ -124,7 +124,7 @@ Six runtime adapters ship production-ready on `main`: LangGraph, DeepAgents, Cla
|
||||
| Platform ↔ Redis | TCP | Ephemeral state (liveness TTL), caching, pub/sub |
|
||||
| Workspace ↔ Workspace | HTTP (A2A JSON-RPC 2.0) | Direct peer-to-peer, **platform not in data path** |
|
||||
| Workspace → Langfuse | HTTP | Automatic OpenTelemetry tracing |
|
||||
| Docker Network | `molecule-monorepo-net` | Internal-only by default, no exposed DB/Redis ports |
|
||||
| Docker Network | `molecule-core-net` | Internal-only by default, no exposed DB/Redis ports |
|
||||
|
||||
### Core Components
|
||||
|
||||
@ -465,7 +465,7 @@ Unknown tier values default to T2 for safety. Applied via `provisioner.ApplyTier
|
||||
|
||||
### Docker Networking
|
||||
|
||||
- All containers join `molecule-monorepo-net` private network
|
||||
- All containers join `molecule-core-net` private network
|
||||
- Container naming: `ws-{workspace_id[:12]}`
|
||||
- Ephemeral host port binding: `127.0.0.1:0→8000/tcp`
|
||||
|
||||
|
||||
@ -19,7 +19,7 @@ The provisioner is the platform component that deploys workspace containers and
|
||||
|
||||
## Docker Networking (Tier 1-3, Tier 4 uses host)
|
||||
|
||||
All workspace containers join the `molecule-monorepo-net` Docker network. Containers are named `ws-{id[:12]}` (first 12 chars of workspace UUID). Two exported helpers in `provisioner` package provide the canonical naming:
|
||||
All workspace containers join the `molecule-core-net` Docker network. Containers are named `ws-{id[:12]}` (first 12 chars of workspace UUID). Two exported helpers in `provisioner` package provide the canonical naming:
|
||||
|
||||
- `provisioner.ContainerName(workspaceID)` → `ws-{id[:12]}`
|
||||
- `provisioner.InternalURL(workspaceID)` → `http://ws-{id[:12]}:8000`
|
||||
@ -38,7 +38,7 @@ This URL is pre-stored in both Postgres and Redis before the agent registers. Wh
|
||||
|
||||
**Why not use Docker-internal URLs?** In local dev, the platform runs on the host (not in Docker), so it cannot resolve Docker container hostnames. The ephemeral port mapping lets the A2A proxy reach agents via localhost. In production (platform in Docker), the Docker-internal URL (`http://ws-{id}:8000`) would work directly.
|
||||
|
||||
**Workspace-to-workspace discovery:** When a workspace discovers another workspace (via `X-Workspace-ID` header on `GET /registry/discover/:id`), the platform returns the Docker-internal URL (`http://ws-{first12chars}:8000`) so containers can reach each other directly on `molecule-monorepo-net`. The internal URL is cached in Redis at provision time and also synthesized as a fallback if the cache misses (only for online/degraded workspaces).
|
||||
**Workspace-to-workspace discovery:** When a workspace discovers another workspace (via `X-Workspace-ID` header on `GET /registry/discover/:id`), the platform returns the Docker-internal URL (`http://ws-{first12chars}:8000`) so containers can reach each other directly on `molecule-core-net`. The internal URL is cached in Redis at provision time and also synthesized as a fallback if the cache misses (only for online/degraded workspaces).
|
||||
|
||||
For external HTTPS access (multi-host mode), Nginx on the host handles TLS termination and proxies to the container.
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user