Merge pull request 'fix(sre): add explicit 15s timeout to gate-check-v3 HTTP calls (closes #603)' (#604) from sre/gate-check-timeout into main
Some checks failed
Block internal-flavored paths / Block forbidden paths (push) Successful in 19s
Lint curl status-code capture / Scan workflows for curl status-capture pollution (push) Successful in 14s
E2E API Smoke Test / detect-changes (push) Successful in 31s
CI / Detect changes (push) Successful in 33s
E2E Staging Canvas (Playwright) / detect-changes (push) Successful in 34s
Secret scan / Scan diff for credential-shaped strings (push) Successful in 12s
Handlers Postgres Integration / detect-changes (push) Successful in 35s
Runtime PR-Built Compatibility / detect-changes (push) Successful in 32s
CI / Platform (Go) (push) Successful in 8s
E2E API Smoke Test / E2E API Smoke Test (push) Successful in 9s
CI / Python Lint & Test (push) Successful in 6s
CI / Canvas (Next.js) (push) Successful in 9s
CI / Shellcheck (E2E scripts) (push) Successful in 6s
Handlers Postgres Integration / Handlers Postgres Integration (push) Successful in 6s
CI / Canvas Deploy Reminder (push) Has been skipped
E2E Staging Canvas (Playwright) / Canvas tabs E2E (push) Successful in 8s
Runtime PR-Built Compatibility / PR-built wheel + import smoke (push) Successful in 7s
CI / all-required (push) Successful in 4s
Sweep stale e2e-* orgs (staging) / Sweep e2e orgs (push) Successful in 3s
Sweep stale Cloudflare Tunnels / Sweep CF tunnels (push) Failing after 4s
status-reaper / reap (push) Successful in 1m26s
Continuous synthetic E2E (staging) / Synthetic E2E against staging (push) Failing after 4m55s

This commit is contained in:
Molecule AI · core-devops 2026-05-11 23:41:31 +00:00
commit 49a4c3a736
2 changed files with 14 additions and 4 deletions

View File

@ -71,8 +71,12 @@ jobs:
run: |
set -euo pipefail
# Fetch all open PRs and run gate-check on each
# socket.setdefaulttimeout(15): defence-in-depth for missing SOP_TIER_CHECK_TOKEN.
# gate_check.py uses timeout=15 on every urlopen call; this catches the
# inline Python polling loop too (issue #603).
pr_numbers=$(python3 -c "
import urllib.request, json, os
import socket, urllib.request, json, os
socket.setdefaulttimeout(15)
token = os.environ['GITEA_TOKEN']
req = urllib.request.Request(
'https://git.moleculesai.app/api/v1/repos/${{ github.repository }}/pulls?state=open&limit=100',

View File

@ -35,6 +35,12 @@ GITEA_HOST = os.environ.get("GITEA_HOST", "git.moleculesai.app")
GITEA_TOKEN = os.environ.get("GITEA_TOKEN", os.environ.get("GITHUB_TOKEN", ""))
API_BASE = f"https://{GITEA_HOST}/api/v1"
# Timeout in seconds for all HTTP calls. Defence-in-depth: ensures a missing or
# invalid SOP_TIER_CHECK_TOKEN causes a fast (~15 s) failure rather than an
# indefinite hang. The real fix is provisioning the token; this caps worst-case
# wall-clock on a broken/unreachable Gitea host.
DEFAULT_TIMEOUT = 15
def api_get(path: str) -> dict | list:
url = f"{API_BASE}{path}"
@ -46,7 +52,7 @@ def api_get(path: str) -> dict | list:
},
)
try:
with urllib.request.urlopen(req) as r:
with urllib.request.urlopen(req, timeout=DEFAULT_TIMEOUT) as r:
return json.loads(r.read())
except urllib.error.HTTPError as e:
body = e.read().decode(errors="replace")
@ -521,12 +527,12 @@ def run(repo: str, pr_number: int, post_comment: bool = False) -> dict:
comment_id = our_comments[-1]["id"]
url = f"{API_BASE}/repos/{owner}/{name}/issues/comments/{comment_id}"
req = urllib.request.Request(url, data=json.dumps({"body": comment_body}).encode(), headers=headers, method="PATCH")
with urllib.request.urlopen(req) as r:
with urllib.request.urlopen(req, timeout=DEFAULT_TIMEOUT) as r:
r.read()
else:
url = f"{API_BASE}/repos/{owner}/{name}/issues/{pr_number}/comments"
req = urllib.request.Request(url, data=json.dumps({"body": comment_body}).encode(), headers=headers, method="POST")
with urllib.request.urlopen(req) as r:
with urllib.request.urlopen(req, timeout=DEFAULT_TIMEOUT) as r:
r.read()
except urllib.error.HTTPError as e:
if e.code == 403: