Compare commits
5 Commits
main
...
docs/autho
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
31d1100149 | ||
|
|
549d956345 | ||
|
|
4b15398fa6 | ||
|
|
75e069d1ab | ||
|
|
0d14857c29 |
28
.github/workflows/publish-image.yml
vendored
28
.github/workflows/publish-image.yml
vendored
@ -25,33 +25,7 @@ permissions:
|
||||
packages: write
|
||||
|
||||
jobs:
|
||||
# The `.runtime-version` file is the push-mode cascade signal post-
|
||||
# 2026-05-06: when molecule-core/publish-runtime.yml ships a new
|
||||
# version to PyPI, it does NOT call repository_dispatch (Gitea 1.22.6
|
||||
# has no such endpoint — empirically verified molecule-core#20).
|
||||
# Instead it git-pushes an updated `.runtime-version` to each template,
|
||||
# which trips this workflow's `on: push: branches: [main]` trigger.
|
||||
# This job reads that file and forwards the version to the reusable
|
||||
# build workflow.
|
||||
resolve-version:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 2
|
||||
outputs:
|
||||
version: ${{ steps.read.outputs.version }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- id: read
|
||||
run: |
|
||||
if [ -f .runtime-version ]; then
|
||||
v="$(head -n1 .runtime-version | tr -d '[:space:]')"
|
||||
echo "version=$v" >> "$GITHUB_OUTPUT"
|
||||
echo "resolved runtime version: $v"
|
||||
else
|
||||
echo "no .runtime-version file present — falling through to Dockerfile default"
|
||||
fi
|
||||
|
||||
publish:
|
||||
needs: resolve-version
|
||||
uses: Molecule-AI/molecule-ci/.github/workflows/publish-template-image.yml@main
|
||||
secrets: inherit
|
||||
with:
|
||||
@ -59,4 +33,4 @@ jobs:
|
||||
# version PyPI just published. Forwarded as a docker --build-arg
|
||||
# so the cache key changes per-version and pip install resolves
|
||||
# freshly. Empty on push/PR — falls back to requirements.txt pin.
|
||||
runtime_version: ${{ github.event.client_payload.runtime_version || inputs.runtime_version || needs.resolve-version.outputs.version || '' }}
|
||||
runtime_version: ${{ github.event.client_payload.runtime_version || inputs.runtime_version || '' }}
|
||||
|
||||
22
.github/workflows/secret-scan.yml
vendored
Normal file
22
.github/workflows/secret-scan.yml
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
name: Secret scan
|
||||
|
||||
# Calls the canonical reusable workflow in molecule-core. Defense
|
||||
# against the #2090-class leak (a hosted-agent commit slipping a
|
||||
# credential-shaped string into a PR). Pattern set lives in
|
||||
# molecule-core so we do not maintain a parallel copy here.
|
||||
#
|
||||
# Pinned to @staging because that is the active default branch on the
|
||||
# upstream repo (main lags behind via the staging-promotion workflow).
|
||||
# Updates ride along automatically as the upstream regex set evolves.
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened]
|
||||
push:
|
||||
branches: [main, staging, master]
|
||||
merge_group:
|
||||
types: [checks_requested]
|
||||
|
||||
jobs:
|
||||
secret-scan:
|
||||
uses: Molecule-AI/molecule-core/.github/workflows/secret-scan.yml@staging
|
||||
@ -1 +0,0 @@
|
||||
0.1.129
|
||||
108
CLAUDE.md
Normal file
108
CLAUDE.md
Normal file
@ -0,0 +1,108 @@
|
||||
# CLAUDE.md — molecule-ai-workspace-template-crewai
|
||||
|
||||
Workspace template that runs CrewAI inside the Molecule AI workspace runtime.
|
||||
This is a **single-process container** booted by `molecule-runtime` (PyPI:
|
||||
`molecule-ai-workspace-runtime`); it speaks A2A to peer workspaces and exposes
|
||||
canvas chat through the platform.
|
||||
|
||||
## What this template does
|
||||
|
||||
- Boots a CrewAI `Agent` + `Task` + `Crew` per inbound message.
|
||||
- Wires platform tools (delegation, memory, sandbox, approval, MCP bridges)
|
||||
into the Crew's tool list so the agent can call them like any CrewAI tool.
|
||||
- Speaks the A2A protocol to other workspaces — replies and peer delegations
|
||||
flow over the platform's A2A transport, not direct HTTP.
|
||||
|
||||
## Files
|
||||
|
||||
- `adapter.py` — the CrewAI adapter (only Python you usually edit).
|
||||
- `config.yaml` — runtime config: model, env requirements, model picker entries.
|
||||
- `system-prompt.md` — default backstory; overridden by canvas Config tab post-deploy.
|
||||
- `Dockerfile` — base image; `ENTRYPOINT ["molecule-runtime"]` boots the runtime
|
||||
which discovers `Adapter` via `ADAPTER_MODULE=adapter`.
|
||||
- `requirements.txt` — `molecule-ai-workspace-runtime` + `crewai>=0.100.0`.
|
||||
|
||||
## BaseAdapter integration point
|
||||
|
||||
`adapter.py` defines `CrewAIAdapter(BaseAdapter)` from
|
||||
`molecule_runtime.adapters.base`. Two lifecycle hooks matter:
|
||||
|
||||
- `setup(config: AdapterConfig)` — imports `crewai`, calls
|
||||
`self._common_setup(config)` (inherited; loads skills, builds LangChain tool
|
||||
list, resolves the system prompt), then bridges each LangChain tool to a
|
||||
CrewAI `@tool` via `_langchain_to_crewai()`.
|
||||
- `create_executor(config) -> AgentExecutor` — returns a `CrewAIA2AExecutor`
|
||||
that the runtime hands to the A2A server. **The executor is the contract**;
|
||||
do not call CrewAI from `setup()`.
|
||||
|
||||
Module exports `Adapter = CrewAIAdapter` so `molecule-runtime` finds it.
|
||||
|
||||
## CrewAI specifics
|
||||
|
||||
`CrewAIA2AExecutor.execute(context, event_queue)` does the per-message work:
|
||||
|
||||
1. `extract_message_text(context)` — pull the user/peer message off A2A.
|
||||
2. `set_current_task(heartbeat, brief_task(...))` — heartbeat surfaces the
|
||||
current task to the platform UI; cleared in `finally`.
|
||||
3. Build a fresh `Agent(role, goal, backstory, llm, tools)`,
|
||||
`Task(description, expected_output, agent)`, and `Crew([agent], [task])`.
|
||||
4. `await asyncio.to_thread(crew.kickoff)` — CrewAI is sync; off-load to a
|
||||
thread so the event loop stays responsive for heartbeats + A2A.
|
||||
5. `event_queue.enqueue_event(new_text_message(reply))` — A2A reply.
|
||||
|
||||
`role` is the first 100 chars of the resolved system prompt; `goal` is fixed.
|
||||
History from the A2A context is folded into `task_desc` via `build_task_text`.
|
||||
|
||||
Model strings follow Molecule's `provider:model` form (e.g.
|
||||
`openai:gpt-4.1-mini`). `openai:` is rewritten to `openai/` for CrewAI's
|
||||
LiteLLM model router. Other providers pass through unchanged.
|
||||
|
||||
## Tool wrapping (Molecule MCP -> CrewAI)
|
||||
|
||||
`_langchain_to_crewai(lc_tool)` wraps each LangChain `BaseTool` (the platform
|
||||
tool surface — `delegate_task`, `send_message_to_user`, memory, sandbox, etc.)
|
||||
as a sync CrewAI `@tool`. CrewAI's decorator reads `__doc__` at decoration
|
||||
time, so `wrapper.__doc__` is set from the LangChain `description` **before**
|
||||
applying `crewai_tool(...)`. The wrapper bridges sync->async via
|
||||
`asyncio.get_event_loop().run_until_complete(lc_tool.ainvoke(kwargs))`.
|
||||
|
||||
## Common gotchas
|
||||
|
||||
- **Sync/async boundary.** `crew.kickoff()` is blocking; always wrap with
|
||||
`asyncio.to_thread`. The tool wrapper uses `run_until_complete`, which works
|
||||
because CrewAI invokes tools from worker threads, not the main loop.
|
||||
- **CrewAI tool docstrings are mandatory.** Setting `__doc__` after the
|
||||
`@tool` decorator runs is too late — keep the order in `_langchain_to_crewai`.
|
||||
- **Model provider routing.** Only `openai:` prefix rewrite exists today;
|
||||
Anthropic / Bedrock / others rely on LiteLLM defaults from the raw string.
|
||||
- **Per-message Crew construction.** A new `Agent`/`Crew` is built every call
|
||||
— there is no in-memory CrewAI state across messages. State lives in the
|
||||
history extracted from A2A context, not inside CrewAI.
|
||||
- **Errors swallowed to a reply.** The `try/except` in `execute()` returns
|
||||
`f"CrewAI error: {e}"` to the user instead of raising; check container logs
|
||||
for stack traces.
|
||||
|
||||
## Conventions
|
||||
|
||||
- New Molecule platform tools land in `molecule-core` / runtime — they will
|
||||
show up automatically once `_common_setup` returns them in `langchain_tools`.
|
||||
Do not register tools by hand here.
|
||||
- Skills load via `_common_setup` driven by `config.yaml` `skills:` (none in
|
||||
this template by default).
|
||||
- `/workspace` is the agent's read/write scratch dir; `/configs/config.yaml`
|
||||
is the live config mount.
|
||||
- Logs: stdout/stderr from this process is captured by the platform; use the
|
||||
module logger (`logger = logging.getLogger(__name__)`).
|
||||
|
||||
## What NOT to do
|
||||
|
||||
- Don't break the `BaseAdapter` contract (`setup` + `create_executor`) — the
|
||||
runtime discovers and drives it, and the publish-runtime smoke test boots
|
||||
this adapter from a freshly built wheel.
|
||||
- Don't bypass A2A for inter-workspace calls. Use the `delegate_task` tool;
|
||||
direct HTTP between workspaces will skip auth, audit, and the platform's
|
||||
A2A queue.
|
||||
- Don't import CrewAI at module top level — `setup()` does the import inside
|
||||
a `try/except ImportError` so missing deps fail with a clear message.
|
||||
- Don't edit code paths assumed by the runtime smoke test without updating
|
||||
the corresponding template-publish workflow in `.github/workflows/`.
|
||||
@ -90,14 +90,14 @@ class CrewAIA2AExecutor(AgentExecutor):
|
||||
self._heartbeat = heartbeat
|
||||
|
||||
async def execute(self, context, event_queue):
|
||||
from a2a.utils import new_agent_text_message
|
||||
from a2a.helpers import new_text_message
|
||||
from molecule_runtime.adapters.shared_runtime import extract_history, build_task_text, brief_task, set_current_task
|
||||
|
||||
from molecule_runtime.adapters.shared_runtime import extract_message_text
|
||||
user_message = extract_message_text(context)
|
||||
|
||||
if not user_message:
|
||||
await event_queue.enqueue_event(new_agent_text_message("No message provided"))
|
||||
await event_queue.enqueue_event(new_text_message("No message provided"))
|
||||
return
|
||||
|
||||
await set_current_task(self._heartbeat, brief_task(user_message))
|
||||
@ -138,7 +138,7 @@ class CrewAIA2AExecutor(AgentExecutor):
|
||||
finally:
|
||||
await set_current_task(self._heartbeat, "")
|
||||
|
||||
await event_queue.enqueue_event(new_agent_text_message(reply))
|
||||
await event_queue.enqueue_event(new_text_message(reply))
|
||||
|
||||
async def cancel(self, context, event_queue): # pragma: no cover
|
||||
pass
|
||||
|
||||
Loading…
Reference in New Issue
Block a user