Add Rich token streaming: server SSE + CLI live display + CLI container

Server (agent.py):
- _stream_queues: per-session asyncio.Queue for token chunks
- _push_stream_chunk() / _end_stream() helpers
- Medium tier: astream() with <think> block filtering — real token streaming
- Light tier: full reply pushed as single chunk then [DONE]
- Complex tier: full reply pushed after agent completes then [DONE]
- GET /stream/{session_id} SSE endpoint (data: <chunk>\n\n, data: [DONE]\n\n)
- medium_model promoted to module-level global for astream() access

CLI (cli.py):
- stream_reply(): reads /stream/ SSE, renders tokens live with Rich Live (transient)
- Final reply rendered as Markdown after stream completes
- os.getlogin() replaced with os.getenv("USER") for container compatibility

Dockerfile.cli + docker-compose cli service (profiles: tools):
- Run: docker compose --profile tools run --rm -it cli

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
Alvis
2026-03-12 17:26:52 +00:00
parent edc9a96f7a
commit b04e8a0925
6 changed files with 151 additions and 38 deletions

View File

@@ -13,17 +13,17 @@ curl -s -X POST http://localhost:8000/message \
-d '{"text": "/think what is the best recipe for an apple pie?", "session_id": "use-case-apple-pie", "channel": "cli", "user_id": "claude"}'
```
**2. Wait for the reply** via SSE (complex tier can take up to 5 minutes):
**2. Wait for the streaming reply** (complex tier can take up to 5 minutes):
```bash
curl -s -N --max-time 300 "http://localhost:8000/reply/use-case-apple-pie"
curl -s -N --max-time 300 "http://localhost:8000/stream/use-case-apple-pie"
```
**3. Confirm tier and tool usage in agent logs:**
```bash
docker compose -f /home/alvis/adolf/docker-compose.yml logs deepagents \
--since=600s --no-log-prefix | grep -E "tier=complex|web_search|fetch_url|crawl4ai"
--since=600s | grep -E "tier=complex|web_search|fetch_url|crawl4ai"
```
## Evaluate (use your judgment)