Omo (oh-my-opencode)
AI coding agent container. Runs oh-my-opencode with local LLM inference via Bifrost.
Location
agap_git/omo/
Setup
Usage
Exec into the container and run the agent against a project:
Configuration
- opencode.json — mounted at
/root/.config/opencode/opencode.json
- Provider: Bifrost (
http://bifrost:8080/v1) — OpenAI-compatible gateway to local Ollama models
- Default model:
ollama/qwen3:8b (GPU)
- Available models: qwen3:8b, qwen3:4b, qwen2.5:1.5b, gemma3:4b
Network
Connected to adolf_default network to reach the bifrost container.
Volumes
| Host path |
Container path |
Purpose |
/home/alvis |
/workspace |
Home directory for coding projects |
./opencode.json |
/root/.config/opencode/opencode.json |
Provider config (read-only) |