From 3b4f475516119f11747960197423b138fb3a0ed8 Mon Sep 17 00:00:00 2001 From: alvis Date: Mon, 16 Mar 2026 07:20:51 +0000 Subject: [PATCH] Add Omo (oh-my-opencode) service --- Hello.md | 1 + Home.md | 1 + Omo.md | 42 ++++++++++++++++++++++++++++++++++++++++++ 3 files changed, 44 insertions(+) create mode 100644 Omo.md diff --git a/Hello.md b/Hello.md index 9472fcd..589e078 100644 --- a/Hello.md +++ b/Hello.md @@ -17,6 +17,7 @@ This repository contains Docker Compose files, configuration templates, and depl | Home Assistant | Home automation | | 3X-UI | VPN / proxy | | Adolf | Persistent AI assistant via Telegram (GPU inference, long-term memory) | +| Omo | AI coding agent (oh-my-opencode) with local LLM via Bifrost | | Vaultwarden | Self-hosted password manager (Bitwarden-compatible), stores all Agap credentials | | Seafile | File sync, sharing, and document editing (OnlyOffice + WebDAV) | | Copyparty | File sharing on Juris remote server (`share.alogins.net:3999`) | diff --git a/Home.md b/Home.md index 48705c0..daacaf3 100644 --- a/Home.md +++ b/Home.md @@ -17,6 +17,7 @@ - [[Zabbix]] — Monitoring (Zabbix 7.4, PostgreSQL, Apache) - [[Juris]] — Remote server (83.99.190.32) - [[Adolf]] — Persistent AI assistant (Telegram, GPU, memory) +- [[Omo]] — AI coding agent (oh-my-opencode, local LLM via Bifrost) - [[Vaultwarden]] — Password manager (Bitwarden-compatible) - [[Seafile]] — File sync and document editing diff --git a/Omo.md b/Omo.md new file mode 100644 index 0000000..c71ca4b --- /dev/null +++ b/Omo.md @@ -0,0 +1,42 @@ +# Omo (oh-my-opencode) + +AI coding agent container. Runs [oh-my-opencode](https://github.com/code-yeongyu/oh-my-openagent) with local LLM inference via Bifrost. + +## Location + +`agap_git/omo/` + +## Setup + +```bash +cd ~/agap_git/omo +docker compose up -d +``` + +## Usage + +Exec into the container and run the agent against a project: + +```bash +docker exec -it omo sh +cd /workspace/my-project +oh-my-opencode run "your task here" +``` + +## Configuration + +- **opencode.json** — mounted at `/root/.config/opencode/opencode.json` +- **Provider:** Bifrost (`http://bifrost:8080/v1`) — OpenAI-compatible gateway to local Ollama models +- **Default model:** `ollama/qwen3:8b` (GPU) +- **Available models:** qwen3:8b, qwen3:4b, qwen2.5:1.5b, gemma3:4b + +## Network + +Connected to `adolf_default` network to reach the `bifrost` container. + +## Volumes + +| Host path | Container path | Purpose | +|-----------|---------------|---------| +| `/home/alvis` | `/workspace` | Home directory for coding projects | +| `./opencode.json` | `/root/.config/opencode/opencode.json` | Provider config (read-only) |