Add Omo (oh-my-opencode) service
1
Hello.md
1
Hello.md
@@ -17,6 +17,7 @@ This repository contains Docker Compose files, configuration templates, and depl
|
|||||||
| Home Assistant | Home automation |
|
| Home Assistant | Home automation |
|
||||||
| 3X-UI | VPN / proxy |
|
| 3X-UI | VPN / proxy |
|
||||||
| Adolf | Persistent AI assistant via Telegram (GPU inference, long-term memory) |
|
| Adolf | Persistent AI assistant via Telegram (GPU inference, long-term memory) |
|
||||||
|
| Omo | AI coding agent (oh-my-opencode) with local LLM via Bifrost |
|
||||||
| Vaultwarden | Self-hosted password manager (Bitwarden-compatible), stores all Agap credentials |
|
| Vaultwarden | Self-hosted password manager (Bitwarden-compatible), stores all Agap credentials |
|
||||||
| Seafile | File sync, sharing, and document editing (OnlyOffice + WebDAV) |
|
| Seafile | File sync, sharing, and document editing (OnlyOffice + WebDAV) |
|
||||||
| Copyparty | File sharing on Juris remote server (`share.alogins.net:3999`) |
|
| Copyparty | File sharing on Juris remote server (`share.alogins.net:3999`) |
|
||||||
|
|||||||
1
Home.md
1
Home.md
@@ -17,6 +17,7 @@
|
|||||||
- [[Zabbix]] — Monitoring (Zabbix 7.4, PostgreSQL, Apache)
|
- [[Zabbix]] — Monitoring (Zabbix 7.4, PostgreSQL, Apache)
|
||||||
- [[Juris]] — Remote server (83.99.190.32)
|
- [[Juris]] — Remote server (83.99.190.32)
|
||||||
- [[Adolf]] — Persistent AI assistant (Telegram, GPU, memory)
|
- [[Adolf]] — Persistent AI assistant (Telegram, GPU, memory)
|
||||||
|
- [[Omo]] — AI coding agent (oh-my-opencode, local LLM via Bifrost)
|
||||||
- [[Vaultwarden]] — Password manager (Bitwarden-compatible)
|
- [[Vaultwarden]] — Password manager (Bitwarden-compatible)
|
||||||
- [[Seafile]] — File sync and document editing
|
- [[Seafile]] — File sync and document editing
|
||||||
|
|
||||||
|
|||||||
42
Omo.md
Normal file
42
Omo.md
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# Omo (oh-my-opencode)
|
||||||
|
|
||||||
|
AI coding agent container. Runs [oh-my-opencode](https://github.com/code-yeongyu/oh-my-openagent) with local LLM inference via Bifrost.
|
||||||
|
|
||||||
|
## Location
|
||||||
|
|
||||||
|
`agap_git/omo/`
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd ~/agap_git/omo
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
Exec into the container and run the agent against a project:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker exec -it omo sh
|
||||||
|
cd /workspace/my-project
|
||||||
|
oh-my-opencode run "your task here"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
- **opencode.json** — mounted at `/root/.config/opencode/opencode.json`
|
||||||
|
- **Provider:** Bifrost (`http://bifrost:8080/v1`) — OpenAI-compatible gateway to local Ollama models
|
||||||
|
- **Default model:** `ollama/qwen3:8b` (GPU)
|
||||||
|
- **Available models:** qwen3:8b, qwen3:4b, qwen2.5:1.5b, gemma3:4b
|
||||||
|
|
||||||
|
## Network
|
||||||
|
|
||||||
|
Connected to `adolf_default` network to reach the `bifrost` container.
|
||||||
|
|
||||||
|
## Volumes
|
||||||
|
|
||||||
|
| Host path | Container path | Purpose |
|
||||||
|
|-----------|---------------|---------|
|
||||||
|
| `/home/alvis` | `/workspace` | Home directory for coding projects |
|
||||||
|
| `./opencode.json` | `/root/.config/opencode/opencode.json` | Provider config (read-only) |
|
||||||
Reference in New Issue
Block a user