1
Omo
alvis edited this page 2026-03-16 07:20:51 +00:00

Omo (oh-my-opencode)

AI coding agent container. Runs oh-my-opencode with local LLM inference via Bifrost.

Location

agap_git/omo/

Setup

cd ~/agap_git/omo
docker compose up -d

Usage

Exec into the container and run the agent against a project:

docker exec -it omo sh
cd /workspace/my-project
oh-my-opencode run "your task here"

Configuration

  • opencode.json — mounted at /root/.config/opencode/opencode.json
  • Provider: Bifrost (http://bifrost:8080/v1) — OpenAI-compatible gateway to local Ollama models
  • Default model: ollama/qwen3:8b (GPU)
  • Available models: qwen3:8b, qwen3:4b, qwen2.5:1.5b, gemma3:4b

Network

Connected to adolf_default network to reach the bifrost container.

Volumes

Host path Container path Purpose
/home/alvis /workspace Home directory for coding projects
./opencode.json /root/.config/opencode/opencode.json Provider config (read-only)