# CLAUDE.md This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. ## Overview This repository manages Docker Compose configurations for the **Agap** self-hosted home server. It is not a software project — it is infrastructure-as-config for several independent services. ## Services | Directory | Service | Port | Notes | |-----------|---------|------|-------| | `immich-app/` | Immich (photo management) | 2283 | Main compose via root `docker-compose.yml` | | `gitea/` | Gitea (git hosting) + Postgres | 3000, 222 | Standalone compose | | `openai/` | Open WebUI + Ollama (AI chat) | 3125 | Requires NVIDIA GPU | ## Common Commands All services use Docker Compose. From each service directory: ```bash # Start a service docker compose up -d # Restart docker compose restart # View logs docker compose logs -f # Pull latest images docker compose pull ``` The root `docker-compose.yml` is an alias that includes `immich-app/docker-compose.yml`. ### Immich-specific Environment variables are in the root `.env` file (referenced by `immich-app/docker-compose.yml` as `../.env`): - `UPLOAD_LOCATION`, `THUMB_LOCATION`, `ENCODED_VIDEO_LOCATION` — media storage paths - `DB_DATA_LOCATION` — Postgres data directory **Restore from backup** (see `immich-app/restore_example.sh`): ```bash docker compose down -v # destroys all data docker compose create docker start immich_postgres sleep 10 gunzip --stdout "/home/alvis/dump.sql.gz" | \ sed "s/SELECT pg_catalog.set_config('search_path', '', false);/SELECT pg_catalog.set_config('search_path', 'public, pg_catalog', true);/g" | \ docker exec -i immich_postgres psql --dbname=postgres --username=postgres docker compose up -d ``` ## GPU / NVIDIA Setup Before running GPU-dependent services (Open WebUI/Ollama, Immich ML with CUDA): 1. Run `sudo ./nvidia-docker-install.sh` — installs Docker + NVIDIA Container Toolkit 2. Run `./install-cuda.sh` — installs CUDA toolkit (toolkit only, not driver) ## Storage Layout | Path | Purpose | |------|---------| | `/mnt/media/upload` | Immich uploaded originals | | `/mnt/ssd1/media/thumbs` | Immich thumbnails | | `/mnt/ssd1/media/encoded-video` | Immich transcoded video | | `/mnt/ssd1/media/postgres` | Immich Postgres data | | `/mnt/misc/gitea` | Gitea data | ## Gitea Integration When changes are made to infrastructure (services, config, setup), update the relevant Gitea wiki pages at `http://localhost:3000/alvis/AgapHost/wiki`. ### Gitea Instance Details - **URL**: `http://localhost:3000` - **Repo**: `alvis/AgapHost` - **Wiki**: `http://localhost:3000/alvis/AgapHost/wiki` - **API token**: Read from `$GITEA_TOKEN` environment variable — never hardcode it ### Wiki Pages Reference | Page | Topic | |------|-------| | Hello | Overview of Agap — services, stack | | Home | Index — links to all pages | | Network | Netplan bridge setup, Caddy reverse proxy | | Storage | LVM setup and commands | | Home-Assistant | KVM-based Home Assistant setup | | 3X-UI | VPN proxy panel | | Gitea | Git hosting Docker service | ### Read Wiki Pages (API) ```bash # List all pages curl -s -H "Authorization: token $GITEA_TOKEN" \ http://localhost:3000/api/v1/repos/alvis/AgapHost/wiki/pages # Read a page (content is base64 in content_base64 field) curl -s -H "Authorization: token $GITEA_TOKEN" \ http://localhost:3000/api/v1/repos/alvis/AgapHost/wiki/page/Home ``` ### Edit Wiki Pages (Git) The Gitea REST API does not expose wiki write endpoints. Use git directly: ```bash # Clone the wiki repo git clone http://alvis:$GITEA_TOKEN@localhost:3000/alvis/AgapHost.wiki.git /tmp/AgapHost.wiki # Edit files, then commit and push cd /tmp/AgapHost.wiki git config user.email "allogn@gmail.com" git config user.name "alvis" git add git commit -m "" git push http://alvis:$GITEA_TOKEN@localhost:3000/alvis/AgapHost.wiki.git main ``` ### Wiki Style Guidelines - Use minimalistic style: clean headings, code blocks, brief descriptions - Remove outdated or redundant content when updating - Create a new page if a topic doesn't exist yet - Wiki files are Markdown, named `.md` ## Home Assistant API **Instance**: `https://haos.alogins.net` **Token**: Read from `$HA_TOKEN` environment variable — never hardcode it **Base URL**: `https://haos.alogins.net/api/` **Auth header**: `Authorization: Bearer ` ### Common Endpoints ```bash # Health check curl -s -H "Authorization: Bearer $HA_TOKEN" \ https://haos.alogins.net/api/ # Get all entity states curl -s -H "Authorization: Bearer $HA_TOKEN" \ https://haos.alogins.net/api/states # Get specific entity curl -s -H "Authorization: Bearer $HA_TOKEN" \ https://haos.alogins.net/api/states/ # Call service (e.g., turn on light) curl -s -X POST \ -H "Authorization: Bearer $HA_TOKEN" \ -H "Content-Type: application/json" \ -d '{"entity_id":"light.example"}' \ https://haos.alogins.net/api/services// ``` **Note**: Status 401 = token invalid/expired ## HA → Zabbix Alerting Home Assistant automations push alerts to Zabbix via `history.push` API (Zabbix 7.4 trapper items). No middleware needed. ### Architecture ``` [HA sensor ON] → [HA automation] → [rest_command: HTTP POST] → [Zabbix history.push] → [trapper item] → [trigger] → [Telegram] ``` ### Water Leak Sensors 3x HOBEIAN ZG-222Z moisture sensors → Disaster-level Zabbix alert with room name. | HA Entity | Room | |-----------|------| | `binary_sensor.hobeian_zg_222z` | Kitchen | | `binary_sensor.hobeian_zg_222z_2` | Bathroom | | `binary_sensor.hobeian_zg_222z_3` | Laundry | **Zabbix side** (host "HA Agap", hostid 10780): - Trapper item: `water.leak` (text type) — receives room name or "ok" - Trigger: `last(/HA Agap/water.leak)<>"ok"` — Disaster (severity 5), manual close - Trigger name uses `{ITEM.LASTVALUE}` to show room in notification **HA side** (`configuration.yaml`): - `rest_command.zabbix_water_leak` — POST to Zabbix `history.push`, accepts `{{ room }}` template variable - `rest_command.zabbix_water_leak_clear` — pushes "ok" to clear - Automation "Water Leak Alert" — any sensor ON → sends room name to Zabbix - Automation "Water Leak Clear" — all sensors OFF → sends "ok" ### Adding a New HA → Zabbix Alert 1. **Zabbix**: Create trapper item (type 2) on "HA Agap" via `item.create` API. Create trigger via `trigger.create`. 2. **HA config**: Add `rest_command` entry in `configuration.yaml` with `history.push` payload. Restart HA. 3. **HA automation**: Create via `POST /api/config/automation/config/` with trigger on sensor state and action calling the rest_command. 4. **Test**: Call `rest_command` via HA API, verify Zabbix problem appears. ## Zabbix API **Instance**: `http://localhost:81` (local), `https://zb.alogins.net` (external) **Endpoint**: `http://localhost:81/api_jsonrpc.php` **Token**: Read from `$ZABBIX_TOKEN` environment variable — never hardcode it **Auth header**: `Authorization: Bearer ` ### Common Requests ```bash # Check API version curl -s -X POST http://localhost:81/api_jsonrpc.php \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $ZABBIX_TOKEN" \ -d '{"jsonrpc":"2.0","method":"apiinfo.version","params":{},"id":1}' # Get all hosts curl -s -X POST http://localhost:81/api_jsonrpc.php \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $ZABBIX_TOKEN" \ -d '{"jsonrpc":"2.0","method":"host.get","params":{"output":"extend"},"id":1}' # Get problems/issues curl -s -X POST http://localhost:81/api_jsonrpc.php \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $ZABBIX_TOKEN" \ -d '{"jsonrpc":"2.0","method":"problem.get","params":{"output":"extend"},"id":1}' ```