rlm-wiki wraps Fleet-RLM with a Daytona-backed markdown wiki. Use it when research inputs are scattered across URLs, files, PDFs, and transcripts and you want the agent to build and maintain a durable knowledge base.Documentation Index
Fetch the complete documentation index at: https://docs.qredence.ai/llms.txt
Use this file to discover all available pages before exploring further.
Operations
| Operation | Implementation | Description |
|---|---|---|
| reset | scripts/bootstrap_daytona_volume.py | Inspect or reset a Daytona volume for wiki bootstrap |
| init | Skill-guided | Create SCHEMA.md, index.md, log.md in wiki root |
| ingest | Skill-guided | Capture sources under raw/, then propose-then-update |
| query | Skill-guided | Answer questions from compiled wiki knowledge |
| lint | Skill-guided | Report structural / semantic issues (read-only) |
reset operation has a standalone script; other operations are executed by the LLM following the skill workflow at skills/rlm-wiki/workflow.md.
Prerequisites
MCP servers
| Server | Purpose |
|---|---|
| fleet-rlm | DSPy / RLM orchestration (local) |
| context7 | Documentation fetching (remote) |
| neon | PostgreSQL metadata (remote) |
| daytona | Sandbox volume management (local) |
Usage
skills/rlm-wiki/workflow.md for the full operating checklist.