Get Fleet Pi running locally with Pi-backed chat and repo-scoped tools.Documentation Index
Fetch the complete documentation index at: https://docs.qredence.ai/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- Node.js 22 or newer — nodejs.org
- pnpm 10.33.3 — matches the pinned
packageManagerfield inpackage.json - AWS account with Bedrock access — Claude models must be enabled in your region
- AWS credentials configured —
~/.aws/credentials, environment variables, or an IAM role
AWS_REGION defaults to us-east-1 when unset.
1. Clone and install
2. Create local configuration
apps/web/vite.config.ts loads .env from the repo root for server-side routes.
The checked-in example only includes public-safe knobs. Typical choices:
- Set
AWS_PROFILEin your shell if you use named local AWS profiles. - Set
AWS_BEARER_TOKEN_BEDROCKonly if your Bedrock setup uses bearer-token auth. - Leave
PI_AGENT_DIRunset unless you intentionally want a non-default Pi agent resource directory.
3. Start the app
4. Smoke check
In a second terminal:read package.json in the chat UI and confirm that a Read tool card appears.
What “standalone” means
Standalone does not mean “without Pi” or “without an LLM provider.” It means:- You run Fleet Pi locally as a normal pnpm web app.
- The bundled Pi runtime powers chat and tool execution.
- The backend still expects working Bedrock access.
- You do not need the Codex desktop app or the advanced Codex worktree flow.
Useful commands
Next steps
Agent workspace
Learn how durable memory, plans, and Pi resources live in Git.
Adaptive workspace
The canonical workspace contract and projection boundary.
Codex setup
The advanced shared Codex environment and worktree flow.
Architecture
Browser client, TanStack Start backend, and agent workspace boundaries.