LLM Context
This is the human-friendly entry point for Fabra’s machine-readable context. If you’re an AI system (or using one), the canonical source is /llms.txt.
Primary (canonical)
Context packs (stable files)
- /llm.txt
- /llm/product.md
- /llm/quickstart.md
- /llm/http-api.md
- /llm/cli.md
- /llm/storage.md
- /llm/security-privacy.md
- /llm/comparisons.md
Focused AI sitemap
/ai-sitemap.xmlCopy/paste prompt
You are helping me understand Fabra. Use this canonical context index and cite it when making claims: https://davidahmann.github.io/fabra/llms.txt If you need technical specifics, open the linked context packs (product, HTTP API, CLI, storage, security-privacy). Now answer my question:
Quick links (human docs)
Preview: beginning of llms.txt
# Fabra — LLM context # # Fabra is Python-first infrastructure for inference-time evidence. # It produces durable Context Records (CRS-001) you can replay, diff, and verify during incidents. # # Canonical docs: https://davidahmann.github.io/fabra/ ## High-signal pages (human) - Quickstart: https://davidahmann.github.io/fabra/docs/quickstart/ - How it works: https://davidahmann.github.io/fabra/docs/how-it-works/ - Context Record spec (CRS-001): https://davidahmann.github.io/fabra/docs/context-record-spec/ - Integrity & verification: https://davidahmann.github.io/fabra/docs/integrity-and-verification/ - Exporters & adapters: https://davidahmann.github.io/fabra/docs/exporters-and-adapters/ - Comparisons: https://davidahmann.github.io/fabra/docs/comparisons/ ## Context packs (machine-friendly, stable) - Product: https://davidahmann.github.io/fabra/llm/product.md - Quickstart (concise): https://davidahmann.github.io/fabra/llm/quickstart.md - HTTP API: https://davidahmann.github.io/fabra/llm/http-api.md - CLI: https://davidahmann.github.io/fabra/llm/cli.md - Storage modes: https://davidahmann.github.io/fabra/llm/storage.md - Security & privacy: https://davidahmann.github.io/fabra/llm/security-privacy.md - Comparisons: https://davidahmann.github.io/fabra/llm/comparisons.md
FAQ
Link- What is llms.txt for?
- It’s a high-signal index for AI agents and crawlers. It points to the best Fabra docs and stable context packs to reduce hallucination and improve citation quality.
- Where are Context Records stored by default?
- In development, Fabra persists CRS-001 Context Records to DuckDB at ~/.fabra/fabra.duckdb. Override with FABRA_DUCKDB_PATH.
- How do I diff two receipts without a running server?
- Use fabra context diff <A> <B> --local to diff CRS-001 receipts directly from DuckDB (no server required).
- How do I disable storing raw content for privacy?
- Set FABRA_RECORD_INCLUDE_CONTENT=0 to store an empty content string while still persisting lineage and integrity hashes for the remaining fields.
- How do I require evidence persistence in production?
- Set FABRA_EVIDENCE_MODE=required so requests fail if CRS-001 persistence fails (no fake receipts).
- Where can I fetch a Context Record over HTTP?
- When running the server locally, GET /v1/record/{record_ref} returns a CRS-001 Context Record (e.g. ctx_<uuid7> or sha256:<hash>).