Gnosis MCP
Serve your PostgreSQL docs to AI agents over MCP.

Quick Start
pip install gnosis-mcp
export GNOSIS_MCP_DATABASE_URL="postgresql://user:pass@localhost:5432/mydb"
gnosis-mcp init-db
gnosis-mcp ingest ./docs/
gnosis-mcp serveWhat It Does
- 6 MCP tools — search, get, related, upsert, delete, update metadata
- 3 resources — document listing, content retrieval, category browsing
- Hybrid search — keyword (tsvector) + semantic (pgvector cosine) with RRF scoring
- Markdown ingestion — chunks by H2, extracts frontmatter, content hashing for fast re-ingestion
- Embedding support — OpenAI, Ollama, or any compatible API (zero new deps)
- Multi-table queries — serve docs from multiple tables via UNION ALL
- Webhooks — get notified on doc changes
- 2 dependencies —
mcp+asyncpg, nothing else
Works With
Claude Code, Cursor, Windsurf, VS Code, Cline, and any MCP-compatible client.
Embeddings
Three tiers of embedding integration, all with zero new dependencies:
- Pre-computed — pass embeddings directly via MCP tools
- Backfill —
gnosis-mcp embedfills NULL embeddings via OpenAI, Ollama, or custom API - Hybrid search — automatically combines keyword + semantic scoring when embeddings exist