How KBForge Works

From documentation to live AI support agent in four steps. No ML expertise required. No infrastructure to manage.

1

Upload Your Documentation

Provide your knowledge base content in any of these formats. KBForge structures, chunks, and indexes everything automatically using FAISS vector search.

📝

Markdown Files

Best quality. Drop in a folder of .md files with frontmatter headers.

📄

PDF Documents

Manuals, textbooks, guides. We extract chapters and split into articles.

🌐

Wiki Exports

Confluence, GitBook, ReadTheDocs. Export and we handle the rest.

2

Configure Your Brand

Your AI agent wears your brand — not ours. Customize every visible element so customers see a seamless extension of your product.

Company name & logo
Custom domain (support.yourco.com)
Hero text & taglines
Quick-topic buttons
Custom system prompt
Footer text & download filenames
3

Choose Your AI Model

Pick the LLM that fits your budget and quality needs. All models use the same tiered retrieval pipeline — only the synthesis engine changes.

⚡ Grok-Mini

xAI — Default

Fastest and cheapest. Great for most documentation use cases.

~$0.01/query

🧠 Claude

Anthropic

Best reasoning and nuance. Ideal for complex technical domains.

~$0.03–$0.05/query

🤖 GPT-4o

OpenAI

Most popular. Strong all-around performer with large context window.

~$0.03–$0.05/query

4

Go Live

Your AI support agent is deployed to a dedicated container with auto-SSL, user authentication, admin analytics, and feedback capture. Share the link with your customers and start reducing support tickets.

$ curl https://support.yourcompany.com/health

{"status": "ok", "tenant": "yourcompany", "model": "grok-3-mini-latest", "kbs_loaded": 1}

🔒
Auto-SSL
👥
User Auth
📊
Analytics
💬
Feedback

Under the Hood

┌──────────────────────────────────────────────────────────┐ │ Customer visits: support.yourcompany.com │ └─────────────────────┬────────────────────────────────────┘ │ ┌───────▼────────┐ │ Caddy Proxy │ ← Auto-SSL, HTTP/3 │ (shared) │ └───────┬────────┘ │ ┌────────────▼────────────────┐ │ Docker Container │ │ ┌──────────────────────┐ │ │ │ FastAPI + Uvicorn │ │ ← Your branded agent │ │ ┌────────────────┐ │ │ │ │ │ Tiered Search │ │ │ ← FAISS vector retrieval │ │ │ Answer Builder │ │ │ ← LLM synthesis │ │ │ User Auth │ │ │ ← SQLite per-tenant │ │ │ Analytics │ │ │ ← Usage tracking │ │ │ Feedback │ │ │ ← Screenshot capture │ │ └────────────────┘ │ │ │ └──────────────────────┘ │ │ │ │ 📂 /data/kb/ (RO) │ ← Your documentation │ 📂 /data/vectors/ (RO) │ ← FAISS indexes │ 📂 /data/users.db (RW) │ ← User database │ 📂 /data/feedback/ (RW) │ ← Feedback logs └─────────────────────────────┘ │ ┌────────────▼────────────────┐ │ LLM API (your choice) │ │ Grok / Claude / GPT-4o │ └─────────────────────────────┘

Ready to get started?

Free 30-day pilot. We'll help you load your knowledge base.