Privacy-First Command Palette

Search Everything.
Locally.

One command palette to search your entire knowledge stack in parallel. Notes, documents, wikis, the web, and your local LLM — all from a single hotkey. Zero cloud dependency.

7 Sources
3 Platforms
0 Cloud


Everything From One Palette

Not just search. Prism is a full command interface for your local infrastructure.

Omnilocal Search

Query 7 sources in parallel with a single keystroke. Qdrant vector cache returns semantic matches instantly while live sources stream results.

Command Palette

Type > to trigger n8n workflows with parameter dialogs. Run automations without leaving the keyboard.

Local LLM Chat

Streaming conversations with your local LLM. Conversation branching lets you explore multiple lines of thought from any message.

Screenshot OCR

Capture any region of your screen, extract text with OCR, and feed it directly to your local AI for analysis or summarization.

Smart Prompts

Type / to browse categorized prompt templates. Coding, writing, analysis — all a keystroke away.

Global Hotkey

Summon Prism from anywhere with Ctrl+Alt+Space (Windows/Linux) or Cmd+Option+Space (macOS). Instant access, always.

Search History & Pinning

Every search is logged in local SQLite. Pin important results for quick access. Your search history is yours alone — never uploaded.

Webhook Automation

Fire webhooks on search, chat, or OCR events. Connect Prism to n8n, Zapier, or any HTTP endpoint to trigger downstream workflows.


Your Searches. Your Data. Your Hardware.

Prism was built with a single conviction: your knowledge should never leave your network. Every search, every chat, every OCR capture stays on your local disk. No telemetry. No analytics. No cloud calls.

  • No telemetry or usage analytics — ever
  • No cloud API calls — all AI runs on your hardware
  • No tracking pixels, no fingerprinting, no cookies
  • Works fully offline — airplane mode ready
  • All data stored in local SQLite — portable and inspectable
  • Open source Tauri 2 + Rust backend — audit the code yourself
100% Local Zero cloud dependency
Search queries Local
LLM inference Local
Cloud uploads None
Telemetry None

Connects to Your Self-Hosted Stack

Prism speaks to the services you already run. Every integration points at your own infrastructure — no third-party SaaS required.

Qdrant

High-performance vector database for semantic search. Caches embeddings from all sources for instant recall of previous results.

Self-Hosted

Paperless-ngx

Document management with OCR. Search scanned receipts, invoices, contracts, and any PDF in your archive.

Self-Hosted

BookStack

Self-hosted wiki platform. Search across shelves, books, chapters, and pages of your team documentation.

Self-Hosted

Obsidian

Markdown-based knowledge base. Search notes, daily journals, and follow backlinks through your personal knowledge graph.

Local Vault

SearXNG / Brave

Privacy-respecting web search. Get real web results without Google tracking. Self-host SearXNG or use Brave Search API.

Self-Hosted / API

LM Studio

Run open-weight LLMs locally. Streaming chat, completions, and embeddings on your own GPU. Llama, Mistral, Phi, and more.

Local

AnythingLLM

RAG-powered knowledge base. Upload documents and get AI answers grounded in your actual files. Fully self-hosted.

Self-Hosted

n8n Workflows

Trigger any n8n workflow from the command palette. Pass parameters, receive results, automate your entire homelab from one hotkey.

Self-Hosted

Available on Every Desktop

Built on Tauri 2 for native performance. Lightweight, fast, and cross-platform.

macOS

DMG · Apple Silicon + Intel Download for macOS

Windows

MSI · x64 Download for Windows

Linux

AppImage · DEB · x64 Download for Linux

30-day free trial. No credit card required. No account needed.

Tauri 2 Rust React 19 TypeScript SQLite Qdrant Tailwind CSS

Take Back Your Search.

Stop sending your queries to the cloud. Search your entire local knowledge stack from one command palette. Privacy by design.