Perplexica is a privacy-focused AI answering engine like Perplexity that you can self-host on your own hardware for private, source-cited web research. It combines live internet search results with AI models, letting you use local LLMs via Ollama or connect to providers like OpenAI, Claude, Gemini, and Groq. Powered by SearxNG, it aggregates results from multiple search engines while keeping your identity and queries private. Perplexica offers multiple search modes—Speed, Balanced, and Quality—so you can trade off latency and depth depending on the task. It also enhances the experience with widgets and rich search types like images, videos, and domain-limited queries. With local search history, file uploads, and an easy Docker-based setup, it’s built for everyday research without sacrificing control or privacy.
Features
- Supports local and cloud LLMs (Ollama, OpenAI, Claude, Gemini, Groq, and more) with mix-and-match model selection.
- Multiple search modes (Speed, Balanced, Quality) for quick answers or deeper research.
- Privacy-preserving web search via SearxNG with optional slim mode for using your own SearxNG instance.
- Source selection across web, discussions, and academic content, plus domain-restricted searching.
- Built-in widgets for quick lookups like weather, calculations, and stock prices when relevant.
- File uploads (PDFs, text, images) with locally saved search history to revisit results anytime.