Ollama Copilot
Proxy that allows you to use ollama as a copilot like Github copilot
...It acts as an intermediary server that exposes Ollama or other model providers through a Copilot-compatible interface, allowing developers to use local or self-hosted models for inline code completion. The project supports multiple providers such as Ollama, DeepSeek, and Mistral, enabling flexibility between local and remote inference depending on user needs. It integrates with editors like Neovim, VS Code, Zed, and Emacs by redirecting Copilot traffic through a configurable proxy layer. The system allows customization of parameters such as context size, token prediction limits, and prompt templates, which gives developers granular control over how completions are generated. ...