jonigl/mcp-client-for-ollama
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
<p align="center">
</p> <p align="center"> <i>A simple yet powerful Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools.</i> </p>
---
[](https://www.python.org/downloads/) [](https://pypi.org/project/ollmcp/) [](https://pypi.org/project/mcp-client-for-ollama/) [](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/ci.yml)
<p align="center"> </p> <p align="center"> <a href="https://asciinema.org/a/jxc6N8oKZAWrzH8aK867zhXdO" target="_blank">๐ฅ Watch this demo as an Asciinema recording</a> </p>
Loading reviews...