delorenj/just-prompt
just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)
just-prompt is a Model Control Protocol (MCP) server that provides a unified interface to various Large Language Model (LLM) providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. It's like having a universal remote for all your AI models - pretty neat, right?
Based on the fantastic original work by [@disler](https://github.com/disler) - huge thanks for creating such an awesome tool! π
The following MCP tools are available in the server:
text: The prompt textmodels_prefixed_by_provider (optional): List of models with provider prefixes. If not provided, uses default models.file: Path to the file containing the promptmodels_prefixed_by_provider (optional): List of models with provider prefixes. If not provided, uses default models.file: Path to the file containing the promptmodels_prefixed_by_provider (optional): List of models with provider prefixes. If not provided, uses default models.output_dir (default: "."): Directory to save the response markdown files tofile: Path to the file containing the promptmodels_prefixed_by_provider (optional): List of models with provider prefixes to act as board members. If not provided, uses default models.output_dir (default:Loading reviews...