Lykhoyda/ask-llm
MCP server for AI-to-AI collaboration — bridge Claude with Gemini, Codex, and other LLMs for code review, second opinions, and plan debate
Platform-specific configuration:
{
"mcpServers": {
"ask-llm": {
"command": "npx",
"args": [
"-y",
"ask-llm"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
<div align="center">
[](https://www.npmjs.com/package/ask-gemini-mcp) [](https://www.npmjs.com/package/ask-gemini-mcp) [](https://github.com/Lykhoyda/ask-llm/releases) [](https://opensource.org/licenses/MIT)
MCP server that connects any AI client to Google Gemini CLI
</div>
An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: `ask-gemini-mcp`. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.
<a href="https://glama.ai/mcp/servers/@Lykhoyda/ask-llm"> </a>
# Project scope (available in current project only)
claude mcp add gemini-cli -- npx -y ask-gemini-mcp
# User scope (available across all projects)
claude mcp add --scope user gemini-cli -- npx -y ask-gemini-mcpAdd to your config file (`~/Library/Application Support/Claude/cl
Loading reviews...