maximilianromer/Onion-Search-MCP
Tools for LLMs to anonymously search and browse the web
Platform-specific configuration:
{
"mcpServers": {
"Onion-Search-MCP": {
"command": "npx",
"args": [
"-y",
"Onion-Search-MCP"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Absolutely anonymous knowledge retrieval for your LLM: the world's first Model Context Protocol (MCP) server that exposes tools to search the web and fetch pages anonymously through Tor. Search results come from DuckDuckGo, and page content is retrieved through an actual Tor Browser instance, preserving the universal fingerprint that makes Tor users indistinguishable from one another.
Millions of people use local LLMs through apps like LM Studio or Ollama. Running models locally offers an extremely private and low-cost way to access information, explore ideas, and automate computation.
https://github.com/user-attachments/assets/1e07d780-8f9e-474e-a7f0-7581c4d64fcb
Search tools are becoming ubiquitous in LLM chat interfaces—ChatGPT, Gemini, and others use them automatically in the background for most queries. Meanwhile, every web search integration tool recommended for local LLMs route requests through off-the-shelf installations of Google Chrome, Firefox, or Brave, which (despite their marketing) leave specific browser fingerprints that can be used to track, surveil, rate-limit, or geo-restrict you.
This tool takes a different approach: every request flows through the Tor network and browser, which routes your requests through an anonymity network and makes your traffic indistinguishable from millions of other users.
> ⚡ Install quickly by giving this prompt to Claude Code, Codex, or any other agent of choice.
Download Python 3.11+. Check if you have it installed by running in
Loading reviews...