antl3x/ToolRAG
Unlimited LLM tools, zero context penalties — ToolRAG serves exactly the LLM tools your user-query demands.
Platform-specific configuration:
{
"mcpServers": {
"ToolRAG": {
"command": "npx",
"args": [
"-y",
"ToolRAG"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
<div align="center"><strong>ToolRAG</strong></div> <div align="center">Infinity LLM tools, zero context conntraints<br />Context-aware tool retrieval for large language models.</div>
ToolRAG provides a seamless solution for using an unlimited number of function definitions with Large Language Models (LLMs), without worrying about context window limitations, costs, or performance degradation.
npm install @antl3x/toolrag
# or
yarn add @antl3x/toolrag
# or
pnpm add @antl3x/toolragimport { ToolRAG } from "@antl3x/toolrag";
import OpenAI from "openai";
// Initialize ToolRAG with MCP servers
const toolRag = await ToolRAG.init({
mcpServLoading reviews...