cyberchitta/llm-context.py
Share code with LLMs via Model Context Protocol or clipboard. Rule-based customization enables easy switching between different tasks (like code review and documentation). Includes smart code outlining.
Platform-specific configuration:
{
"mcpServers": {
"llm-context.py": {
"command": "npx",
"args": [
"-y",
"llm-context.py"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
[](https://opensource.org/licenses/Apache-2.0) [](https://pypi.org/project/llm-context/) [](https://pepy.tech/project/llm-context)
Smart context management for LLM development workflows. Share relevant project files instantly through intelligent selection and rule-based filtering.
Getting the right context into LLM conversations is friction-heavy:
llm-context provides focused, task-specific project context through composable rules.
For humans using chat interfaces:
lc-select # Smart file selection
lc-context # Copy formatted context to clipboard
# Paste and work - AI can access additional files via MCPFor AI agents with CLI access:
lc-preview tmp-prm-auth # Validate rule selects right files
lc-context tmp-prm-auth # Get focused context for sub-agentFor AI agents in chat (MCP tools):
lc_outlines - Generate excerpted context from current rulelc_preview - Validate rule effectiveness before uselc_missing - Fetch specific files/implementations on demand> Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7, 4.0) and Groks (3, 4), using LLM Context itself to share code during development. All code is heavily human-curated by @restlessronin.
uv tool install "llm-context>=0.6.0"# One-time setup
cd your-project
lc-init
# Daily usage
lc-select
lc-context
# Paste into your LLM chatLoading reviews...