PleasePrompto/notebooklm-mcp
MCP server for NotebookLM - Let your AI agents (Claude Code, Codex) research documentation directly with grounded, citation-backed answers from Gemini. Persistent auth, library management, cross-client sharing. Zero hallucinations, just your knowledge base.
Platform-specific configuration:
{
"mcpServers": {
"notebooklm-mcp": {
"command": "npx",
"args": [
"-y",
"notebooklm-mcp"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
<div align="center">
Let your CLI agents (Claude, Cursor, Codex...) chat directly with NotebookLM for zero-hallucination answers based on your own notebooks
[](https://www.typescriptlang.org/) [](https://modelcontextprotocol.io/) [](https://www.npmjs.com/package/notebooklm-mcp) [](https://github.com/PleasePrompto/notebooklm-skill) [](https://github.com/PleasePrompto/notebooklm-mcp)
Installation • Quick Start • Why NotebookLM • Examples • Claude Code Skill • Documentation
</div>
---
When you tell Claude Code or Cursor to "search through my local documentation", here's what happens:
Let your local agents chat directly with **NotebookLM** — Google's zero-hallucination knowledge base powered by Gemini 2.5 that provides intelligent, synthesized answers from your docs.
Your Task → Local Agent asks NotebookLM → Gemini synthesizes answer → Agent writes correct codeThe real advantage: No more manual copy-paste between NotebookLM and your editor. Your agent asks NotebookLM directly and gets answers straight back in th
Loading reviews...