libralm-mcp-server


Namelibralm-mcp-server JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryLibraLM MCP Server - A Model Context Protocol server for searching and retrieving book information
upload_time2025-08-03 08:54:14
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseMIT
keywords books libralm library mcp search server
VCS
bugtrack_url
requirements mcp fastmcp pydantic requests
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LibraLM MCP Server

[![smithery badge](https://smithery.ai/badge/@libralm-ai/libralm_mcp_server)](https://smithery.ai/server/@libralm-ai/libralm_mcp_server)

Access 50+ book summaries and chapter breakdowns directly in Claude Desktop through the Model Context Protocol (MCP).

<img width="1000" alt="LibraLM Demo" src="https://github.com/user-attachments/assets/demo-placeholder">

## Overview

LibraLM MCP Server brings a library of AI-generated book summaries to your Claude Desktop conversations. Search for books, read comprehensive summaries, explore chapter-by-chapter breakdowns, and get instant access to key insights from business, self-help, and educational books.

## Features

- 📚 **500+ Book Summaries** - Access a growing library of professionally summarized books
- 🔍 **Smart Search** - Find books by title, author, or ISBN
- 📖 **Chapter Breakdowns** - Get detailed summaries of individual chapters
- 📋 **Table of Contents** - View complete book structure with chapter descriptions
- 🎯 **Key Insights** - Extract main themes, frameworks, and actionable takeaways
- 🔐 **Secure API** - Protected access with API key authentication

## Installation

### Installing via Smithery

To install libralm_mcp_server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@libralm-ai/libralm_mcp_server):

```bash
npx -y @smithery/cli install @libralm-ai/libralm_mcp_server --client claude
```

### Prerequisites

- Claude Desktop installed
- Python 3.10 or higher
- LibraLM API key (get one at [libralm.com](https://libralm.com))

### Quick Install

1. **Clone the repository**:
```bash
git clone https://github.com/libralm-ai/libralm_mcp_server.git
cd libralm_mcp_server
```

2. **Install dependencies**:
```bash
pip install -r requirements.txt
```

3. **Get your API key**:
   - Visit [libralm.com](https://libralm.com)
   - Sign in with Google or GitHub
   - Copy your API key from the dashboard

4. **Configure Claude Desktop**:

Add to your Claude Desktop configuration file:

**macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`  
**Windows**: `%APPDATA%\Claude\claude_desktop_config.json`

```json
{
    "mcpServers": {
      "libralm": {
        "command": "uvx",
        "args": ["--from", "libralm-mcp-server", "libralm-mcp-server"],
        "env": {
            "LIBRALM_API_KEY": "your_api_key_here"
        }
      }
    }
  }
```

5. **Restart Claude Desktop**

## Available Tools

### 🔍 `search_books`
Search for books by title, author, or ISBN.
```
Search for "Atomic Habits"
Find books by James Clear
Look up ISBN 0735211299
```

### 📖 `get_book_info`
Get detailed information about a specific book.
```
Get details for book ID 0735211299
Show me information about this book
```

### 📝 `get_book_summary`
Get the comprehensive AI-generated summary of a book.
```
Summarize "Atomic Habits"
Give me the main points of this book
```

### 📋 `get_table_of_contents`
View the complete chapter list with descriptions.
```
Show me the chapters in "Atomic Habits"
What topics does this book cover?
```

### 📄 `get_chapter_summary`
Get a detailed summary of a specific chapter.
```
Summarize chapter 3 of "Atomic Habits"
What's in the first chapter?
```

## Example Usage

Here are some example prompts you can use with Claude:

- "Search LibraLM for books about habits"
- "What books do you have on leadership?"
- "Give me a summary of 'Outlive'"
- "Show me chapter 5 of 'The Wealth Ladder'"
- "Find books by Adam Grant"

## Configuration

### Environment Variables

- `LIBRALM_API_KEY` (required): Your LibraLM API key



## API Limits

- Free tier: 50 API calls per month
- Pro tier: Unlimited API calls
- Rate limiting: 10 requests per minute

## Troubleshooting

### "Invalid API key" error
- Verify your API key is correct in the configuration
- Check that you've copied the entire key including the prefix

### "Resource not found" error
- Ensure you're using a valid book ID
- The book may not be in the library yet

### No books showing up
- Check your internet connection
- Verify the API endpoint is accessible
- Ensure your API key has not exceeded its usage limit

## Contributing

We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Support

- 📧 Email: support@libralm.com
- 🐛 Issues: [GitHub Issues](https://github.com/libralm-ai/libralm_mcp_server/issues)
<!-- - 💬 Discord: [Join our community](https://discord.gg/libralm) -->

## Related Projects

- [LibraLM Web](https://libralm.com) - Web dashboard and API key management
<!-- - [LibraLM API Docs](https://docs.libralm.com) - Full API documentation -->
- [MCP Specification](https://modelcontextprotocol.io) - Learn more about MCP

---

Built with ❤️ by the LibraLM team

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "libralm-mcp-server",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "books, libralm, library, mcp, search, server",
    "author": null,
    "author_email": "LibraLM <support@libralm.com>",
    "download_url": "https://files.pythonhosted.org/packages/69/30/270771a003da659195a7a249a29ced90b1b34deaff6c4447de7d898d9a30/libralm_mcp_server-0.1.2.tar.gz",
    "platform": null,
    "description": "# LibraLM MCP Server\n\n[![smithery badge](https://smithery.ai/badge/@libralm-ai/libralm_mcp_server)](https://smithery.ai/server/@libralm-ai/libralm_mcp_server)\n\nAccess 50+ book summaries and chapter breakdowns directly in Claude Desktop through the Model Context Protocol (MCP).\n\n<img width=\"1000\" alt=\"LibraLM Demo\" src=\"https://github.com/user-attachments/assets/demo-placeholder\">\n\n## Overview\n\nLibraLM MCP Server brings a library of AI-generated book summaries to your Claude Desktop conversations. Search for books, read comprehensive summaries, explore chapter-by-chapter breakdowns, and get instant access to key insights from business, self-help, and educational books.\n\n## Features\n\n- \ud83d\udcda **500+ Book Summaries** - Access a growing library of professionally summarized books\n- \ud83d\udd0d **Smart Search** - Find books by title, author, or ISBN\n- \ud83d\udcd6 **Chapter Breakdowns** - Get detailed summaries of individual chapters\n- \ud83d\udccb **Table of Contents** - View complete book structure with chapter descriptions\n- \ud83c\udfaf **Key Insights** - Extract main themes, frameworks, and actionable takeaways\n- \ud83d\udd10 **Secure API** - Protected access with API key authentication\n\n## Installation\n\n### Installing via Smithery\n\nTo install libralm_mcp_server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@libralm-ai/libralm_mcp_server):\n\n```bash\nnpx -y @smithery/cli install @libralm-ai/libralm_mcp_server --client claude\n```\n\n### Prerequisites\n\n- Claude Desktop installed\n- Python 3.10 or higher\n- LibraLM API key (get one at [libralm.com](https://libralm.com))\n\n### Quick Install\n\n1. **Clone the repository**:\n```bash\ngit clone https://github.com/libralm-ai/libralm_mcp_server.git\ncd libralm_mcp_server\n```\n\n2. **Install dependencies**:\n```bash\npip install -r requirements.txt\n```\n\n3. **Get your API key**:\n   - Visit [libralm.com](https://libralm.com)\n   - Sign in with Google or GitHub\n   - Copy your API key from the dashboard\n\n4. **Configure Claude Desktop**:\n\nAdd to your Claude Desktop configuration file:\n\n**macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`  \n**Windows**: `%APPDATA%\\Claude\\claude_desktop_config.json`\n\n```json\n{\n    \"mcpServers\": {\n      \"libralm\": {\n        \"command\": \"uvx\",\n        \"args\": [\"--from\", \"libralm-mcp-server\", \"libralm-mcp-server\"],\n        \"env\": {\n            \"LIBRALM_API_KEY\": \"your_api_key_here\"\n        }\n      }\n    }\n  }\n```\n\n5. **Restart Claude Desktop**\n\n## Available Tools\n\n### \ud83d\udd0d `search_books`\nSearch for books by title, author, or ISBN.\n```\nSearch for \"Atomic Habits\"\nFind books by James Clear\nLook up ISBN 0735211299\n```\n\n### \ud83d\udcd6 `get_book_info`\nGet detailed information about a specific book.\n```\nGet details for book ID 0735211299\nShow me information about this book\n```\n\n### \ud83d\udcdd `get_book_summary`\nGet the comprehensive AI-generated summary of a book.\n```\nSummarize \"Atomic Habits\"\nGive me the main points of this book\n```\n\n### \ud83d\udccb `get_table_of_contents`\nView the complete chapter list with descriptions.\n```\nShow me the chapters in \"Atomic Habits\"\nWhat topics does this book cover?\n```\n\n### \ud83d\udcc4 `get_chapter_summary`\nGet a detailed summary of a specific chapter.\n```\nSummarize chapter 3 of \"Atomic Habits\"\nWhat's in the first chapter?\n```\n\n## Example Usage\n\nHere are some example prompts you can use with Claude:\n\n- \"Search LibraLM for books about habits\"\n- \"What books do you have on leadership?\"\n- \"Give me a summary of 'Outlive'\"\n- \"Show me chapter 5 of 'The Wealth Ladder'\"\n- \"Find books by Adam Grant\"\n\n## Configuration\n\n### Environment Variables\n\n- `LIBRALM_API_KEY` (required): Your LibraLM API key\n\n\n\n## API Limits\n\n- Free tier: 50 API calls per month\n- Pro tier: Unlimited API calls\n- Rate limiting: 10 requests per minute\n\n## Troubleshooting\n\n### \"Invalid API key\" error\n- Verify your API key is correct in the configuration\n- Check that you've copied the entire key including the prefix\n\n### \"Resource not found\" error\n- Ensure you're using a valid book ID\n- The book may not be in the library yet\n\n### No books showing up\n- Check your internet connection\n- Verify the API endpoint is accessible\n- Ensure your API key has not exceeded its usage limit\n\n## Contributing\n\nWe welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Support\n\n- \ud83d\udce7 Email: support@libralm.com\n- \ud83d\udc1b Issues: [GitHub Issues](https://github.com/libralm-ai/libralm_mcp_server/issues)\n<!-- - \ud83d\udcac Discord: [Join our community](https://discord.gg/libralm) -->\n\n## Related Projects\n\n- [LibraLM Web](https://libralm.com) - Web dashboard and API key management\n<!-- - [LibraLM API Docs](https://docs.libralm.com) - Full API documentation -->\n- [MCP Specification](https://modelcontextprotocol.io) - Learn more about MCP\n\n---\n\nBuilt with \u2764\ufe0f by the LibraLM team\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "LibraLM MCP Server - A Model Context Protocol server for searching and retrieving book information",
    "version": "0.1.2",
    "project_urls": {
        "Documentation": "https://github.com/libralm-ai/libralm_mcp_server#readme",
        "Homepage": "https://github.com/libralm-ai/libralm_mcp_server",
        "Issues": "https://github.com/libralm-ai/libralm_mcp_server/issues",
        "Repository": "https://github.com/libralm-ai/libralm_mcp_server"
    },
    "split_keywords": [
        "books",
        " libralm",
        " library",
        " mcp",
        " search",
        " server"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bc336018cb7dc716787b3f5c98f1da09ed0a51050d155f3fb039a5042cd28f96",
                "md5": "fc81a6b82f5097c6a2b2894553ec3bd2",
                "sha256": "1a61c9be2355a278dacbe41890ad26ffc55b25db325d44b6ac7bd541d44d293d"
            },
            "downloads": -1,
            "filename": "libralm_mcp_server-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fc81a6b82f5097c6a2b2894553ec3bd2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 7021,
            "upload_time": "2025-08-03T08:54:13",
            "upload_time_iso_8601": "2025-08-03T08:54:13.076524Z",
            "url": "https://files.pythonhosted.org/packages/bc/33/6018cb7dc716787b3f5c98f1da09ed0a51050d155f3fb039a5042cd28f96/libralm_mcp_server-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6930270771a003da659195a7a249a29ced90b1b34deaff6c4447de7d898d9a30",
                "md5": "2f8e9f1ba0ac181a97311b438b55adbe",
                "sha256": "528b5ce17ece95fa5f8741e40add4014ec09a393c3ff6f6702a760b168e67dfb"
            },
            "downloads": -1,
            "filename": "libralm_mcp_server-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "2f8e9f1ba0ac181a97311b438b55adbe",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 65283,
            "upload_time": "2025-08-03T08:54:14",
            "upload_time_iso_8601": "2025-08-03T08:54:14.166657Z",
            "url": "https://files.pythonhosted.org/packages/69/30/270771a003da659195a7a249a29ced90b1b34deaff6c4447de7d898d9a30/libralm_mcp_server-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-03 08:54:14",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "libralm-ai",
    "github_project": "libralm_mcp_server#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "mcp",
            "specs": []
        },
        {
            "name": "fastmcp",
            "specs": []
        },
        {
            "name": "pydantic",
            "specs": []
        },
        {
            "name": "requests",
            "specs": []
        }
    ],
    "lcname": "libralm-mcp-server"
}
        
Elapsed time: 1.15646s