| Name | serper-toolkit JSON | 
            
| Version | 
                  0.0.8
                   
                  JSON | 
            
 | download  | 
            
| home_page | None  | 
            
| Summary | A high-performance, asynchronous MCP server for Serper Google Search, featuring connection pooling, request retries, and intelligent input parsing. | 
            | upload_time | 2025-10-10 16:31:01 | 
            | maintainer | None | 
            
            | docs_url | None | 
            | author | None | 
            
            | requires_python | >=3.12 | 
            
            
            | license | None | 
            | keywords | 
                
                    serper
                
                     mcp
                
                     server
                
                     google
                
                     search
                 | 
            | VCS | 
                
                    | 
                
            
            | bugtrack_url | 
                
                 | 
             
            
            | requirements | 
                
                  No requirements were recorded.
                
             | 
            
| Travis-CI | 
                
                   No Travis.
                
             | 
            | coveralls test coverage | 
                
                   No coveralls.
                
             | 
        
        
            
            # Serper MCP Toolkit
A high-performance, asynchronous MCP server that provides comprehensive Google search and web content scraping capabilities through the Serper API (excluding some rarely used interfaces).
This project is built on `httpx`, utilizing asynchronous clients and connection pool management to offer LLMs a stable and efficient external information retrieval tool.
## Key Features
- **Asynchronous Architecture**: Fully based on `asyncio` and `httpx`, ensuring high throughput and non-blocking I/O operations.
- **HTTP Connection Pool**: Manages and reuses TCP connections through a global `httpx.AsyncClient` instance, significantly improving performance under high concurrency.
- **Concurrency Control**: Built-in global and per-API endpoint concurrency semaphores effectively manage API request rates to prevent exceeding rate limits.
- **Automatic Retry Mechanism**: Integrated request retry functionality with exponential backoff strategy automatically handles temporary network fluctuations or server errors, enhancing service stability.
- **Intelligent Country Code Parsing**: Includes a comprehensive country name dictionary supporting inputs in Chinese, English, ISO Alpha-2/3, and other formats, with automatic normalization.
- **Flexible Environment Variable Configuration**: Supports fine-tuned service configuration via environment variables.
## Available Tools
This service provides the following tools:
| Tool Name                | Description                                  |
| ------------------------ | -------------------------------------------- |
| `serper-general-search`  | Performs general Google web searches.        |
| `serper-image-search`    | Performs Google image searches.               |
| `serper-video-search`    | Performs Google video searches.               |
| `serper-place-search`    | Performs Google place searches.               |
| `serper-maps_search`     | Performs Google Maps searches.                |
| `serper-news-search`     | Performs Google news searches.                |
| `serper-lens-search`     | Performs Google Lens reverse image searches via image URL. |
| `serper-scholar-search`  | Performs Google Scholar searches.             |
| `serper-shopping-search` | Performs Google Shopping searches.            |
| `serper-patents-search`  | Performs Google Patents searches.             |
| `serper-scrape`          | Scrapes and returns the content of a specified URL. |
## Installation Guide
It is recommended to install using `pip` or `uv`.
```bash
# Using pip
pip install serper-toolkit
# Or using uv
uv pip install serper-toolkit
```
## Quick Start
### Set Environment Variables
Create a `.env` file in the project root directory and enter your Serper API key:
```bash
SERPER_API_KEY="your-serper-api-key-here"
```
### Configure MCP Client
Add the following server configuration in the MCP client configuration file:
```json
{
  "mcpServers": {
    "serper": {
      "command": "python3",
      "args": ["-m", "serper-toolkit"],
      "env": {
        "SERPER_API_KEY": "<Your Serper API key>"
      }
    }
  }
}
```
```json
{
  "mcpServers": {
    "serper-toolkit": {
      "command": "uvx",
      "args": ["serper-toolkit"],
      "env": {
        "SERPER_API_KEY": "<Your Serper API key>"
      }
    }
  }
}
```
### Environment Variables
- `SERPER_MAX_CONNECTIONS`: Maximum number of HTTP client connections (default: 200).
- `SERPER_KEEPALIVE`: Maximum number of keep-alive HTTP client connections (default: 20).
- `SERPER_HTTP2`: Enable HTTP/2 (default: "0", set to "1" to enable).
- `SERPER_MAX_CONCURRENT_REQUESTS`: Global maximum concurrent requests (default: 200).
- `SERPER_RETRY_COUNT`: Maximum retry attempts for failed requests (default: 3).
- `SERPER_RETRY_BASE_DELAY`: Base delay time for retries in seconds (default: 0.5).
- `SERPER_ENDPOINT_CONCURRENCY`: Set concurrency per endpoint (JSON format), e.g., {"search":10,"scrape":2}.
- `SERPER_ENDPOINT_RETRYABLE`: Set retry allowance per endpoint (JSON format), e.g., {"scrape": false}.
## Tool Parameters and Usage Examples
### serper-general-search: Perform general web search
Parameters:
- `search_key_words` (str, required): Keywords to search.
- `search_country` (str, optional): Specify the country/region for search results. Supports Chinese names (e.g., "China"), English names (e.g., "United States"), or ISO codes (e.g., "US"). Default is "US".
- `search_num` (int, optional): Number of results to return, range 1-100. Default is 10.
- `search_time` (str, optional): Filter results by time range. Available values: "hour", "day", "week", "month", "year".
Example:
```Python
result_json = serper_general_search(
    search_key_words="AI advancements 2024",
    search_country="United States",
    search_num=5,
    search_time="month"
)
```
### serper-lens-search: Perform reverse image search via image URL
Parameters:
- `image_url` (str, required): Public URL of the image to search.
- `search_country` (str, optional): Specify the country/region for search results. Default is "US".
Example:
```Python
result_json = serper_lens_search(
    image_url="https://www.google.com/images/branding/googlelogo/1x/googlelogo_color_272x92dp.png",
    search_country="JP"
)
```
### serper-scrape: Scrape webpage content
Parameters:
- `url` (str, required): URL of the target webpage.
- `include_markdown` (bool, optional): Whether to include Markdown-formatted content in the returned results. Default is False.
Example:
```Python
result_json = serper_scrape(
    url="https://www.example.com",
    include_markdown=True
)
```
## License Agreement
This project is licensed under the MIT License.
            
         
        Raw data
        
            {
    "_id": null,
    "home_page": null,
    "name": "serper-toolkit",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": "serper, mcp, server, google, search",
    "author": null,
    "author_email": "\"Joey.Kot\" <joey.kot.x@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/c0/32/64e4b1f2df5a9b5a58d5c13365516dd46430fd1801fb6bf03c6ebded2995/serper_toolkit-0.0.8.tar.gz",
    "platform": null,
    "description": "# Serper MCP Toolkit\n\nA high-performance, asynchronous MCP server that provides comprehensive Google search and web content scraping capabilities through the Serper API (excluding some rarely used interfaces).\n\nThis project is built on `httpx`, utilizing asynchronous clients and connection pool management to offer LLMs a stable and efficient external information retrieval tool.\n\n## Key Features\n\n- **Asynchronous Architecture**: Fully based on `asyncio` and `httpx`, ensuring high throughput and non-blocking I/O operations.\n- **HTTP Connection Pool**: Manages and reuses TCP connections through a global `httpx.AsyncClient` instance, significantly improving performance under high concurrency.\n- **Concurrency Control**: Built-in global and per-API endpoint concurrency semaphores effectively manage API request rates to prevent exceeding rate limits.\n- **Automatic Retry Mechanism**: Integrated request retry functionality with exponential backoff strategy automatically handles temporary network fluctuations or server errors, enhancing service stability.\n- **Intelligent Country Code Parsing**: Includes a comprehensive country name dictionary supporting inputs in Chinese, English, ISO Alpha-2/3, and other formats, with automatic normalization.\n- **Flexible Environment Variable Configuration**: Supports fine-tuned service configuration via environment variables.\n\n## Available Tools\n\nThis service provides the following tools:\n\n| Tool Name                | Description                                  |\n| ------------------------ | -------------------------------------------- |\n| `serper-general-search`  | Performs general Google web searches.        |\n| `serper-image-search`    | Performs Google image searches.               |\n| `serper-video-search`    | Performs Google video searches.               |\n| `serper-place-search`    | Performs Google place searches.               |\n| `serper-maps_search`     | Performs Google Maps searches.                |\n| `serper-news-search`     | Performs Google news searches.                |\n| `serper-lens-search`     | Performs Google Lens reverse image searches via image URL. |\n| `serper-scholar-search`  | Performs Google Scholar searches.             |\n| `serper-shopping-search` | Performs Google Shopping searches.            |\n| `serper-patents-search`  | Performs Google Patents searches.             |\n| `serper-scrape`          | Scrapes and returns the content of a specified URL. |\n\n## Installation Guide\n\nIt is recommended to install using `pip` or `uv`.\n\n```bash\n# Using pip\npip install serper-toolkit\n\n# Or using uv\nuv pip install serper-toolkit\n```\n\n## Quick Start\n\n### Set Environment Variables\n\nCreate a `.env` file in the project root directory and enter your Serper API key:\n\n```bash\nSERPER_API_KEY=\"your-serper-api-key-here\"\n```\n\n### Configure MCP Client\n\nAdd the following server configuration in the MCP client configuration file:\n\n```json\n{\n  \"mcpServers\": {\n    \"serper\": {\n      \"command\": \"python3\",\n      \"args\": [\"-m\", \"serper-toolkit\"],\n      \"env\": {\n        \"SERPER_API_KEY\": \"<Your Serper API key>\"\n      }\n    }\n  }\n}\n```\n\n```json\n{\n  \"mcpServers\": {\n    \"serper-toolkit\": {\n      \"command\": \"uvx\",\n      \"args\": [\"serper-toolkit\"],\n      \"env\": {\n        \"SERPER_API_KEY\": \"<Your Serper API key>\"\n      }\n    }\n  }\n}\n```\n\n### Environment Variables\n\n- `SERPER_MAX_CONNECTIONS`: Maximum number of HTTP client connections (default: 200).\n- `SERPER_KEEPALIVE`: Maximum number of keep-alive HTTP client connections (default: 20).\n- `SERPER_HTTP2`: Enable HTTP/2 (default: \"0\", set to \"1\" to enable).\n- `SERPER_MAX_CONCURRENT_REQUESTS`: Global maximum concurrent requests (default: 200).\n- `SERPER_RETRY_COUNT`: Maximum retry attempts for failed requests (default: 3).\n- `SERPER_RETRY_BASE_DELAY`: Base delay time for retries in seconds (default: 0.5).\n- `SERPER_ENDPOINT_CONCURRENCY`: Set concurrency per endpoint (JSON format), e.g., {\"search\":10,\"scrape\":2}.\n- `SERPER_ENDPOINT_RETRYABLE`: Set retry allowance per endpoint (JSON format), e.g., {\"scrape\": false}.\n\n## Tool Parameters and Usage Examples\n\n### serper-general-search: Perform general web search\n\nParameters:\n\n- `search_key_words` (str, required): Keywords to search.\n- `search_country` (str, optional): Specify the country/region for search results. Supports Chinese names (e.g., \"China\"), English names (e.g., \"United States\"), or ISO codes (e.g., \"US\"). Default is \"US\".\n- `search_num` (int, optional): Number of results to return, range 1-100. Default is 10.\n- `search_time` (str, optional): Filter results by time range. Available values: \"hour\", \"day\", \"week\", \"month\", \"year\".\n\nExample:\n\n```Python\nresult_json = serper_general_search(\n    search_key_words=\"AI advancements 2024\",\n    search_country=\"United States\",\n    search_num=5,\n    search_time=\"month\"\n)\n```\n\n### serper-lens-search: Perform reverse image search via image URL\n\nParameters:\n\n- `image_url` (str, required): Public URL of the image to search.\n- `search_country` (str, optional): Specify the country/region for search results. Default is \"US\".\n\nExample:\n\n```Python\nresult_json = serper_lens_search(\n    image_url=\"https://www.google.com/images/branding/googlelogo/1x/googlelogo_color_272x92dp.png\",\n    search_country=\"JP\"\n)\n```\n\n### serper-scrape: Scrape webpage content\n\nParameters:\n\n- `url` (str, required): URL of the target webpage.\n- `include_markdown` (bool, optional): Whether to include Markdown-formatted content in the returned results. Default is False.\n\nExample:\n\n```Python\nresult_json = serper_scrape(\n    url=\"https://www.example.com\",\n    include_markdown=True\n)\n```\n\n## License Agreement\n\nThis project is licensed under the MIT License.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A high-performance, asynchronous MCP server for Serper Google Search, featuring connection pooling, request retries, and intelligent input parsing.",
    "version": "0.0.8",
    "project_urls": null,
    "split_keywords": [
        "serper",
        " mcp",
        " server",
        " google",
        " search"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1319fcf1afd08ce068c8f5b8d2e44caea27541e88ca2f76d9fdef7ad250d7b05",
                "md5": "957a6c0dc8a695e15a0dab7dc418793e",
                "sha256": "8e0d8da73204dc22d196761766043f32961488f49e74ad4570b0e466160ca5d1"
            },
            "downloads": -1,
            "filename": "serper_toolkit-0.0.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "957a6c0dc8a695e15a0dab7dc418793e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 20895,
            "upload_time": "2025-10-10T16:31:00",
            "upload_time_iso_8601": "2025-10-10T16:31:00.036590Z",
            "url": "https://files.pythonhosted.org/packages/13/19/fcf1afd08ce068c8f5b8d2e44caea27541e88ca2f76d9fdef7ad250d7b05/serper_toolkit-0.0.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c03264e4b1f2df5a9b5a58d5c13365516dd46430fd1801fb6bf03c6ebded2995",
                "md5": "b0cfe21970a3a31338ed986207671860",
                "sha256": "42d9d24b3559cb8c7911ec9276f54213ce7b13ccdfd6803e39e5523e79583436"
            },
            "downloads": -1,
            "filename": "serper_toolkit-0.0.8.tar.gz",
            "has_sig": false,
            "md5_digest": "b0cfe21970a3a31338ed986207671860",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.12",
            "size": 23195,
            "upload_time": "2025-10-10T16:31:01",
            "upload_time_iso_8601": "2025-10-10T16:31:01.482681Z",
            "url": "https://files.pythonhosted.org/packages/c0/32/64e4b1f2df5a9b5a58d5c13365516dd46430fd1801fb6bf03c6ebded2995/serper_toolkit-0.0.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-10 16:31:01",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "serper-toolkit"
}