is-bot


Nameis-bot JSON
Version 0.3.2 PyPI version JSON
download
home_pagehttps://github.com/romis2012/is-bot
SummaryPython package to detect bots/crawlers/spiders via user-agent
upload_time2025-04-07 14:10:18
maintainerNone
docs_urlNone
authorRoman Snegirev
requires_pythonNone
licenseApache 2
keywords python bots crawlers web-crawlers user-agent user-agent-parser
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ## is-bot

[![CI](https://github.com/romis2012/is-bot/actions/workflows/ci.yml/badge.svg)](https://github.com/romis2012/is-bot/actions/workflows/ci.yml)
[![Coverage Status](https://codecov.io/gh/romis2012/is-bot/branch/master/graph/badge.svg)](https://codecov.io/gh/romis2012/is-bot)
[![PyPI version](https://badge.fury.io/py/is-bot.svg)](https://pypi.python.org/pypi/is-bot)

Python package to detect bots/crawlers/spiders via user-agent string.
This is a port of the [isbot](https://github.com/omrilotan/isbot) JavaScript module.


## Requirements
- Python >= 3.7
- regex >= 2022.8.17

## Installation
```
pip install is-bot
```

## Usage

### Simple usage

```python
from is_bot import Bots

bots = Bots()

ua = 'Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Chrome/104.0.5112.79 Safari/537.36'
assert bots.is_bot(ua)

ua = 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/104.0.0.0 Safari/537.36'
assert not bots.is_bot(ua)
```

### Add/remove parsing rules

```python
from is_bot import Bots

bots = Bots()

# Exclude Chrome-Lighthouse from default bot list
ua = 'Mozilla/5.0 (Linux; Android 7.0; Moto G (4)) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4695.0 Mobile Safari/537.36 Chrome-Lighthouse'
assert bots.is_bot(ua)
bots.exclude(['chrome-lighthouse'])
assert not bots.is_bot(ua)

# Add some browser to default bot list
ua = 'SomeAwesomeBrowser/10.0 (Linux; Android 7.0)'
assert not bots.is_bot(ua)
bots.extend(['SomeAwesomeBrowser'])
assert bots.is_bot(ua)
```

### Get additional parsing information

```python
from is_bot import Bots

bots = Bots()

ua = 'Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0 SearchRobot/1.0'

# view the respective match for bot user agent rule
print(bots.find(ua))
#> Search

# list all patterns that match the user agent string
print(bots.matches(ua))
#> ['(?<! (ya|yandex))search', '(?<! cu)bot']
```


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/romis2012/is-bot",
    "name": "is-bot",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "python bots crawlers web-crawlers user-agent user-agent-parser",
    "author": "Roman Snegirev",
    "author_email": "snegiryev@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/39/e3/91c6800ccafdaea1d255de88faaf9e78b842b3676a257e31e61f9291b7d7/is_bot-0.3.2.tar.gz",
    "platform": null,
    "description": "## is-bot\n\n[![CI](https://github.com/romis2012/is-bot/actions/workflows/ci.yml/badge.svg)](https://github.com/romis2012/is-bot/actions/workflows/ci.yml)\n[![Coverage Status](https://codecov.io/gh/romis2012/is-bot/branch/master/graph/badge.svg)](https://codecov.io/gh/romis2012/is-bot)\n[![PyPI version](https://badge.fury.io/py/is-bot.svg)](https://pypi.python.org/pypi/is-bot)\n\nPython package to detect bots/crawlers/spiders via user-agent string.\nThis is a port of the [isbot](https://github.com/omrilotan/isbot) JavaScript module.\n\n\n## Requirements\n- Python >= 3.7\n- regex >= 2022.8.17\n\n## Installation\n```\npip install is-bot\n```\n\n## Usage\n\n### Simple usage\n\n```python\nfrom is_bot import Bots\n\nbots = Bots()\n\nua = 'Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Chrome/104.0.5112.79 Safari/537.36'\nassert bots.is_bot(ua)\n\nua = 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/104.0.0.0 Safari/537.36'\nassert not bots.is_bot(ua)\n```\n\n### Add/remove parsing rules\n\n```python\nfrom is_bot import Bots\n\nbots = Bots()\n\n# Exclude Chrome-Lighthouse from default bot list\nua = 'Mozilla/5.0 (Linux; Android 7.0; Moto G (4)) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4695.0 Mobile Safari/537.36 Chrome-Lighthouse'\nassert bots.is_bot(ua)\nbots.exclude(['chrome-lighthouse'])\nassert not bots.is_bot(ua)\n\n# Add some browser to default bot list\nua = 'SomeAwesomeBrowser/10.0 (Linux; Android 7.0)'\nassert not bots.is_bot(ua)\nbots.extend(['SomeAwesomeBrowser'])\nassert bots.is_bot(ua)\n```\n\n### Get additional parsing information\n\n```python\nfrom is_bot import Bots\n\nbots = Bots()\n\nua = 'Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0 SearchRobot/1.0'\n\n# view the respective match for bot user agent rule\nprint(bots.find(ua))\n#> Search\n\n# list all patterns that match the user agent string\nprint(bots.matches(ua))\n#> ['(?<! (ya|yandex))search', '(?<! cu)bot']\n```\n\n",
    "bugtrack_url": null,
    "license": "Apache 2",
    "summary": "Python package to detect bots/crawlers/spiders via user-agent",
    "version": "0.3.2",
    "project_urls": {
        "Homepage": "https://github.com/romis2012/is-bot"
    },
    "split_keywords": [
        "python",
        "bots",
        "crawlers",
        "web-crawlers",
        "user-agent",
        "user-agent-parser"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4c8340bf9e42cbba9c8db1cfc137672cd4ffcdf16ffa2829b093a5d686f06490",
                "md5": "1abcebd3a247ab7b58837f2e9aed35d0",
                "sha256": "87f825091804a2fef6e018faae01d8364c495db0e9490cb5a67018cd898c476d"
            },
            "downloads": -1,
            "filename": "is_bot-0.3.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1abcebd3a247ab7b58837f2e9aed35d0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 8344,
            "upload_time": "2025-04-07T14:10:17",
            "upload_time_iso_8601": "2025-04-07T14:10:17.394200Z",
            "url": "https://files.pythonhosted.org/packages/4c/83/40bf9e42cbba9c8db1cfc137672cd4ffcdf16ffa2829b093a5d686f06490/is_bot-0.3.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "39e391c6800ccafdaea1d255de88faaf9e78b842b3676a257e31e61f9291b7d7",
                "md5": "9adb976aefdea5e089cb0aa59e143e49",
                "sha256": "4dfc670daf76523be96e33d9c09f79df52be9f151887f92190aff563426c6f75"
            },
            "downloads": -1,
            "filename": "is_bot-0.3.2.tar.gz",
            "has_sig": false,
            "md5_digest": "9adb976aefdea5e089cb0aa59e143e49",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 9150,
            "upload_time": "2025-04-07T14:10:18",
            "upload_time_iso_8601": "2025-04-07T14:10:18.339809Z",
            "url": "https://files.pythonhosted.org/packages/39/e3/91c6800ccafdaea1d255de88faaf9e78b842b3676a257e31e61f9291b7d7/is_bot-0.3.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-04-07 14:10:18",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "romis2012",
    "github_project": "is-bot",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "is-bot"
}
        
Elapsed time: 7.95853s