scrapse


Namescrapse JSON
Version 0.3.9 PyPI version JSON
download
home_pagehttps://gitlab.di.unito.it/ngupp/ngupp-scrapse
SummaryPackage to download and manage judgments
upload_time2023-04-14 12:28:50
maintainerzaharia laurentiu jr marius
docs_urlNone
authorzaharia laurentiu jr marius
requires_python>=3.8,<4.0
license
keywords scraping judgments
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ScrapSE

## Package description

ScrapSE downloads and manage the judgments.

Currently supported platforms: LEGGI D'ITALIA PA.

### Install scrapse
```
pip install scrapse
```
The package creates the `scrapse` folder in `/Users/your_username`, where it will save all judgments in the  
appropriate subfolders.

### How to use

#### Saving cookies - important!
```
scrapse leggitalia save-cookie 'your_cokies'
```
This command saves session cookies in a special file, containing  `your_cookie`.

#### Show filter values
```
scrapse leggitalia show-filters
```
This command shows the possible values to be assigned to sentence search filters.

#### Download the judgments
Make sure you have **saved** platform-related cookies before downloading the judgments!.
```
scrapse leggitalia scrap-judgments -l torino -s 'Sez. lavoro, Sez. V'
```
This command creates a folder in `/Users/your_username/scrapse/leggitalia` named `sez.lavoro&sez.v_torino` containing the judgments.

#### Dump judgments to json format
```
scrapse leggitalia dump-judgments -d 'folder_path'
```
This command creates the json files by saving them in the `/Users/your_username/scrapse/leggitalia/judgments_dump` folder.

#### For more help
For more information for each command.
```
command-name --help
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://gitlab.di.unito.it/ngupp/ngupp-scrapse",
    "name": "scrapse",
    "maintainer": "zaharia laurentiu jr marius",
    "docs_url": null,
    "requires_python": ">=3.8,<4.0",
    "maintainer_email": "zaharialorenzo@gmail.com",
    "keywords": "scraping,judgments",
    "author": "zaharia laurentiu jr marius",
    "author_email": "zaharialorenzo@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/95/dd/01d3d50132107e0666075d203d21a7621774c728158629d70a335e536e2a/scrapse-0.3.9.tar.gz",
    "platform": null,
    "description": "# ScrapSE\n\n## Package description\n\nScrapSE downloads and manage the judgments.\n\nCurrently supported platforms: LEGGI D'ITALIA PA.\n\n### Install scrapse\n```\npip install scrapse\n```\nThe package creates the `scrapse` folder in `/Users/your_username`, where it will save all judgments in the  \nappropriate subfolders.\n\n### How to use\n\n#### Saving cookies - important!\n```\nscrapse leggitalia save-cookie 'your_cokies'\n```\nThis command saves session cookies in a special file, containing  `your_cookie`.\n\n#### Show filter values\n```\nscrapse leggitalia show-filters\n```\nThis command shows the possible values to be assigned to sentence search filters.\n\n#### Download the judgments\nMake sure you have **saved** platform-related cookies before downloading the judgments!.\n```\nscrapse leggitalia scrap-judgments -l torino -s 'Sez. lavoro, Sez. V'\n```\nThis command creates a folder in `/Users/your_username/scrapse/leggitalia` named `sez.lavoro&sez.v_torino` containing the judgments.\n\n#### Dump judgments to json format\n```\nscrapse leggitalia dump-judgments -d 'folder_path'\n```\nThis command creates the json files by saving them in the `/Users/your_username/scrapse/leggitalia/judgments_dump` folder.\n\n#### For more help\nFor more information for each command.\n```\ncommand-name --help\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Package to download and manage judgments",
    "version": "0.3.9",
    "split_keywords": [
        "scraping",
        "judgments"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "651bf7ccb63f3c107b0f55088606af73d923ff448765bf0d977a6d665815e88d",
                "md5": "a5c16ecbd0caad9ddcd62c2d2f27bc03",
                "sha256": "569574386eddb733918eebe63ef17f6e5106ed6163638a809b6417061f30958a"
            },
            "downloads": -1,
            "filename": "scrapse-0.3.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a5c16ecbd0caad9ddcd62c2d2f27bc03",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<4.0",
            "size": 54747,
            "upload_time": "2023-04-14T12:28:48",
            "upload_time_iso_8601": "2023-04-14T12:28:48.982063Z",
            "url": "https://files.pythonhosted.org/packages/65/1b/f7ccb63f3c107b0f55088606af73d923ff448765bf0d977a6d665815e88d/scrapse-0.3.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "95dd01d3d50132107e0666075d203d21a7621774c728158629d70a335e536e2a",
                "md5": "0f72122a422b0b3570d590c1d42f7a52",
                "sha256": "41143966af1e25b2871bb19c117c954f24a002f9342d928448a3fcdd7a8563a8"
            },
            "downloads": -1,
            "filename": "scrapse-0.3.9.tar.gz",
            "has_sig": false,
            "md5_digest": "0f72122a422b0b3570d590c1d42f7a52",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<4.0",
            "size": 51334,
            "upload_time": "2023-04-14T12:28:50",
            "upload_time_iso_8601": "2023-04-14T12:28:50.837955Z",
            "url": "https://files.pythonhosted.org/packages/95/dd/01d3d50132107e0666075d203d21a7621774c728158629d70a335e536e2a/scrapse-0.3.9.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-14 12:28:50",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "scrapse"
}
        
Elapsed time: 0.10124s