data-prep-connector


Namedata-prep-connector JSON
Version 0.2.2 PyPI version JSON
download
home_pageNone
SummaryScalable and Compliant Web Crawler
upload_time2024-10-23 14:52:16
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseApache-2.0
keywords data data acquisition crawler web crawler llm generative ai fine-tuning llmapps
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # DPK Connector

DPK Connector is a scalable and compliant web crawler developed for data acquisition towards LLM development. It is built on [Scrapy](https://scrapy.org/).
For more details read [the documentation](doc/overview.md).

## Virtual Environment

The project uses `pyproject.toml` and a Makefile for operations.
To do development you should establish the virtual environment
```shell
make venv
```
and then either activate
```shell
source venv/bin/activate
```
or set up your IDE to use the venv directory when developing in this project

## Library Artifact Build and Publish

To test, build and publish the library
```shell
make test build publish
```

To up the version number, edit the Makefile to change VERSION and rerun the above. This will require committing both the `Makefile` and the autotmatically updated `pyproject.toml` file.

## How to use

See [the overview](doc/overview.md).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "data-prep-connector",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "data, data acquisition, crawler, web crawler, llm, generative, ai, fine-tuning, llmapps",
    "author": null,
    "author_email": "Hiroya Matsubara <hmtbr@jp.ibm.com>",
    "download_url": "https://files.pythonhosted.org/packages/3b/ca/96600f0a7dae543f04d11990d414e200b2cf72ad4277c230909eaf1a307a/data_prep_connector-0.2.2.tar.gz",
    "platform": null,
    "description": "# DPK Connector\n\nDPK Connector is a scalable and compliant web crawler developed for data acquisition towards LLM development. It is built on [Scrapy](https://scrapy.org/).\nFor more details read [the documentation](doc/overview.md).\n\n## Virtual Environment\n\nThe project uses `pyproject.toml` and a Makefile for operations.\nTo do development you should establish the virtual environment\n```shell\nmake venv\n```\nand then either activate\n```shell\nsource venv/bin/activate\n```\nor set up your IDE to use the venv directory when developing in this project\n\n## Library Artifact Build and Publish\n\nTo test, build and publish the library\n```shell\nmake test build publish\n```\n\nTo up the version number, edit the Makefile to change VERSION and rerun the above. This will require committing both the `Makefile` and the autotmatically updated `pyproject.toml` file.\n\n## How to use\n\nSee [the overview](doc/overview.md).\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Scalable and Compliant Web Crawler",
    "version": "0.2.2",
    "project_urls": null,
    "split_keywords": [
        "data",
        " data acquisition",
        " crawler",
        " web crawler",
        " llm",
        " generative",
        " ai",
        " fine-tuning",
        " llmapps"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d667592be7fd3791d60524a5b54f71caa7eac1e292b1327d5205431b68cdcf21",
                "md5": "080084fb08f7195cbfe1c1065ecf7ef1",
                "sha256": "903142ecac9894e6b404023a656153df38bd12027ed3a4ffaa3fa0875b6538a6"
            },
            "downloads": -1,
            "filename": "data_prep_connector-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "080084fb08f7195cbfe1c1065ecf7ef1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 16574,
            "upload_time": "2024-10-23T14:52:14",
            "upload_time_iso_8601": "2024-10-23T14:52:14.841648Z",
            "url": "https://files.pythonhosted.org/packages/d6/67/592be7fd3791d60524a5b54f71caa7eac1e292b1327d5205431b68cdcf21/data_prep_connector-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3bca96600f0a7dae543f04d11990d414e200b2cf72ad4277c230909eaf1a307a",
                "md5": "0be2ca4df8256f2650c0a84442ed1ba9",
                "sha256": "48b56e614721597fdc90c08850e37d63d39eb76551d40db28b32216dc5d67fbb"
            },
            "downloads": -1,
            "filename": "data_prep_connector-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "0be2ca4df8256f2650c0a84442ed1ba9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 15527,
            "upload_time": "2024-10-23T14:52:16",
            "upload_time_iso_8601": "2024-10-23T14:52:16.459206Z",
            "url": "https://files.pythonhosted.org/packages/3b/ca/96600f0a7dae543f04d11990d414e200b2cf72ad4277c230909eaf1a307a/data_prep_connector-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-23 14:52:16",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "data-prep-connector"
}
        
Elapsed time: 0.59388s