th2-grpc-crawler-data-processor


Nameth2-grpc-crawler-data-processor JSON
Version 0.3.1 PyPI version JSON
download
home_pagehttps://github.com/th2-net/th2-grpc-crawler-data-processor
Summaryth2_grpc_crawler_data_processor
upload_time2022-05-13 15:15:53
maintainer
docs_urlNone
authorTH2-devs
requires_python>=3.7
licenseApache License 2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # th2 gRPC crawler data processor library (0.3.1)

This project contains the gRPC interface to implement if you want to create your own crawler data processor.

The crawler data processor work with the [crawler](https://github.com/th2-net/th2-crawler)

## How to transform template
1. Create a directory with the same name as project name (use underscores instead of dashes) under `src/main/proto` directory (remove other files and directories if they exist).
2. Place your custom `.proto` files in created directory. Pay attention to `package` specifier and `import` statements.
3. Edit `release_version` and `vcs_url` properties in `gradle.properties` file.
4. Edit `rootProject.name` variable in `settings.gradle` file. This will be the name of Java package.
5. Edit `package_info.json` file in order to specify name and version for Python package (create file if it's absent).
6. Edit parameters of `setup.py` in `setup` function invocation such as: `author`, `author_email`, `url`. Do not edit the others.
7. Edit `README.md` file according to the new project.

Note that the name of created directory under `src/main/proto` directory is used in Python (it's a package name).

## How to maintain project
1. Make your changes.
2. Up version of Java package in `gradle.properties` file.
3. Up version of Python package in `package_info.json` file.
4. Commit everything.

## How to run project

### Java
If you wish to manually create and publish package for Java, run these command:
```
gradle --no-daemon clean build publish artifactoryPublish \
       -Pbintray_user=${BINTRAY_USER} \
       -Pbintray_key=${BINTRAY_KEY}
```
`BINTRAY_USER` and `BINTRAY_KEY` are parameters for publishing.

### Python
If you wish to manually create and publish package for Python:
1. Generate services by gradle:
    ```
       gradle --no-daemon clean generateProto
    ```
    You can find the generated files by following path: `src/gen/main/services/python`
2. Generate code from `.proto` files and publish everything:
    ```
    pip install -r requirements.txt
    python setup.py generate
    python setup.py sdist
    twine upload --repository-url ${PYPI_REPOSITORY_URL} --username ${PYPI_USER} --password ${PYPI_PASSWORD} dist/*
    ```
    `PYPI_REPOSITORY_URL`, `PYPI_USER` and `PYPI_PASSWORD` are parameters for publishing.

## Changes:

### 0.3.1

+ Update `th2-grpc-data-provider` Python dependency from `0.1.4` to `0.1.6` 

### 0.3.0

+ Update `th2-bom` from `3.0.0` to `3.1.0` 
+ Update `th2-grpc-common` from `3.1.2` to `3.8.0`
+ Add stubs for Python

### 0.2.0 (Breaking changes)

#### Breaking:

+ Use list of `MessageID` instead of mapping between session and `MessageID`.
  User now will have to specify `MessageID` for both directions in the response if he or she needs to set a checkpoint.
  The list should contain a single `MessageID` for each pair `alias + direction`.
  If more than one is found the last one (according to their sequences) will be taken.
+ The rpc methods was renamed according to the Protobuf naming convention (uses PascalCase).
+ The event and message IDs from the response to connect method are removed because this functionality requires additional improvements on Crawler's side.

#### Added:

+ New method that will be invoked by the crawler each time the new interval is started.
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/th2-net/th2-grpc-crawler-data-processor",
    "name": "th2-grpc-crawler-data-processor",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "",
    "author": "TH2-devs",
    "author_email": "th2-devs@exactprosystems.com",
    "download_url": "https://files.pythonhosted.org/packages/7c/e7/6648e9f2e1af04d0ac86fddd76bcdb06f6dfad66ff5907b63d5980b16d1a/th2_grpc_crawler_data_processor-0.3.1.tar.gz",
    "platform": null,
    "description": "# th2 gRPC crawler data processor library (0.3.1)\n\nThis project contains the gRPC interface to implement if you want to create your own crawler data processor.\n\nThe crawler data processor work with the [crawler](https://github.com/th2-net/th2-crawler)\n\n## How to transform template\n1. Create a directory with the same name as project name (use underscores instead of dashes) under `src/main/proto` directory (remove other files and directories if they exist).\n2. Place your custom `.proto` files in created directory. Pay attention to `package` specifier and `import` statements.\n3. Edit `release_version` and `vcs_url` properties in `gradle.properties` file.\n4. Edit `rootProject.name` variable in `settings.gradle` file. This will be the name of Java package.\n5. Edit `package_info.json` file in order to specify name and version for Python package (create file if it's absent).\n6. Edit parameters of `setup.py` in `setup` function invocation such as: `author`, `author_email`, `url`. Do not edit the others.\n7. Edit `README.md` file according to the new project.\n\nNote that the name of created directory under `src/main/proto` directory is used in Python (it's a package name).\n\n## How to maintain project\n1. Make your changes.\n2. Up version of Java package in `gradle.properties` file.\n3. Up version of Python package in `package_info.json` file.\n4. Commit everything.\n\n## How to run project\n\n### Java\nIf you wish to manually create and publish package for Java, run these command:\n```\ngradle --no-daemon clean build publish artifactoryPublish \\\n       -Pbintray_user=${BINTRAY_USER} \\\n       -Pbintray_key=${BINTRAY_KEY}\n```\n`BINTRAY_USER` and `BINTRAY_KEY` are parameters for publishing.\n\n### Python\nIf you wish to manually create and publish package for Python:\n1. Generate services by gradle:\n    ```\n       gradle --no-daemon clean generateProto\n    ```\n    You can find the generated files by following path: `src/gen/main/services/python`\n2. Generate code from `.proto` files and publish everything:\n    ```\n    pip install -r requirements.txt\n    python setup.py generate\n    python setup.py sdist\n    twine upload --repository-url ${PYPI_REPOSITORY_URL} --username ${PYPI_USER} --password ${PYPI_PASSWORD} dist/*\n    ```\n    `PYPI_REPOSITORY_URL`, `PYPI_USER` and `PYPI_PASSWORD` are parameters for publishing.\n\n## Changes:\n\n### 0.3.1\n\n+ Update `th2-grpc-data-provider` Python dependency from `0.1.4` to `0.1.6` \n\n### 0.3.0\n\n+ Update `th2-bom` from `3.0.0` to `3.1.0` \n+ Update `th2-grpc-common` from `3.1.2` to `3.8.0`\n+ Add stubs for Python\n\n### 0.2.0 (Breaking changes)\n\n#### Breaking:\n\n+ Use list of `MessageID` instead of mapping between session and `MessageID`.\n  User now will have to specify `MessageID` for both directions in the response if he or she needs to set a checkpoint.\n  The list should contain a single `MessageID` for each pair `alias + direction`.\n  If more than one is found the last one (according to their sequences) will be taken.\n+ The rpc methods was renamed according to the Protobuf naming convention (uses PascalCase).\n+ The event and message IDs from the response to connect method are removed because this functionality requires additional improvements on Crawler's side.\n\n#### Added:\n\n+ New method that will be invoked by the crawler each time the new interval is started.",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "th2_grpc_crawler_data_processor",
    "version": "0.3.1",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7ce76648e9f2e1af04d0ac86fddd76bcdb06f6dfad66ff5907b63d5980b16d1a",
                "md5": "631c87a94dff10431817004a41dae96f",
                "sha256": "e38f2b3b0c7e61bc786d25b192e8c97b85f531832013b05f64a304f32e150941"
            },
            "downloads": -1,
            "filename": "th2_grpc_crawler_data_processor-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "631c87a94dff10431817004a41dae96f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 11023,
            "upload_time": "2022-05-13T15:15:53",
            "upload_time_iso_8601": "2022-05-13T15:15:53.100577Z",
            "url": "https://files.pythonhosted.org/packages/7c/e7/6648e9f2e1af04d0ac86fddd76bcdb06f6dfad66ff5907b63d5980b16d1a/th2_grpc_crawler_data_processor-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-05-13 15:15:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "th2-net",
    "github_project": "th2-grpc-crawler-data-processor",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "th2-grpc-crawler-data-processor"
}
        
Elapsed time: 0.02613s