atr-dan


Nameatr-dan JSON
Version 0.2.0rc11 PyPI version JSON
download
home_pageNone
SummaryTeklia DAN
upload_time2024-09-13 13:01:36
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords python htr ocr ner machine learning pytorch
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # DAN: a Segmentation-free Document Attention Network for Handwritten Document Recognition

[![Python >= 3.10](https://img.shields.io/badge/Python-%3E%3D3.10-blue.svg)](https://www.python.org/downloads/release/python-3100/)

For more details about this package, make sure to see the documentation available at <https://atr.pages.teklia.com/dan/>.

This is an open-source project, licensed using [the CeCILL-C license](https://cecill.info/index.en.html).

## Inference

To apply DAN to an image, one needs to first add a few imports and to load an image. Note that the image should be in RGB.

```python
import cv2
from dan.ocr.predict.inference import DAN

image = cv2.cvtColor(cv2.imread(IMAGE_PATH), cv2.COLOR_BGR2RGB)
```

Then one can initialize and load the trained model with the parameters used during training. The directory passed as parameter should have:

- a `model.pt` file,
- a `charset.pkl` file,
- a `parameters.yml` file corresponding to the `inference_parameters.yml` file generated during training.

```python
from pathlib import Path

model_path = Path("models")

model = DAN("cpu")
model.load(model_path, mode="eval")
```

To run the inference on a GPU, one can replace `cpu` by the name of the GPU. In the end, one can run the prediction:

```python
from pathlib import Path
from dan.utils import parse_charset_pattern

# Load image
image_path = "images/page.jpg"
_, image = dan_model.preprocess(str(image_path))

input_tensor = image.unsqueeze(0)
input_tensor = input_tensor.to("cpu")
input_sizes = [image.shape[1:]]

# Predict
text, confidence_scores = model.predict(
    input_tensor,
    input_sizes,
    char_separators=parse_charset_pattern(dan_model.charset),
    confidences=True,
)
```

## Training

This package provides three subcommands. To get more information about any subcommand, use the `--help` option.

### Get started

See the [dedicated page](https://atr.pages.teklia.com/dan/get_started/training/) on the official DAN documentation.

### Data extraction from Arkindex

See the [dedicated page](https://atr.pages.teklia.com/dan/usage/datasets/extract/) on the official DAN documentation.

### Model training

See the [dedicated page](https://atr.pages.teklia.com/dan/usage/train/) on the official DAN documentation.

### Model prediction

See the [dedicated page](https://atr.pages.teklia.com/dan/usage/predict/) on the official DAN documentation.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "atr-dan",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "python, HTR, OCR, NER, machine learning, pytorch",
    "author": null,
    "author_email": "Teklia <contact@teklia.com>",
    "download_url": "https://files.pythonhosted.org/packages/e4/44/edbabbc6ef7d844c82989f96e9ff9969e8222bd7d70ccd885030a1a9bb14/atr_dan-0.2.0rc11.tar.gz",
    "platform": null,
    "description": "# DAN: a Segmentation-free Document Attention Network for Handwritten Document Recognition\n\n[![Python >= 3.10](https://img.shields.io/badge/Python-%3E%3D3.10-blue.svg)](https://www.python.org/downloads/release/python-3100/)\n\nFor more details about this package, make sure to see the documentation available at <https://atr.pages.teklia.com/dan/>.\n\nThis is an open-source project, licensed using [the CeCILL-C license](https://cecill.info/index.en.html).\n\n## Inference\n\nTo apply DAN to an image, one needs to first add a few imports and to load an image. Note that the image should be in RGB.\n\n```python\nimport cv2\nfrom dan.ocr.predict.inference import DAN\n\nimage = cv2.cvtColor(cv2.imread(IMAGE_PATH), cv2.COLOR_BGR2RGB)\n```\n\nThen one can initialize and load the trained model with the parameters used during training. The directory passed as parameter should have:\n\n- a `model.pt` file,\n- a `charset.pkl` file,\n- a `parameters.yml` file corresponding to the `inference_parameters.yml` file generated during training.\n\n```python\nfrom pathlib import Path\n\nmodel_path = Path(\"models\")\n\nmodel = DAN(\"cpu\")\nmodel.load(model_path, mode=\"eval\")\n```\n\nTo run the inference on a GPU, one can replace `cpu` by the name of the GPU. In the end, one can run the prediction:\n\n```python\nfrom pathlib import Path\nfrom dan.utils import parse_charset_pattern\n\n# Load image\nimage_path = \"images/page.jpg\"\n_, image = dan_model.preprocess(str(image_path))\n\ninput_tensor = image.unsqueeze(0)\ninput_tensor = input_tensor.to(\"cpu\")\ninput_sizes = [image.shape[1:]]\n\n# Predict\ntext, confidence_scores = model.predict(\n    input_tensor,\n    input_sizes,\n    char_separators=parse_charset_pattern(dan_model.charset),\n    confidences=True,\n)\n```\n\n## Training\n\nThis package provides three subcommands. To get more information about any subcommand, use the `--help` option.\n\n### Get started\n\nSee the [dedicated page](https://atr.pages.teklia.com/dan/get_started/training/) on the official DAN documentation.\n\n### Data extraction from Arkindex\n\nSee the [dedicated page](https://atr.pages.teklia.com/dan/usage/datasets/extract/) on the official DAN documentation.\n\n### Model training\n\nSee the [dedicated page](https://atr.pages.teklia.com/dan/usage/train/) on the official DAN documentation.\n\n### Model prediction\n\nSee the [dedicated page](https://atr.pages.teklia.com/dan/usage/predict/) on the official DAN documentation.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Teklia DAN",
    "version": "0.2.0rc11",
    "project_urls": {
        "Bug Tracker": "https://gitlab.teklia.com/atr/dan/issues",
        "Documentation": "https://atr.pages.teklia/dan/",
        "Homepage": "https://atr.pages.teklia/dan/",
        "Maintainers": "https://teklia.com",
        "Repository": "https://gitlab.teklia.com/atr/dan"
    },
    "split_keywords": [
        "python",
        " htr",
        " ocr",
        " ner",
        " machine learning",
        " pytorch"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "10245af62b9ecddac5ec3c9a302c4547f35fbb8d4a8b5dee3d954946b1978482",
                "md5": "183139e579e8ea4ab3f4c7d528d316b1",
                "sha256": "7793aa098e5ffb2f945a05f818fe8a0c49ddfe56c0c74bc2f05990e8482e9ec5"
            },
            "downloads": -1,
            "filename": "atr_dan-0.2.0rc11-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "183139e579e8ea4ab3f4c7d528d316b1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 125522,
            "upload_time": "2024-09-13T13:01:34",
            "upload_time_iso_8601": "2024-09-13T13:01:34.327646Z",
            "url": "https://files.pythonhosted.org/packages/10/24/5af62b9ecddac5ec3c9a302c4547f35fbb8d4a8b5dee3d954946b1978482/atr_dan-0.2.0rc11-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e444edbabbc6ef7d844c82989f96e9ff9969e8222bd7d70ccd885030a1a9bb14",
                "md5": "7dbfc01dc6cc24f8a394daf7547ebf5e",
                "sha256": "87f23be6d8187a490fac48dc6905c17034740fade27da8f2b16a2bd52fec9663"
            },
            "downloads": -1,
            "filename": "atr_dan-0.2.0rc11.tar.gz",
            "has_sig": false,
            "md5_digest": "7dbfc01dc6cc24f8a394daf7547ebf5e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 101921,
            "upload_time": "2024-09-13T13:01:36",
            "upload_time_iso_8601": "2024-09-13T13:01:36.447550Z",
            "url": "https://files.pythonhosted.org/packages/e4/44/edbabbc6ef7d844c82989f96e9ff9969e8222bd7d70ccd885030a1a9bb14/atr_dan-0.2.0rc11.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-13 13:01:36",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "atr-dan"
}
        
Elapsed time: 0.82183s