OpenNMT-tf


NameOpenNMT-tf JSON
Version 2.32.0 PyPI version JSON
download
home_pagehttps://opennmt.net
SummaryNeural machine translation and sequence learning using TensorFlow
upload_time2023-08-04 08:38:13
maintainer
docs_urlNone
authorOpenNMT
requires_python>=3.7
licenseMIT
keywords tensorflow opennmt nmt neural machine translation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            [![CI](https://github.com/OpenNMT/OpenNMT-tf/workflows/CI/badge.svg)](https://github.com/OpenNMT/OpenNMT-tf/actions?query=workflow%3ACI) [![codecov](https://codecov.io/gh/OpenNMT/OpenNMT-tf/branch/master/graph/badge.svg)](https://codecov.io/gh/OpenNMT/OpenNMT-tf) [![PyPI version](https://badge.fury.io/py/OpenNMT-tf.svg)](https://badge.fury.io/py/OpenNMT-tf) [![Documentation](https://img.shields.io/badge/docs-latest-blue.svg)](https://opennmt.net/OpenNMT-tf/) [![Gitter](https://badges.gitter.im/OpenNMT/OpenNMT-tf.svg)](https://gitter.im/OpenNMT/OpenNMT-tf?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![Forum](https://img.shields.io/discourse/status?server=https%3A%2F%2Fforum.opennmt.net%2F)](https://forum.opennmt.net/)

# OpenNMT-tf

OpenNMT-tf is a general purpose sequence learning toolkit using TensorFlow 2. While neural machine translation is the main target task, it has been designed to more generally support:

* sequence to sequence mapping
* sequence tagging
* sequence classification
* language modeling

The project is production-oriented and comes with [backward compatibility guarantees](https://github.com/OpenNMT/OpenNMT-tf/blob/master/CHANGELOG.md).

## Key features

### Modular model architecture

Models are described with code to allow training custom architectures and overriding default behavior. For example, the following instance defines a sequence to sequence model with 2 concatenated input features, a self-attentional encoder, and an attentional RNN decoder sharing its input and output embeddings:

```python
opennmt.models.SequenceToSequence(
    source_inputter=opennmt.inputters.ParallelInputter(
        [
            opennmt.inputters.WordEmbedder(embedding_size=256),
            opennmt.inputters.WordEmbedder(embedding_size=256),
        ],
        reducer=opennmt.layers.ConcatReducer(axis=-1),
    ),
    target_inputter=opennmt.inputters.WordEmbedder(embedding_size=512),
    encoder=opennmt.encoders.SelfAttentionEncoder(num_layers=6),
    decoder=opennmt.decoders.AttentionalRNNDecoder(
        num_layers=4,
        num_units=512,
        attention_mechanism_class=tfa.seq2seq.LuongAttention,
    ),
    share_embeddings=opennmt.models.EmbeddingsSharingLevel.TARGET,
)
```

The [`opennmt`](https://opennmt.net/OpenNMT-tf/package/overview.html) package exposes other building blocks that can be used to design:

* [multiple input features](https://opennmt.net/OpenNMT-tf/package/opennmt.inputters.ParallelInputter.html)
* [mixed embedding representation](https://opennmt.net/OpenNMT-tf/package/opennmt.inputters.MixedInputter.html)
* [multi-source context](https://opennmt.net/OpenNMT-tf/package/opennmt.inputters.ParallelInputter.html)
* [cascaded](https://opennmt.net/OpenNMT-tf/package/opennmt.encoders.SequentialEncoder.html) or [multi-column](https://opennmt.net/OpenNMT-tf/package/opennmt.encoders.ParallelEncoder.html) encoder
* [hybrid sequence to sequence models](https://opennmt.net/OpenNMT-tf/package/opennmt.models.SequenceToSequence.html)

Standard models such as the Transformer are defined in a [model catalog](https://github.com/OpenNMT/OpenNMT-tf/blob/master/opennmt/models/catalog.py) and can be used without additional configuration.

*Find more information about model configuration in the [documentation](https://opennmt.net/OpenNMT-tf/model.html).*

### Full TensorFlow 2 integration

OpenNMT-tf is fully integrated in the TensorFlow 2 ecosystem:

* Reusable layers extending [`tf.keras.layers.Layer`](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer)
* Multi-GPU training with [`tf.distribute`](https://www.tensorflow.org/api_docs/python/tf/distribute) and distributed training with [Horovod](https://github.com/horovod/horovod)
* Mixed precision training with [`tf.keras.mixed_precision`](https://www.tensorflow.org/guide/mixed_precision)
* Visualization with [TensorBoard](https://www.tensorflow.org/tensorboard)
* `tf.function` graph tracing that can be [exported to a SavedModel](https://opennmt.net/OpenNMT-tf/serving.html) and served with [TensorFlow Serving](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/serving/tensorflow_serving) or [Python](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/serving/python)

### Compatibility with CTranslate2

[CTranslate2](https://github.com/OpenNMT/CTranslate2) is an optimized inference engine for OpenNMT models featuring fast CPU and GPU execution, model quantization, parallel translations, dynamic memory usage, interactive decoding, and more! OpenNMT-tf can [automatically export](https://opennmt.net/OpenNMT-tf/serving.html#ctranslate2) models to be used in CTranslate2.

### Dynamic data pipeline

OpenNMT-tf does not require to compile the data before the training. Instead, it can directly read text files and preprocess the data when needed by the training. This allows [on-the-fly tokenization](https://opennmt.net/OpenNMT-tf/tokenization.html) and data augmentation by injecting random noise.

### Model fine-tuning

OpenNMT-tf supports model fine-tuning workflows:

* Model weights can be transferred to new word vocabularies, e.g. to inject domain terminology before fine-tuning on in-domain data
* [Contrastive learning](https://ai.google/research/pubs/pub48253/) to reduce word omission errors

### Source-target alignment

Sequence to sequence models can be trained with [guided alignment](https://arxiv.org/abs/1607.01628) and alignment information are returned as part of the translation API.

---

OpenNMT-tf also implements most of the techniques commonly used to train and evaluate sequence models, such as:

* automatic evaluation during the training
* multiple decoding strategy: greedy search, beam search, random sampling
* N-best rescoring
* gradient accumulation
* scheduled sampling
* checkpoint averaging
* ... and more!

*See the [documentation](https://opennmt.net/OpenNMT-tf/) to learn how to use these features.*

## Usage

OpenNMT-tf requires:

* Python 3.7 or above
* TensorFlow 2.6, 2.7, 2.8, 2.9, 2.10, 2.11, 2.12, or 2.13

We recommend installing it with `pip`:

```bash
pip install --upgrade pip
pip install OpenNMT-tf
```

*See the [documentation](https://opennmt.net/OpenNMT-tf/installation.html) for more information.*

### Command line

OpenNMT-tf comes with several command line utilities to prepare data, train, and evaluate models.

For all tasks involving a model execution, OpenNMT-tf uses a unique entrypoint: `onmt-main`. A typical OpenNMT-tf run consists of 3 elements:

* the **model** type
* the **parameters** described in a YAML file
* the **run** type such as `train`, `eval`, `infer`, `export`, `score`, `average_checkpoints`, or `update_vocab`

that are passed to the main script:

```
onmt-main --model_type <model> --config <config_file.yml> --auto_config <run_type> <run_options>
```

*For more information and examples on how to use OpenNMT-tf, please visit [our documentation](https://opennmt.net/OpenNMT-tf).*

### Library

OpenNMT-tf also exposes [well-defined and stable APIs](https://opennmt.net/OpenNMT-tf/package/overview.html), from high-level training utilities to low-level model layers and dataset transformations.

For example, the `Runner` class can be used to train and evaluate models with few lines of code:

```python
import opennmt

config = {
    "model_dir": "/data/wmt-ende/checkpoints/",
    "data": {
        "source_vocabulary": "/data/wmt-ende/joint-vocab.txt",
        "target_vocabulary": "/data/wmt-ende/joint-vocab.txt",
        "train_features_file": "/data/wmt-ende/train.en",
        "train_labels_file": "/data/wmt-ende/train.de",
        "eval_features_file": "/data/wmt-ende/valid.en",
        "eval_labels_file": "/data/wmt-ende/valid.de",
    }
}

model = opennmt.models.TransformerBase()
runner = opennmt.Runner(model, config, auto_config=True)
runner.train(num_devices=2, with_eval=True)
```

Here is another example using OpenNMT-tf to run efficient beam search with a self-attentional decoder:

```python
decoder = opennmt.decoders.SelfAttentionDecoder(num_layers=6, vocab_size=32000)

initial_state = decoder.initial_state(
    memory=memory, memory_sequence_length=memory_sequence_length
)

batch_size = tf.shape(memory)[0]
start_ids = tf.fill([batch_size], opennmt.START_OF_SENTENCE_ID)

decoding_result = decoder.dynamic_decode(
    target_embedding,
    start_ids=start_ids,
    initial_state=initial_state,
    decoding_strategy=opennmt.utils.BeamSearch(4),
)
```

More examples using OpenNMT-tf as a library can be found online:

* The directory [examples/library](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/library) contains additional examples that use OpenNMT-tf as a library
* [nmt-wizard-docker](https://github.com/OpenNMT/nmt-wizard-docker) uses the high-level `opennmt.Runner` API to wrap OpenNMT-tf with a custom interface for training, translating, and serving

*For a complete overview of the APIs, see the [package documentation](https://opennmt.net/OpenNMT-tf/package/overview.html).*

## Additional resources

* [Documentation](https://opennmt.net/OpenNMT-tf)
* [Forum](https://forum.opennmt.net)
* [Gitter](https://gitter.im/OpenNMT/OpenNMT-tf)



            

Raw data

            {
    "_id": null,
    "home_page": "https://opennmt.net",
    "name": "OpenNMT-tf",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "tensorflow opennmt nmt neural machine translation",
    "author": "OpenNMT",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/39/a7/aa003550a746f2c843718b55f21a6697a61b2c9e10587c5ba0fa4b62708b/OpenNMT-tf-2.32.0.tar.gz",
    "platform": null,
    "description": "[![CI](https://github.com/OpenNMT/OpenNMT-tf/workflows/CI/badge.svg)](https://github.com/OpenNMT/OpenNMT-tf/actions?query=workflow%3ACI) [![codecov](https://codecov.io/gh/OpenNMT/OpenNMT-tf/branch/master/graph/badge.svg)](https://codecov.io/gh/OpenNMT/OpenNMT-tf) [![PyPI version](https://badge.fury.io/py/OpenNMT-tf.svg)](https://badge.fury.io/py/OpenNMT-tf) [![Documentation](https://img.shields.io/badge/docs-latest-blue.svg)](https://opennmt.net/OpenNMT-tf/) [![Gitter](https://badges.gitter.im/OpenNMT/OpenNMT-tf.svg)](https://gitter.im/OpenNMT/OpenNMT-tf?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![Forum](https://img.shields.io/discourse/status?server=https%3A%2F%2Fforum.opennmt.net%2F)](https://forum.opennmt.net/)\n\n# OpenNMT-tf\n\nOpenNMT-tf is a general purpose sequence learning toolkit using TensorFlow 2. While neural machine translation is the main target task, it has been designed to more generally support:\n\n* sequence to sequence mapping\n* sequence tagging\n* sequence classification\n* language modeling\n\nThe project is production-oriented and comes with [backward compatibility guarantees](https://github.com/OpenNMT/OpenNMT-tf/blob/master/CHANGELOG.md).\n\n## Key features\n\n### Modular model architecture\n\nModels are described with code to allow training custom architectures and overriding default behavior. For example, the following instance defines a sequence to sequence model with 2 concatenated input features, a self-attentional encoder, and an attentional RNN decoder sharing its input and output embeddings:\n\n```python\nopennmt.models.SequenceToSequence(\n    source_inputter=opennmt.inputters.ParallelInputter(\n        [\n            opennmt.inputters.WordEmbedder(embedding_size=256),\n            opennmt.inputters.WordEmbedder(embedding_size=256),\n        ],\n        reducer=opennmt.layers.ConcatReducer(axis=-1),\n    ),\n    target_inputter=opennmt.inputters.WordEmbedder(embedding_size=512),\n    encoder=opennmt.encoders.SelfAttentionEncoder(num_layers=6),\n    decoder=opennmt.decoders.AttentionalRNNDecoder(\n        num_layers=4,\n        num_units=512,\n        attention_mechanism_class=tfa.seq2seq.LuongAttention,\n    ),\n    share_embeddings=opennmt.models.EmbeddingsSharingLevel.TARGET,\n)\n```\n\nThe [`opennmt`](https://opennmt.net/OpenNMT-tf/package/overview.html) package exposes other building blocks that can be used to design:\n\n* [multiple input features](https://opennmt.net/OpenNMT-tf/package/opennmt.inputters.ParallelInputter.html)\n* [mixed embedding representation](https://opennmt.net/OpenNMT-tf/package/opennmt.inputters.MixedInputter.html)\n* [multi-source context](https://opennmt.net/OpenNMT-tf/package/opennmt.inputters.ParallelInputter.html)\n* [cascaded](https://opennmt.net/OpenNMT-tf/package/opennmt.encoders.SequentialEncoder.html) or [multi-column](https://opennmt.net/OpenNMT-tf/package/opennmt.encoders.ParallelEncoder.html) encoder\n* [hybrid sequence to sequence models](https://opennmt.net/OpenNMT-tf/package/opennmt.models.SequenceToSequence.html)\n\nStandard models such as the Transformer are defined in a [model catalog](https://github.com/OpenNMT/OpenNMT-tf/blob/master/opennmt/models/catalog.py) and can be used without additional configuration.\n\n*Find more information about model configuration in the [documentation](https://opennmt.net/OpenNMT-tf/model.html).*\n\n### Full TensorFlow 2 integration\n\nOpenNMT-tf is fully integrated in the TensorFlow 2 ecosystem:\n\n* Reusable layers extending [`tf.keras.layers.Layer`](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer)\n* Multi-GPU training with [`tf.distribute`](https://www.tensorflow.org/api_docs/python/tf/distribute) and distributed training with [Horovod](https://github.com/horovod/horovod)\n* Mixed precision training with [`tf.keras.mixed_precision`](https://www.tensorflow.org/guide/mixed_precision)\n* Visualization with [TensorBoard](https://www.tensorflow.org/tensorboard)\n* `tf.function` graph tracing that can be [exported to a SavedModel](https://opennmt.net/OpenNMT-tf/serving.html) and served with [TensorFlow Serving](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/serving/tensorflow_serving) or [Python](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/serving/python)\n\n### Compatibility with CTranslate2\n\n[CTranslate2](https://github.com/OpenNMT/CTranslate2) is an optimized inference engine for OpenNMT models featuring fast CPU and GPU execution, model quantization, parallel translations, dynamic memory usage, interactive decoding, and more! OpenNMT-tf can [automatically export](https://opennmt.net/OpenNMT-tf/serving.html#ctranslate2) models to be used in CTranslate2.\n\n### Dynamic data pipeline\n\nOpenNMT-tf does not require to compile the data before the training. Instead, it can directly read text files and preprocess the data when needed by the training. This allows [on-the-fly tokenization](https://opennmt.net/OpenNMT-tf/tokenization.html) and data augmentation by injecting random noise.\n\n### Model fine-tuning\n\nOpenNMT-tf supports model fine-tuning workflows:\n\n* Model weights can be transferred to new word vocabularies, e.g. to inject domain terminology before fine-tuning on in-domain data\n* [Contrastive learning](https://ai.google/research/pubs/pub48253/) to reduce word omission errors\n\n### Source-target alignment\n\nSequence to sequence models can be trained with [guided alignment](https://arxiv.org/abs/1607.01628) and alignment information are returned as part of the translation API.\n\n---\n\nOpenNMT-tf also implements most of the techniques commonly used to train and evaluate sequence models, such as:\n\n* automatic evaluation during the training\n* multiple decoding strategy: greedy search, beam search, random sampling\n* N-best rescoring\n* gradient accumulation\n* scheduled sampling\n* checkpoint averaging\n* ... and more!\n\n*See the [documentation](https://opennmt.net/OpenNMT-tf/) to learn how to use these features.*\n\n## Usage\n\nOpenNMT-tf requires:\n\n* Python 3.7 or above\n* TensorFlow 2.6, 2.7, 2.8, 2.9, 2.10, 2.11, 2.12, or 2.13\n\nWe recommend installing it with `pip`:\n\n```bash\npip install --upgrade pip\npip install OpenNMT-tf\n```\n\n*See the [documentation](https://opennmt.net/OpenNMT-tf/installation.html) for more information.*\n\n### Command line\n\nOpenNMT-tf comes with several command line utilities to prepare data, train, and evaluate models.\n\nFor all tasks involving a model execution, OpenNMT-tf uses a unique entrypoint: `onmt-main`. A typical OpenNMT-tf run consists of 3 elements:\n\n* the **model** type\n* the **parameters** described in a YAML file\n* the **run** type such as `train`, `eval`, `infer`, `export`, `score`, `average_checkpoints`, or `update_vocab`\n\nthat are passed to the main script:\n\n```\nonmt-main --model_type <model> --config <config_file.yml> --auto_config <run_type> <run_options>\n```\n\n*For more information and examples on how to use OpenNMT-tf, please visit [our documentation](https://opennmt.net/OpenNMT-tf).*\n\n### Library\n\nOpenNMT-tf also exposes [well-defined and stable APIs](https://opennmt.net/OpenNMT-tf/package/overview.html), from high-level training utilities to low-level model layers and dataset transformations.\n\nFor example, the `Runner` class can be used to train and evaluate models with few lines of code:\n\n```python\nimport opennmt\n\nconfig = {\n    \"model_dir\": \"/data/wmt-ende/checkpoints/\",\n    \"data\": {\n        \"source_vocabulary\": \"/data/wmt-ende/joint-vocab.txt\",\n        \"target_vocabulary\": \"/data/wmt-ende/joint-vocab.txt\",\n        \"train_features_file\": \"/data/wmt-ende/train.en\",\n        \"train_labels_file\": \"/data/wmt-ende/train.de\",\n        \"eval_features_file\": \"/data/wmt-ende/valid.en\",\n        \"eval_labels_file\": \"/data/wmt-ende/valid.de\",\n    }\n}\n\nmodel = opennmt.models.TransformerBase()\nrunner = opennmt.Runner(model, config, auto_config=True)\nrunner.train(num_devices=2, with_eval=True)\n```\n\nHere is another example using OpenNMT-tf to run efficient beam search with a self-attentional decoder:\n\n```python\ndecoder = opennmt.decoders.SelfAttentionDecoder(num_layers=6, vocab_size=32000)\n\ninitial_state = decoder.initial_state(\n    memory=memory, memory_sequence_length=memory_sequence_length\n)\n\nbatch_size = tf.shape(memory)[0]\nstart_ids = tf.fill([batch_size], opennmt.START_OF_SENTENCE_ID)\n\ndecoding_result = decoder.dynamic_decode(\n    target_embedding,\n    start_ids=start_ids,\n    initial_state=initial_state,\n    decoding_strategy=opennmt.utils.BeamSearch(4),\n)\n```\n\nMore examples using OpenNMT-tf as a library can be found online:\n\n* The directory [examples/library](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/library) contains additional examples that use OpenNMT-tf as a library\n* [nmt-wizard-docker](https://github.com/OpenNMT/nmt-wizard-docker) uses the high-level `opennmt.Runner` API to wrap OpenNMT-tf with a custom interface for training, translating, and serving\n\n*For a complete overview of the APIs, see the [package documentation](https://opennmt.net/OpenNMT-tf/package/overview.html).*\n\n## Additional resources\n\n* [Documentation](https://opennmt.net/OpenNMT-tf)\n* [Forum](https://forum.opennmt.net)\n* [Gitter](https://gitter.im/OpenNMT/OpenNMT-tf)\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Neural machine translation and sequence learning using TensorFlow",
    "version": "2.32.0",
    "project_urls": {
        "Documentation": "https://opennmt.net/OpenNMT-tf/",
        "Forum": "https://forum.opennmt.net/",
        "Gitter": "https://gitter.im/OpenNMT/OpenNMT-tf",
        "Homepage": "https://opennmt.net",
        "Source": "https://github.com/OpenNMT/OpenNMT-tf/"
    },
    "split_keywords": [
        "tensorflow",
        "opennmt",
        "nmt",
        "neural",
        "machine",
        "translation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ff2db655f288685a9c6b62384d97d2b9132861980e84f8c15af0de48d8a3e5f9",
                "md5": "994570a8a71dd5fd184a53fb2bb366d9",
                "sha256": "41b4221777fed15a84247123eebba6a1f6becef824a85af33248416a6d331cd5"
            },
            "downloads": -1,
            "filename": "OpenNMT_tf-2.32.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "994570a8a71dd5fd184a53fb2bb366d9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 162022,
            "upload_time": "2023-08-04T08:38:11",
            "upload_time_iso_8601": "2023-08-04T08:38:11.131488Z",
            "url": "https://files.pythonhosted.org/packages/ff/2d/b655f288685a9c6b62384d97d2b9132861980e84f8c15af0de48d8a3e5f9/OpenNMT_tf-2.32.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "39a7aa003550a746f2c843718b55f21a6697a61b2c9e10587c5ba0fa4b62708b",
                "md5": "1c360d336c32891c0cd6e303f5707e64",
                "sha256": "dcc7d80046bf6e94f3d7f2e477d1c89c7c57ef6b99f7d9b92235341442833c79"
            },
            "downloads": -1,
            "filename": "OpenNMT-tf-2.32.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1c360d336c32891c0cd6e303f5707e64",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 133017,
            "upload_time": "2023-08-04T08:38:13",
            "upload_time_iso_8601": "2023-08-04T08:38:13.316998Z",
            "url": "https://files.pythonhosted.org/packages/39/a7/aa003550a746f2c843718b55f21a6697a61b2c9e10587c5ba0fa4b62708b/OpenNMT-tf-2.32.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-04 08:38:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "OpenNMT",
    "github_project": "OpenNMT-tf",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "opennmt-tf"
}
        
Elapsed time: 0.09791s