neon-transformers


Nameneon-transformers JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/NeonGeckoCom/neon-transformers
Summary
upload_time2022-08-18 23:06:47
maintainer
docs_urlNone
authorNeongecko
requires_python
licenseBSD-3-Clause
keywords ovos neon plugin
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Neon Transformers

## About

### Utterance Transformers

A utterance transformer takes `utterances` and a `context` as input, 
then returns the modified `utterances` and a new `context`

`context` is simply a python dictionary, it can contain anything

`utterances` is a list of transcription candidates, assumed to be a single utterance not a list of unrelated documents!

A transformer might change the utterances or simply return them unmodified with a `context`

eg. 
- The translator transformer will detect language and translate as necessary, it returns modified utterances
- the NER transformer return unmodified utterances and the context contains extracted entities

Transformers may also depend on other transformers

```python
from neon_utterance_KeyBERT_plugin import KeyBERTExtractor
from neon_utterance_wn_entailment_plugin import WordNetEntailments

# depends on keywords being tagged by a prev transformer
entail = WordNetEntailments()

kbert = KeyBERTExtractor()  # or RAKE or YAKE ...

utts = ["The man was snoring very loudly"]
_, context = kbert.transform(utts)
_, context = entail.transform(utts, context)
print(context)
# {'entailments': ['exhale', 'inhale', 'sleep']}
```

#### mycroft integration

Usage with mycroft-core is limited to skills, it is useful for fallback and common_qa skills

You can import individual transformers directly in skills

#### neon integration

neon-core integrate the neon_transformers service in the nlp pipeline transparently

- neon_transformers are integrated into mycroft after STT but before Intent parsing
- all enabled transformer plugins (mycroft.conf) are loaded
- each plugin has a priority that the developer sets and the user can override (mycroft.conf)
- utterances are passed to each transformer sequentially
  - utterances are replaced with the text returned by a transformer
  - if utterances are transformed, the next transformer receives the transformed utterances
  - context is merged with context returned by previous transformer
- the transformed utterances are passed to the intent stage 
- context is available in message.context for skills during intent handling
  - skills can add transformers to their requirements.txt
  - for compatibility with vanilla mycroft-core skills should handle message.context as optional data
  - if a certain transformer is absolutely necessary, load it directly if message.context is missing data

#### ovos-core integration

WIP - not available

### Audio Transformers

TODO


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/NeonGeckoCom/neon-transformers",
    "name": "neon-transformers",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "ovos neon plugin",
    "author": "Neongecko",
    "author_email": "developers@neon.ai",
    "download_url": "",
    "platform": null,
    "description": "# Neon Transformers\n\n## About\n\n### Utterance Transformers\n\nA utterance transformer takes `utterances` and a `context` as input, \nthen returns the modified `utterances` and a new `context`\n\n`context` is simply a python dictionary, it can contain anything\n\n`utterances` is a list of transcription candidates, assumed to be a single utterance not a list of unrelated documents!\n\nA transformer might change the utterances or simply return them unmodified with a `context`\n\neg. \n- The translator transformer will detect language and translate as necessary, it returns modified utterances\n- the NER transformer return unmodified utterances and the context contains extracted entities\n\nTransformers may also depend on other transformers\n\n```python\nfrom neon_utterance_KeyBERT_plugin import KeyBERTExtractor\nfrom neon_utterance_wn_entailment_plugin import WordNetEntailments\n\n# depends on keywords being tagged by a prev transformer\nentail = WordNetEntailments()\n\nkbert = KeyBERTExtractor()  # or RAKE or YAKE ...\n\nutts = [\"The man was snoring very loudly\"]\n_, context = kbert.transform(utts)\n_, context = entail.transform(utts, context)\nprint(context)\n# {'entailments': ['exhale', 'inhale', 'sleep']}\n```\n\n#### mycroft integration\n\nUsage with mycroft-core is limited to skills, it is useful for fallback and common_qa skills\n\nYou can import individual transformers directly in skills\n\n#### neon integration\n\nneon-core integrate the neon_transformers service in the nlp pipeline transparently\n\n- neon_transformers are integrated into mycroft after STT but before Intent parsing\n- all enabled transformer plugins (mycroft.conf) are loaded\n- each plugin has a priority that the developer sets and the user can override (mycroft.conf)\n- utterances are passed to each transformer sequentially\n  - utterances are replaced with the text returned by a transformer\n  - if utterances are transformed, the next transformer receives the transformed utterances\n  - context is merged with context returned by previous transformer\n- the transformed utterances are passed to the intent stage \n- context is available in message.context for skills during intent handling\n  - skills can add transformers to their requirements.txt\n  - for compatibility with vanilla mycroft-core skills should handle message.context as optional data\n  - if a certain transformer is absolutely necessary, load it directly if message.context is missing data\n\n#### ovos-core integration\n\nWIP - not available\n\n### Audio Transformers\n\nTODO\n\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": "",
    "version": "0.2.0",
    "project_urls": {
        "Homepage": "https://github.com/NeonGeckoCom/neon-transformers"
    },
    "split_keywords": [
        "ovos",
        "neon",
        "plugin"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c9a6329c05fdd7fa7664aeb06f25b4f3372e7f71b474204c7366bb30347d91d6",
                "md5": "d00a9cb236f596e62af2831640f79814",
                "sha256": "a9d3d01892fe0d704877704f564201d6024856f47f6f4417182d9e52cefce8db"
            },
            "downloads": -1,
            "filename": "neon_transformers-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d00a9cb236f596e62af2831640f79814",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 15298,
            "upload_time": "2022-08-18T23:06:47",
            "upload_time_iso_8601": "2022-08-18T23:06:47.727179Z",
            "url": "https://files.pythonhosted.org/packages/c9/a6/329c05fdd7fa7664aeb06f25b4f3372e7f71b474204c7366bb30347d91d6/neon_transformers-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-08-18 23:06:47",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "NeonGeckoCom",
    "github_project": "neon-transformers",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "neon-transformers"
}
        
Elapsed time: 0.68441s