# Lexikanon: A HyFI-based library for Tokenizers
[![pypi-image]][pypi-url]
[![version-image]][release-url]
[![release-date-image]][release-url]
[![license-image]][license-url]
[![DOI][zenodo-image]][zenodo-url]
[![codecov][codecov-image]][codecov-url]
[![jupyter-book-image]][docs-url]
A HyFI-based library for the creation, training, and utilization of tokenizers.
- Documentation: [https://lexikanon.entelecheia.ai][docs-url]
- GitHub: [https://github.com/entelecheia/lexikanon][repo-url]
- PyPI: [https://pypi.org/project/lexikanon][pypi-url]
Lexikanon is a high-performance Python library specifically engineered for the creation, training, and utilization of tokenizers, which are fundamental components in both natural language processing (NLP) and artificial intelligence (AI). Drawing its name from the Greek words λέξη (meaning "word") and κάνων (meaning "maker"), Lexikanon encapsulates its primary purpose of enabling users to develop robust tokenizers tailored for different languages and specific tasks. Built on the [Hydra Fast Interface (HyFI)](https://hyfi.entelecheia.ai) framework, Lexikanon stands as a HyFI-based library. This makes it seamlessly pluggable into any HyFI-oriented project, although it can also function as a standalone library.
## Citation
```tex
@software{lee_2023_8248118,
author = {Young Joon Lee},
title = {Lexikanon: A HyFI-based library for Tokenizers},
month = aug,
year = 2023,
publisher = {Zenodo},
version = {v0.6.2},
doi = {10.5281/zenodo.8248117},
url = {https://doi.org/10.5281/zenodo.8248117}
}
```
```tex
@software{lee_2023_hyfi,
author = {Young Joon Lee},
title = {Lexikanon: A HyFI-based library for Tokenizers},
year = 2023,
publisher = {GitHub},
url = {https://github.com/entelecheia/lexikanon}
}
```
## Changelog
See the [CHANGELOG] for more information.
## Contributing
Contributions are welcome! Please see the [contributing guidelines] for more information.
## License
This project is released under the [MIT License][license-url].
<!-- Links: -->
[zenodo-image]: https://zenodo.org/badge/DOI/10.5281/zenodo.8248117.svg
[zenodo-url]: https://doi.org/10.5281/zenodo.8248117
[codecov-image]: https://codecov.io/gh/entelecheia/lexikanon/branch/main/graph/badge.svg?token=KGST5XVW3F
[codecov-url]: https://codecov.io/gh/entelecheia/lexikanon
[pypi-image]: https://img.shields.io/pypi/v/lexikanon
[license-image]: https://img.shields.io/github/license/entelecheia/lexikanon
[license-url]: https://github.com/entelecheia/lexikanon/blob/main/LICENSE
[version-image]: https://img.shields.io/github/v/release/entelecheia/lexikanon?sort=semver
[release-date-image]: https://img.shields.io/github/release-date/entelecheia/lexikanon
[release-url]: https://github.com/entelecheia/lexikanon/releases
[jupyter-book-image]: https://jupyterbook.org/en/stable/_images/badge.svg
[repo-url]: https://github.com/entelecheia/lexikanon
[pypi-url]: https://pypi.org/project/lexikanon
[docs-url]: https://lexikanon.entelecheia.ai
[changelog]: https://github.com/entelecheia/lexikanon/blob/main/CHANGELOG.md
[contributing guidelines]: https://github.com/entelecheia/lexikanon/blob/main/CONTRIBUTING.md
<!-- Links: -->
Raw data
{
"_id": null,
"home_page": "https://lexikanon.entelecheia.ai",
"name": "lexikanon",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Young Joon Lee",
"author_email": "entelecheia@hotmail.com",
"download_url": "https://files.pythonhosted.org/packages/6e/11/b8d38d9af44fc40a147985670b51edaf216546a65af0792b73f747787f8f/lexikanon-0.6.5.tar.gz",
"platform": null,
"description": "# Lexikanon: A HyFI-based library for Tokenizers\n\n[![pypi-image]][pypi-url]\n[![version-image]][release-url]\n[![release-date-image]][release-url]\n[![license-image]][license-url]\n[![DOI][zenodo-image]][zenodo-url]\n[![codecov][codecov-image]][codecov-url]\n[![jupyter-book-image]][docs-url]\n\nA HyFI-based library for the creation, training, and utilization of tokenizers.\n\n- Documentation: [https://lexikanon.entelecheia.ai][docs-url]\n- GitHub: [https://github.com/entelecheia/lexikanon][repo-url]\n- PyPI: [https://pypi.org/project/lexikanon][pypi-url]\n\nLexikanon is a high-performance Python library specifically engineered for the creation, training, and utilization of tokenizers, which are fundamental components in both natural language processing (NLP) and artificial intelligence (AI). Drawing its name from the Greek words \u03bb\u03ad\u03be\u03b7 (meaning \"word\") and \u03ba\u03ac\u03bd\u03c9\u03bd (meaning \"maker\"), Lexikanon encapsulates its primary purpose of enabling users to develop robust tokenizers tailored for different languages and specific tasks. Built on the [Hydra Fast Interface (HyFI)](https://hyfi.entelecheia.ai) framework, Lexikanon stands as a HyFI-based library. This makes it seamlessly pluggable into any HyFI-oriented project, although it can also function as a standalone library.\n\n## Citation\n\n```tex\n@software{lee_2023_8248118,\n author = {Young Joon Lee},\n title = {Lexikanon: A HyFI-based library for Tokenizers},\n month = aug,\n year = 2023,\n publisher = {Zenodo},\n version = {v0.6.2},\n doi = {10.5281/zenodo.8248117},\n url = {https://doi.org/10.5281/zenodo.8248117}\n}\n```\n\n```tex\n@software{lee_2023_hyfi,\n author = {Young Joon Lee},\n title = {Lexikanon: A HyFI-based library for Tokenizers},\n year = 2023,\n publisher = {GitHub},\n url = {https://github.com/entelecheia/lexikanon}\n}\n```\n\n## Changelog\n\nSee the [CHANGELOG] for more information.\n\n## Contributing\n\nContributions are welcome! Please see the [contributing guidelines] for more information.\n\n## License\n\nThis project is released under the [MIT License][license-url].\n\n<!-- Links: -->\n\n[zenodo-image]: https://zenodo.org/badge/DOI/10.5281/zenodo.8248117.svg\n[zenodo-url]: https://doi.org/10.5281/zenodo.8248117\n[codecov-image]: https://codecov.io/gh/entelecheia/lexikanon/branch/main/graph/badge.svg?token=KGST5XVW3F\n[codecov-url]: https://codecov.io/gh/entelecheia/lexikanon\n[pypi-image]: https://img.shields.io/pypi/v/lexikanon\n[license-image]: https://img.shields.io/github/license/entelecheia/lexikanon\n[license-url]: https://github.com/entelecheia/lexikanon/blob/main/LICENSE\n[version-image]: https://img.shields.io/github/v/release/entelecheia/lexikanon?sort=semver\n[release-date-image]: https://img.shields.io/github/release-date/entelecheia/lexikanon\n[release-url]: https://github.com/entelecheia/lexikanon/releases\n[jupyter-book-image]: https://jupyterbook.org/en/stable/_images/badge.svg\n[repo-url]: https://github.com/entelecheia/lexikanon\n[pypi-url]: https://pypi.org/project/lexikanon\n[docs-url]: https://lexikanon.entelecheia.ai\n[changelog]: https://github.com/entelecheia/lexikanon/blob/main/CHANGELOG.md\n[contributing guidelines]: https://github.com/entelecheia/lexikanon/blob/main/CONTRIBUTING.md\n\n<!-- Links: -->\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A Python Library for Tokenizers",
"version": "0.6.5",
"project_urls": {
"Homepage": "https://lexikanon.entelecheia.ai",
"Repository": "https://github.com/entelecheia/lexikanon"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f575b70612025dcfd98df0451d9ec1b682bb2264a0a3b29d94ee5219e0759e07",
"md5": "b87687db742f8c7da385d40c0b97df69",
"sha256": "672cbddd2541202f7bfdd811f908037f6888dfba9d49ed28f5dc58e844504791"
},
"downloads": -1,
"filename": "lexikanon-0.6.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b87687db742f8c7da385d40c0b97df69",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.9",
"size": 853925,
"upload_time": "2024-03-27T19:46:40",
"upload_time_iso_8601": "2024-03-27T19:46:40.646524Z",
"url": "https://files.pythonhosted.org/packages/f5/75/b70612025dcfd98df0451d9ec1b682bb2264a0a3b29d94ee5219e0759e07/lexikanon-0.6.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6e11b8d38d9af44fc40a147985670b51edaf216546a65af0792b73f747787f8f",
"md5": "4a54c777456c1dbb0c13615d3367a651",
"sha256": "b1025bc9d6fc81463ef31971636e25364ae451210fcd15043dde7fcebcc22c1f"
},
"downloads": -1,
"filename": "lexikanon-0.6.5.tar.gz",
"has_sig": false,
"md5_digest": "4a54c777456c1dbb0c13615d3367a651",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.9",
"size": 842188,
"upload_time": "2024-03-27T19:46:42",
"upload_time_iso_8601": "2024-03-27T19:46:42.955206Z",
"url": "https://files.pythonhosted.org/packages/6e/11/b8d38d9af44fc40a147985670b51edaf216546a65af0792b73f747787f8f/lexikanon-0.6.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-03-27 19:46:42",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "entelecheia",
"github_project": "lexikanon",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "hyfi",
"specs": []
}
],
"lcname": "lexikanon"
}