alyahmor


Namealyahmor JSON
Version 0.2 PyPI version JSON
download
home_pagehttp://github.com/linuxscout/alyahmor/
SummaryAlyahmor Arabic Morphological Genrator for Python
upload_time2023-01-23 16:55:59
maintainer
docs_urlNone
authorTaha Zerrouki
requires_python
licenseGPL
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Alyahmor اليحمور
Arabic flexionnal morphology generator

![Alyahmor](doc/alyahmor.png)

## Description

The Alyahmor produce a word form from (prefix, lemma, suffix).
It has many functionalities:
- Generate word forms from given word and affixes
- Generate all word forms by adding verbal or nominal affixes according to word type
- Generate all affixes combination for verbs or nouns which can be used in morphology analysis.


#### Developpers: 
 Taha Zerrouki: http://tahadz.com
    taha dot zerrouki at gmail dot com

Features |   value
---------|---------------------------------------------------------------------------------
Authors  | [Authors.md](https://github.com/linuxscout/alyahmor/master/AUTHORS.md)
Release  | 0.2 
License  |[GPL](https://github.com/linuxscout/alyahmor/master/LICENSE)
Tracker  |[linuxscout/alyahmor/Issues](https://github.com/linuxscout/alyahmor/issues)
Accounts  |[@Twitter](https://twitter.com/linuxscout) 
<!-- Website  |[https://pypi.python.org/pypi/alyahmor](https://pypi.python.org/pypi/alyahmor)-->
<!--Doc  |[package Documentaion](http://pythonhosted.org/alyahmor/)
Source  |[Github](http://github.com/linuxscout/alyahmor)-->
<!--Download  |[sourceforge](http://alyahmor.sourceforge.net)-->
<!-- Feedbacks  |[Comments](http://tahadz.com/alyahmor/contact) -->




## Citation
If you would cite it in academic work, can you use this citation
```
T. Zerrouki‏, Alyahmor, Arabic mophological  generator Library for python.,  https://pypi.python.org/pypi/alyahmor/, 2019
```
or in bibtex format
```bibtex
@misc{zerrouki2019alyahmor,
  title={alyahmor, Arabic mophological generator Library for python.},
  author={Zerrouki, Taha},
  url={https://pypi.python.org/pypi/alyahmor},
  year={2019}
}
```
## Applications
* Text Stemming
* Morphology analysis 
* Text Classification and categorization
* Spellchecking


## Features  مزايا
 - Arabic word Light Stemming.
* Features:
    - Generate word forms from given word and affixes
    - Generate all word forms by adding verbal or nominal affixes according to word type
    - Generate all affixes combination for verbs or nouns which can be used in morphology analysis.
    - Generate Stopwords forms 





## Installation
```
pip install alyahmor
```
### Requirements
``` 
pip install -r requirements.txt 
```
 - libQutrub: Qutrub verb conjugation library: http://pypi.pyton/LibQutrub
 - PyArabic: Arabic language tools library   : http://pypi.pyton/pyarabic
 - Arramooz-pysqlite : Arabic dictionary


## أصل التسمية

**اليَحْمُور،** وهو الحسن بن المعالي الباقلاني أبو علي النحوي الحلي  شيخ العربية في زمانه في بغداد من تلامذة أبي البقاء العكبري ت ٦٣٧هـ

وكتب بخطه كثيراً من الأدب واللغة وسائر الفنون، وكان له همةٌ عالية، وحرصٌ شديد؛ وتحصيل الفوائد مع علو سنه، وضعف بصره، وكثرة محفوظه، وصدقه، وثقته، وتواضعه، وكرم أخلاقه.

وانتقل آخر عمره إلى مذهب الشافعي، **وانتهت إليه رياسة النحو.** مولده سنة ثمان وستين وخمسمائة، وتوفي سنة سبع وثلاثين وستمائة.
[المزيد عن اليحمور ](doc/alyahmor.md)

Usage
=====




## Example 




### Generate words forms

It joins word with affixes with suitable correction
for example

بال+كتاب +ين => بالكتابين
ب+أبناء+ه => بأبنائه

#### Nouns
To generate all forms of the word كتاب as noun use 
``` python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = u"كِتِاب"
>>> noun_forms = generator.generate_forms( word, word_type="noun")
>>>noun_forms
[u'آلْكِتَاب', u'آلْكِتَابا', u'آلْكِتَابات', u'آلْكِتَابان', u'آلْكِتَابة', u'آلْكِتَابتان', u'آلْكِتَابتين', u'آلْكِتَابون', u'آلْكِتَابي', u'آلْكِتَابيات'
....]

```
#### Verbs
To generate all forms of the word كتاب as verb, use 
``` python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = u"استعمل"
>>> verb_forms = generator.generate_forms( word, word_type="verb")
>>>verb_forms
[u'أَأَسْتَعْمِلَ', u'أَأَسْتَعْمِلَكَ', u'أَأَسْتَعْمِلَكُمَا', u'أَأَسْتَعْمِلَكُمْ', u'أَأَسْتَعْمِلَكُنَّ', u'أَأَسْتَعْمِلَنَا', u'أَأَسْتَعْمِلَنِي', u'أَأَسْتَعْمِلَنَّ', u'أَأَسْتَعْمِلَنَّكَ', u'أَأَسْتَعْمِلَنَّكُمَا', 

....]
```

### Stop words
To generate all forms of the word إلى as stopword, use
``` python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = "إلى"
>>> stop_forms = generator.generate_forms( word, word_type="stopword")
>>> stop_forms
['أَإِلَى', 'أَإِلَييّ', 'أَإِلَيْكَ', 'أَإِلَيْكُمَا', 'أَإِلَيْكُمْ', 'أَإِلَيْكُنَّ', 'أَإِلَيْكِ', 'أَإِلَيْنَا',
....]
```
#### Generate non vocalized forms
To generate all forms of the word كتاب as noun without vocalization  use 
``` python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = u"كِتِاب"
>>> noun_forms = generator.generate_forms( word, word_type="noun", vocalized=False)
>>>noun_forms
[u'آلكتاب', u'آلكتابا', u'آلكتابات', u'آلكتابان', u'آلكتابة', u'آلكتابتان', u'آلكتابتين', u'آلكتابون', u'آلكتابي', u'آلكتابيات',
....]

```
#### Generate a dictionary of  vocalized forms indexed by unvocalized form
To generate all forms of the word كتاب as noun as a dict of grouped all vocalized forms by unvocalized form  use 
``` python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = u"كِتِاب"
>>> noun_forms = generator.generate_forms( word, word_type="noun", indexed=True)
>>>noun_forms
{u'أككتابة': [u'أكَكِتَِابَةِ', u'أكَكِتَِابَةٍ'],
 u'أوككتابة': [u'أَوَكَكِتَِابَةِ', u'أَوَكَكِتَِابَةٍ'],
 u'وكتابياتهم': [u'وَكِتَِابياتهِمْ', u'وَكِتَِابِيَاتُهُمْ', u'وَكِتَِابِيَاتِهِمْ', u'وَكِتَِابِيَاتُهِمْ', u'وَكِتَِابياتهُمْ'],
 u'وكتابياتهن': [u'وَكِتَِابياتهِنَّ', u'وَكِتَِابياتهُنَّ', u'وَكِتَِابِيَاتِهِنَّ', u'وَكِتَِابِيَاتُهِنَّ', u'وَكِتَِابِيَاتُهُنَّ'],
 u'وللكتابات': [u'وَلِلْكِتَِابَاتِ', u'وَلِلْكِتَِابات'],
 u'أبكتابتكن': [u'أَبِكِتَِابَتِكُنَّ'],
 u'أبكتابتكم': [u'أَبِكِتَِابَتِكُمْ'],
 u'أكتابياتهن': [u'أَكِتَِابياتهِنَّ', u'أَكِتَِابِيَاتِهِنَّ', u'أَكِتَِابياتهُنَّ', u'أَكِتَِابِيَاتُهُنَّ', u'أَكِتَِابِيَاتُهِنَّ'],
 u'فكتاباتهم': [u'فَكِتَِاباتهِمْ', u'فَكِتَِابَاتُهُمْ', u'فَكِتَِابَاتُهِمْ', u'فَكِتَِاباتهُمْ', u'فَكِتَِابَاتِهِمْ'],
 u'بكتابياتكن': [u'بِكِتَِابِيَاتِكُنَّ', u'بِكِتَِابياتكُنَّ'],
....
}

```

### Generate detailled forms
The detailled  form contains
* vocalized word form, example: "ِكِتَابَاتُنَا"
* semi-vocalized: the word without case mark (دون علامة الإعراب),  example: "ِكِتَابَاتنَا"
* segmented form: the affix parts and the word like : procletic-prefix-word-suffix-proclitic, for example : و--كتاب-ات-نا
* Tags : عطف:جمع مؤنث سالم:ضمير متصل

``` python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = u"كِتِاب"
noun_forms = generator.generate_forms( word, word_type="noun", indexed=True, details=True)
>>> noun_forms
  [{'vocolized': 'استعمل', 'semi-vocalized': 'استعمل', 'segmented': '-استعمل--', 'tags': '::'}, 
  {'vocolized': 'استعملي', 'semi-vocalized': 'استعملي', 'segmented': '-استعمل--ي', 'tags': ':مضاف:'},
  {'vocolized': 'استعملِي', 'semi-vocalized': 'استعملِي', 'segmented': '-استعمل--ي', 'tags': ':مضاف:'},
  {'vocolized': 'استعملكِ', 'semi-vocalized': 'استعملكِ', 'segmented': '-استعمل--ك', 'tags': ':مضاف:'}, 
  {'vocolized': 'استعملكَ', 'semi-vocalized': 'استعملكَ', 'segmented': '-استعمل--ك', 'tags': ':مضاف:'},
   {'vocolized': 'استعملكِ', 'semi-vocalized': 'استعملكِ', 'segmented': '-استعمل--ك', 'tags': ':مضاف:'}, 
   {'vocolized': 'استعملكُمُ', 'semi-vocalized': 'استعملكُمُ', 'segmented': '-استعمل--كم', 'tags': ':مضاف:'}, 
   ....]
```
### Generate affixes lists
Alyahmor generate affixes listes for verbs and nouns
```python
>>> verb_affix =generator.generate_affix_list(word_type="verb", vocalized=True)
>>>verb_affix
[u'أَفَسَت-يننِي', u'أَ-ونَا', u'ي-ونكَ', u'فَلَ-تاكَ', u'وَلََن-هُنَّ', u'أَت-وننَا', u'وَ-اكُنَّ', u'ن-ننَا', u'وَت-وهَا', u'أَي-نهُمَا', ....]

>>> noun_affix =generator.generate_affix_list(word_type="noun", vocalized=True)
>>> noun_affix
[u'أكَ-ياتكَ', u'فَ-ِيَاتِكُمَا', u'أكَ-ياتكِ', u'أَوَكَ-ِينَا', u'أَلِ-ِيِّهِنَّ', u'أَفَ-َكُمَا', u'أَفَ-ِيَّتِهِمْ', u'أَفَكَ-ياتهُمْ', u'فَبِ-ِيِّكُمْ', u'وَلِ-ِيَّتِهَا', ....]

```

Generate Unvocalized affixes 
```python
>>> noun_affix =generator.generate_affix_list(word_type="noun", vocalized=False)
>>> noun_affix
[u'-', u'-ا', u'-ات', u'-اتك', u'-اتكم', u'-اتكما', u'-اتكن', u'-اتنا', u'-اته', u'-اتها', ...]

```

### Generate word forms by affixes
Alyahmor generate word forms for given affixes



*  the affix parameter is a list which contains  four elements as
  * procletic
  * prefix
  * suffix
  * enclitic

```python
>>> import alyahmor.genelex
>>> generator = alyahmor.genelex.genelex()
>>> word = u"كِتِاب"
>>> generator.generate_by_affixes( word, word_type="noun", affixes = [u"بال", u"", u"ين", u""])
['بِالْكِتَِابين']
>>> generator.generate_by_affixes( word, word_type="noun", affixes = [u"وك", u"", u"ِ", u""])
['وَكَكِتَِابِ']
>>> generator.generate_by_affixes( word, word_type="noun", affixes = [u"و", u"", u"", u""])
['وَكِتَِاب']

```

### Files

* file/directory    category    description 

tests/samples/dataset.csv   A list of verified affixes

## Featured Posts




            

Raw data

            {
    "_id": null,
    "home_page": "http://github.com/linuxscout/alyahmor/",
    "name": "alyahmor",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Taha Zerrouki",
    "author_email": "taha. zerrouki@gmail .com",
    "download_url": "",
    "platform": null,
    "description": "# Alyahmor \u0627\u0644\u064a\u062d\u0645\u0648\u0631\nArabic flexionnal morphology generator\n\n![Alyahmor](doc/alyahmor.png)\n\n## Description\n\nThe Alyahmor produce a word form from (prefix, lemma, suffix).\nIt has many functionalities:\n- Generate word forms from given word and affixes\n- Generate all word forms by adding verbal or nominal affixes according to word type\n- Generate all affixes combination for verbs or nouns which can be used in morphology analysis.\n\n\n#### Developpers: \n Taha Zerrouki: http://tahadz.com\n    taha dot zerrouki at gmail dot com\n\nFeatures |   value\n---------|---------------------------------------------------------------------------------\nAuthors  | [Authors.md](https://github.com/linuxscout/alyahmor/master/AUTHORS.md)\nRelease  | 0.2 \nLicense  |[GPL](https://github.com/linuxscout/alyahmor/master/LICENSE)\nTracker  |[linuxscout/alyahmor/Issues](https://github.com/linuxscout/alyahmor/issues)\nAccounts  |[@Twitter](https://twitter.com/linuxscout) \n<!-- Website  |[https://pypi.python.org/pypi/alyahmor](https://pypi.python.org/pypi/alyahmor)-->\n<!--Doc  |[package Documentaion](http://pythonhosted.org/alyahmor/)\nSource  |[Github](http://github.com/linuxscout/alyahmor)-->\n<!--Download  |[sourceforge](http://alyahmor.sourceforge.net)-->\n<!-- Feedbacks  |[Comments](http://tahadz.com/alyahmor/contact) -->\n\n\n\n\n## Citation\nIf you would cite it in academic work, can you use this citation\n```\nT. Zerrouki\u200f, Alyahmor, Arabic mophological  generator Library for python.,  https://pypi.python.org/pypi/alyahmor/, 2019\n```\nor in bibtex format\n```bibtex\n@misc{zerrouki2019alyahmor,\n  title={alyahmor, Arabic mophological generator Library for python.},\n  author={Zerrouki, Taha},\n  url={https://pypi.python.org/pypi/alyahmor},\n  year={2019}\n}\n```\n## Applications\n* Text Stemming\n* Morphology analysis \n* Text Classification and categorization\n* Spellchecking\n\n\n## Features  \u0645\u0632\u0627\u064a\u0627\n - Arabic word Light Stemming.\n* Features:\n    - Generate word forms from given word and affixes\n    - Generate all word forms by adding verbal or nominal affixes according to word type\n    - Generate all affixes combination for verbs or nouns which can be used in morphology analysis.\n    - Generate Stopwords forms \n\n\n\n\n\n## Installation\n```\npip install alyahmor\n```\n### Requirements\n``` \npip install -r requirements.txt \n```\n - libQutrub: Qutrub verb conjugation library: http://pypi.pyton/LibQutrub\n - PyArabic: Arabic language tools library   : http://pypi.pyton/pyarabic\n - Arramooz-pysqlite : Arabic dictionary\n\n\n## \u0623\u0635\u0644 \u0627\u0644\u062a\u0633\u0645\u064a\u0629\n\n**\u0627\u0644\u064a\u064e\u062d\u0652\u0645\u064f\u0648\u0631\u060c** \u0648\u0647\u0648 \u0627\u0644\u062d\u0633\u0646 \u0628\u0646 \u0627\u0644\u0645\u0639\u0627\u0644\u064a \u0627\u0644\u0628\u0627\u0642\u0644\u0627\u0646\u064a \u0623\u0628\u0648 \u0639\u0644\u064a \u0627\u0644\u0646\u062d\u0648\u064a \u0627\u0644\u062d\u0644\u064a  \u0634\u064a\u062e \u0627\u0644\u0639\u0631\u0628\u064a\u0629 \u0641\u064a \u0632\u0645\u0627\u0646\u0647 \u0641\u064a \u0628\u063a\u062f\u0627\u062f \u0645\u0646 \u062a\u0644\u0627\u0645\u0630\u0629 \u0623\u0628\u064a \u0627\u0644\u0628\u0642\u0627\u0621 \u0627\u0644\u0639\u0643\u0628\u0631\u064a \u062a \u0666\u0663\u0667\u0647\u0640\n\n\u0648\u0643\u062a\u0628 \u0628\u062e\u0637\u0647 \u0643\u062b\u064a\u0631\u0627\u064b \u0645\u0646 \u0627\u0644\u0623\u062f\u0628 \u0648\u0627\u0644\u0644\u063a\u0629 \u0648\u0633\u0627\u0626\u0631 \u0627\u0644\u0641\u0646\u0648\u0646\u060c \u0648\u0643\u0627\u0646 \u0644\u0647 \u0647\u0645\u0629\u064c \u0639\u0627\u0644\u064a\u0629\u060c \u0648\u062d\u0631\u0635\u064c \u0634\u062f\u064a\u062f\u061b \u0648\u062a\u062d\u0635\u064a\u0644 \u0627\u0644\u0641\u0648\u0627\u0626\u062f \u0645\u0639 \u0639\u0644\u0648 \u0633\u0646\u0647\u060c \u0648\u0636\u0639\u0641 \u0628\u0635\u0631\u0647\u060c \u0648\u0643\u062b\u0631\u0629 \u0645\u062d\u0641\u0648\u0638\u0647\u060c \u0648\u0635\u062f\u0642\u0647\u060c \u0648\u062b\u0642\u062a\u0647\u060c \u0648\u062a\u0648\u0627\u0636\u0639\u0647\u060c \u0648\u0643\u0631\u0645 \u0623\u062e\u0644\u0627\u0642\u0647.\n\n\u0648\u0627\u0646\u062a\u0642\u0644 \u0622\u062e\u0631 \u0639\u0645\u0631\u0647 \u0625\u0644\u0649 \u0645\u0630\u0647\u0628 \u0627\u0644\u0634\u0627\u0641\u0639\u064a\u060c **\u0648\u0627\u0646\u062a\u0647\u062a \u0625\u0644\u064a\u0647 \u0631\u064a\u0627\u0633\u0629 \u0627\u0644\u0646\u062d\u0648.** \u0645\u0648\u0644\u062f\u0647 \u0633\u0646\u0629 \u062b\u0645\u0627\u0646 \u0648\u0633\u062a\u064a\u0646 \u0648\u062e\u0645\u0633\u0645\u0627\u0626\u0629\u060c \u0648\u062a\u0648\u0641\u064a \u0633\u0646\u0629 \u0633\u0628\u0639 \u0648\u062b\u0644\u0627\u062b\u064a\u0646 \u0648\u0633\u062a\u0645\u0627\u0626\u0629.\n[\u0627\u0644\u0645\u0632\u064a\u062f \u0639\u0646 \u0627\u0644\u064a\u062d\u0645\u0648\u0631 ](doc/alyahmor.md)\n\nUsage\n=====\n\n\n\n\n## Example \n\n\n\n\n### Generate words forms\n\nIt joins word with affixes with suitable correction\nfor example\n\n\u0628\u0627\u0644+\u0643\u062a\u0627\u0628 +\u064a\u0646 => \u0628\u0627\u0644\u0643\u062a\u0627\u0628\u064a\u0646\n\u0628+\u0623\u0628\u0646\u0627\u0621+\u0647 => \u0628\u0623\u0628\u0646\u0627\u0626\u0647\n\n#### Nouns\nTo generate all forms of the word \u0643\u062a\u0627\u0628 as noun use \n``` python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = u\"\u0643\u0650\u062a\u0650\u0627\u0628\"\n>>> noun_forms = generator.generate_forms( word, word_type=\"noun\")\n>>>noun_forms\n[u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u0627', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u0627\u062a', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u0627\u0646', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u0629', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u062a\u0627\u0646', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u062a\u064a\u0646', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u0648\u0646', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u064a', u'\u0622\u0644\u0652\u0643\u0650\u062a\u064e\u0627\u0628\u064a\u0627\u062a'\n....]\n\n```\n#### Verbs\nTo generate all forms of the word \u0643\u062a\u0627\u0628 as verb, use \n``` python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = u\"\u0627\u0633\u062a\u0639\u0645\u0644\"\n>>> verb_forms = generator.generate_forms( word, word_type=\"verb\")\n>>>verb_forms\n[u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0643\u064e', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0643\u064f\u0645\u064e\u0627', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0643\u064f\u0645\u0652', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0643\u064f\u0646\u0651\u064e', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0646\u064e\u0627', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0646\u0650\u064a', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0646\u0651\u064e', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0646\u0651\u064e\u0643\u064e', u'\u0623\u064e\u0623\u064e\u0633\u0652\u062a\u064e\u0639\u0652\u0645\u0650\u0644\u064e\u0646\u0651\u064e\u0643\u064f\u0645\u064e\u0627', \n\n....]\n```\n\n### Stop words\nTo generate all forms of the word \u0625\u0644\u0649 as stopword, use\n``` python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = \"\u0625\u0644\u0649\"\n>>> stop_forms = generator.generate_forms( word, word_type=\"stopword\")\n>>> stop_forms\n['\u0623\u064e\u0625\u0650\u0644\u064e\u0649', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u064a\u0651', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u0652\u0643\u064e', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u0652\u0643\u064f\u0645\u064e\u0627', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u0652\u0643\u064f\u0645\u0652', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u0652\u0643\u064f\u0646\u0651\u064e', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u0652\u0643\u0650', '\u0623\u064e\u0625\u0650\u0644\u064e\u064a\u0652\u0646\u064e\u0627',\n....]\n```\n#### Generate non vocalized forms\nTo generate all forms of the word \u0643\u062a\u0627\u0628 as noun without vocalization  use \n``` python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = u\"\u0643\u0650\u062a\u0650\u0627\u0628\"\n>>> noun_forms = generator.generate_forms( word, word_type=\"noun\", vocalized=False)\n>>>noun_forms\n[u'\u0622\u0644\u0643\u062a\u0627\u0628', u'\u0622\u0644\u0643\u062a\u0627\u0628\u0627', u'\u0622\u0644\u0643\u062a\u0627\u0628\u0627\u062a', u'\u0622\u0644\u0643\u062a\u0627\u0628\u0627\u0646', u'\u0622\u0644\u0643\u062a\u0627\u0628\u0629', u'\u0622\u0644\u0643\u062a\u0627\u0628\u062a\u0627\u0646', u'\u0622\u0644\u0643\u062a\u0627\u0628\u062a\u064a\u0646', u'\u0622\u0644\u0643\u062a\u0627\u0628\u0648\u0646', u'\u0622\u0644\u0643\u062a\u0627\u0628\u064a', u'\u0622\u0644\u0643\u062a\u0627\u0628\u064a\u0627\u062a',\n....]\n\n```\n#### Generate a dictionary of  vocalized forms indexed by unvocalized form\nTo generate all forms of the word \u0643\u062a\u0627\u0628 as noun as a dict of grouped all vocalized forms by unvocalized form  use \n``` python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = u\"\u0643\u0650\u062a\u0650\u0627\u0628\"\n>>> noun_forms = generator.generate_forms( word, word_type=\"noun\", indexed=True)\n>>>noun_forms\n{u'\u0623\u0643\u0643\u062a\u0627\u0628\u0629': [u'\u0623\u0643\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0629\u0650', u'\u0623\u0643\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0629\u064d'],\n u'\u0623\u0648\u0643\u0643\u062a\u0627\u0628\u0629': [u'\u0623\u064e\u0648\u064e\u0643\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0629\u0650', u'\u0623\u064e\u0648\u064e\u0643\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0629\u064d'],\n u'\u0648\u0643\u062a\u0627\u0628\u064a\u0627\u062a\u0647\u0645': [u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0647\u0650\u0645\u0652', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u064f\u0647\u064f\u0645\u0652', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u0650\u0647\u0650\u0645\u0652', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u064f\u0647\u0650\u0645\u0652', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0647\u064f\u0645\u0652'],\n u'\u0648\u0643\u062a\u0627\u0628\u064a\u0627\u062a\u0647\u0646': [u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0647\u0650\u0646\u0651\u064e', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0647\u064f\u0646\u0651\u064e', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u0650\u0647\u0650\u0646\u0651\u064e', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u064f\u0647\u0650\u0646\u0651\u064e', u'\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u064f\u0647\u064f\u0646\u0651\u064e'],\n u'\u0648\u0644\u0644\u0643\u062a\u0627\u0628\u0627\u062a': [u'\u0648\u064e\u0644\u0650\u0644\u0652\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0627\u062a\u0650', u'\u0648\u064e\u0644\u0650\u0644\u0652\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0627\u062a'],\n u'\u0623\u0628\u0643\u062a\u0627\u0628\u062a\u0643\u0646': [u'\u0623\u064e\u0628\u0650\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u062a\u0650\u0643\u064f\u0646\u0651\u064e'],\n u'\u0623\u0628\u0643\u062a\u0627\u0628\u062a\u0643\u0645': [u'\u0623\u064e\u0628\u0650\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u062a\u0650\u0643\u064f\u0645\u0652'],\n u'\u0623\u0643\u062a\u0627\u0628\u064a\u0627\u062a\u0647\u0646': [u'\u0623\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0647\u0650\u0646\u0651\u064e', u'\u0623\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u0650\u0647\u0650\u0646\u0651\u064e', u'\u0623\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0647\u064f\u0646\u0651\u064e', u'\u0623\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u064f\u0647\u064f\u0646\u0651\u064e', u'\u0623\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u064f\u0647\u0650\u0646\u0651\u064e'],\n u'\u0641\u0643\u062a\u0627\u0628\u0627\u062a\u0647\u0645': [u'\u0641\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0627\u062a\u0647\u0650\u0645\u0652', u'\u0641\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0627\u062a\u064f\u0647\u064f\u0645\u0652', u'\u0641\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0627\u062a\u064f\u0647\u0650\u0645\u0652', u'\u0641\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0627\u062a\u0647\u064f\u0645\u0652', u'\u0641\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064e\u0627\u062a\u0650\u0647\u0650\u0645\u0652'],\n u'\u0628\u0643\u062a\u0627\u0628\u064a\u0627\u062a\u0643\u0646': [u'\u0628\u0650\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650\u064a\u064e\u0627\u062a\u0650\u0643\u064f\u0646\u0651\u064e', u'\u0628\u0650\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0627\u062a\u0643\u064f\u0646\u0651\u064e'],\n....\n}\n\n```\n\n### Generate detailled forms\nThe detailled  form contains\n* vocalized word form, example: \"\u0650\u0643\u0650\u062a\u064e\u0627\u0628\u064e\u0627\u062a\u064f\u0646\u064e\u0627\"\n* semi-vocalized: the word without case mark (\u062f\u0648\u0646 \u0639\u0644\u0627\u0645\u0629 \u0627\u0644\u0625\u0639\u0631\u0627\u0628),  example: \"\u0650\u0643\u0650\u062a\u064e\u0627\u0628\u064e\u0627\u062a\u0646\u064e\u0627\"\n* segmented form: the affix parts and the word like : procletic-prefix-word-suffix-proclitic, for example : \u0648--\u0643\u062a\u0627\u0628-\u0627\u062a-\u0646\u0627\n* Tags : \u0639\u0637\u0641:\u062c\u0645\u0639 \u0645\u0624\u0646\u062b \u0633\u0627\u0644\u0645:\u0636\u0645\u064a\u0631 \u0645\u062a\u0635\u0644\n\n``` python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = u\"\u0643\u0650\u062a\u0650\u0627\u0628\"\nnoun_forms = generator.generate_forms( word, word_type=\"noun\", indexed=True, details=True)\n>>> noun_forms\n  [{'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--', 'tags': '::'}, \n  {'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644\u064a', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644\u064a', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--\u064a', 'tags': ':\u0645\u0636\u0627\u0641:'},\n  {'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0650\u064a', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0650\u064a', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--\u064a', 'tags': ':\u0645\u0636\u0627\u0641:'},\n  {'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u0650', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u0650', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--\u0643', 'tags': ':\u0645\u0636\u0627\u0641:'}, \n  {'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u064e', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u064e', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--\u0643', 'tags': ':\u0645\u0636\u0627\u0641:'},\n   {'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u0650', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u0650', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--\u0643', 'tags': ':\u0645\u0636\u0627\u0641:'}, \n   {'vocolized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u064f\u0645\u064f', 'semi-vocalized': '\u0627\u0633\u062a\u0639\u0645\u0644\u0643\u064f\u0645\u064f', 'segmented': '-\u0627\u0633\u062a\u0639\u0645\u0644--\u0643\u0645', 'tags': ':\u0645\u0636\u0627\u0641:'}, \n   ....]\n```\n### Generate affixes lists\nAlyahmor generate affixes listes for verbs and nouns\n```python\n>>> verb_affix =generator.generate_affix_list(word_type=\"verb\", vocalized=True)\n>>>verb_affix\n[u'\u0623\u064e\u0641\u064e\u0633\u064e\u062a-\u064a\u0646\u0646\u0650\u064a', u'\u0623\u064e-\u0648\u0646\u064e\u0627', u'\u064a-\u0648\u0646\u0643\u064e', u'\u0641\u064e\u0644\u064e-\u062a\u0627\u0643\u064e', u'\u0648\u064e\u0644\u064e\u064e\u0646-\u0647\u064f\u0646\u0651\u064e', u'\u0623\u064e\u062a-\u0648\u0646\u0646\u064e\u0627', u'\u0648\u064e-\u0627\u0643\u064f\u0646\u0651\u064e', u'\u0646-\u0646\u0646\u064e\u0627', u'\u0648\u064e\u062a-\u0648\u0647\u064e\u0627', u'\u0623\u064e\u064a-\u0646\u0647\u064f\u0645\u064e\u0627', ....]\n\n>>> noun_affix =generator.generate_affix_list(word_type=\"noun\", vocalized=True)\n>>> noun_affix\n[u'\u0623\u0643\u064e-\u064a\u0627\u062a\u0643\u064e', u'\u0641\u064e-\u0650\u064a\u064e\u0627\u062a\u0650\u0643\u064f\u0645\u064e\u0627', u'\u0623\u0643\u064e-\u064a\u0627\u062a\u0643\u0650', u'\u0623\u064e\u0648\u064e\u0643\u064e-\u0650\u064a\u0646\u064e\u0627', u'\u0623\u064e\u0644\u0650-\u0650\u064a\u0651\u0650\u0647\u0650\u0646\u0651\u064e', u'\u0623\u064e\u0641\u064e-\u064e\u0643\u064f\u0645\u064e\u0627', u'\u0623\u064e\u0641\u064e-\u0650\u064a\u0651\u064e\u062a\u0650\u0647\u0650\u0645\u0652', u'\u0623\u064e\u0641\u064e\u0643\u064e-\u064a\u0627\u062a\u0647\u064f\u0645\u0652', u'\u0641\u064e\u0628\u0650-\u0650\u064a\u0651\u0650\u0643\u064f\u0645\u0652', u'\u0648\u064e\u0644\u0650-\u0650\u064a\u0651\u064e\u062a\u0650\u0647\u064e\u0627', ....]\n\n```\n\nGenerate Unvocalized affixes \n```python\n>>> noun_affix =generator.generate_affix_list(word_type=\"noun\", vocalized=False)\n>>> noun_affix\n[u'-', u'-\u0627', u'-\u0627\u062a', u'-\u0627\u062a\u0643', u'-\u0627\u062a\u0643\u0645', u'-\u0627\u062a\u0643\u0645\u0627', u'-\u0627\u062a\u0643\u0646', u'-\u0627\u062a\u0646\u0627', u'-\u0627\u062a\u0647', u'-\u0627\u062a\u0647\u0627', ...]\n\n```\n\n### Generate word forms by affixes\nAlyahmor generate word forms for given affixes\n\n\n\n*  the affix parameter is a list which contains  four elements as\n  * procletic\n  * prefix\n  * suffix\n  * enclitic\n\n```python\n>>> import alyahmor.genelex\n>>> generator = alyahmor.genelex.genelex()\n>>> word = u\"\u0643\u0650\u062a\u0650\u0627\u0628\"\n>>> generator.generate_by_affixes( word, word_type=\"noun\", affixes = [u\"\u0628\u0627\u0644\", u\"\", u\"\u064a\u0646\", u\"\"])\n['\u0628\u0650\u0627\u0644\u0652\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u064a\u0646']\n>>> generator.generate_by_affixes( word, word_type=\"noun\", affixes = [u\"\u0648\u0643\", u\"\", u\"\u0650\", u\"\"])\n['\u0648\u064e\u0643\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628\u0650']\n>>> generator.generate_by_affixes( word, word_type=\"noun\", affixes = [u\"\u0648\", u\"\", u\"\", u\"\"])\n['\u0648\u064e\u0643\u0650\u062a\u0650\u064e\u0627\u0628']\n\n```\n\n### Files\n\n* file/directory    category    description \n\ntests/samples/dataset.csv   A list of verified affixes\n\n## Featured Posts\n\n\n\n",
    "bugtrack_url": null,
    "license": "GPL",
    "summary": "Alyahmor Arabic Morphological Genrator for Python",
    "version": "0.2",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cf88896e4f95143e09791b352ec3d09643dd4404c45b678c1a27a2651ca878b5",
                "md5": "c7a274a6a83f2d5a213fc97e49e349b7",
                "sha256": "a6bb1d4bc61dfc6bd4e4faf22abcf7dece911add147f6fc4c70d3b449bd7dd44"
            },
            "downloads": -1,
            "filename": "alyahmor-0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c7a274a6a83f2d5a213fc97e49e349b7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 65054,
            "upload_time": "2023-01-23T16:55:59",
            "upload_time_iso_8601": "2023-01-23T16:55:59.937318Z",
            "url": "https://files.pythonhosted.org/packages/cf/88/896e4f95143e09791b352ec3d09643dd4404c45b678c1a27a2651ca878b5/alyahmor-0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-23 16:55:59",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "linuxscout",
    "github_project": "alyahmor",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "alyahmor"
}
        
Elapsed time: 0.03252s