anyks-lm


Nameanyks-lm JSON
Version 3.5.0 PyPI version JSON
download
home_pagehttps://github.com/anyks/alm
SummarySmart language model
upload_time2023-01-05 00:01:56
maintainer
docs_urlNone
authorYuriy Lobarev
requires_python>=3.6
license
keywords nlp lm alm language-model
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![ANYKS Smart language model](https://raw.githubusercontent.com/anyks/alm/master/site/img/banner.jpg)](https://anyks.com)

# ANYKS Language Model (ALM)

## Project goals and features

The are many toolkits capable of creating language models: ([KenLM](https://github.com/kpu/kenlm), [SriLM](https://github.com/BitMindLab/SRILM), [IRSTLM](https://github.com/irstlm-team/irstlm)), and each of those toolkits may have a reason to exist. But our language model creation toolkit has the following goals and features:

- **UTF-8 support**: Full UTF-8 support without third-party dependencies.
- **Support of many data formats**: ARPA, Vocab, Map Sequence, N-grams, Binary alm dictionary.
- **Smoothing algorithms**: Kneser-Nay, Modified Kneser-Nay, Witten-Bell, Additive, Good-Turing, Absolute discounting.
- **Normalisation and preprocessing for corpora**: Transferring corpus to lowercase, smart tokenization, ability to create black - and white - lists for n-grams.
- **ARPA modification**: Frequencies and n-grams replacing, adding new n-grams with frequencies, removing n-grams.
- **Pruning**: N-gram removal based on specified criteria.
- **Removal of low-probability n-grams**: Removal of n-grams which backoff probability is higher than standard probability.
- **ARPA recovery**: Recovery of damaged n-grams in ARPA with subsequent recalculation of their backoff probabilities.
- **Support of additional word features**: Feature extraction: (numbers, roman numbers, ranges of numbers, numeric abbreviations, any other custom attributes) using scripts written in Python3.
- **Text preprocessing**: Unlike all other language model toolkits, ALM can extract correct context from files with unnormalized texts.
- **Unknown word token accounting**: Accounting of 〈unk〉 token as full n-gram.
- **Redefinition of 〈unk〉 token**: Ability to redefine an attribute of an unknown token.
- **N-grams preprocessing**: Ability to pre-process n-grams before adding them to ARPA using custom Python3 scripts.
- **Binary container for Language Models**: The binary container supports compression, encryption and installation of copyrights.
- **Convenient visualization of the Language model assembly process**: ALM implements several types of visualizations: textual, graphic, process indicator, and logging to files or console.
- **Collection of all n-grams**: Unlike other language model toolkits, ALM is guaranteed to extract all possible n-grams from the corpus, regardless of their length (except for Modified Kneser-Nay); you can also force all n-grams to be taken into account even if they occured only once.

## Requirements

- [Zlib](http://www.zlib.net)
- [OpenSSL](https://www.openssl.org)
- [Python3](https://www.python.org/download/releases/3.0)
- [NLohmann::json](https://github.com/nlohmann/json)
- [BigInteger](http://mattmccutchen.net/bigint)

## Install PyBind11

```bash
$ python3 -m pip install pybind11
```

## Description of Methods

### Methods:
- **idw** - Word ID retrieval method
- **idt** - Token ID retrieval method
- **ids** - Sequence ID retrieval method

### Example:
```python
>>> import alm
>>>
>>> alm.idw("hello")
313191024
>>>
>>> alm.idw("<s>")
1
>>>
>>> alm.idw("</s>")
22
>>>
>>> alm.idw("<unk>")
3
>>>
>>> alm.idt("1424")
2
>>>
>>> alm.idt("hello")
0
>>>
>>> alm.idw("Living")
13268942501
>>>
>>> alm.idw("in")
2047
>>>
>>> alm.idw("the")
83201
>>>
>>> alm.idw("USA")
72549
>>>
>>> alm.ids([13268942501, 2047, 83201, 72549])
16314074810955466382
```

### Description
| Name      | Description                                                                                                                                           |
|-----------|-------------------------------------------------------------------------------------------------------------------------------------------------------|
|〈s〉       | Sentence beginning token                                                                                                                              |
|〈/s〉      | Sentence end token                                                                                                                                    |
|〈url〉     | URL-address token                                                                                                                                     |
|〈num〉     | Number (arabic or roman) token                                                                                                                        |
|〈unk〉     | Unknown word token                                                                                                                                    |
|〈time〉    | Time token (15:44:56)                                                                                                                                 |
|〈score〉   | Score count token (4:3 ¦ 01:04)                                                                                                                       |
|〈fract〉   | Fraction token (5/20 ¦ 192/864)                                                                                                                       |
|〈date〉    | Date token (18.07.2004 ¦ 07/18/2004)                                                                                                                  |
|〈abbr〉    | Abbreviation token (1-й ¦ 2-е ¦ 20-я ¦ p.s ¦ p.s.)                                                                                                    |
|〈dimen〉   | Dimensions token (200x300 ¦ 1920x1080)                                                                                                                |
|〈range〉   | Range of numbers token (1-2 ¦ 100-200 ¦ 300-400)                                                                                                      |
|〈aprox〉   | Approximate number token (~93 ¦ ~95.86 ¦ 10~20)                                                                                                       |
|〈anum〉    | Pseudo-number token (combination of numbers and other symbols) (T34 ¦ 895-M-86 ¦ 39km)                                                                |
|〈pcards〉  | Symbols of the play cards (♠ ¦ ♣ ¦ ♥ ¦ ♦ )                                                                                                            |
|〈punct〉   | Punctuation token (. ¦ , ¦ ? ¦ ! ¦ : ¦ ; ¦ … ¦ ¡ ¦ ¿)                                                                                                 |
|〈route〉   | Direction symbols (arrows) (← ¦ ↑ ¦ ↓ ¦ ↔ ¦ ↵ ¦ ⇐ ¦ ⇑ ¦ ⇒ ¦ ⇓ ¦ ⇔ ¦ ◄ ¦ ▲ ¦ ► ¦ ▼)                                                                    |
|〈greek〉   | Symbols of the Greek alphabet (Α ¦ Β ¦ Γ ¦ Δ ¦ Ε ¦ Ζ ¦ Η ¦ Θ ¦ Ι ¦ Κ ¦ Λ ¦ Μ ¦ Ν ¦ Ξ ¦ Ο ¦ Π ¦ Ρ ¦ Σ ¦ Τ ¦ Υ ¦ Φ ¦ Χ ¦ Ψ ¦ Ω)                         |
|〈isolat〉  | Isolation/quotation token (( ¦ ) ¦ [ ¦ ] ¦ { ¦ } ¦ " ¦ « ¦ » ¦ „ ¦ “ ¦ ` ¦ ⌈ ¦ ⌉ ¦ ⌊ ¦ ⌋ ¦ ‹ ¦ › ¦ ‚ ¦ ’ ¦ ′ ¦ ‛ ¦ ″ ¦ ‘ ¦ ” ¦ ‟ ¦ ' ¦〈 ¦ 〉)         |
|〈specl〉   | Special character token (_ ¦ @ ¦ # ¦ № ¦ © ¦ ® ¦ & ¦ § ¦ æ ¦ ø ¦ Þ ¦ – ¦ ‾ ¦ ‑ ¦ — ¦ ¯ ¦ ¶ ¦ ˆ ¦ ˜ ¦ † ¦ ‡ ¦ • ¦ ‰ ¦ ⁄ ¦ ℑ ¦ ℘ ¦ ℜ ¦ ℵ ¦ ◊ ¦ \ )     |
|〈currency〉| Symbols of world currencies ($ ¦ € ¦ ₽ ¦ ¢ ¦ £ ¦ ₤ ¦ ¤ ¦ ¥ ¦ ℳ ¦ ₣ ¦ ₴ ¦ ₸ ¦ ₹ ¦ ₩ ¦ ₦ ¦ ₭ ¦ ₪ ¦ ৳ ¦ ƒ ¦ ₨ ¦ ฿ ¦ ₫ ¦ ៛ ¦ ₮ ¦ ₱ ¦ ﷼ ¦ ₡ ¦ ₲ ¦ ؋ ¦ ₵ ¦ ₺ ¦ ₼ ¦ ₾ ¦ ₠ ¦ ₧ ¦ ₯ ¦ ₢ ¦ ₳ ¦ ₥ ¦ ₰ ¦ ₿ ¦ ұ) |
|〈math〉    | Mathematical operation token (+ ¦ - ¦ = ¦ / ¦ * ¦ ^ ¦ × ¦ ÷ ¦ − ¦ ∕ ¦ ∖ ¦ ∗ ¦ √ ¦ ∝ ¦ ∞ ¦ ∠ ¦ ± ¦ ¹ ¦ ² ¦ ³ ¦ ½ ¦ ⅓ ¦ ¼ ¦ ¾ ¦ % ¦ ~ ¦ · ¦ ⋅ ¦ ° ¦ º ¦ ¬ ¦ ƒ ¦ ∀ ¦ ∂ ¦ ∃ ¦ ∅ ¦ ∇ ¦ ∈ ¦ ∉ ¦ ∋ ¦ ∏ ¦ ∑ ¦ ∧ ¦ ∨ ¦ ∩ ¦ ∪ ¦ ∫ ¦ ∴ ¦ ∼ ¦ ≅ ¦ ≈ ¦ ≠ ¦ ≡ ¦ ≤ ¦ ≥ ¦ ª ¦ ⊂ ¦ ⊃ ¦ ⊄ ¦ ⊆ ¦ ⊇ ¦ ⊕ ¦ ⊗ ¦ ⊥ ¦ ¨) |

---

### Methods:
- **setZone** - User zone set method

### Example:
```python
>>> import alm
>>>
>>> alm.setZone("com")
>>> alm.setZone("ru")
>>> alm.setZone("org")
>>> alm.setZone("net")
```

---

### Methods:
- **clear** - Method clear all data
- **setAlphabet** - Method set alphabet
- **getAlphabet** - Method get alphabet

### Example:
```python
>>> import alm
>>>
>>> alm.getAlphabet()
'abcdefghijklmnopqrstuvwxyz'
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.getAlphabet()
'abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя'
>>>
>>> alm.clear()
>>>
>>> alm.getAlphabet()
'abcdefghijklmnopqrstuvwxyz'
```

---

### Methods:
- **setUnknown** - Method set unknown word
- **getUnknown** - Method extraction unknown word

### Example:
```python
>>> import alm
>>>
>>> alm.setUnknown("word")
>>>
>>> alm.getUnknown()
'word'
```

---

### Methods:
- **info** - Dictionary information output method
- **init** - Language Model Initialization Method signature: [smoothing = wittenBell, modified = False, prepares = False, mod = 0.0]
- **token** - Method for determining the type of the token words
- **addText** - Method of adding text for estimate
- **collectCorpus** - Training method of assembling the text data for ALM
- **pruneVocab** - Dictionary pruning method
- **buildArpa** - Method for build ARPA
- **writeALM** - Method for writing data from ARPA file to binary container
- **writeWords** - Method for writing these words to a file
- **writeVocab** - Method for writing dictionary data to a file
- **writeNgrams** - Method of writing data to NGRAMs files
- **writeMap** - Method of writing sequence map to file
- **writeSuffix** - Method for writing data to a suffix file for digital abbreviations
- **writeAbbrs** - Method for writing data to an abbreviation file
- **getSuffixes** - Method for extracting the list of suffixes of digital abbreviations
- **writeArpa** - Method of writing data to ARPA file
- **setSize** - Method for set size N-gram
- **setLocale** - Method set locale (Default: en_US.UTF-8)
- **pruneArpa** - Language model pruning method
- **addWord** - Method for add a word to the dictionary
- **setThreads** - Method for setting the number of threads used in work (0 - all available threads)
- **setSubstitutes** - Method for set letters to correct words from mixed alphabets
- **addAbbr** - Method add abbreviation
- **setAbbrs** - Method set abbreviations
- **getAbbrs** - Method for extracting the list of abbreviations
- **addGoodword** - Method add good word
- **addBadword** - Method add bad word
- **readArpa** - Method for reading an ARPA file, language model
- **readVocab** - Method of reading the dictionary
- **setAdCw** - Method for set dictionary characteristics (cw - count all words in dataset, ad - count all documents in dataset)

#### Description
| Smoothing       |
|-----------------|
| wittenBell      |
| addSmooth       |
| goodTuring      |
| constDiscount   |
| naturalDiscount |
| kneserNey       |
| modKneserNey    |

### Example:
```python
>>> import alm
>>>
>>> alm.info("./lm.alm")


* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

* Name: Test Language Model

* Encryption: AES128

* Alphabet: абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz

* Build date: 09/18/2020 21:52:00

* N-gram size: 3

* Words: 9373

* N-grams: 25021

* Author: Some name

* Contacts: site: https://example.com, e-mail: info@example.com

* Copyright ©: You company LLC

* License type: MIT

* License text:
... License text ...

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

>>> 
```

### Example:
```python
>>> import alm
>>> import json
>>> 
>>> alm.setSize(3)
>>> alm.setThreads(0)
>>> alm.setLocale("en_US.UTF-8")
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> alm.setSubstitutes({'p':'р','c':'с','o':'о','t':'т','k':'к','e':'е','a':'а','h':'н','x':'х','b':'в','m':'м'})
>>> 
>>> alm.setOption(alm.options_t.allowUnk)
>>> alm.setOption(alm.options_t.resetUnk)
>>> alm.setOption(alm.options_t.mixDicts)
>>> alm.setOption(alm.options_t.tokenWords)
>>> alm.setOption(alm.options_t.interpolate)
>>> 
>>> alm.init(alm.smoothing_t.modKneserNey, True, True)
>>> 
>>> p = alm.getParams()
>>> p.algorithm
4
>>> p.mod
0.0
>>> p.prepares
True
>>> p.modified
True
>>> alm.idw("Сбербанк")
13236490857
>>> alm.idw("Совкомбанк")
22287680895
>>> 
>>> alm.token("Сбербанк")
'<unk>'
>>> alm.token("совкомбанк")
'<unk>'
>>> 
>>> alm.setAbbrs({13236490857, 22287680895})
>>> 
>>> alm.addAbbr("США")
>>> alm.addAbbr("Сбер")
>>> 
>>> alm.token("Сбербанк")
'<abbr>'
>>> alm.token("совкомбанк")
'<abbr>'
>>> 
>>> alm.token("сша")
'<abbr>'
>>> alm.token("СБЕР")
'<abbr>'
>>> 
>>> alm.getAbbrs()
{13236490857, 189243, 22287680895, 26938511}
>>> 
>>> alm.addGoodword("T-34")
>>> alm.addGoodword("АН-25")
>>> 
>>> alm.addBadword("ийти")
>>> alm.addBadword("циган")
>>> alm.addBadword("апичатка")
>>> 
>>> alm.addWord("министерство")
>>> alm.addWord("возмездие", 0, 1)
>>> alm.addWord("возражение", alm.idw("возражение"), 2)
>>> 
>>> def status(text, status):
...     print(text, status)
... 
>>> def statusWriteALM(status):
...     print("Write ALM", status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> def statusBuildArpa(status):
...     print("Build ARPA", status)
... 
>>> def statusPrune(status):
...     print("Prune data", status)
... 
>>> def statusWords(status):
...     print("Write words", status)
... 
>>> def statusVocab(status):
...     print("Write vocab", status)
... 
>>> def statusNgram(status):
...     print("Write ngram", status)
... 
>>> def statusMap(status):
...     print("Write map", status)
... 
>>> def statusSuffix(status):
...     print("Write suffix", status)
... 
>>> def statusAbbreviation(status):
...     print("Write abbreviation", status)
... 
>>> alm.addText("The future is now", 0)
>>> 
>>> alm.collectCorpus("./correct.txt", status)
Read text corpora 0
Read text corpora 1
Read text corpora 2
Read text corpora 3
Read text corpora 4
Read text corpora 5
Read text corpora 6
...
>>> alm.pruneVocab(-15.0, 0, 0, statusPrune)
Prune data 0
Prune data 1
Prune data 2
Prune data 3
Prune data 4
Prune data 5
Prune data 6
...
>>> alm.pruneArpa(0.015, 3, statusPrune)
Prune data 0
Prune data 1
Prune data 2
Prune data 3
Prune data 4
Prune data 5
Prune data 6
...
>>> meta = {
...     "aes": 128,
...     "name": "Test Language Model",
...     "author": "Some name",
...     "lictype": "MIT",
...     "password": "password",
...     "copyright": "You company LLC",
...     "lictext": "... License text ...",
...     "contacts": "site: https://example.com, e-mail: info@example.com"
... }
>>> 
>>> alm.writeALM("./lm.alm", json.dumps(meta), statusWriteALM)
Write ALM 0
Write ALM 0
Write ALM 0
Write ALM 0
Write ALM 0
Write ALM 0
...
>>> alm.writeWords("./words.txt", statusWords)
Write words 0
Write words 1
Write words 2
Write words 3
Write words 4
Write words 5
Write words 6
...
>>> alm.writeVocab("./lm.vocab", statusVocab)
Write vocab 0
Write vocab 1
Write vocab 2
Write vocab 3
Write vocab 4
Write vocab 5
Write vocab 6
...
>>> alm.writeNgrams("./lm.ngram", statusNgram)
Write ngram 0
Write ngram 1
Write ngram 2
Write ngram 3
Write ngram 4
Write ngram 5
Write ngram 6
...
>>> alm.writeMap("./lm.map", statusMap, "|")
Write map 0
Write map 1
Write map 2
Write map 3
Write map 4
Write map 5
Write map 6
...
>>> alm.writeSuffix("./suffix.txt", statusSuffix)
Write suffix 10
Write suffix 20
Write suffix 30
Write suffix 40
Write suffix 50
Write suffix 60
...
>>> alm.writeAbbrs("./words.abbr", statusAbbreviation)
Write abbreviation 25
Write abbreviation 50
Write abbreviation 75
Write abbreviation 100
...
>>> alm.getAbbrs()
{13236490857, 189243, 22287680895, 26938511}
>>> 
>>> alm.getSuffixes()
{2633, 1662978425, 14279182218, 3468, 47, 28876661395, 29095464659, 2968, 57, 30}
>>> 
>>> alm.buildArpa(statusBuildArpa)
Build ARPA 0
Build ARPA 1
Build ARPA 2
Build ARPA 3
Build ARPA 4
Build ARPA 5
Build ARPA 6
...
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
```

---

### Methods:
- **setOption** - Library options setting method
- **unsetOption** - Disable module option method

### Example:
```python
>>> import alm
>>>
>>> alm.unsetOption(alm.options_t.debug)
>>> alm.unsetOption(alm.options_t.mixDicts)
>>> alm.unsetOption(alm.options_t.onlyGood)
>>> alm.unsetOption(alm.options_t.confidence)
...
```

#### Description
| Options     | Description                                                                              |
|-------------|------------------------------------------------------------------------------------------|
| debug       | Flag debug mode                                                                          |
| stress      | Flag allowing to stress in words                                                         |
| uppers      | Flag that allows you to correct the case of letters                                      |
| onlyGood    | Flag allowing to consider words from the white list only                                 |
| mixDicts    | Flag allowing the use of words consisting of mixed dictionaries                          |
| allowUnk    | Flag allowing to unknown word                                                            |
| resetUnk    | Flag to reset the frequency of an unknown word                                           |
| allGrams    | Flag allowing accounting of all collected n-grams                                        |
| lowerCase   | Flag allowing to case-insensitive                                                        |
| confidence  | Flag ARPA file loading without pre-processing the words                                  |
| tokenWords  | Flag that takes into account when assembling N-grams, only those tokens that match words |
| interpolate | Flag allowing to use interpolation in estimating                                         |

---

### Methods:
- **readMap** - Method for reading sequence map from file

### Example:
```python
>>> import alm
>>>
>>> alm.setLocale("en_US.UTF-8")
>>>
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> alm.setSubstitutes({'p':'р','c':'с','o':'о','t':'т','k':'к','e':'е','a':'а','h':'н','x':'х','b':'в','m':'м'})
>>>
>>> alm.setOption(alm.options_t.allowUnk)
>>> alm.setOption(alm.options_t.resetUnk)
>>> alm.setOption(alm.options_t.mixDicts)
>>> 
>>> def statusMap(text, status):
...     print("Read map", text, status)
... 
>>> def statusBuildArpa(status):
...     print("Build ARPA", status)
... 
>>> def statusPrune(status):
...     print("Prune data", status)
... 
>>> def statusVocab(text, status):
...     print("Read Vocab", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init(alm.smoothing_t.wittenBell)
>>> 
>>> p = alm.getParams()
>>> p.algorithm
2
>>> alm.readVocab("./lm.vocab", statusVocab)
Read Vocab ./lm.vocab 0
Read Vocab ./lm.vocab 1
Read Vocab ./lm.vocab 2
Read Vocab ./lm.vocab 3
Read Vocab ./lm.vocab 4
Read Vocab ./lm.vocab 5
Read Vocab ./lm.vocab 6
...
>>> alm.readMap("./lm1.map", statusMap, "|")
Read map ./lm.map 0
Read map ./lm.map 1
Read map ./lm.map 2
Read map ./lm.map 3
Read map ./lm.map 4
Read map ./lm.map 5
Read map ./lm.map 6
...
>>> alm.readMap("./lm2.map", statusMap, "|")
Read map ./lm.map 0
Read map ./lm.map 1
Read map ./lm.map 2
Read map ./lm.map 3
Read map ./lm.map 4
Read map ./lm.map 5
Read map ./lm.map 6
...
>>> alm.pruneVocab(-15.0, 0, 0, statusPrune)
Prune data 0
Prune data 1
Prune data 2
Prune data 3
Prune data 4
Prune data 5
Prune data 6
...
>>> alm.buildArpa(statusBuildArpa)
Build ARPA 0
Build ARPA 1
Build ARPA 2
Build ARPA 3
Build ARPA 4
Build ARPA 5
Build ARPA 6
...
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
>>> def getWords(word, idw, oc, dc, count):
...     print(word, idw, oc, dc, count)
...     return True
... 
>>> alm.words(getWords)
а 25 244 12 9373
б 26 11 6 9373
в 27 757 12 9373
ж 32 12 7 9373
и 34 823 12 9373
к 36 102 12 9373
о 40 63 12 9373
п 41 1 1 9373
р 42 1 1 9373
с 43 290 12 9373
у 45 113 12 9373
Х 47 1 1 9373
я 57 299 12 9373
D 61 1 1 9373
I 66 1 1 9373
да 2179 32 10 9373
за 2183 92 12 9373
на 2189 435 12 9373
па 2191 1 1 9373
та 2194 4 4 9373
об 2276 20 10 9373
...
>>> alm.getStatistic()
(13, 38124)
>> alm.setAdCw(44381, 20)
>>> alm.getStatistic()
(20, 44381)
```

### Example:
```python
>>> import alm
>>>
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> alm.setSubstitutes({'p':'р','c':'с','o':'о','t':'т','k':'к','e':'е','a':'а','h':'н','x':'х','b':'в','m':'м'})
>>>
>>> alm.setOption(alm.options_t.allowUnk)
>>> alm.setOption(alm.options_t.resetUnk)
>>> alm.setOption(alm.options_t.mixDicts)
>>> 
>>> def statusBuildArpa(status):
...     print("Build ARPA", status)
... 
>>> def statusPrune(status):
...     print("Prune data", status)
... 
>>> def statusNgram(text, status):
...     print("Read Ngram", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init(alm.smoothing_t.addSmooth, False, False, 0.5)
>>> 
>>> p = alm.getParams()
>>> p.algorithm
0
>>> p.mod
0.5
>>> p.prepares
False
>>> p.modified
False
>>> 
>>> alm.readNgram("./lm.ngram", statusNgram)
Read Ngram ./lm.ngram 0
Read Ngram ./lm.ngram 1
Read Ngram ./lm.ngram 2
Read Ngram ./lm.ngram 3
Read Ngram ./lm.ngram 4
Read Ngram ./lm.ngram 5
Read Ngram ./lm.ngram 6
...
>>> alm.pruneVocab(-15.0, 0, 0, statusPrune)
Prune data 0
Prune data 1
Prune data 2
Prune data 3
Prune data 4
Prune data 5
Prune data 6
...
>>> alm.buildArpa(statusBuildArpa)
Build ARPA 0
Build ARPA 1
Build ARPA 2
Build ARPA 3
Build ARPA 4
Build ARPA 5
Build ARPA 6
...
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
```

---

### Methods:
- **modify** - ARPA modification method
- **sweep** - ARPA Low Frequency N-gram Removal Method
- **repair** - Method of repair of previously calculated ARPA

### Example:
```python
>>> import alm
>>>
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>>
>>> alm.setOption(alm.options_t.confidence)
>>> 
>>> def statusSweep(text, status):
...     print("Sweep n-grams", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init()
>>> 
>>> alm.sweep("./lm.arpa", statusSweep)
Sweep n-grams Read ARPA file 0
Sweep n-grams Read ARPA file 1
Sweep n-grams Read ARPA file 2
Sweep n-grams Read ARPA file 3
Sweep n-grams Read ARPA file 4
Sweep n-grams Read ARPA file 5
Sweep n-grams Read ARPA file 6
...
Sweep n-grams Sweep N-grams 0
Sweep n-grams Sweep N-grams 1
Sweep n-grams Sweep N-grams 2
Sweep n-grams Sweep N-grams 3
Sweep n-grams Sweep N-grams 4
Sweep n-grams Sweep N-grams 5
Sweep n-grams Sweep N-grams 6
...
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
>>> alm.clear()
>>> 
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> 
>>> def statusRepair(text, status):
...     print("Repair n-grams", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init()
>>> 
>>> alm.repair("./lm.arpa", statusRepair)
Repair n-grams Read ARPA file 0
Repair n-grams Read ARPA file 1
Repair n-grams Read ARPA file 2
Repair n-grams Read ARPA file 3
Repair n-grams Read ARPA file 4
Repair n-grams Read ARPA file 5
Repair n-grams Read ARPA file 6
...
Repair n-grams Repair ARPA data 0
Repair n-grams Repair ARPA data 1
Repair n-grams Repair ARPA data 2
Repair n-grams Repair ARPA data 3
Repair n-grams Repair ARPA data 4
Repair n-grams Repair ARPA data 5
Repair n-grams Repair ARPA data 6
...
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
>>> alm.clear()
>>> 
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> 
>>> def statusModify(text, status):
...     print("Modify ARPA data", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init()
>>> 
>>> alm.modify("./lm.arpa", "./remove.txt", alm.modify_t.remove, statusModify)
Modify ARPA data Read ARPA file 0
Modify ARPA data Read ARPA file 1
Modify ARPA data Read ARPA file 2
Modify ARPA data Read ARPA file 3
Modify ARPA data Read ARPA file 4
Modify ARPA data Read ARPA file 5
Modify ARPA data Read ARPA file 6
...
Modify ARPA data Modify ARPA data 3
Modify ARPA data Modify ARPA data 10
Modify ARPA data Modify ARPA data 15
Modify ARPA data Modify ARPA data 18
Modify ARPA data Modify ARPA data 24
Modify ARPA data Modify ARPA data 30
...
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
```

### Modification flags
| Name    | Description                                             |
|---------|---------------------------------------------------------|
| emplace | Flag of adding n-gram into existing ARPA file           |
| remove  | Flag of removing n-gram from existing ARPA file         |
| change  | Flag of changing n-gram frequency in existing ARPA file |
| replace | Flag of replacing n-gram in existing ARPA file          |

### File of adding n-gram into existing ARPA file
```
-3.002006	США
-1.365296	границ США
-0.988534	у границ США
-1.759398	замуж за
-0.092796	собираюсь замуж за
-0.474876	и тоже
-19.18453	можно и тоже
...
```

| N-gram frequency      | Separator   | N-gram       |
|-----------------------|-------------|--------------|
| -0.988534             | \t          | у границ США |

### File of changing n-gram frequency in existing ARPA file
```
-0.6588787	получайте удовольствие </s>
-0.6588787	только в одном
-0.6588787	работа связана с
-0.6588787	мужчины и женщины
-0.6588787	говоря про то
-0.6588787	потому что я
-0.6588787	потому что это
-0.6588787	работу потому что
-0.6588787	пейзажи за окном
-0.6588787	статусы для одноклассников
-0.6588787	вообще не хочу
...
```

| N-gram frequency      | Separator   | N-gram            |
|-----------------------|-------------|-------------------|
| -0.6588787            | \t          | мужчины и женщины |

### File of replacing n-gram in existing ARPA file
```
коем случае нельзя	там да тут
но тем не	да ты что
неожиданный у	ожидаемый к
в СМИ	в ФСБ
Шах	Мат
...
```

| Existing N-gram       | Separator   | New N-gram        |
|-----------------------|-------------|-------------------|
| но тем не             | \t          | да ты что         |

### File of removing n-gram from existing ARPA file
```
ну то есть
ну очень большой
бы было если
мы с ней
ты смеешься над
два года назад
над тем что
или еще что-то
как я понял
как ни удивительно
как вы знаете
так и не
все-таки права
все-таки болят
все-таки сдохло
все-таки встала
все-таки решился
уже
мне
мое
все
...
```

---

### Methods:
- **mix** - Multiple ARPA Interpolation Method [backward = True, forward = False]
- **mix** - Interpolation method of multiple arpa algorithms (Bayesian and Logarithmic-linear) [Bayes: length > 0, Loglinear: length == 0]

### Example:
```python
>>> import alm
>>> 
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> 
>>> alm.setOption(alm.options_t.confidence)
>>> 
>>> def statusMix(text, status):
...     print("Mix ARPA data", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init()
>>> 
>>> alm.mix(["./lm1.arpa", "./lm2.arpa"], [0.02, 0.05], True, statusMix)
Mix ARPA data ./lm1.arpa 0
Mix ARPA data ./lm1.arpa 1
Mix ARPA data ./lm1.arpa 2
Mix ARPA data ./lm1.arpa 3
Mix ARPA data ./lm1.arpa 4
Mix ARPA data ./lm1.arpa 5
Mix ARPA data ./lm1.arpa 6
...
Mix ARPA data  0
Mix ARPA data  1
Mix ARPA data  2
Mix ARPA data  3
Mix ARPA data  4
Mix ARPA data  5
Mix ARPA data  6
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
>>> alm.clear()
>>> 
>>> alm.setAlphabet("абвгдеёжзийклмнопрстуфхцчшщъыьэюяabcdefghijklmnopqrstuvwxyz")
>>> 
>>> def statusMix(text, status):
...     print("Mix ARPA data", text, status)
... 
>>> def statusWriteArpa(status):
...     print("Write ARPA", status)
... 
>>> alm.init()
>>> 
>>> alm.mix(["./lm1.arpa", "./lm2.arpa"], [0.02, 0.05], 0, 0.032, statusMix)
Mix ARPA data ./lm1.arpa 0
Mix ARPA data ./lm1.arpa 1
Mix ARPA data ./lm1.arpa 2
Mix ARPA data ./lm1.arpa 3
Mix ARPA data ./lm1.arpa 4
Mix ARPA data ./lm1.arpa 5
Mix ARPA data ./lm1.arpa 6
...
Mix ARPA data  0
Mix ARPA data  1
Mix ARPA data  2
Mix ARPA data  3
Mix ARPA data  4
Mix ARPA data  5
Mix ARPA data  6
>>> alm.writeArpa("./lm.arpa", statusWriteArpa)
Write ARPA 0
Write ARPA 1
Write ARPA 2
Write ARPA 3
Write ARPA 4
Write ARPA 5
Write ARPA 6
...
```

---

### Methods:
- **size** - Method of obtaining the size of the N-gram

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.size()
3
```

---

### Methods:
- **damerauLevenshtein** - Determination of the Damerau-Levenshtein distance in phrases
- **distanceLevenshtein** - Determination of Levenshtein distance in phrases
- **tanimoto** - Method for determining Jaccard coefficient (quotient - Tanimoto coefficient)
- **needlemanWunsch** - Word stretching method

### Example:
```python
>>> import alm
>>> alm.damerauLevenshtein("привет", "приветик")
2
>>> 
>>> alm.damerauLevenshtein("приевтик", "приветик")
1
>>> 
>>> alm.distanceLevenshtein("приевтик", "приветик")
2
>>> 
>>> alm.tanimoto("привет", "приветик")
0.7142857142857143
>>> 
>>> alm.tanimoto("привеитк", "приветик")
0.4
>>> 
>>> alm.needlemanWunsch("привеитк", "приветик")
4
>>> 
>>> alm.needlemanWunsch("привет", "приветик")
2
>>> 
>>> alm.damerauLevenshtein("acre", "car")
2
>>> alm.distanceLevenshtein("acre", "car")
3
>>> 
>>> alm.damerauLevenshtein("anteater", "theatre")
4
>>> alm.distanceLevenshtein("anteater", "theatre")
5
>>> 
>>> alm.damerauLevenshtein("banana", "nanny")
3
>>> alm.distanceLevenshtein("banana", "nanny")
3
>>> 
>>> alm.damerauLevenshtein("cat", "crate")
2
>>> alm.distanceLevenshtein("cat", "crate")
2
>>>
>>> alm.mulctLevenshtein("привет", "приветик")
4
>>>
>>> alm.mulctLevenshtein("приевтик", "приветик")
1
>>>
>>> alm.mulctLevenshtein("acre", "car")
3
>>>
>>> alm.mulctLevenshtein("anteater", "theatre")
5
>>>
>>> alm.mulctLevenshtein("banana", "nanny")
4
>>>
>>> alm.mulctLevenshtein("cat", "crate")
4
```

---

### Methods:
- **textToJson** - Method to convert text to JSON
- **isAllowApostrophe** - Apostrophe permission check method
- **switchAllowApostrophe** - Method for permitting or denying an apostrophe as part of a word

### Example:
```python
>>> import alm
>>>
>>> def callbackFn(text):
...     print(text)
... 
>>> alm.isAllowApostrophe()
False
>>> alm.switchAllowApostrophe()
>>>
>>> alm.isAllowApostrophe()
True
>>> alm.textToJson("«On nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'évaluer» l'astronomie", callbackFn)
[["«","On","nous","dit","qu'aujourd'hui","c'est","le","cas",",","encore","faudra-t-il","l'évaluer","»","l'astronomie"]]
```

---

### Methods:
- **jsonToText** - Method to convert JSON to text

### Example:
```python
>>> import alm
>>>
>>> def callbackFn(text):
...     print(text)
... 
>>> alm.jsonToText('[["«","On","nous","dit","qu\'aujourd\'hui","c\'est","le","cas",",","encore","faudra-t-il","l\'évaluer","»","l\'astronomie"]]', callbackFn)
«On nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'évaluer» l'astronomie
```

---

### Methods:
- **restore** - Method for restore text from context

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.uppers)
>>>
>>> alm.restore(["«","On","nous","dit","qu\'aujourd\'hui","c\'est","le","cas",",","encore","faudra-t-il","l\'évaluer","»","l\'astronomie"])
"«On nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'évaluer» l'astronomie"
```

---

### Methods:
- **allowStress** - Method for allow using stress in words
- **disallowStress** - Method for disallow using stress in words

### Example:
```python
>>> import alm
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> def callbackFn(text):
...     print(text)
... 
>>> alm.textToJson('«Бе́лая стрела́» — согласно распространённой в 1990-е годы в России городской легенде, якобы специально организованная и подготовленная законспирированная правительственная спецслужба, сотрудники которой — бывшие и действовавшие милиционеры и спецназовцы, имеющие право на физическую ликвидацию особо опасных уголовных авторитетов и лидеров орудовавших в России ОПГ, относительно которых не представляется возможным привлечения их к уголовной ответственности законными методами[1][2][3]. Несмотря на отсутствие официальных доказательств существования организации и многочисленные опровержения со стороны силовых структур и служб безопасности[4], в российском обществе легенду считают основанной на подлинных фактах громких убийств криминальных авторитетов, совершённых в 1990-е годы, и не исключают существование реальной спецслужбы[5].', callbackFn)
[["«","Белая","стрела","»","—","согласно","распространённой","в","1990-е","годы","в","России","городской","легенде",",","якобы","специально","организованная","и","подготовленная","законспирированная","правительственная","спецслужба",",","сотрудники","которой","—","бывшие","и","действовавшие","милиционеры","и","спецназовцы",",","имеющие","право","на","физическую","ликвидацию","особо","опасных","уголовных","авторитетов","и","лидеров","орудовавших","в","России","ОПГ",",","относительно","которых","не","представляется","возможным","привлечения","их","к","уголовной","ответственности","законными","методами","[","1","]","[","2","]","[","3","]","."],["Несмотря","на","отсутствие","официальных","доказательств","существования","организации","и","многочисленные","опровержения","со","стороны","силовых","структур","и","служб","безопасности","[","4","]",",","в","российском","обществе","легенду","считают","основанной","на","подлинных","фактах","громких","убийств","криминальных","авторитетов",",","совершённых","в","1990-е","годы",",","и","не","исключают","существование","реальной","спецслужбы","[","5","]","."]]
>>>
>>> alm.jsonToText('[["«","Белая","стрела","»","—","согласно","распространённой","в","1990-е","годы","в","России","городской","легенде",",","якобы","специально","организованная","и","подготовленная","законспирированная","правительственная","спецслужба",",","сотрудники","которой","—","бывшие","и","действовавшие","милиционеры","и","спецназовцы",",","имеющие","право","на","физическую","ликвидацию","особо","опасных","уголовных","авторитетов","и","лидеров","орудовавших","в","России","ОПГ",",","относительно","которых","не","представляется","возможным","привлечения","их","к","уголовной","ответственности","законными","методами","[","1","]","[","2","]","[","3","]","."],["Несмотря","на","отсутствие","официальных","доказательств","существования","организации","и","многочисленные","опровержения","со","стороны","силовых","структур","и","служб","безопасности","[","4","]",",","в","российском","обществе","легенду","считают","основанной","на","подлинных","фактах","громких","убийств","криминальных","авторитетов",",","совершённых","в","1990-е","годы",",","и","не","исключают","существование","реальной","спецслужбы","[","5","]","."]]', callbackFn)
«Белая стрела» — согласно распространённой в 1990-е годы в России городской легенде, якобы специально организованная и подготовленная законспирированная правительственная спецслужба, сотрудники которой — бывшие и действовавшие милиционеры и спецназовцы, имеющие право на физическую ликвидацию особо опасных уголовных авторитетов и лидеров орудовавших в России ОПГ, относительно которых не представляется возможным привлечения их к уголовной ответственности законными методами [1] [2] [3].
Несмотря на отсутствие официальных доказательств существования организации и многочисленные опровержения со стороны силовых структур и служб безопасности [4], в российском обществе легенду считают основанной на подлинных фактах громких убийств криминальных авторитетов, совершённых в 1990-е годы, и не исключают существование реальной спецслужбы [5].
>>>
>>> alm.allowStress()
>>> alm.textToJson('«Бе́лая стрела́» — согласно распространённой в 1990-е годы в России городской легенде, якобы специально организованная и подготовленная законспирированная правительственная спецслужба, сотрудники которой — бывшие и действовавшие милиционеры и спецназовцы, имеющие право на физическую ликвидацию особо опасных уголовных авторитетов и лидеров орудовавших в России ОПГ, относительно которых не представляется возможным привлечения их к уголовной ответственности законными методами[1][2][3]. Несмотря на отсутствие официальных доказательств существования организации и многочисленные опровержения со стороны силовых структур и служб безопасности[4], в российском обществе легенду считают основанной на подлинных фактах громких убийств криминальных авторитетов, совершённых в 1990-е годы, и не исключают существование реальной спецслужбы[5].', callbackFn)
[["«","Бе́лая","стрела́","»","—","согласно","распространённой","в","1990-е","годы","в","России","городской","легенде",",","якобы","специально","организованная","и","подготовленная","законспирированная","правительственная","спецслужба",",","сотрудники","которой","—","бывшие","и","действовавшие","милиционеры","и","спецназовцы",",","имеющие","право","на","физическую","ликвидацию","особо","опасных","уголовных","авторитетов","и","лидеров","орудовавших","в","России","ОПГ",",","относительно","которых","не","представляется","возможным","привлечения","их","к","уголовной","ответственности","законными","методами","[","1","]","[","2","]","[","3","]","."],["Несмотря","на","отсутствие","официальных","доказательств","существования","организации","и","многочисленные","опровержения","со","стороны","силовых","структур","и","служб","безопасности","[","4","]",",","в","российском","обществе","легенду","считают","основанной","на","подлинных","фактах","громких","убийств","криминальных","авторитетов",",","совершённых","в","1990-е","годы",",","и","не","исключают","существование","реальной","спецслужбы","[","5","]","."]]
>>>
>>> alm.jsonToText('[["«","Бе́лая","стрела́","»","—","согласно","распространённой","в","1990-е","годы","в","России","городской","легенде",",","якобы","специально","организованная","и","подготовленная","законспирированная","правительственная","спецслужба",",","сотрудники","которой","—","бывшие","и","действовавшие","милиционеры","и","спецназовцы",",","имеющие","право","на","физическую","ликвидацию","особо","опасных","уголовных","авторитетов","и","лидеров","орудовавших","в","России","ОПГ",",","относительно","которых","не","представляется","возможным","привлечения","их","к","уголовной","ответственности","законными","методами","[","1","]","[","2","]","[","3","]","."],["Несмотря","на","отсутствие","официальных","доказательств","существования","организации","и","многочисленные","опровержения","со","стороны","силовых","структур","и","служб","безопасности","[","4","]",",","в","российском","обществе","легенду","считают","основанной","на","подлинных","фактах","громких","убийств","криминальных","авторитетов",",","совершённых","в","1990-е","годы",",","и","не","исключают","существование","реальной","спецслужбы","[","5","]","."]]', callbackFn)
«Бе́лая стрела́» — согласно распространённой в 1990-е годы в России городской легенде, якобы специально организованная и подготовленная законспирированная правительственная спецслужба, сотрудники которой — бывшие и действовавшие милиционеры и спецназовцы, имеющие право на физическую ликвидацию особо опасных уголовных авторитетов и лидеров орудовавших в России ОПГ, относительно которых не представляется возможным привлечения их к уголовной ответственности законными методами [1] [2] [3].
Несмотря на отсутствие официальных доказательств существования организации и многочисленные опровержения со стороны силовых структур и служб безопасности [4], в российском обществе легенду считают основанной на подлинных фактах громких убийств криминальных авторитетов, совершённых в 1990-е годы, и не исключают существование реальной спецслужбы [5].
>>>
>>> alm.disallowStress()
>>> alm.textToJson('«Бе́лая стрела́» — согласно распространённой в 1990-е годы в России городской легенде, якобы специально организованная и подготовленная законспирированная правительственная спецслужба, сотрудники которой — бывшие и действовавшие милиционеры и спецназовцы, имеющие право на физическую ликвидацию особо опасных уголовных авторитетов и лидеров орудовавших в России ОПГ, относительно которых не представляется возможным привлечения их к уголовной ответственности законными методами[1][2][3]. Несмотря на отсутствие официальных доказательств существования организации и многочисленные опровержения со стороны силовых структур и служб безопасности[4], в российском обществе легенду считают основанной на подлинных фактах громких убийств криминальных авторитетов, совершённых в 1990-е годы, и не исключают существование реальной спецслужбы[5].', callbackFn)
[["«","Белая","стрела","»","—","согласно","распространённой","в","1990-е","годы","в","России","городской","легенде",",","якобы","специально","организованная","и","подготовленная","законспирированная","правительственная","спецслужба",",","сотрудники","которой","—","бывшие","и","действовавшие","милиционеры","и","спецназовцы",",","имеющие","право","на","физическую","ликвидацию","особо","опасных","уголовных","авторитетов","и","лидеров","орудовавших","в","России","ОПГ",",","относительно","которых","не","представляется","возможным","привлечения","их","к","уголовной","ответственности","законными","методами","[","1","]","[","2","]","[","3","]","."],["Несмотря","на","отсутствие","официальных","доказательств","существования","организации","и","многочисленные","опровержения","со","стороны","силовых","структур","и","служб","безопасности","[","4","]",",","в","российском","обществе","легенду","считают","основанной","на","подлинных","фактах","громких","убийств","криминальных","авторитетов",",","совершённых","в","1990-е","годы",",","и","не","исключают","существование","реальной","спецслужбы","[","5","]","."]]
>>>
>>> alm.jsonToText('[["«","Белая","стрела","»","—","согласно","распространённой","в","1990-е","годы","в","России","городской","легенде",",","якобы","специально","организованная","и","подготовленная","законспирированная","правительственная","спецслужба",",","сотрудники","которой","—","бывшие","и","действовавшие","милиционеры","и","спецназовцы",",","имеющие","право","на","физическую","ликвидацию","особо","опасных","уголовных","авторитетов","и","лидеров","орудовавших","в","России","ОПГ",",","относительно","которых","не","представляется","возможным","привлечения","их","к","уголовной","ответственности","законными","методами","[","1","]","[","2","]","[","3","]","."],["Несмотря","на","отсутствие","официальных","доказательств","существования","организации","и","многочисленные","опровержения","со","стороны","силовых","структур","и","служб","безопасности","[","4","]",",","в","российском","обществе","легенду","считают","основанной","на","подлинных","фактах","громких","убийств","криминальных","авторитетов",",","совершённых","в","1990-е","годы",",","и","не","исключают","существование","реальной","спецслужбы","[","5","]","."]]', callbackFn)
«Белая стрела» — согласно распространённой в 1990-е годы в России городской легенде, якобы специально организованная и подготовленная законспирированная правительственная спецслужба, сотрудники которой — бывшие и действовавшие милиционеры и спецназовцы, имеющие право на физическую ликвидацию особо опасных уголовных авторитетов и лидеров орудовавших в России ОПГ, относительно которых не представляется возможным привлечения их к уголовной ответственности законными методами [1] [2] [3].
Несмотря на отсутствие официальных доказательств существования организации и многочисленные опровержения со стороны силовых структур и служб безопасности [4], в российском обществе легенду считают основанной на подлинных фактах громких убийств криминальных авторитетов, совершённых в 1990-е годы, и не исключают существование реальной спецслужбы [5].
```

---

### Methods:
- **addBadword** - Method add bad word
- **setBadwords** - Method set words to blacklist
- **getBadwords** - Method get words in blacklist

### Example:
```python
>>> import alm
>>>
>>> alm.setBadwords(["hello", "world", "test"])
>>>
>>> alm.getBadwords()
{1554834897, 2156498622, 28307030}
>>>
>>> alm.addBadword("test2")
>>>
>>> alm.getBadwords()
{5170183734, 1554834897, 2156498622, 28307030}
```

### Example:
```python
>>> import alm
>>>
>>> alm.setBadwords({24227504, 1219922507, 1794085167})
>>>
>>> alm.getBadwords()
{24227504, 1219922507, 1794085167}
>>>
>>> alm.clear(alm.clear_t.badwords)
>>>
>>> alm.getBadwords()
{}
```

---

### Methods:
- **addGoodword** - Method add good word
- **setGoodwords** - Method set words to whitelist
- **getGoodwords** - Method get words in whitelist

### Example:
```python
>>> import alm
>>>
>>> alm.setGoodwords(["hello", "world", "test"])
>>>
>>> alm.getGoodwords()
{1554834897, 2156498622, 28307030}
>>>
>>> alm.addGoodword("test2")
>>>
>>> alm.getGoodwords()
{5170183734, 1554834897, 2156498622, 28307030}
>>>
>>> alm.clear(alm.clear_t.goodwords)
>>>
>>  alm.getGoodwords()
{}
```

### Example:
```python
>>> import alm
>>>
>>> alm.setGoodwords({24227504, 1219922507, 1794085167})
>>>
>>> alm.getGoodwords()
{24227504, 1219922507, 1794085167}
```

---

### Methods:
- **setUserToken** - Method for adding user token
- **getUserTokens** - User token list retrieval method
- **getUserTokenId** - Method for obtaining user token identifier
- **getUserTokenWord** - Method for obtaining a custom token by its identifier

### Example:
```python
>>> import alm
>>>
>>> alm.setUserToken("usa")
>>>
>>> alm.setUserToken("russia")
>>>
>>> alm.getUserTokenId("usa")
5759809081
>>>
>>> alm.getUserTokenId("russia")
9910674734
>>>
>>> alm.getUserTokens()
['usa', 'russia']
>>>
>>> alm.getUserTokenWord(5759809081)
'usa'
>>>
>>> alm.getUserTokenWord(9910674734)
'russia'
>>>
>> alm.clear(alm.clear_t.utokens)
>>>
>>> alm.getUserTokens()
[]
```

---

### Methods:
- **findNgram** - N-gram search method in text
- **word** - "Method to extract a word by its identifier"

### Example:
```python
>>> import alm
>>> 
>>> def callbackFn(text):
...     print(text)
... 
>>> alm.setOption(alm.options_t.confidence)
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>> alm.readArpa('./lm.arpa')
>>> 
>>> alm.idw("привет")
2487910648
>>> alm.word(2487910648)
'привет'
>>> 
>>> alm.findNgram("Особое место занимает чудотворная икона Лобзание Христа Иудою", callbackFn)
<s> Особое
Особое место
место занимает
занимает чудотворная
чудотворная икона
икона Лобзание
Лобзание Христа
Христа Иудою
Иудою </s>


>>>
```

---

### Methods:
- **setUserTokenMethod** - Method for set a custom token processing function

### Example:
```python
>>> import alm
>>>
>>> def fn(token, word):
...     if token and (token == "<usa>"):
...         if word and (word.lower() == "usa"):
...             return True
...     elif token and (token == "<russia>"):
...         if word and (word.lower() == "russia"):
...             return True
...     return False
... 
>>> alm.setUserToken("usa")
>>>
>>> alm.setUserToken("russia")
>>>
>>> alm.setUserTokenMethod("usa", fn)
>>>
>>> alm.setUserTokenMethod("russia", fn)
>>>
>>> alm.idw("usa")
5759809081
>>>
>>> alm.idw("russia")
9910674734
>>>
>>> alm.getUserTokenWord(5759809081)
'usa'
>>>
>>> alm.getUserTokenWord(9910674734)
'russia'
```

---

### Methods:
- **setAlmV2** - Method for set the language model type ALMv2
- **unsetAlmV2** - Method for unset the language model type ALMv2
- **readALM** - Method for reading data from a binary container
- **setWordPreprocessingMethod** - Method for set the word preprocessing function

### Example:
```python
>>> import alm
>>> 
>>> alm.setAlmV2()
>>> 
>>> def run(word, context):
...     if word == "возле": word = "около"
...     return word
... 
>>> alm.setOption(alm.options_t.debug)
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.setWordPreprocessingMethod(run)
>>>
>>> a = alm.perplexity("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
info: <s> Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор <punct> <punct> <punct> </s>

info: p( неожиданно | <s> ) 	= [2gram] 0.00038931 [ -3.40969900 ] / 0.99999991
info: p( из | неожиданно ...) 	= [2gram] 0.10110741 [ -0.99521700 ] / 0.99999979
info: p( подворотни | из ...) 	= [2gram] 0.00711798 [ -2.14764300 ] / 1.00000027
info: p( в | подворотни ...) 	= [2gram] 0.51077661 [ -0.29176900 ] / 1.00000021
info: p( олега | в ...) 	= [2gram] 0.00082936 [ -3.08125500 ] / 0.99999974
info: p( ударил | олега ...) 	= [2gram] 0.25002820 [ -0.60201100 ] / 0.99999978
info: p( яркий | ударил ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( прожектор | яркий ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( патрульный | прожектор ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( трактор | патрульный ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( <punct> | трактор ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999973
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( </s> | <punct> ...) 	= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993

info: 1 sentences, 13 words, 0 OOVs
info: 3 zeroprobs, logprob= -12.97624000 ppl= 8.45034200 ppl1= 9.95800426

info: <s> С лязгом выкатился и остановился около мальчика <punct> <punct> <punct> <punct> </s>

info: p( с | <s> ) 	= [2gram] 0.00642448 [ -2.19216200 ] / 0.99999991
info: p( лязгом | с ...) 	= [2gram] 0.00195917 [ -2.70792700 ] / 0.99999999
info: p( выкатился | лязгом ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( и | выкатился ...) 	= [2gram] 0.51169951 [ -0.29098500 ] / 1.00000024
info: p( остановился | и ...) 	= [2gram] 0.00143382 [ -2.84350600 ] / 0.99999975
info: p( около | остановился ...) 	= [1gram] 0.00011358 [ -3.94468000 ] / 1.00000003
info: p( мальчика | около ...) 	= [1gram] 0.00003932 [ -4.40541100 ] / 1.00000016
info: p( <punct> | мальчика ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999990
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( </s> | <punct> ...) 	= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993

info: 1 sentences, 11 words, 0 OOVs
info: 4 zeroprobs, logprob= -17.93030200 ppl= 31.20267541 ppl1= 42.66064865
>>> print(a.logprob)
-30.906542
```

### Example:
```python
>>> import alm
>>> 
>>> alm.setAlmV2()
>>> 
>>> alm.setOption(alm.options_t.debug)
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> def statusAlm(status):
...     print("Read ALM", status)
... 
>>> alm.readALM("./lm.alm", "password", 128, statusAlm)
Read ALM 0
Read ALM 1
Read ALM 2
Read ALM 3
Read ALM 4
Read ALM 5
Read ALM 6
...
>>>
>>> a = alm.perplexity("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
info: <s> Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор <punct> <punct> <punct> </s>

info: p( неожиданно | <s> ) 	= [2gram] 0.00038931 [ -3.40969900 ] / 0.99999991
info: p( из | неожиданно ...) 	= [2gram] 0.10110741 [ -0.99521700 ] / 0.99999979
info: p( подворотни | из ...) 	= [2gram] 0.00711798 [ -2.14764300 ] / 1.00000027
info: p( в | подворотни ...) 	= [2gram] 0.51077661 [ -0.29176900 ] / 1.00000021
info: p( олега | в ...) 	= [2gram] 0.00082936 [ -3.08125500 ] / 0.99999974
info: p( ударил | олега ...) 	= [2gram] 0.25002820 [ -0.60201100 ] / 0.99999978
info: p( яркий | ударил ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( прожектор | яркий ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( патрульный | прожектор ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( трактор | патрульный ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( <punct> | трактор ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999973
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( </s> | <punct> ...) 	= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993

info: 1 sentences, 13 words, 0 OOVs
info: 3 zeroprobs, logprob= -12.97624000 ppl= 8.45034200 ppl1= 9.95800426

info: <s> С лязгом выкатился и остановился около мальчика <punct> <punct> <punct> <punct> </s>

info: p( с | <s> ) 	= [2gram] 0.00642448 [ -2.19216200 ] / 0.99999991
info: p( лязгом | с ...) 	= [2gram] 0.00195917 [ -2.70792700 ] / 0.99999999
info: p( выкатился | лязгом ...) 	= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034
info: p( и | выкатился ...) 	= [2gram] 0.51169951 [ -0.29098500 ] / 1.00000024
info: p( остановился | и ...) 	= [2gram] 0.00143382 [ -2.84350600 ] / 0.99999975
info: p( около | остановился ...) 	= [1gram] 0.00011358 [ -3.94468000 ] / 1.00000003
info: p( мальчика | около ...) 	= [1gram] 0.00003932 [ -4.40541100 ] / 1.00000016
info: p( <punct> | мальчика ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999990
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( <punct> | <punct> ...) 	= [OOV] 0.00000000 [ -inf ] / 0.99999993
info: p( </s> | <punct> ...) 	= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993

info: 1 sentences, 11 words, 0 OOVs
info: 4 zeroprobs, logprob= -17.93030200 ppl= 31.20267541 ppl1= 42.66064865
>>> print(a.logprob)
-30.906542
```

---

### Methods:
- **setLogfile** - Method of set the file for log output
- **setOOvFile** - Method set file for saving OOVs words

### Example:
```python
>>> import alm
>>>
>>> alm.setLogfile("./log.txt")
>>>
>>> alm.setOOvFile("./oov.txt")
```

---

### Methods:
- **perplexity** - Perplexity calculation
- **pplConcatenate** - Method of combining perplexia
- **pplByFiles** - Method for reading perplexity calculation by file or group of files

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> a = alm.perplexity("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
>>>
>>> print(a.logprob)
-30.906542
>>>
>>> print(a.oovs)
0
>>>
>>> print(a.words)
24
>>>
>>> print(a.sentences)
2
>>>
>>> print(a.zeroprobs)
7
>>>
>>> print(a.ppl)
17.229063831108224
>>>
>>> print(a.ppl1)
19.398698060810077
>>>
>>> b = alm.pplByFiles("./text.txt")
>>>
>>> c = alm.pplConcatenate(a, b)
>>>
>>> print(c.ppl)
7.384123548831112
```

### Description
| Name      | Description                                                                 |
|-----------|-----------------------------------------------------------------------------|
| ppl       | The meaning of perplexity without considering the beginning of the sentence |
| ppl1      | The meaning of perplexion taking into account the beginning of the sentence |
| oovs      | Count of oov words                                                          |
| words     | Count of words in sentence                                                  |
| logprob   | Word sequence frequency                                                     |
| sentences | Count of sequences                                                          |
| zeroprobs | Count of zero probs                                                         |

---

### Methods:
- **tokenization** - Method for breaking text into tokens

### Example:
```python
>>> import alm
>>>
>>> def tokensFn(word, context, reset, stop):
...     print(word, " => ", context)
...     return True
...
>>> alm.switchAllowApostrophe()
>>>
>>> alm.tokenization("«On nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'évaluer» l'astronomie", tokensFn)
«  =>  []
On  =>  ['«']
nous  =>  ['«', 'On']
dit  =>  ['«', 'On', 'nous']
qu'aujourd'hui  =>  ['«', 'On', 'nous', 'dit']
c'est  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui"]
le  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est"]
cas  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le']
,  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas']
encore  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',']
faudra-t-il  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore']
l  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l']
'  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l']
évaluer  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', "'"]
»  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', "'", 'évaluer']
l  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', "'", 'évaluer', '»']
'  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', "'", 'évaluer', '»', 'l']
astronomie  =>  ['«', 'On', 'nous', 'dit', "qu'aujourd'hui", "c'est", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', "'", 'évaluer', '»', 'l', "'"]
```

---

### Methods:
- **setTokenizerFn** - Method for set the function of an external tokenizer

### Example:
```python
>>> import alm
>>>
>>> def tokenizerFn(text, callback):
...     word = ""
...     context = []
...     for letter in text:
...         if letter == " " and len(word) > 0:
...             if not callback(word, context, False, False): return
...             context.append(word)
...             word = ""
...         elif letter == "." or letter == "!" or letter == "?":
...             if not callback(word, context, True, False): return
...             word = ""
...             context = []
...         else:
...             word += letter
...     if len(word) > 0:
...         if not callback(word, context, False, True): return
...
>>> def tokensFn(word, context, reset, stop):
...     print(word, " => ", context)
...     return True
...
>>> alm.setTokenizerFn(tokenizerFn)
>>>
>>> alm.tokenization("Hello World today!", tokensFn)
Hello  =>  []
World  =>  ['Hello']
today  =>  ['Hello', 'World']
```

---

### Methods:
- **sentences** - Sentences generation method
- **sentencesToFile** - Method for assembling a specified number of sentences and writing to a file

### Example:
```python
>>> import alm
>>>
>>> def sentencesFn(text):
...     print("Sentences:", text)
...     return True
...
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.sentences(sentencesFn)
Sentences: <s> В общем </s>
Sentences: <s> С лязгом выкатился и остановился возле мальчика </s>
Sentences: <s> У меня нет </s>
Sentences: <s> Я вообще не хочу </s>
Sentences: <s> Да и в общем </s>
Sentences: <s> Не могу </s>
Sentences: <s> Ну в общем </s>
Sentences: <s> Так что я вообще не хочу </s>
Sentences: <s> Потому что я вообще не хочу </s>
Sentences: <s> Продолжение следует </s>
Sentences: <s> Неожиданно из подворотни в олега ударил яркий прожектор патрульный трактор </s>
>>>
>>> alm.sentencesToFile(5, "./result.txt")
```

---

### Methods:
- **fixUppers** - Method for correcting registers in the text
- **fixUppersByFiles** - Method for correcting text registers in a text file

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.fixUppers("неожиданно из подворотни в олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
'Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор??? С лязгом выкатился и остановился возле мальчика....'
>>>
>>> alm.fixUppersByFiles("./corpus", "./result.txt", "txt")
```

---

### Methods:
- **checkHypLat** - Hyphen and latin character search method

### Example:
```python
>>> import alm
>>>
>>> alm.checkHypLat("Hello-World")
(True, True)
>>>
>>> alm.checkHypLat("Hello")
(False, True)
>>>
>>> alm.checkHypLat("Привет")
(False, False)
>>>
>>> alm.checkHypLat("так-как")
(True, False)
```

---

### Methods:
- **getUppers** - Method for extracting registers for each word
- **countLetter** - Method for counting the amount of a specific letter in a word

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.idw("Living")
10493385932
>>>
>>> alm.idw("in")
3301
>>>
>>> alm.idw("the")
217280
>>>
>>> alm.idw("USA")
188643
>>>
>>> alm.getUppers([10493385932, 3301, 217280, 188643])
[1, 0, 0, 7]
>>> 
>>> alm.countLetter("hello-world", "-")
1
>>>
>>> alm.countLetter("hello-world", "l")
3
```

---

### Methods:
- **urls** - Method for extracting URL address coordinates in a string

### Example:
```python
>>> import alm
>>>
>>> alm.urls("This website: example.com was designed with ...")
{14: 25}
>>>
>>> alm.urls("This website: https://a.b.c.example.net?id=52#test-1 was designed with ...")
{14: 52}
>>>
>>> alm.urls("This website: https://a.b.c.example.net?id=52#test-1 and 127.0.0.1 was designed with ...")
{14: 52, 57: 66}
```

---

### Methods:
- **roman2Arabic** - Method for translating Roman numerals to Arabic

### Example:
```python
>>> import alm
>>>
>>> alm.roman2Arabic("XVI")
16
```

---

### Methods:
- **rest** - Method for correction and detection of words with mixed alphabets
- **setSubstitutes** - Method for set letters to correct words from mixed alphabets
- **getSubstitutes** - Method of extracting letters to correct words from mixed alphabets

### Example:
```python
>>> import alm
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.setSubstitutes({'p':'р','c':'с','o':'о','t':'т','k':'к','e':'е','a':'а','h':'н','x':'х','b':'в','m':'м'})
>>>
>>> alm.getSubstitutes()
{'a': 'а', 'b': 'в', 'c': 'с', 'e': 'е', 'h': 'н', 'k': 'к', 'm': 'м', 'o': 'о', 'p': 'р', 't': 'т', 'x': 'х'}
>>>
>>> str = "ПPИBETИК"
>>>
>>> str.lower()
'пpиbetик'
>>>
>>> alm.rest(str)
'приветик'
```

---

### Methods:
- **setTokensDisable** - Method for set the list of forbidden tokens
- **setTokensUnknown** - Method for set the list of tokens cast to 〈unk〉
- **setTokenDisable** - Method for set the list of unidentifiable tokens
- **setTokenUnknown** - Method of set the list of tokens that need to be identified as 〈unk〉
- **getTokensDisable** - Method for retrieving the list of forbidden tokens
- **getTokensUnknown** - Method for extracting a list of tokens reducible to 〈unk〉
- **setAllTokenDisable** - Method for set all tokens as unidentifiable
- **setAllTokenUnknown** - The method of set all tokens identified as 〈unk〉

### Example:
```python
>>> import alm
>>>
>>> alm.idw("<date>")
6
>>>
>>> alm.idw("<time>")
7
>>>
>>> alm.idw("<abbr>")
5
>>>
>>> alm.idw("<math>")
9
>>>
>>> alm.setTokenDisable("date|time|abbr|math")
>>>
>>> alm.getTokensDisable()
{9, 5, 6, 7}
>>>
>>> alm.setTokensDisable({6, 7, 5, 9})
>>>
>>> alm.setTokenUnknown("date|time|abbr|math")
>>>
>>> alm.getTokensUnknown()
{9, 5, 6, 7}
>>>
>>> alm.setTokensUnknown({6, 7, 5, 9})
>>>
>>> alm.setAllTokenDisable()
>>>
>>> alm.getTokensDisable()
{2, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 23}
>>>
>>> alm.setAllTokenUnknown()
>>>
>>> alm.getTokensUnknown()
{2, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 23}
```

---

### Methods:
- **countAlphabet** - Method of obtaining the number of letters in the dictionary

### Example:
```python
>>> import alm
>>>
>>> alm.getAlphabet()
'abcdefghijklmnopqrstuvwxyz'
>>>
>>> alm.countAlphabet()
26
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.countAlphabet()
59
```

---

### Methods:
- **countBigrams** - Method get count bigrams
- **countTrigrams** - Method get count trigrams
- **countGrams** - Method get count N-gram by lm size

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.countBigrams("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
12
>>>
>>> alm.countTrigrams("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
10
>>>
>>> alm.size()
3
>>>
>>> alm.countGrams("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???с лязгом выкатился и остановился возле мальчика....")
10
>>>
>>> alm.idw("неожиданно")
3263936167
>>>
>>> alm.idw("из")
5134
>>>
>>> alm.idw("подворотни")
12535356101
>>>
>>> alm.idw("в")
53
>>>
>>> alm.idw("Олега")
2824508300
>>>
>>> alm.idw("ударил")
24816796913
>>>
>>> alm.countBigrams([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])
5
>>>
>>> alm.countTrigrams([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])
4
>>>
>>> alm.countGrams([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])
4
```

---

### Methods:
- **arabic2Roman** - Convert arabic number to roman number

### Example:
```python
>>> import alm
>>>
>>> alm.arabic2Roman(23)
'XXIII'
>>>
>>> alm.arabic2Roman("33")
'XXXIII'
```

---

### Methods:
- **setThreads** - Method for set the number of threads (0 - all threads)

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.setThreads(3)
>>>
>>> a = alm.pplByFiles("./text.txt")
>>>
>>> print(a.logprob)
-48201.29481399994
```

---

### Methods:
- **fti** - Method for removing the fractional part of a number

### Example:
```python
>>> import alm
>>>
>>> alm.fti(5892.4892)
5892489200000
>>>
>>> alm.fti(5892.4892, 4)
58924892
```

---

### Methods:
- **context** - Method for assembling text context from a sequence

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.idw("неожиданно")
3263936167
>>>
>>> alm.idw("из")
5134
>>>
>>> alm.idw("подворотни")
12535356101
>>>
>>> alm.idw("в")
53
>>>
>>> alm.idw("Олега")
2824508300
>>>
>>> alm.idw("ударил")
24816796913
>>>
>>> alm.context([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])
'Неожиданно из подворотни в Олега ударил'
```

---

### Methods:
- **isAbbr** - Method of checking a word for compliance with an abbreviation
- **isSuffix** - Method for checking a word for a suffix of a numeric abbreviation
- **isToken** - Method for checking if an identifier matches a token
- **isIdWord** - Method for checking if an identifier matches a word

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.addAbbr("США")
>>>
>>> alm.isAbbr("сша")
True
>>>
>>> alm.addSuffix("1-я")
>>>
>>> alm.isSuffix("1-я")
True
>>>
>>> alm.isToken(alm.idw("США"))
True
>>>
>>> alm.isToken(alm.idw("1-я"))
True
>>>
>>> alm.isToken(alm.idw("125"))
True
>>>
>>> alm.isToken(alm.idw("<s>"))
True
>>>
>>> alm.isToken(alm.idw("Hello"))
False
>>>
>>> alm.isIdWord(alm.idw("https://anyks.com"))
True
>>>
>>> alm.isIdWord(alm.idw("Hello"))
True
>>>
>>> alm.isIdWord(alm.idw("-"))
False
```

---

### Methods:
- **findByFiles** - Method search N-grams in a text file

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.debug)
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.findByFiles("./text.txt", "./result.txt")
info: <s> Кукай
сари кукай
сари японские
японские каллиграфы
каллиграфы я
я постоянно
постоянно навещал
навещал их
их тайно
тайно от
от людей
людей </s>


info: <s> Неожиданно из
Неожиданно из подворотни
из подворотни в
подворотни в Олега
в Олега ударил
Олега ударил яркий
ударил яркий прожектор
яркий прожектор патрульный
прожектор патрульный трактор
патрульный трактор

<s> С лязгом
С лязгом выкатился
лязгом выкатился и
выкатился и остановился
и остановился возле
остановился возле мальчика
возле мальчика
```

---

### Methods:
- **checkSequence** - Sequence Existence Method
- **existSequence** - Method for checking the existence of a sequence, excluding non-word tokens
- **checkByFiles** - Method for checking if a sequence exists in a text file

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.debug)
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.addAbbr("США")
>>>
>>> alm.isAbbr("сша")
>>>
>>> alm.checkSequence("Неожиданно из подворотни в олега ударил")
True
>>>
>>> alm.checkSequence("Сегодня сыграл и в Олега ударил яркий прожектор патрульный трактор с корпоративным сектором")
True
>>>
>>> alm.checkSequence("Сегодня сыграл и в Олега ударил яркий прожектор патрульный трактор с корпоративным сектором", True)
True
>>>
>>> alm.checkSequence("в Олега ударил яркий")
True
>>>
>>> alm.checkSequence("в Олега ударил яркий", True)
True
>>>
>>> alm.checkSequence("от госсекретаря США")
True
>>>
>>> alm.checkSequence("от госсекретаря США", True)
True
>>>
>>> alm.checkSequence("Неожиданно из подворотни в олега ударил", 2)
True
>>>
>>> alm.checkSequence(["Неожиданно","из","подворотни","в","олега","ударил"], 2)
True
>>>
>>> alm.existSequence("<s> Сегодня сыграл и в, Олега ударил яркий прожектор, патрульный трактор - с корпоративным сектором </s>", 2)
(True, 0)
>>>
>>> alm.existSequence(["<s>","Сегодня","сыграл","и","в",",","Олега","ударил","яркий","прожектор",",","патрульный","трактор","-","с","корпоративным","сектором","</s>"], 2)
(True, 2)
>>>
>>> alm.idw("от")
6086
>>>
>>> alm.idw("госсекретаря")
51273912082
>>>
>>> alm.idw("США")
5
>>>
>>> alm.checkSequence([6086, 51273912082, 5])
True
>>>
>>> alm.checkSequence([6086, 51273912082, 5], True)
True
>>>
>>> alm.checkSequence(["от", "госсекретаря", "США"])
True
>>>
>>> alm.checkSequence(["от", "госсекретаря", "США"], True)
True
>>>
>>> alm.checkByFiles("./text.txt", "./result.txt")
info: 1999 | YES | Какой-то период времени мы вообще не общались

info: 2000 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.С лязгом выкатился и остановился возле мальчика.

info: 2001 | YES | Так как эти яйца жалко есть а хочется все больше любоваться их можно покрыть лаком даже прозрачным лаком для ногтей

info: 2002 | NO | кукай <unk> <unk> сари кукай <unk> <unk> сари японские каллиграфы я постоянно навещал их тайно от людей

info: 2003 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???С лязгом выкатился и остановился возле мальчика....

info: 2004 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор?С лязгом выкатился и остановился возле мальчика.

info: 2005 | YES | Сегодня яичницей никто не завтракал как впрочем и вчера на ближайшем к нам рынке мы ели фруктовый салат со свежевыжатым соком как в старые добрые времена в Бразилии

info: 2006 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор!С лязгом выкатился и остановился возле мальчика.

info: 2007 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.с лязгом выкатился и остановился возле мальчика.

All texts: 2007
Exists texts: 1359
Not exists texts: 648
>>>
>>> alm.checkByFiles("./corpus", "./result.txt", False, "txt")
info: 1999 | YES | Какой-то период времени мы вообще не общались

info: 2000 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.С лязгом выкатился и остановился возле мальчика.

info: 2001 | YES | Так как эти яйца жалко есть а хочется все больше любоваться их можно покрыть лаком даже прозрачным лаком для ногтей

info: 2002 | NO | кукай <unk> <unk> сари кукай <unk> <unk> сари японские каллиграфы я постоянно навещал их тайно от людей

info: 2003 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???С лязгом выкатился и остановился возле мальчика....

info: 2004 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор?С лязгом выкатился и остановился возле мальчика.

info: 2005 | YES | Сегодня яичницей никто не завтракал как впрочем и вчера на ближайшем к нам рынке мы ели фруктовый салат со свежевыжатым соком как в старые добрые времена в Бразилии

info: 2006 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор!С лязгом выкатился и остановился возле мальчика.

info: 2007 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.с лязгом выкатился и остановился возле мальчика.

All texts: 2007
Exists texts: 1359
Not exists texts: 648
>>>
>>> alm.checkByFiles("./corpus", "./result.txt", True, "txt")
info: 2000 | NO | Так как эти яйца жалко есть а хочется все больше любоваться их можно покрыть лаком даже прозрачным лаком для ногтей

info: 2001 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.С лязгом выкатился и остановился возле мальчика.

info: 2002 | NO | Сегодня яичницей никто не завтракал как впрочем и вчера на ближайшем к нам рынке мы ели фруктовый салат со свежевыжатым соком как в старые добрые времена в Бразилии

info: 2003 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор!С лязгом выкатился и остановился возле мальчика.

info: 2004 | NO | кукай <unk> <unk> сари кукай <unk> <unk> сари японские каллиграфы я постоянно навещал их тайно от людей

info: 2005 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор?С лязгом выкатился и остановился возле мальчика.

info: 2006 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???С лязгом выкатился и остановился возле мальчика....

info: 2007 | NO | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.с лязгом выкатился и остановился возле мальчика.

All texts: 2007
Exists texts: 0
Not exists texts: 2007
```

---

### Methods:
- **check** - String Check Method
- **match** - String Matching Method
- **addAbbr** - Method add abbreviation
- **addSuffix** - Method add number suffix abbreviation
- **setSuffixes** - Method set number suffix abbreviations
- **readSuffix** - Method for reading data from a file of suffixes and abbreviations

### Example:
```python
>>> import alm
>>> 
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>> alm.setSubstitutes({'p':'р','c':'с','o':'о','t':'т','k':'к','e':'е','a':'а','h':'н','x':'х','b':'в','m':'м'})
>>> 
>>> alm.check("Дом-2", alm.check_t.home2)
True
>>> 
>>> alm.check("Дом2", alm.check_t.home2)
False
>>> 
>>> alm.check("Дом-2", alm.check_t.latian)
False
>>> 
>>> alm.check("Hello", alm.check_t.latian)
True
>>> 
>>> alm.check("прiвет", alm.check_t.latian)
True
>>> 
>>> alm.check("Дом-2", alm.check_t.hyphen)
True
>>> 
>>> alm.check("Дом2", alm.check_t.hyphen)
False
>>> 
>>> alm.check("Д", alm.check_t.letter)
True
>>> 
>>> alm.check("$", alm.check_t.letter)
False
>>> 
>>> alm.check("-", alm.check_t.letter)
False
>>> 
>>> alm.check("просtоквaшино", alm.check_t.similars)
True
>>> 
>>> alm.match("my site http://example.ru, it's true", alm.match_t.url)
True
>>> 
>>> alm.match("по вашему ip адресу 46.40.123.12 проводится проверка", alm.match_t.url)
True
>>> 
>>> alm.match("мой адрес в формате IPv6: http://[2001:0db8:11a3:09d7:1f34:8a2e:07a0:765d]/", alm.match_t.url)
True
>>> 
>>> alm.match("13-я", alm.match_t.abbr)
True
>>> 
alm.match("13-я-й", alm.match_t.abbr)
False
>>> 
alm.match("т.д", alm.match_t.abbr)
True
>>> 
alm.match("т.п.", alm.match_t.abbr)
True
>>> 
>>> alm.match("С.Ш.А.", alm.match_t.abbr)
True
>>> 
>>> alm.addAbbr("сша")
>>> alm.match("США", alm.match_t.abbr)
True
>>> 
>>> alm.addSuffix("15-летия")
>>> alm.match("15-летия", alm.match_t.abbr)
True
>>> 
>>> alm.getSuffixes()
{3139900457}
>>> 
>>> alm.idw("лет")
328041
>>> 
>>> alm.idw("тых")
352214
>>> 
>>> alm.setSuffixes({328041, 352214})
>>> 
>>> alm.getSuffixes()
{328041, 352214}
>>> 
>>> def status(status):
...     print(status)
... 
>>> alm.readSuffix("./suffix.abbr", status)
>>> 
>>> alm.match("15-лет", alm.match_t.abbr)
True
>>> 
>>> alm.match("20-тых", alm.match_t.abbr)
True
>>> 
>>> alm.match("15-летия", alm.match_t.abbr)
False
>>> 
>>> alm.match("Hello", alm.match_t.latian)
True
>>> 
>>> alm.match("прiвет", alm.match_t.latian)
False
>>> 
>>> alm.match("23424", alm.match_t.number)
True
>>> 
>>> alm.match("hello", alm.match_t.number)
False
>>> 
>>> alm.match("23424.55", alm.match_t.number)
False
>>> 
>>> alm.match("23424", alm.match_t.decimal)
False
>>> 
>>> alm.match("23424.55", alm.match_t.decimal)
True
>>> 
>>> alm.match("23424,55", alm.match_t.decimal)
True
>>> 
>>> alm.match("-23424.55", alm.match_t.decimal)
True
>>> 
>>> alm.match("+23424.55", alm.match_t.decimal)
True
>>> 
>>> alm.match("+23424.55", alm.match_t.anumber)
True
>>> 
>>> alm.match("15T-34", alm.match_t.anumber)
True
>>> 
>>> alm.match("hello", alm.match_t.anumber)
False
>>> 
>>> alm.match("hello", alm.match_t.allowed)
True
>>> 
>>> alm.match("évaluer", alm.match_t.allowed)
False
>>> 
>>> alm.match("13", alm.match_t.allowed)
True
>>> 
>>> alm.match("Hello-World", alm.match_t.allowed)
True
>>> 
>>> alm.match("Hello", alm.match_t.math)
False
>>> 
>>> alm.match("+", alm.match_t.math)
True
>>> 
>>> alm.match("=", alm.match_t.math)
True
>>> 
>>> alm.match("Hello", alm.match_t.upper)
True
>>> 
>>> alm.match("hello", alm.match_t.upper)
False
>>> 
>>> alm.match("hellO", alm.match_t.upper)
False
>>> 
>>> alm.match("a", alm.match_t.punct)
False
>>> 
>>> alm.match(",", alm.match_t.punct)
True
>>> 
>>> alm.match(" ", alm.match_t.space)
True
>>> 
>>> alm.match("a", alm.match_t.space)
False
>>> 
>>> alm.match("a", alm.match_t.special)
False
>>> 
>>> alm.match("±", alm.match_t.special)
False
>>> 
>>> alm.match("[", alm.match_t.isolation)
True
>>> 
>>> alm.match("a", alm.match_t.isolation)
False
>>> 
>>> alm.match("a", alm.match_t.greek)
False
>>> 
>>> alm.match("Ψ", alm.match_t.greek)
True
>>> 
>>> alm.match("->", alm.match_t.route)
False
>>> 
>>> alm.match("⇔", alm.match_t.route)
True
>>> 
>>> alm.match("a", alm.match_t.letter)
True
>>> 
>>> alm.match("!", alm.match_t.letter)
False
>>> 
>>> alm.match("!", alm.match_t.pcards)
False
>>> 
>>> alm.match("♣", alm.match_t.pcards)
True
>>> 
>>> alm.match("p", alm.match_t.currency)
False
>>> 
>>> alm.match("$", alm.match_t.currency)
True
>>> 
>>> alm.match("€", alm.match_t.currency)
True
>>> 
>>> alm.match("₽", alm.match_t.currency)
True
>>> 
>>> alm.match("₿", alm.match_t.currency)
True
```

---

### Methods:
- **delInText** - Method for delete letter in text

### Example:
```python
>>> import alm
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.delInText("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор??? с лязгом выкатился и остановился возле мальчика....", alm.wdel_t.punct)
'неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор с лязгом выкатился и остановился возле мальчика'
>>>
>>> alm.delInText("hello-world-hello-world", alm.wdel_t.hyphen)
'helloworldhelloworld'
>>>
>>> alm.delInText("неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор??? с лязгом выкатился и остановился возле мальчика....", alm.wdel_t.broken)
'неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор с лязгом выкатился и остановился возле мальчика'
>>>
>>> alm.delInText("«On nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'évaluer» l'astronomie", alm.wdel_t.broken)
"On nous dit qu'aujourd'hui c'est le cas encore faudra-t-il l'valuer l'astronomie"
```

---

### Methods:
- **countsByFiles** - Method for counting the number of n-grams in a text file

### Example:
```python
>>> import alm
>>>
>>> alm.setOption(alm.options_t.debug)
>>>
>>> alm.setOption(alm.options_t.confidence)
>>>
>>> alm.setAlphabet("abcdefghijklmnopqrstuvwxyzабвгдеёжзийклмнопрстуфхцчшщъыьэюя")
>>>
>>> alm.readArpa('./lm.arpa')
>>>
>>> alm.countsByFiles("./text.txt", "./result.txt", 3)
info: 0 | Сегодня яичницей никто не завтракал как впрочем и вчера на ближайшем к нам рынке мы ели фруктовый салат со свежевыжатым соком как в старые добрые времена в Бразилии

info: 10 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор?С лязгом выкатился и остановился возле мальчика.

info: 10 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор!С лязгом выкатился и остановился возле мальчика.

info: 0 | Так как эти яйца жалко есть а хочется все больше любоваться их можно покрыть лаком даже прозрачным лаком для ногтей

info: 10 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???С лязгом выкатился и остановился возле мальчика....

Counts 3grams: 471
>>>
>>> alm.countsByFiles("./corpus", "./result.txt", 2, "txt")
info: 19 | Так как эти яйца жалко есть а хочется все больше любоваться их можно покрыть лаком даже прозрачным лаком для ногтей

info: 12 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор.с лязгом выкатился и остановился возле мальчика.

info: 12 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор!С лязгом выкатился и остановился возле мальчика.

info: 10 | кукай <unk> <unk> сари кукай <unk> <unk> сари японские каллиграфы я постоянно навещал их тайно от людей

info: 12 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор???С лязгом выкатился и остановился возле мальчика....

info: 12 | Неожиданно из подворотни в Олега ударил яркий прожектор патрульный трактор?С лязгом выкатился и остановился возле мальчика.

info: 27 | Сегодня яичницей никто не завтракал как впрочем и вчера на ближайшем к нам рынке мы ели фруктовый салат со свежевыжатым соком как в старые добрые времена в Бразилии

Counts 2grams: 20270
```

### Description
| N-gram size | Description         |
|-------------|---------------------|
| 1           | language model size |
| 2           | bigram              |
| 3           | trigram             |

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/anyks/alm",
    "name": "anyks-lm",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "nlp,lm,alm,language-model",
    "author": "Yuriy Lobarev",
    "author_email": "forman@anyks.com",
    "download_url": "https://files.pythonhosted.org/packages/3e/3d/8d49d9c3ff4013891b15d7cb8b9990084ee79acd14bcf05bf43bb40a76a7/anyks-lm-3.5.0.tar.gz",
    "platform": null,
    "description": "[![ANYKS Smart language model](https://raw.githubusercontent.com/anyks/alm/master/site/img/banner.jpg)](https://anyks.com)\n\n# ANYKS Language Model (ALM)\n\n## Project goals and features\n\nThe are many toolkits capable of creating language models: ([KenLM](https://github.com/kpu/kenlm), [SriLM](https://github.com/BitMindLab/SRILM), [IRSTLM](https://github.com/irstlm-team/irstlm)), and each of those toolkits may have a reason to exist. But our language model creation toolkit has the following goals and features:\n\n- **UTF-8 support**: Full UTF-8 support without third-party dependencies.\n- **Support of many data formats**: ARPA, Vocab, Map Sequence, N-grams, Binary alm dictionary.\n- **Smoothing algorithms**: Kneser-Nay, Modified Kneser-Nay, Witten-Bell, Additive, Good-Turing, Absolute discounting.\n- **Normalisation and preprocessing for corpora**: Transferring corpus to lowercase, smart tokenization, ability to create black - and white - lists for n-grams.\n- **ARPA modification**: Frequencies and n-grams replacing, adding new n-grams with frequencies, removing n-grams.\n- **Pruning**: N-gram removal based on specified criteria.\n- **Removal of low-probability n-grams**: Removal of n-grams which backoff probability is higher than standard probability.\n- **ARPA recovery**: Recovery of damaged n-grams in ARPA with subsequent recalculation of their backoff probabilities.\n- **Support of additional word features**: Feature extraction: (numbers, roman numbers, ranges of numbers, numeric abbreviations, any other custom attributes) using scripts written in Python3.\n- **Text preprocessing**: Unlike all other language model toolkits, ALM can extract correct context from files with unnormalized texts.\n- **Unknown word token accounting**: Accounting of \u2329unk\u232a token as full n-gram.\n- **Redefinition of \u2329unk\u232a token**: Ability to redefine an attribute of an unknown token.\n- **N-grams preprocessing**: Ability to pre-process n-grams before adding them to ARPA using custom Python3 scripts.\n- **Binary container for Language Models**: The binary container supports compression, encryption and installation of copyrights.\n- **Convenient visualization of the Language model assembly process**: ALM implements several types of visualizations: textual, graphic, process indicator, and logging to files or console.\n- **Collection of all n-grams**: Unlike other language model toolkits, ALM is guaranteed to extract all possible n-grams from the corpus, regardless of their length (except for Modified Kneser-Nay); you can also force all n-grams to be taken into account even if they occured only once.\n\n## Requirements\n\n- [Zlib](http://www.zlib.net)\n- [OpenSSL](https://www.openssl.org)\n- [Python3](https://www.python.org/download/releases/3.0)\n- [NLohmann::json](https://github.com/nlohmann/json)\n- [BigInteger](http://mattmccutchen.net/bigint)\n\n## Install PyBind11\n\n```bash\n$ python3 -m pip install pybind11\n```\n\n## Description of Methods\n\n### Methods:\n- **idw** - Word ID retrieval method\n- **idt** - Token ID retrieval method\n- **ids** - Sequence ID retrieval method\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.idw(\"hello\")\n313191024\n>>>\n>>> alm.idw(\"<s>\")\n1\n>>>\n>>> alm.idw(\"</s>\")\n22\n>>>\n>>> alm.idw(\"<unk>\")\n3\n>>>\n>>> alm.idt(\"1424\")\n2\n>>>\n>>> alm.idt(\"hello\")\n0\n>>>\n>>> alm.idw(\"Living\")\n13268942501\n>>>\n>>> alm.idw(\"in\")\n2047\n>>>\n>>> alm.idw(\"the\")\n83201\n>>>\n>>> alm.idw(\"USA\")\n72549\n>>>\n>>> alm.ids([13268942501, 2047, 83201, 72549])\n16314074810955466382\n```\n\n### Description\n| Name      | Description                                                                                                                                           |\n|-----------|-------------------------------------------------------------------------------------------------------------------------------------------------------|\n|\u2329s\u232a       | Sentence beginning token                                                                                                                              |\n|\u2329/s\u232a      | Sentence end token                                                                                                                                    |\n|\u2329url\u232a     | URL-address token                                                                                                                                     |\n|\u2329num\u232a     | Number (arabic or roman) token                                                                                                                        |\n|\u2329unk\u232a     | Unknown word token                                                                                                                                    |\n|\u2329time\u232a    | Time token (15:44:56)                                                                                                                                 |\n|\u2329score\u232a   | Score count token (4:3 \u00a6 01:04)                                                                                                                       |\n|\u2329fract\u232a   | Fraction token (5/20 \u00a6 192/864)                                                                                                                       |\n|\u2329date\u232a    | Date token (18.07.2004 \u00a6 07/18/2004)                                                                                                                  |\n|\u2329abbr\u232a    | Abbreviation token (1-\u0439 \u00a6 2-\u0435 \u00a6 20-\u044f \u00a6 p.s \u00a6 p.s.)                                                                                                    |\n|\u2329dimen\u232a   | Dimensions token (200x300 \u00a6 1920x1080)                                                                                                                |\n|\u2329range\u232a   | Range of numbers token (1-2 \u00a6 100-200 \u00a6 300-400)                                                                                                      |\n|\u2329aprox\u232a   | Approximate number token (~93 \u00a6 ~95.86 \u00a6 10~20)                                                                                                       |\n|\u2329anum\u232a    | Pseudo-number token (combination of numbers and other symbols) (T34 \u00a6 895-M-86 \u00a6 39km)                                                                |\n|\u2329pcards\u232a  | Symbols of the play cards (\u2660 \u00a6 \u2663 \u00a6 \u2665 \u00a6 \u2666 )                                                                                                            |\n|\u2329punct\u232a   | Punctuation token (. \u00a6 , \u00a6 ? \u00a6 ! \u00a6 : \u00a6 ; \u00a6 \u2026 \u00a6 \u00a1 \u00a6 \u00bf)                                                                                                 |\n|\u2329route\u232a   | Direction symbols (arrows) (\u2190 \u00a6 \u2191 \u00a6 \u2193 \u00a6 \u2194 \u00a6 \u21b5 \u00a6 \u21d0 \u00a6 \u21d1 \u00a6 \u21d2 \u00a6 \u21d3 \u00a6 \u21d4 \u00a6 \u25c4 \u00a6 \u25b2 \u00a6 \u25ba \u00a6 \u25bc)                                                                    |\n|\u2329greek\u232a   | Symbols of the Greek alphabet (\u0391 \u00a6 \u0392 \u00a6 \u0393 \u00a6 \u0394 \u00a6 \u0395 \u00a6 \u0396 \u00a6 \u0397 \u00a6 \u0398 \u00a6 \u0399 \u00a6 \u039a \u00a6 \u039b \u00a6 \u039c \u00a6 \u039d \u00a6 \u039e \u00a6 \u039f \u00a6 \u03a0 \u00a6 \u03a1 \u00a6 \u03a3 \u00a6 \u03a4 \u00a6 \u03a5 \u00a6 \u03a6 \u00a6 \u03a7 \u00a6 \u03a8 \u00a6 \u03a9)                         |\n|\u2329isolat\u232a  | Isolation/quotation token (( \u00a6 ) \u00a6 [ \u00a6 ] \u00a6 { \u00a6 } \u00a6 \" \u00a6 \u00ab \u00a6 \u00bb \u00a6 \u201e \u00a6 \u201c \u00a6 ` \u00a6 \u2308 \u00a6 \u2309 \u00a6 \u230a \u00a6 \u230b \u00a6 \u2039 \u00a6 \u203a \u00a6 \u201a \u00a6 \u2019 \u00a6 \u2032 \u00a6 \u201b \u00a6 \u2033 \u00a6 \u2018 \u00a6 \u201d \u00a6 \u201f \u00a6 ' \u00a6\u2329 \u00a6 \u232a)         |\n|\u2329specl\u232a   | Special character token (_ \u00a6 @ \u00a6 # \u00a6 \u2116 \u00a6 \u00a9 \u00a6 \u00ae \u00a6 & \u00a6 \u00a7 \u00a6 \u00e6 \u00a6 \u00f8 \u00a6 \u00de \u00a6 \u2013 \u00a6 \u203e \u00a6 \u2011 \u00a6 \u2014 \u00a6 \u00af \u00a6 \u00b6 \u00a6 \u02c6 \u00a6 \u02dc \u00a6 \u2020 \u00a6 \u2021 \u00a6 \u2022 \u00a6 \u2030 \u00a6 \u2044 \u00a6 \u2111 \u00a6 \u2118 \u00a6 \u211c \u00a6 \u2135 \u00a6 \u25ca \u00a6 \\ )     |\n|\u2329currency\u232a| Symbols of world currencies ($ \u00a6 \u20ac \u00a6 \u20bd \u00a6 \u00a2 \u00a6 \u00a3 \u00a6 \u20a4 \u00a6 \u00a4 \u00a6 \u00a5 \u00a6 \u2133 \u00a6 \u20a3 \u00a6 \u20b4 \u00a6 \u20b8 \u00a6 \u20b9 \u00a6 \u20a9 \u00a6 \u20a6 \u00a6 \u20ad \u00a6 \u20aa \u00a6 \u09f3 \u00a6 \u0192 \u00a6 \u20a8 \u00a6 \u0e3f \u00a6 \u20ab \u00a6 \u17db \u00a6 \u20ae \u00a6 \u20b1 \u00a6 \ufdfc \u00a6 \u20a1 \u00a6 \u20b2 \u00a6 \u060b \u00a6 \u20b5 \u00a6 \u20ba \u00a6 \u20bc \u00a6 \u20be \u00a6 \u20a0 \u00a6 \u20a7 \u00a6 \u20af \u00a6 \u20a2 \u00a6 \u20b3 \u00a6 \u20a5 \u00a6 \u20b0 \u00a6 \u20bf \u00a6 \u04b1) |\n|\u2329math\u232a    | Mathematical operation token (+ \u00a6 - \u00a6 = \u00a6 / \u00a6 * \u00a6 ^ \u00a6 \u00d7 \u00a6 \u00f7 \u00a6 \u2212 \u00a6 \u2215 \u00a6 \u2216 \u00a6 \u2217 \u00a6 \u221a \u00a6 \u221d \u00a6 \u221e \u00a6 \u2220 \u00a6 \u00b1 \u00a6 \u00b9 \u00a6 \u00b2 \u00a6 \u00b3 \u00a6 \u00bd \u00a6 \u2153 \u00a6 \u00bc \u00a6 \u00be \u00a6 % \u00a6 ~ \u00a6 \u00b7 \u00a6 \u22c5 \u00a6 \u00b0 \u00a6 \u00ba \u00a6 \u00ac \u00a6 \u0192 \u00a6 \u2200 \u00a6 \u2202 \u00a6 \u2203 \u00a6 \u2205 \u00a6 \u2207 \u00a6 \u2208 \u00a6 \u2209 \u00a6 \u220b \u00a6 \u220f \u00a6 \u2211 \u00a6 \u2227 \u00a6 \u2228 \u00a6 \u2229 \u00a6 \u222a \u00a6 \u222b \u00a6 \u2234 \u00a6 \u223c \u00a6 \u2245 \u00a6 \u2248 \u00a6 \u2260 \u00a6 \u2261 \u00a6 \u2264 \u00a6 \u2265 \u00a6 \u00aa \u00a6 \u2282 \u00a6 \u2283 \u00a6 \u2284 \u00a6 \u2286 \u00a6 \u2287 \u00a6 \u2295 \u00a6 \u2297 \u00a6 \u22a5 \u00a6 \u00a8) |\n\n---\n\n### Methods:\n- **setZone** - User zone set method\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setZone(\"com\")\n>>> alm.setZone(\"ru\")\n>>> alm.setZone(\"org\")\n>>> alm.setZone(\"net\")\n```\n\n---\n\n### Methods:\n- **clear** - Method clear all data\n- **setAlphabet** - Method set alphabet\n- **getAlphabet** - Method get alphabet\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.getAlphabet()\n'abcdefghijklmnopqrstuvwxyz'\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.getAlphabet()\n'abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f'\n>>>\n>>> alm.clear()\n>>>\n>>> alm.getAlphabet()\n'abcdefghijklmnopqrstuvwxyz'\n```\n\n---\n\n### Methods:\n- **setUnknown** - Method set unknown word\n- **getUnknown** - Method extraction unknown word\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setUnknown(\"word\")\n>>>\n>>> alm.getUnknown()\n'word'\n```\n\n---\n\n### Methods:\n- **info** - Dictionary information output method\n- **init** - Language Model Initialization Method signature: [smoothing = wittenBell, modified = False, prepares = False, mod = 0.0]\n- **token** - Method for determining the type of the token words\n- **addText** - Method of adding text for estimate\n- **collectCorpus** - Training method of assembling the text data for ALM\n- **pruneVocab** - Dictionary pruning method\n- **buildArpa** - Method for build ARPA\n- **writeALM** - Method for writing data from ARPA file to binary container\n- **writeWords** - Method for writing these words to a file\n- **writeVocab** - Method for writing dictionary data to a file\n- **writeNgrams** - Method of writing data to NGRAMs files\n- **writeMap** - Method of writing sequence map to file\n- **writeSuffix** - Method for writing data to a suffix file for digital abbreviations\n- **writeAbbrs** - Method for writing data to an abbreviation file\n- **getSuffixes** - Method for extracting the list of suffixes of digital abbreviations\n- **writeArpa** - Method of writing data to ARPA file\n- **setSize** - Method for set size N-gram\n- **setLocale** - Method set locale (Default: en_US.UTF-8)\n- **pruneArpa** - Language model pruning method\n- **addWord** - Method for add a word to the dictionary\n- **setThreads** - Method for setting the number of threads used in work (0 - all available threads)\n- **setSubstitutes** - Method for set letters to correct words from mixed alphabets\n- **addAbbr** - Method add abbreviation\n- **setAbbrs** - Method set abbreviations\n- **getAbbrs** - Method for extracting the list of abbreviations\n- **addGoodword** - Method add good word\n- **addBadword** - Method add bad word\n- **readArpa** - Method for reading an ARPA file, language model\n- **readVocab** - Method of reading the dictionary\n- **setAdCw** - Method for set dictionary characteristics (cw - count all words in dataset, ad - count all documents in dataset)\n\n#### Description\n| Smoothing       |\n|-----------------|\n| wittenBell      |\n| addSmooth       |\n| goodTuring      |\n| constDiscount   |\n| naturalDiscount |\n| kneserNey       |\n| modKneserNey    |\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.info(\"./lm.alm\")\n\n\n* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *\n\n* Name: Test Language Model\n\n* Encryption: AES128\n\n* Alphabet: \u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\n\n* Build date: 09/18/2020 21:52:00\n\n* N-gram size: 3\n\n* Words: 9373\n\n* N-grams: 25021\n\n* Author: Some name\n\n* Contacts: site: https://example.com, e-mail: info@example.com\n\n* Copyright \u00a9: You company LLC\n\n* License type: MIT\n\n* License text:\n... License text ...\n\n* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *\n\n>>> \n```\n\n### Example:\n```python\n>>> import alm\n>>> import json\n>>> \n>>> alm.setSize(3)\n>>> alm.setThreads(0)\n>>> alm.setLocale(\"en_US.UTF-8\")\n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> alm.setSubstitutes({'p':'\u0440','c':'\u0441','o':'\u043e','t':'\u0442','k':'\u043a','e':'\u0435','a':'\u0430','h':'\u043d','x':'\u0445','b':'\u0432','m':'\u043c'})\n>>> \n>>> alm.setOption(alm.options_t.allowUnk)\n>>> alm.setOption(alm.options_t.resetUnk)\n>>> alm.setOption(alm.options_t.mixDicts)\n>>> alm.setOption(alm.options_t.tokenWords)\n>>> alm.setOption(alm.options_t.interpolate)\n>>> \n>>> alm.init(alm.smoothing_t.modKneserNey, True, True)\n>>> \n>>> p = alm.getParams()\n>>> p.algorithm\n4\n>>> p.mod\n0.0\n>>> p.prepares\nTrue\n>>> p.modified\nTrue\n>>> alm.idw(\"\u0421\u0431\u0435\u0440\u0431\u0430\u043d\u043a\")\n13236490857\n>>> alm.idw(\"\u0421\u043e\u0432\u043a\u043e\u043c\u0431\u0430\u043d\u043a\")\n22287680895\n>>> \n>>> alm.token(\"\u0421\u0431\u0435\u0440\u0431\u0430\u043d\u043a\")\n'<unk>'\n>>> alm.token(\"\u0441\u043e\u0432\u043a\u043e\u043c\u0431\u0430\u043d\u043a\")\n'<unk>'\n>>> \n>>> alm.setAbbrs({13236490857, 22287680895})\n>>> \n>>> alm.addAbbr(\"\u0421\u0428\u0410\")\n>>> alm.addAbbr(\"\u0421\u0431\u0435\u0440\")\n>>> \n>>> alm.token(\"\u0421\u0431\u0435\u0440\u0431\u0430\u043d\u043a\")\n'<abbr>'\n>>> alm.token(\"\u0441\u043e\u0432\u043a\u043e\u043c\u0431\u0430\u043d\u043a\")\n'<abbr>'\n>>> \n>>> alm.token(\"\u0441\u0448\u0430\")\n'<abbr>'\n>>> alm.token(\"\u0421\u0411\u0415\u0420\")\n'<abbr>'\n>>> \n>>> alm.getAbbrs()\n{13236490857, 189243, 22287680895, 26938511}\n>>> \n>>> alm.addGoodword(\"T-34\")\n>>> alm.addGoodword(\"\u0410\u041d-25\")\n>>> \n>>> alm.addBadword(\"\u0438\u0439\u0442\u0438\")\n>>> alm.addBadword(\"\u0446\u0438\u0433\u0430\u043d\")\n>>> alm.addBadword(\"\u0430\u043f\u0438\u0447\u0430\u0442\u043a\u0430\")\n>>> \n>>> alm.addWord(\"\u043c\u0438\u043d\u0438\u0441\u0442\u0435\u0440\u0441\u0442\u0432\u043e\")\n>>> alm.addWord(\"\u0432\u043e\u0437\u043c\u0435\u0437\u0434\u0438\u0435\", 0, 1)\n>>> alm.addWord(\"\u0432\u043e\u0437\u0440\u0430\u0436\u0435\u043d\u0438\u0435\", alm.idw(\"\u0432\u043e\u0437\u0440\u0430\u0436\u0435\u043d\u0438\u0435\"), 2)\n>>> \n>>> def status(text, status):\n...     print(text, status)\n... \n>>> def statusWriteALM(status):\n...     print(\"Write ALM\", status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> def statusBuildArpa(status):\n...     print(\"Build ARPA\", status)\n... \n>>> def statusPrune(status):\n...     print(\"Prune data\", status)\n... \n>>> def statusWords(status):\n...     print(\"Write words\", status)\n... \n>>> def statusVocab(status):\n...     print(\"Write vocab\", status)\n... \n>>> def statusNgram(status):\n...     print(\"Write ngram\", status)\n... \n>>> def statusMap(status):\n...     print(\"Write map\", status)\n... \n>>> def statusSuffix(status):\n...     print(\"Write suffix\", status)\n... \n>>> def statusAbbreviation(status):\n...     print(\"Write abbreviation\", status)\n... \n>>> alm.addText(\"The future is now\", 0)\n>>> \n>>> alm.collectCorpus(\"./correct.txt\", status)\nRead text corpora 0\nRead text corpora 1\nRead text corpora 2\nRead text corpora 3\nRead text corpora 4\nRead text corpora 5\nRead text corpora 6\n...\n>>> alm.pruneVocab(-15.0, 0, 0, statusPrune)\nPrune data 0\nPrune data 1\nPrune data 2\nPrune data 3\nPrune data 4\nPrune data 5\nPrune data 6\n...\n>>> alm.pruneArpa(0.015, 3, statusPrune)\nPrune data 0\nPrune data 1\nPrune data 2\nPrune data 3\nPrune data 4\nPrune data 5\nPrune data 6\n...\n>>> meta = {\n...     \"aes\": 128,\n...     \"name\": \"Test Language Model\",\n...     \"author\": \"Some name\",\n...     \"lictype\": \"MIT\",\n...     \"password\": \"password\",\n...     \"copyright\": \"You company LLC\",\n...     \"lictext\": \"... License text ...\",\n...     \"contacts\": \"site: https://example.com, e-mail: info@example.com\"\n... }\n>>> \n>>> alm.writeALM(\"./lm.alm\", json.dumps(meta), statusWriteALM)\nWrite ALM 0\nWrite ALM 0\nWrite ALM 0\nWrite ALM 0\nWrite ALM 0\nWrite ALM 0\n...\n>>> alm.writeWords(\"./words.txt\", statusWords)\nWrite words 0\nWrite words 1\nWrite words 2\nWrite words 3\nWrite words 4\nWrite words 5\nWrite words 6\n...\n>>> alm.writeVocab(\"./lm.vocab\", statusVocab)\nWrite vocab 0\nWrite vocab 1\nWrite vocab 2\nWrite vocab 3\nWrite vocab 4\nWrite vocab 5\nWrite vocab 6\n...\n>>> alm.writeNgrams(\"./lm.ngram\", statusNgram)\nWrite ngram 0\nWrite ngram 1\nWrite ngram 2\nWrite ngram 3\nWrite ngram 4\nWrite ngram 5\nWrite ngram 6\n...\n>>> alm.writeMap(\"./lm.map\", statusMap, \"|\")\nWrite map 0\nWrite map 1\nWrite map 2\nWrite map 3\nWrite map 4\nWrite map 5\nWrite map 6\n...\n>>> alm.writeSuffix(\"./suffix.txt\", statusSuffix)\nWrite suffix 10\nWrite suffix 20\nWrite suffix 30\nWrite suffix 40\nWrite suffix 50\nWrite suffix 60\n...\n>>> alm.writeAbbrs(\"./words.abbr\", statusAbbreviation)\nWrite abbreviation 25\nWrite abbreviation 50\nWrite abbreviation 75\nWrite abbreviation 100\n...\n>>> alm.getAbbrs()\n{13236490857, 189243, 22287680895, 26938511}\n>>> \n>>> alm.getSuffixes()\n{2633, 1662978425, 14279182218, 3468, 47, 28876661395, 29095464659, 2968, 57, 30}\n>>> \n>>> alm.buildArpa(statusBuildArpa)\nBuild ARPA 0\nBuild ARPA 1\nBuild ARPA 2\nBuild ARPA 3\nBuild ARPA 4\nBuild ARPA 5\nBuild ARPA 6\n...\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n```\n\n---\n\n### Methods:\n- **setOption** - Library options setting method\n- **unsetOption** - Disable module option method\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.unsetOption(alm.options_t.debug)\n>>> alm.unsetOption(alm.options_t.mixDicts)\n>>> alm.unsetOption(alm.options_t.onlyGood)\n>>> alm.unsetOption(alm.options_t.confidence)\n...\n```\n\n#### Description\n| Options     | Description                                                                              |\n|-------------|------------------------------------------------------------------------------------------|\n| debug       | Flag debug mode                                                                          |\n| stress      | Flag allowing to stress in words                                                         |\n| uppers      | Flag that allows you to correct the case of letters                                      |\n| onlyGood    | Flag allowing to consider words from the white list only                                 |\n| mixDicts    | Flag allowing the use of words consisting of mixed dictionaries                          |\n| allowUnk    | Flag allowing to unknown word                                                            |\n| resetUnk    | Flag to reset the frequency of an unknown word                                           |\n| allGrams    | Flag allowing accounting of all collected n-grams                                        |\n| lowerCase   | Flag allowing to case-insensitive                                                        |\n| confidence  | Flag ARPA file loading without pre-processing the words                                  |\n| tokenWords  | Flag that takes into account when assembling N-grams, only those tokens that match words |\n| interpolate | Flag allowing to use interpolation in estimating                                         |\n\n---\n\n### Methods:\n- **readMap** - Method for reading sequence map from file\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setLocale(\"en_US.UTF-8\")\n>>>\n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> alm.setSubstitutes({'p':'\u0440','c':'\u0441','o':'\u043e','t':'\u0442','k':'\u043a','e':'\u0435','a':'\u0430','h':'\u043d','x':'\u0445','b':'\u0432','m':'\u043c'})\n>>>\n>>> alm.setOption(alm.options_t.allowUnk)\n>>> alm.setOption(alm.options_t.resetUnk)\n>>> alm.setOption(alm.options_t.mixDicts)\n>>> \n>>> def statusMap(text, status):\n...     print(\"Read map\", text, status)\n... \n>>> def statusBuildArpa(status):\n...     print(\"Build ARPA\", status)\n... \n>>> def statusPrune(status):\n...     print(\"Prune data\", status)\n... \n>>> def statusVocab(text, status):\n...     print(\"Read Vocab\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init(alm.smoothing_t.wittenBell)\n>>> \n>>> p = alm.getParams()\n>>> p.algorithm\n2\n>>> alm.readVocab(\"./lm.vocab\", statusVocab)\nRead Vocab ./lm.vocab 0\nRead Vocab ./lm.vocab 1\nRead Vocab ./lm.vocab 2\nRead Vocab ./lm.vocab 3\nRead Vocab ./lm.vocab 4\nRead Vocab ./lm.vocab 5\nRead Vocab ./lm.vocab 6\n...\n>>> alm.readMap(\"./lm1.map\", statusMap, \"|\")\nRead map ./lm.map 0\nRead map ./lm.map 1\nRead map ./lm.map 2\nRead map ./lm.map 3\nRead map ./lm.map 4\nRead map ./lm.map 5\nRead map ./lm.map 6\n...\n>>> alm.readMap(\"./lm2.map\", statusMap, \"|\")\nRead map ./lm.map 0\nRead map ./lm.map 1\nRead map ./lm.map 2\nRead map ./lm.map 3\nRead map ./lm.map 4\nRead map ./lm.map 5\nRead map ./lm.map 6\n...\n>>> alm.pruneVocab(-15.0, 0, 0, statusPrune)\nPrune data 0\nPrune data 1\nPrune data 2\nPrune data 3\nPrune data 4\nPrune data 5\nPrune data 6\n...\n>>> alm.buildArpa(statusBuildArpa)\nBuild ARPA 0\nBuild ARPA 1\nBuild ARPA 2\nBuild ARPA 3\nBuild ARPA 4\nBuild ARPA 5\nBuild ARPA 6\n...\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n>>> def getWords(word, idw, oc, dc, count):\n...     print(word, idw, oc, dc, count)\n...     return True\n... \n>>> alm.words(getWords)\n\u0430 25 244 12 9373\n\u0431 26 11 6 9373\n\u0432 27 757 12 9373\n\u0436 32 12 7 9373\n\u0438 34 823 12 9373\n\u043a 36 102 12 9373\n\u043e 40 63 12 9373\n\u043f 41 1 1 9373\n\u0440 42 1 1 9373\n\u0441 43 290 12 9373\n\u0443 45 113 12 9373\n\u0425 47 1 1 9373\n\u044f 57 299 12 9373\nD 61 1 1 9373\nI 66 1 1 9373\n\u0434\u0430 2179 32 10 9373\n\u0437\u0430 2183 92 12 9373\n\u043d\u0430 2189 435 12 9373\n\u043f\u0430 2191 1 1 9373\n\u0442\u0430 2194 4 4 9373\n\u043e\u0431 2276 20 10 9373\n...\n>>> alm.getStatistic()\n(13, 38124)\n>> alm.setAdCw(44381, 20)\n>>> alm.getStatistic()\n(20, 44381)\n```\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> alm.setSubstitutes({'p':'\u0440','c':'\u0441','o':'\u043e','t':'\u0442','k':'\u043a','e':'\u0435','a':'\u0430','h':'\u043d','x':'\u0445','b':'\u0432','m':'\u043c'})\n>>>\n>>> alm.setOption(alm.options_t.allowUnk)\n>>> alm.setOption(alm.options_t.resetUnk)\n>>> alm.setOption(alm.options_t.mixDicts)\n>>> \n>>> def statusBuildArpa(status):\n...     print(\"Build ARPA\", status)\n... \n>>> def statusPrune(status):\n...     print(\"Prune data\", status)\n... \n>>> def statusNgram(text, status):\n...     print(\"Read Ngram\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init(alm.smoothing_t.addSmooth, False, False, 0.5)\n>>> \n>>> p = alm.getParams()\n>>> p.algorithm\n0\n>>> p.mod\n0.5\n>>> p.prepares\nFalse\n>>> p.modified\nFalse\n>>> \n>>> alm.readNgram(\"./lm.ngram\", statusNgram)\nRead Ngram ./lm.ngram 0\nRead Ngram ./lm.ngram 1\nRead Ngram ./lm.ngram 2\nRead Ngram ./lm.ngram 3\nRead Ngram ./lm.ngram 4\nRead Ngram ./lm.ngram 5\nRead Ngram ./lm.ngram 6\n...\n>>> alm.pruneVocab(-15.0, 0, 0, statusPrune)\nPrune data 0\nPrune data 1\nPrune data 2\nPrune data 3\nPrune data 4\nPrune data 5\nPrune data 6\n...\n>>> alm.buildArpa(statusBuildArpa)\nBuild ARPA 0\nBuild ARPA 1\nBuild ARPA 2\nBuild ARPA 3\nBuild ARPA 4\nBuild ARPA 5\nBuild ARPA 6\n...\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n```\n\n---\n\n### Methods:\n- **modify** - ARPA modification method\n- **sweep** - ARPA Low Frequency N-gram Removal Method\n- **repair** - Method of repair of previously calculated ARPA\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>> \n>>> def statusSweep(text, status):\n...     print(\"Sweep n-grams\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init()\n>>> \n>>> alm.sweep(\"./lm.arpa\", statusSweep)\nSweep n-grams Read ARPA file 0\nSweep n-grams Read ARPA file 1\nSweep n-grams Read ARPA file 2\nSweep n-grams Read ARPA file 3\nSweep n-grams Read ARPA file 4\nSweep n-grams Read ARPA file 5\nSweep n-grams Read ARPA file 6\n...\nSweep n-grams Sweep N-grams 0\nSweep n-grams Sweep N-grams 1\nSweep n-grams Sweep N-grams 2\nSweep n-grams Sweep N-grams 3\nSweep n-grams Sweep N-grams 4\nSweep n-grams Sweep N-grams 5\nSweep n-grams Sweep N-grams 6\n...\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n>>> alm.clear()\n>>> \n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> \n>>> def statusRepair(text, status):\n...     print(\"Repair n-grams\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init()\n>>> \n>>> alm.repair(\"./lm.arpa\", statusRepair)\nRepair n-grams Read ARPA file 0\nRepair n-grams Read ARPA file 1\nRepair n-grams Read ARPA file 2\nRepair n-grams Read ARPA file 3\nRepair n-grams Read ARPA file 4\nRepair n-grams Read ARPA file 5\nRepair n-grams Read ARPA file 6\n...\nRepair n-grams Repair ARPA data 0\nRepair n-grams Repair ARPA data 1\nRepair n-grams Repair ARPA data 2\nRepair n-grams Repair ARPA data 3\nRepair n-grams Repair ARPA data 4\nRepair n-grams Repair ARPA data 5\nRepair n-grams Repair ARPA data 6\n...\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n>>> alm.clear()\n>>> \n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> \n>>> def statusModify(text, status):\n...     print(\"Modify ARPA data\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init()\n>>> \n>>> alm.modify(\"./lm.arpa\", \"./remove.txt\", alm.modify_t.remove, statusModify)\nModify ARPA data Read ARPA file 0\nModify ARPA data Read ARPA file 1\nModify ARPA data Read ARPA file 2\nModify ARPA data Read ARPA file 3\nModify ARPA data Read ARPA file 4\nModify ARPA data Read ARPA file 5\nModify ARPA data Read ARPA file 6\n...\nModify ARPA data Modify ARPA data 3\nModify ARPA data Modify ARPA data 10\nModify ARPA data Modify ARPA data 15\nModify ARPA data Modify ARPA data 18\nModify ARPA data Modify ARPA data 24\nModify ARPA data Modify ARPA data 30\n...\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n```\n\n### Modification flags\n| Name    | Description                                             |\n|---------|---------------------------------------------------------|\n| emplace | Flag of adding n-gram into existing ARPA file           |\n| remove  | Flag of removing n-gram from existing ARPA file         |\n| change  | Flag of changing n-gram frequency in existing ARPA file |\n| replace | Flag of replacing n-gram in existing ARPA file          |\n\n### File of adding n-gram into existing ARPA file\n```\n-3.002006\t\u0421\u0428\u0410\n-1.365296\t\u0433\u0440\u0430\u043d\u0438\u0446 \u0421\u0428\u0410\n-0.988534\t\u0443 \u0433\u0440\u0430\u043d\u0438\u0446 \u0421\u0428\u0410\n-1.759398\t\u0437\u0430\u043c\u0443\u0436 \u0437\u0430\n-0.092796\t\u0441\u043e\u0431\u0438\u0440\u0430\u044e\u0441\u044c \u0437\u0430\u043c\u0443\u0436 \u0437\u0430\n-0.474876\t\u0438 \u0442\u043e\u0436\u0435\n-19.18453\t\u043c\u043e\u0436\u043d\u043e \u0438 \u0442\u043e\u0436\u0435\n...\n```\n\n| N-gram frequency      | Separator   | N-gram       |\n|-----------------------|-------------|--------------|\n| -0.988534             | \\t          | \u0443 \u0433\u0440\u0430\u043d\u0438\u0446 \u0421\u0428\u0410 |\n\n### File of changing n-gram frequency in existing ARPA file\n```\n-0.6588787\t\u043f\u043e\u043b\u0443\u0447\u0430\u0439\u0442\u0435 \u0443\u0434\u043e\u0432\u043e\u043b\u044c\u0441\u0442\u0432\u0438\u0435 </s>\n-0.6588787\t\u0442\u043e\u043b\u044c\u043a\u043e \u0432 \u043e\u0434\u043d\u043e\u043c\n-0.6588787\t\u0440\u0430\u0431\u043e\u0442\u0430 \u0441\u0432\u044f\u0437\u0430\u043d\u0430 \u0441\n-0.6588787\t\u043c\u0443\u0436\u0447\u0438\u043d\u044b \u0438 \u0436\u0435\u043d\u0449\u0438\u043d\u044b\n-0.6588787\t\u0433\u043e\u0432\u043e\u0440\u044f \u043f\u0440\u043e \u0442\u043e\n-0.6588787\t\u043f\u043e\u0442\u043e\u043c\u0443 \u0447\u0442\u043e \u044f\n-0.6588787\t\u043f\u043e\u0442\u043e\u043c\u0443 \u0447\u0442\u043e \u044d\u0442\u043e\n-0.6588787\t\u0440\u0430\u0431\u043e\u0442\u0443 \u043f\u043e\u0442\u043e\u043c\u0443 \u0447\u0442\u043e\n-0.6588787\t\u043f\u0435\u0439\u0437\u0430\u0436\u0438 \u0437\u0430 \u043e\u043a\u043d\u043e\u043c\n-0.6588787\t\u0441\u0442\u0430\u0442\u0443\u0441\u044b \u0434\u043b\u044f \u043e\u0434\u043d\u043e\u043a\u043b\u0430\u0441\u0441\u043d\u0438\u043a\u043e\u0432\n-0.6588787\t\u0432\u043e\u043e\u0431\u0449\u0435 \u043d\u0435 \u0445\u043e\u0447\u0443\n...\n```\n\n| N-gram frequency      | Separator   | N-gram            |\n|-----------------------|-------------|-------------------|\n| -0.6588787            | \\t          | \u043c\u0443\u0436\u0447\u0438\u043d\u044b \u0438 \u0436\u0435\u043d\u0449\u0438\u043d\u044b |\n\n### File of replacing n-gram in existing ARPA file\n```\n\u043a\u043e\u0435\u043c \u0441\u043b\u0443\u0447\u0430\u0435 \u043d\u0435\u043b\u044c\u0437\u044f\t\u0442\u0430\u043c \u0434\u0430 \u0442\u0443\u0442\n\u043d\u043e \u0442\u0435\u043c \u043d\u0435\t\u0434\u0430 \u0442\u044b \u0447\u0442\u043e\n\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u044b\u0439 \u0443\t\u043e\u0436\u0438\u0434\u0430\u0435\u043c\u044b\u0439 \u043a\n\u0432 \u0421\u041c\u0418\t\u0432 \u0424\u0421\u0411\n\u0428\u0430\u0445\t\u041c\u0430\u0442\n...\n```\n\n| Existing N-gram       | Separator   | New N-gram        |\n|-----------------------|-------------|-------------------|\n| \u043d\u043e \u0442\u0435\u043c \u043d\u0435             | \\t          | \u0434\u0430 \u0442\u044b \u0447\u0442\u043e         |\n\n### File of removing n-gram from existing ARPA file\n```\n\u043d\u0443 \u0442\u043e \u0435\u0441\u0442\u044c\n\u043d\u0443 \u043e\u0447\u0435\u043d\u044c \u0431\u043e\u043b\u044c\u0448\u043e\u0439\n\u0431\u044b \u0431\u044b\u043b\u043e \u0435\u0441\u043b\u0438\n\u043c\u044b \u0441 \u043d\u0435\u0439\n\u0442\u044b \u0441\u043c\u0435\u0435\u0448\u044c\u0441\u044f \u043d\u0430\u0434\n\u0434\u0432\u0430 \u0433\u043e\u0434\u0430 \u043d\u0430\u0437\u0430\u0434\n\u043d\u0430\u0434 \u0442\u0435\u043c \u0447\u0442\u043e\n\u0438\u043b\u0438 \u0435\u0449\u0435 \u0447\u0442\u043e-\u0442\u043e\n\u043a\u0430\u043a \u044f \u043f\u043e\u043d\u044f\u043b\n\u043a\u0430\u043a \u043d\u0438 \u0443\u0434\u0438\u0432\u0438\u0442\u0435\u043b\u044c\u043d\u043e\n\u043a\u0430\u043a \u0432\u044b \u0437\u043d\u0430\u0435\u0442\u0435\n\u0442\u0430\u043a \u0438 \u043d\u0435\n\u0432\u0441\u0435-\u0442\u0430\u043a\u0438 \u043f\u0440\u0430\u0432\u0430\n\u0432\u0441\u0435-\u0442\u0430\u043a\u0438 \u0431\u043e\u043b\u044f\u0442\n\u0432\u0441\u0435-\u0442\u0430\u043a\u0438 \u0441\u0434\u043e\u0445\u043b\u043e\n\u0432\u0441\u0435-\u0442\u0430\u043a\u0438 \u0432\u0441\u0442\u0430\u043b\u0430\n\u0432\u0441\u0435-\u0442\u0430\u043a\u0438 \u0440\u0435\u0448\u0438\u043b\u0441\u044f\n\u0443\u0436\u0435\n\u043c\u043d\u0435\n\u043c\u043e\u0435\n\u0432\u0441\u0435\n...\n```\n\n---\n\n### Methods:\n- **mix** - Multiple ARPA Interpolation Method [backward = True, forward = False]\n- **mix** - Interpolation method of multiple arpa algorithms (Bayesian and Logarithmic-linear) [Bayes: length > 0, Loglinear: length == 0]\n\n### Example:\n```python\n>>> import alm\n>>> \n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> \n>>> alm.setOption(alm.options_t.confidence)\n>>> \n>>> def statusMix(text, status):\n...     print(\"Mix ARPA data\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init()\n>>> \n>>> alm.mix([\"./lm1.arpa\", \"./lm2.arpa\"], [0.02, 0.05], True, statusMix)\nMix ARPA data ./lm1.arpa 0\nMix ARPA data ./lm1.arpa 1\nMix ARPA data ./lm1.arpa 2\nMix ARPA data ./lm1.arpa 3\nMix ARPA data ./lm1.arpa 4\nMix ARPA data ./lm1.arpa 5\nMix ARPA data ./lm1.arpa 6\n...\nMix ARPA data  0\nMix ARPA data  1\nMix ARPA data  2\nMix ARPA data  3\nMix ARPA data  4\nMix ARPA data  5\nMix ARPA data  6\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n>>> alm.clear()\n>>> \n>>> alm.setAlphabet(\"\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044fabcdefghijklmnopqrstuvwxyz\")\n>>> \n>>> def statusMix(text, status):\n...     print(\"Mix ARPA data\", text, status)\n... \n>>> def statusWriteArpa(status):\n...     print(\"Write ARPA\", status)\n... \n>>> alm.init()\n>>> \n>>> alm.mix([\"./lm1.arpa\", \"./lm2.arpa\"], [0.02, 0.05], 0, 0.032, statusMix)\nMix ARPA data ./lm1.arpa 0\nMix ARPA data ./lm1.arpa 1\nMix ARPA data ./lm1.arpa 2\nMix ARPA data ./lm1.arpa 3\nMix ARPA data ./lm1.arpa 4\nMix ARPA data ./lm1.arpa 5\nMix ARPA data ./lm1.arpa 6\n...\nMix ARPA data  0\nMix ARPA data  1\nMix ARPA data  2\nMix ARPA data  3\nMix ARPA data  4\nMix ARPA data  5\nMix ARPA data  6\n>>> alm.writeArpa(\"./lm.arpa\", statusWriteArpa)\nWrite ARPA 0\nWrite ARPA 1\nWrite ARPA 2\nWrite ARPA 3\nWrite ARPA 4\nWrite ARPA 5\nWrite ARPA 6\n...\n```\n\n---\n\n### Methods:\n- **size** - Method of obtaining the size of the N-gram\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.size()\n3\n```\n\n---\n\n### Methods:\n- **damerauLevenshtein** - Determination of the Damerau-Levenshtein distance in phrases\n- **distanceLevenshtein** - Determination of Levenshtein distance in phrases\n- **tanimoto** - Method for determining Jaccard coefficient (quotient - Tanimoto coefficient)\n- **needlemanWunsch** - Word stretching method\n\n### Example:\n```python\n>>> import alm\n>>> alm.damerauLevenshtein(\"\u043f\u0440\u0438\u0432\u0435\u0442\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n2\n>>> \n>>> alm.damerauLevenshtein(\"\u043f\u0440\u0438\u0435\u0432\u0442\u0438\u043a\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n1\n>>> \n>>> alm.distanceLevenshtein(\"\u043f\u0440\u0438\u0435\u0432\u0442\u0438\u043a\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n2\n>>> \n>>> alm.tanimoto(\"\u043f\u0440\u0438\u0432\u0435\u0442\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n0.7142857142857143\n>>> \n>>> alm.tanimoto(\"\u043f\u0440\u0438\u0432\u0435\u0438\u0442\u043a\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n0.4\n>>> \n>>> alm.needlemanWunsch(\"\u043f\u0440\u0438\u0432\u0435\u0438\u0442\u043a\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n4\n>>> \n>>> alm.needlemanWunsch(\"\u043f\u0440\u0438\u0432\u0435\u0442\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n2\n>>> \n>>> alm.damerauLevenshtein(\"acre\", \"car\")\n2\n>>> alm.distanceLevenshtein(\"acre\", \"car\")\n3\n>>> \n>>> alm.damerauLevenshtein(\"anteater\", \"theatre\")\n4\n>>> alm.distanceLevenshtein(\"anteater\", \"theatre\")\n5\n>>> \n>>> alm.damerauLevenshtein(\"banana\", \"nanny\")\n3\n>>> alm.distanceLevenshtein(\"banana\", \"nanny\")\n3\n>>> \n>>> alm.damerauLevenshtein(\"cat\", \"crate\")\n2\n>>> alm.distanceLevenshtein(\"cat\", \"crate\")\n2\n>>>\n>>> alm.mulctLevenshtein(\"\u043f\u0440\u0438\u0432\u0435\u0442\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n4\n>>>\n>>> alm.mulctLevenshtein(\"\u043f\u0440\u0438\u0435\u0432\u0442\u0438\u043a\", \"\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a\")\n1\n>>>\n>>> alm.mulctLevenshtein(\"acre\", \"car\")\n3\n>>>\n>>> alm.mulctLevenshtein(\"anteater\", \"theatre\")\n5\n>>>\n>>> alm.mulctLevenshtein(\"banana\", \"nanny\")\n4\n>>>\n>>> alm.mulctLevenshtein(\"cat\", \"crate\")\n4\n```\n\n---\n\n### Methods:\n- **textToJson** - Method to convert text to JSON\n- **isAllowApostrophe** - Apostrophe permission check method\n- **switchAllowApostrophe** - Method for permitting or denying an apostrophe as part of a word\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> def callbackFn(text):\n...     print(text)\n... \n>>> alm.isAllowApostrophe()\nFalse\n>>> alm.switchAllowApostrophe()\n>>>\n>>> alm.isAllowApostrophe()\nTrue\n>>> alm.textToJson(\"\u00abOn nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'\u00e9valuer\u00bb l'astronomie\", callbackFn)\n[[\"\u00ab\",\"On\",\"nous\",\"dit\",\"qu'aujourd'hui\",\"c'est\",\"le\",\"cas\",\",\",\"encore\",\"faudra-t-il\",\"l'\u00e9valuer\",\"\u00bb\",\"l'astronomie\"]]\n```\n\n---\n\n### Methods:\n- **jsonToText** - Method to convert JSON to text\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> def callbackFn(text):\n...     print(text)\n... \n>>> alm.jsonToText('[[\"\u00ab\",\"On\",\"nous\",\"dit\",\"qu\\'aujourd\\'hui\",\"c\\'est\",\"le\",\"cas\",\",\",\"encore\",\"faudra-t-il\",\"l\\'\u00e9valuer\",\"\u00bb\",\"l\\'astronomie\"]]', callbackFn)\n\u00abOn nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'\u00e9valuer\u00bb l'astronomie\n```\n\n---\n\n### Methods:\n- **restore** - Method for restore text from context\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.uppers)\n>>>\n>>> alm.restore([\"\u00ab\",\"On\",\"nous\",\"dit\",\"qu\\'aujourd\\'hui\",\"c\\'est\",\"le\",\"cas\",\",\",\"encore\",\"faudra-t-il\",\"l\\'\u00e9valuer\",\"\u00bb\",\"l\\'astronomie\"])\n\"\u00abOn nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'\u00e9valuer\u00bb l'astronomie\"\n```\n\n---\n\n### Methods:\n- **allowStress** - Method for allow using stress in words\n- **disallowStress** - Method for disallow using stress in words\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> def callbackFn(text):\n...     print(text)\n... \n>>> alm.textToJson('\u00ab\u0411\u0435\u0301\u043b\u0430\u044f \u0441\u0442\u0440\u0435\u043b\u0430\u0301\u00bb \u2014 \u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439 \u043b\u0435\u0433\u0435\u043d\u0434\u0435, \u044f\u043a\u043e\u0431\u044b \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u0438 \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f \u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430, \u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438 \u043a\u043e\u0442\u043e\u0440\u043e\u0439 \u2014 \u0431\u044b\u0432\u0448\u0438\u0435 \u0438 \u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435 \u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b \u0438 \u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b, \u0438\u043c\u0435\u044e\u0449\u0438\u0435 \u043f\u0440\u0430\u0432\u043e \u043d\u0430 \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e \u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e \u043e\u0441\u043e\u0431\u043e \u043e\u043f\u0430\u0441\u043d\u044b\u0445 \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432 \u0438 \u043b\u0438\u0434\u0435\u0440\u043e\u0432 \u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445 \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u041e\u041f\u0413, \u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e \u043a\u043e\u0442\u043e\u0440\u044b\u0445 \u043d\u0435 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c \u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f \u0438\u0445 \u043a \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439 \u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438 \u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438 \u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438[1][2][3]. \u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f \u043d\u0430 \u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435 \u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445 \u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438 \u0438 \u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435 \u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f \u0441\u043e \u0441\u0442\u043e\u0440\u043e\u043d\u044b \u0441\u0438\u043b\u043e\u0432\u044b\u0445 \u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440 \u0438 \u0441\u043b\u0443\u0436\u0431 \u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438[4], \u0432 \u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c \u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435 \u043b\u0435\u0433\u0435\u043d\u0434\u0443 \u0441\u0447\u0438\u0442\u0430\u044e\u0442 \u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439 \u043d\u0430 \u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445 \u0444\u0430\u043a\u0442\u0430\u0445 \u0433\u0440\u043e\u043c\u043a\u0438\u0445 \u0443\u0431\u0438\u0439\u0441\u0442\u0432 \u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432, \u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b, \u0438 \u043d\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435 \u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439 \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b[5].', callbackFn)\n[[\"\u00ab\",\"\u0411\u0435\u043b\u0430\u044f\",\"\u0441\u0442\u0440\u0435\u043b\u0430\",\"\u00bb\",\"\u2014\",\"\u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e\",\"\u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0435\",\",\",\"\u044f\u043a\u043e\u0431\u044b\",\"\u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u0438\",\"\u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f\",\"\u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430\",\",\",\"\u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438\",\"\u043a\u043e\u0442\u043e\u0440\u043e\u0439\",\"\u2014\",\"\u0431\u044b\u0432\u0448\u0438\u0435\",\"\u0438\",\"\u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435\",\"\u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b\",\"\u0438\",\"\u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b\",\",\",\"\u0438\u043c\u0435\u044e\u0449\u0438\u0435\",\"\u043f\u0440\u0430\u0432\u043e\",\"\u043d\u0430\",\"\u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e\",\"\u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e\",\"\u043e\u0441\u043e\u0431\u043e\",\"\u043e\u043f\u0430\u0441\u043d\u044b\u0445\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\"\u0438\",\"\u043b\u0438\u0434\u0435\u0440\u043e\u0432\",\"\u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u041e\u041f\u0413\",\",\",\"\u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e\",\"\u043a\u043e\u0442\u043e\u0440\u044b\u0445\",\"\u043d\u0435\",\"\u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f\",\"\u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c\",\"\u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f\",\"\u0438\u0445\",\"\u043a\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439\",\"\u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438\",\"\u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438\",\"\u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438\",\"[\",\"1\",\"]\",\"[\",\"2\",\"]\",\"[\",\"3\",\"]\",\".\"],[\"\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f\",\"\u043d\u0430\",\"\u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435\",\"\u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438\",\"\u0438\",\"\u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435\",\"\u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f\",\"\u0441\u043e\",\"\u0441\u0442\u043e\u0440\u043e\u043d\u044b\",\"\u0441\u0438\u043b\u043e\u0432\u044b\u0445\",\"\u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440\",\"\u0438\",\"\u0441\u043b\u0443\u0436\u0431\",\"\u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438\",\"[\",\"4\",\"]\",\",\",\"\u0432\",\"\u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c\",\"\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0443\",\"\u0441\u0447\u0438\u0442\u0430\u044e\u0442\",\"\u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439\",\"\u043d\u0430\",\"\u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445\",\"\u0444\u0430\u043a\u0442\u0430\u0445\",\"\u0433\u0440\u043e\u043c\u043a\u0438\u0445\",\"\u0443\u0431\u0438\u0439\u0441\u0442\u0432\",\"\u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\",\",\"\u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\",\",\"\u0438\",\"\u043d\u0435\",\"\u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435\",\"\u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b\",\"[\",\"5\",\"]\",\".\"]]\n>>>\n>>> alm.jsonToText('[[\"\u00ab\",\"\u0411\u0435\u043b\u0430\u044f\",\"\u0441\u0442\u0440\u0435\u043b\u0430\",\"\u00bb\",\"\u2014\",\"\u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e\",\"\u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0435\",\",\",\"\u044f\u043a\u043e\u0431\u044b\",\"\u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u0438\",\"\u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f\",\"\u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430\",\",\",\"\u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438\",\"\u043a\u043e\u0442\u043e\u0440\u043e\u0439\",\"\u2014\",\"\u0431\u044b\u0432\u0448\u0438\u0435\",\"\u0438\",\"\u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435\",\"\u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b\",\"\u0438\",\"\u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b\",\",\",\"\u0438\u043c\u0435\u044e\u0449\u0438\u0435\",\"\u043f\u0440\u0430\u0432\u043e\",\"\u043d\u0430\",\"\u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e\",\"\u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e\",\"\u043e\u0441\u043e\u0431\u043e\",\"\u043e\u043f\u0430\u0441\u043d\u044b\u0445\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\"\u0438\",\"\u043b\u0438\u0434\u0435\u0440\u043e\u0432\",\"\u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u041e\u041f\u0413\",\",\",\"\u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e\",\"\u043a\u043e\u0442\u043e\u0440\u044b\u0445\",\"\u043d\u0435\",\"\u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f\",\"\u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c\",\"\u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f\",\"\u0438\u0445\",\"\u043a\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439\",\"\u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438\",\"\u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438\",\"\u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438\",\"[\",\"1\",\"]\",\"[\",\"2\",\"]\",\"[\",\"3\",\"]\",\".\"],[\"\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f\",\"\u043d\u0430\",\"\u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435\",\"\u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438\",\"\u0438\",\"\u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435\",\"\u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f\",\"\u0441\u043e\",\"\u0441\u0442\u043e\u0440\u043e\u043d\u044b\",\"\u0441\u0438\u043b\u043e\u0432\u044b\u0445\",\"\u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440\",\"\u0438\",\"\u0441\u043b\u0443\u0436\u0431\",\"\u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438\",\"[\",\"4\",\"]\",\",\",\"\u0432\",\"\u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c\",\"\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0443\",\"\u0441\u0447\u0438\u0442\u0430\u044e\u0442\",\"\u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439\",\"\u043d\u0430\",\"\u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445\",\"\u0444\u0430\u043a\u0442\u0430\u0445\",\"\u0433\u0440\u043e\u043c\u043a\u0438\u0445\",\"\u0443\u0431\u0438\u0439\u0441\u0442\u0432\",\"\u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\",\",\"\u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\",\",\"\u0438\",\"\u043d\u0435\",\"\u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435\",\"\u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b\",\"[\",\"5\",\"]\",\".\"]]', callbackFn)\n\u00ab\u0411\u0435\u043b\u0430\u044f \u0441\u0442\u0440\u0435\u043b\u0430\u00bb \u2014 \u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439 \u043b\u0435\u0433\u0435\u043d\u0434\u0435, \u044f\u043a\u043e\u0431\u044b \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u0438 \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f \u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430, \u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438 \u043a\u043e\u0442\u043e\u0440\u043e\u0439 \u2014 \u0431\u044b\u0432\u0448\u0438\u0435 \u0438 \u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435 \u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b \u0438 \u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b, \u0438\u043c\u0435\u044e\u0449\u0438\u0435 \u043f\u0440\u0430\u0432\u043e \u043d\u0430 \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e \u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e \u043e\u0441\u043e\u0431\u043e \u043e\u043f\u0430\u0441\u043d\u044b\u0445 \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432 \u0438 \u043b\u0438\u0434\u0435\u0440\u043e\u0432 \u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445 \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u041e\u041f\u0413, \u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e \u043a\u043e\u0442\u043e\u0440\u044b\u0445 \u043d\u0435 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c \u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f \u0438\u0445 \u043a \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439 \u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438 \u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438 \u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438 [1] [2] [3].\n\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f \u043d\u0430 \u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435 \u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445 \u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438 \u0438 \u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435 \u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f \u0441\u043e \u0441\u0442\u043e\u0440\u043e\u043d\u044b \u0441\u0438\u043b\u043e\u0432\u044b\u0445 \u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440 \u0438 \u0441\u043b\u0443\u0436\u0431 \u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438 [4], \u0432 \u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c \u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435 \u043b\u0435\u0433\u0435\u043d\u0434\u0443 \u0441\u0447\u0438\u0442\u0430\u044e\u0442 \u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439 \u043d\u0430 \u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445 \u0444\u0430\u043a\u0442\u0430\u0445 \u0433\u0440\u043e\u043c\u043a\u0438\u0445 \u0443\u0431\u0438\u0439\u0441\u0442\u0432 \u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432, \u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b, \u0438 \u043d\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435 \u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439 \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b [5].\n>>>\n>>> alm.allowStress()\n>>> alm.textToJson('\u00ab\u0411\u0435\u0301\u043b\u0430\u044f \u0441\u0442\u0440\u0435\u043b\u0430\u0301\u00bb \u2014 \u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439 \u043b\u0435\u0433\u0435\u043d\u0434\u0435, \u044f\u043a\u043e\u0431\u044b \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u0438 \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f \u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430, \u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438 \u043a\u043e\u0442\u043e\u0440\u043e\u0439 \u2014 \u0431\u044b\u0432\u0448\u0438\u0435 \u0438 \u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435 \u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b \u0438 \u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b, \u0438\u043c\u0435\u044e\u0449\u0438\u0435 \u043f\u0440\u0430\u0432\u043e \u043d\u0430 \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e \u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e \u043e\u0441\u043e\u0431\u043e \u043e\u043f\u0430\u0441\u043d\u044b\u0445 \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432 \u0438 \u043b\u0438\u0434\u0435\u0440\u043e\u0432 \u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445 \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u041e\u041f\u0413, \u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e \u043a\u043e\u0442\u043e\u0440\u044b\u0445 \u043d\u0435 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c \u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f \u0438\u0445 \u043a \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439 \u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438 \u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438 \u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438[1][2][3]. \u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f \u043d\u0430 \u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435 \u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445 \u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438 \u0438 \u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435 \u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f \u0441\u043e \u0441\u0442\u043e\u0440\u043e\u043d\u044b \u0441\u0438\u043b\u043e\u0432\u044b\u0445 \u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440 \u0438 \u0441\u043b\u0443\u0436\u0431 \u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438[4], \u0432 \u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c \u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435 \u043b\u0435\u0433\u0435\u043d\u0434\u0443 \u0441\u0447\u0438\u0442\u0430\u044e\u0442 \u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439 \u043d\u0430 \u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445 \u0444\u0430\u043a\u0442\u0430\u0445 \u0433\u0440\u043e\u043c\u043a\u0438\u0445 \u0443\u0431\u0438\u0439\u0441\u0442\u0432 \u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432, \u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b, \u0438 \u043d\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435 \u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439 \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b[5].', callbackFn)\n[[\"\u00ab\",\"\u0411\u0435\u0301\u043b\u0430\u044f\",\"\u0441\u0442\u0440\u0435\u043b\u0430\u0301\",\"\u00bb\",\"\u2014\",\"\u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e\",\"\u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0435\",\",\",\"\u044f\u043a\u043e\u0431\u044b\",\"\u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u0438\",\"\u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f\",\"\u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430\",\",\",\"\u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438\",\"\u043a\u043e\u0442\u043e\u0440\u043e\u0439\",\"\u2014\",\"\u0431\u044b\u0432\u0448\u0438\u0435\",\"\u0438\",\"\u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435\",\"\u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b\",\"\u0438\",\"\u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b\",\",\",\"\u0438\u043c\u0435\u044e\u0449\u0438\u0435\",\"\u043f\u0440\u0430\u0432\u043e\",\"\u043d\u0430\",\"\u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e\",\"\u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e\",\"\u043e\u0441\u043e\u0431\u043e\",\"\u043e\u043f\u0430\u0441\u043d\u044b\u0445\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\"\u0438\",\"\u043b\u0438\u0434\u0435\u0440\u043e\u0432\",\"\u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u041e\u041f\u0413\",\",\",\"\u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e\",\"\u043a\u043e\u0442\u043e\u0440\u044b\u0445\",\"\u043d\u0435\",\"\u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f\",\"\u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c\",\"\u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f\",\"\u0438\u0445\",\"\u043a\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439\",\"\u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438\",\"\u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438\",\"\u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438\",\"[\",\"1\",\"]\",\"[\",\"2\",\"]\",\"[\",\"3\",\"]\",\".\"],[\"\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f\",\"\u043d\u0430\",\"\u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435\",\"\u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438\",\"\u0438\",\"\u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435\",\"\u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f\",\"\u0441\u043e\",\"\u0441\u0442\u043e\u0440\u043e\u043d\u044b\",\"\u0441\u0438\u043b\u043e\u0432\u044b\u0445\",\"\u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440\",\"\u0438\",\"\u0441\u043b\u0443\u0436\u0431\",\"\u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438\",\"[\",\"4\",\"]\",\",\",\"\u0432\",\"\u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c\",\"\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0443\",\"\u0441\u0447\u0438\u0442\u0430\u044e\u0442\",\"\u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439\",\"\u043d\u0430\",\"\u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445\",\"\u0444\u0430\u043a\u0442\u0430\u0445\",\"\u0433\u0440\u043e\u043c\u043a\u0438\u0445\",\"\u0443\u0431\u0438\u0439\u0441\u0442\u0432\",\"\u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\",\",\"\u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\",\",\"\u0438\",\"\u043d\u0435\",\"\u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435\",\"\u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b\",\"[\",\"5\",\"]\",\".\"]]\n>>>\n>>> alm.jsonToText('[[\"\u00ab\",\"\u0411\u0435\u0301\u043b\u0430\u044f\",\"\u0441\u0442\u0440\u0435\u043b\u0430\u0301\",\"\u00bb\",\"\u2014\",\"\u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e\",\"\u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0435\",\",\",\"\u044f\u043a\u043e\u0431\u044b\",\"\u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u0438\",\"\u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f\",\"\u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430\",\",\",\"\u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438\",\"\u043a\u043e\u0442\u043e\u0440\u043e\u0439\",\"\u2014\",\"\u0431\u044b\u0432\u0448\u0438\u0435\",\"\u0438\",\"\u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435\",\"\u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b\",\"\u0438\",\"\u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b\",\",\",\"\u0438\u043c\u0435\u044e\u0449\u0438\u0435\",\"\u043f\u0440\u0430\u0432\u043e\",\"\u043d\u0430\",\"\u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e\",\"\u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e\",\"\u043e\u0441\u043e\u0431\u043e\",\"\u043e\u043f\u0430\u0441\u043d\u044b\u0445\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\"\u0438\",\"\u043b\u0438\u0434\u0435\u0440\u043e\u0432\",\"\u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u041e\u041f\u0413\",\",\",\"\u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e\",\"\u043a\u043e\u0442\u043e\u0440\u044b\u0445\",\"\u043d\u0435\",\"\u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f\",\"\u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c\",\"\u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f\",\"\u0438\u0445\",\"\u043a\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439\",\"\u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438\",\"\u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438\",\"\u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438\",\"[\",\"1\",\"]\",\"[\",\"2\",\"]\",\"[\",\"3\",\"]\",\".\"],[\"\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f\",\"\u043d\u0430\",\"\u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435\",\"\u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438\",\"\u0438\",\"\u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435\",\"\u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f\",\"\u0441\u043e\",\"\u0441\u0442\u043e\u0440\u043e\u043d\u044b\",\"\u0441\u0438\u043b\u043e\u0432\u044b\u0445\",\"\u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440\",\"\u0438\",\"\u0441\u043b\u0443\u0436\u0431\",\"\u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438\",\"[\",\"4\",\"]\",\",\",\"\u0432\",\"\u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c\",\"\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0443\",\"\u0441\u0447\u0438\u0442\u0430\u044e\u0442\",\"\u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439\",\"\u043d\u0430\",\"\u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445\",\"\u0444\u0430\u043a\u0442\u0430\u0445\",\"\u0433\u0440\u043e\u043c\u043a\u0438\u0445\",\"\u0443\u0431\u0438\u0439\u0441\u0442\u0432\",\"\u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\",\",\"\u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\",\",\"\u0438\",\"\u043d\u0435\",\"\u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435\",\"\u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b\",\"[\",\"5\",\"]\",\".\"]]', callbackFn)\n\u00ab\u0411\u0435\u0301\u043b\u0430\u044f \u0441\u0442\u0440\u0435\u043b\u0430\u0301\u00bb \u2014 \u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439 \u043b\u0435\u0433\u0435\u043d\u0434\u0435, \u044f\u043a\u043e\u0431\u044b \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u0438 \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f \u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430, \u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438 \u043a\u043e\u0442\u043e\u0440\u043e\u0439 \u2014 \u0431\u044b\u0432\u0448\u0438\u0435 \u0438 \u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435 \u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b \u0438 \u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b, \u0438\u043c\u0435\u044e\u0449\u0438\u0435 \u043f\u0440\u0430\u0432\u043e \u043d\u0430 \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e \u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e \u043e\u0441\u043e\u0431\u043e \u043e\u043f\u0430\u0441\u043d\u044b\u0445 \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432 \u0438 \u043b\u0438\u0434\u0435\u0440\u043e\u0432 \u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445 \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u041e\u041f\u0413, \u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e \u043a\u043e\u0442\u043e\u0440\u044b\u0445 \u043d\u0435 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c \u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f \u0438\u0445 \u043a \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439 \u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438 \u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438 \u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438 [1] [2] [3].\n\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f \u043d\u0430 \u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435 \u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445 \u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438 \u0438 \u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435 \u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f \u0441\u043e \u0441\u0442\u043e\u0440\u043e\u043d\u044b \u0441\u0438\u043b\u043e\u0432\u044b\u0445 \u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440 \u0438 \u0441\u043b\u0443\u0436\u0431 \u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438 [4], \u0432 \u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c \u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435 \u043b\u0435\u0433\u0435\u043d\u0434\u0443 \u0441\u0447\u0438\u0442\u0430\u044e\u0442 \u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439 \u043d\u0430 \u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445 \u0444\u0430\u043a\u0442\u0430\u0445 \u0433\u0440\u043e\u043c\u043a\u0438\u0445 \u0443\u0431\u0438\u0439\u0441\u0442\u0432 \u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432, \u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b, \u0438 \u043d\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435 \u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439 \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b [5].\n>>>\n>>> alm.disallowStress()\n>>> alm.textToJson('\u00ab\u0411\u0435\u0301\u043b\u0430\u044f \u0441\u0442\u0440\u0435\u043b\u0430\u0301\u00bb \u2014 \u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439 \u043b\u0435\u0433\u0435\u043d\u0434\u0435, \u044f\u043a\u043e\u0431\u044b \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u0438 \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f \u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430, \u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438 \u043a\u043e\u0442\u043e\u0440\u043e\u0439 \u2014 \u0431\u044b\u0432\u0448\u0438\u0435 \u0438 \u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435 \u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b \u0438 \u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b, \u0438\u043c\u0435\u044e\u0449\u0438\u0435 \u043f\u0440\u0430\u0432\u043e \u043d\u0430 \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e \u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e \u043e\u0441\u043e\u0431\u043e \u043e\u043f\u0430\u0441\u043d\u044b\u0445 \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432 \u0438 \u043b\u0438\u0434\u0435\u0440\u043e\u0432 \u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445 \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u041e\u041f\u0413, \u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e \u043a\u043e\u0442\u043e\u0440\u044b\u0445 \u043d\u0435 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c \u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f \u0438\u0445 \u043a \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439 \u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438 \u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438 \u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438[1][2][3]. \u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f \u043d\u0430 \u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435 \u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445 \u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438 \u0438 \u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435 \u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f \u0441\u043e \u0441\u0442\u043e\u0440\u043e\u043d\u044b \u0441\u0438\u043b\u043e\u0432\u044b\u0445 \u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440 \u0438 \u0441\u043b\u0443\u0436\u0431 \u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438[4], \u0432 \u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c \u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435 \u043b\u0435\u0433\u0435\u043d\u0434\u0443 \u0441\u0447\u0438\u0442\u0430\u044e\u0442 \u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439 \u043d\u0430 \u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445 \u0444\u0430\u043a\u0442\u0430\u0445 \u0433\u0440\u043e\u043c\u043a\u0438\u0445 \u0443\u0431\u0438\u0439\u0441\u0442\u0432 \u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432, \u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b, \u0438 \u043d\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435 \u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439 \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b[5].', callbackFn)\n[[\"\u00ab\",\"\u0411\u0435\u043b\u0430\u044f\",\"\u0441\u0442\u0440\u0435\u043b\u0430\",\"\u00bb\",\"\u2014\",\"\u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e\",\"\u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0435\",\",\",\"\u044f\u043a\u043e\u0431\u044b\",\"\u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u0438\",\"\u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f\",\"\u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430\",\",\",\"\u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438\",\"\u043a\u043e\u0442\u043e\u0440\u043e\u0439\",\"\u2014\",\"\u0431\u044b\u0432\u0448\u0438\u0435\",\"\u0438\",\"\u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435\",\"\u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b\",\"\u0438\",\"\u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b\",\",\",\"\u0438\u043c\u0435\u044e\u0449\u0438\u0435\",\"\u043f\u0440\u0430\u0432\u043e\",\"\u043d\u0430\",\"\u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e\",\"\u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e\",\"\u043e\u0441\u043e\u0431\u043e\",\"\u043e\u043f\u0430\u0441\u043d\u044b\u0445\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\"\u0438\",\"\u043b\u0438\u0434\u0435\u0440\u043e\u0432\",\"\u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u041e\u041f\u0413\",\",\",\"\u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e\",\"\u043a\u043e\u0442\u043e\u0440\u044b\u0445\",\"\u043d\u0435\",\"\u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f\",\"\u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c\",\"\u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f\",\"\u0438\u0445\",\"\u043a\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439\",\"\u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438\",\"\u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438\",\"\u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438\",\"[\",\"1\",\"]\",\"[\",\"2\",\"]\",\"[\",\"3\",\"]\",\".\"],[\"\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f\",\"\u043d\u0430\",\"\u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435\",\"\u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438\",\"\u0438\",\"\u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435\",\"\u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f\",\"\u0441\u043e\",\"\u0441\u0442\u043e\u0440\u043e\u043d\u044b\",\"\u0441\u0438\u043b\u043e\u0432\u044b\u0445\",\"\u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440\",\"\u0438\",\"\u0441\u043b\u0443\u0436\u0431\",\"\u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438\",\"[\",\"4\",\"]\",\",\",\"\u0432\",\"\u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c\",\"\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0443\",\"\u0441\u0447\u0438\u0442\u0430\u044e\u0442\",\"\u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439\",\"\u043d\u0430\",\"\u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445\",\"\u0444\u0430\u043a\u0442\u0430\u0445\",\"\u0433\u0440\u043e\u043c\u043a\u0438\u0445\",\"\u0443\u0431\u0438\u0439\u0441\u0442\u0432\",\"\u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\",\",\"\u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\",\",\"\u0438\",\"\u043d\u0435\",\"\u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435\",\"\u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b\",\"[\",\"5\",\"]\",\".\"]]\n>>>\n>>> alm.jsonToText('[[\"\u00ab\",\"\u0411\u0435\u043b\u0430\u044f\",\"\u0441\u0442\u0440\u0435\u043b\u0430\",\"\u00bb\",\"\u2014\",\"\u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e\",\"\u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0435\",\",\",\"\u044f\u043a\u043e\u0431\u044b\",\"\u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u0438\",\"\u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f\",\"\u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f\",\"\u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430\",\",\",\"\u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438\",\"\u043a\u043e\u0442\u043e\u0440\u043e\u0439\",\"\u2014\",\"\u0431\u044b\u0432\u0448\u0438\u0435\",\"\u0438\",\"\u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435\",\"\u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b\",\"\u0438\",\"\u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b\",\",\",\"\u0438\u043c\u0435\u044e\u0449\u0438\u0435\",\"\u043f\u0440\u0430\u0432\u043e\",\"\u043d\u0430\",\"\u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e\",\"\u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e\",\"\u043e\u0441\u043e\u0431\u043e\",\"\u043e\u043f\u0430\u0441\u043d\u044b\u0445\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\"\u0438\",\"\u043b\u0438\u0434\u0435\u0440\u043e\u0432\",\"\u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445\",\"\u0432\",\"\u0420\u043e\u0441\u0441\u0438\u0438\",\"\u041e\u041f\u0413\",\",\",\"\u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e\",\"\u043a\u043e\u0442\u043e\u0440\u044b\u0445\",\"\u043d\u0435\",\"\u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f\",\"\u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c\",\"\u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f\",\"\u0438\u0445\",\"\u043a\",\"\u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439\",\"\u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438\",\"\u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438\",\"\u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438\",\"[\",\"1\",\"]\",\"[\",\"2\",\"]\",\"[\",\"3\",\"]\",\".\"],[\"\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f\",\"\u043d\u0430\",\"\u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435\",\"\u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f\",\"\u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438\",\"\u0438\",\"\u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435\",\"\u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f\",\"\u0441\u043e\",\"\u0441\u0442\u043e\u0440\u043e\u043d\u044b\",\"\u0441\u0438\u043b\u043e\u0432\u044b\u0445\",\"\u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440\",\"\u0438\",\"\u0441\u043b\u0443\u0436\u0431\",\"\u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438\",\"[\",\"4\",\"]\",\",\",\"\u0432\",\"\u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c\",\"\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435\",\"\u043b\u0435\u0433\u0435\u043d\u0434\u0443\",\"\u0441\u0447\u0438\u0442\u0430\u044e\u0442\",\"\u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439\",\"\u043d\u0430\",\"\u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445\",\"\u0444\u0430\u043a\u0442\u0430\u0445\",\"\u0433\u0440\u043e\u043c\u043a\u0438\u0445\",\"\u0443\u0431\u0438\u0439\u0441\u0442\u0432\",\"\u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445\",\"\u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432\",\",\",\"\u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445\",\"\u0432\",\"1990-\u0435\",\"\u0433\u043e\u0434\u044b\",\",\",\"\u0438\",\"\u043d\u0435\",\"\u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442\",\"\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435\",\"\u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439\",\"\u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b\",\"[\",\"5\",\"]\",\".\"]]', callbackFn)\n\u00ab\u0411\u0435\u043b\u0430\u044f \u0441\u0442\u0440\u0435\u043b\u0430\u00bb \u2014 \u0441\u043e\u0433\u043b\u0430\u0441\u043d\u043e \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u0451\u043d\u043d\u043e\u0439 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u0433\u043e\u0440\u043e\u0434\u0441\u043a\u043e\u0439 \u043b\u0435\u0433\u0435\u043d\u0434\u0435, \u044f\u043a\u043e\u0431\u044b \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u0438 \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u0430\u044f \u0437\u0430\u043a\u043e\u043d\u0441\u043f\u0438\u0440\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u0430\u044f \u043f\u0440\u0430\u0432\u0438\u0442\u0435\u043b\u044c\u0441\u0442\u0432\u0435\u043d\u043d\u0430\u044f \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u0430, \u0441\u043e\u0442\u0440\u0443\u0434\u043d\u0438\u043a\u0438 \u043a\u043e\u0442\u043e\u0440\u043e\u0439 \u2014 \u0431\u044b\u0432\u0448\u0438\u0435 \u0438 \u0434\u0435\u0439\u0441\u0442\u0432\u043e\u0432\u0430\u0432\u0448\u0438\u0435 \u043c\u0438\u043b\u0438\u0446\u0438\u043e\u043d\u0435\u0440\u044b \u0438 \u0441\u043f\u0435\u0446\u043d\u0430\u0437\u043e\u0432\u0446\u044b, \u0438\u043c\u0435\u044e\u0449\u0438\u0435 \u043f\u0440\u0430\u0432\u043e \u043d\u0430 \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0443\u044e \u043b\u0438\u043a\u0432\u0438\u0434\u0430\u0446\u0438\u044e \u043e\u0441\u043e\u0431\u043e \u043e\u043f\u0430\u0441\u043d\u044b\u0445 \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432 \u0438 \u043b\u0438\u0434\u0435\u0440\u043e\u0432 \u043e\u0440\u0443\u0434\u043e\u0432\u0430\u0432\u0448\u0438\u0445 \u0432 \u0420\u043e\u0441\u0441\u0438\u0438 \u041e\u041f\u0413, \u043e\u0442\u043d\u043e\u0441\u0438\u0442\u0435\u043b\u044c\u043d\u043e \u043a\u043e\u0442\u043e\u0440\u044b\u0445 \u043d\u0435 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u044b\u043c \u043f\u0440\u0438\u0432\u043b\u0435\u0447\u0435\u043d\u0438\u044f \u0438\u0445 \u043a \u0443\u0433\u043e\u043b\u043e\u0432\u043d\u043e\u0439 \u043e\u0442\u0432\u0435\u0442\u0441\u0442\u0432\u0435\u043d\u043d\u043e\u0441\u0442\u0438 \u0437\u0430\u043a\u043e\u043d\u043d\u044b\u043c\u0438 \u043c\u0435\u0442\u043e\u0434\u0430\u043c\u0438 [1] [2] [3].\n\u041d\u0435\u0441\u043c\u043e\u0442\u0440\u044f \u043d\u0430 \u043e\u0442\u0441\u0443\u0442\u0441\u0442\u0432\u0438\u0435 \u043e\u0444\u0438\u0446\u0438\u0430\u043b\u044c\u043d\u044b\u0445 \u0434\u043e\u043a\u0430\u0437\u0430\u0442\u0435\u043b\u044c\u0441\u0442\u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u044f \u043e\u0440\u0433\u0430\u043d\u0438\u0437\u0430\u0446\u0438\u0438 \u0438 \u043c\u043d\u043e\u0433\u043e\u0447\u0438\u0441\u043b\u0435\u043d\u043d\u044b\u0435 \u043e\u043f\u0440\u043e\u0432\u0435\u0440\u0436\u0435\u043d\u0438\u044f \u0441\u043e \u0441\u0442\u043e\u0440\u043e\u043d\u044b \u0441\u0438\u043b\u043e\u0432\u044b\u0445 \u0441\u0442\u0440\u0443\u043a\u0442\u0443\u0440 \u0438 \u0441\u043b\u0443\u0436\u0431 \u0431\u0435\u0437\u043e\u043f\u0430\u0441\u043d\u043e\u0441\u0442\u0438 [4], \u0432 \u0440\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u043e\u043c \u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0435 \u043b\u0435\u0433\u0435\u043d\u0434\u0443 \u0441\u0447\u0438\u0442\u0430\u044e\u0442 \u043e\u0441\u043d\u043e\u0432\u0430\u043d\u043d\u043e\u0439 \u043d\u0430 \u043f\u043e\u0434\u043b\u0438\u043d\u043d\u044b\u0445 \u0444\u0430\u043a\u0442\u0430\u0445 \u0433\u0440\u043e\u043c\u043a\u0438\u0445 \u0443\u0431\u0438\u0439\u0441\u0442\u0432 \u043a\u0440\u0438\u043c\u0438\u043d\u0430\u043b\u044c\u043d\u044b\u0445 \u0430\u0432\u0442\u043e\u0440\u0438\u0442\u0435\u0442\u043e\u0432, \u0441\u043e\u0432\u0435\u0440\u0448\u0451\u043d\u043d\u044b\u0445 \u0432 1990-\u0435 \u0433\u043e\u0434\u044b, \u0438 \u043d\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0430\u044e\u0442 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u043e\u0432\u0430\u043d\u0438\u0435 \u0440\u0435\u0430\u043b\u044c\u043d\u043e\u0439 \u0441\u043f\u0435\u0446\u0441\u043b\u0443\u0436\u0431\u044b [5].\n```\n\n---\n\n### Methods:\n- **addBadword** - Method add bad word\n- **setBadwords** - Method set words to blacklist\n- **getBadwords** - Method get words in blacklist\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setBadwords([\"hello\", \"world\", \"test\"])\n>>>\n>>> alm.getBadwords()\n{1554834897, 2156498622, 28307030}\n>>>\n>>> alm.addBadword(\"test2\")\n>>>\n>>> alm.getBadwords()\n{5170183734, 1554834897, 2156498622, 28307030}\n```\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setBadwords({24227504, 1219922507, 1794085167})\n>>>\n>>> alm.getBadwords()\n{24227504, 1219922507, 1794085167}\n>>>\n>>> alm.clear(alm.clear_t.badwords)\n>>>\n>>> alm.getBadwords()\n{}\n```\n\n---\n\n### Methods:\n- **addGoodword** - Method add good word\n- **setGoodwords** - Method set words to whitelist\n- **getGoodwords** - Method get words in whitelist\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setGoodwords([\"hello\", \"world\", \"test\"])\n>>>\n>>> alm.getGoodwords()\n{1554834897, 2156498622, 28307030}\n>>>\n>>> alm.addGoodword(\"test2\")\n>>>\n>>> alm.getGoodwords()\n{5170183734, 1554834897, 2156498622, 28307030}\n>>>\n>>> alm.clear(alm.clear_t.goodwords)\n>>>\n>>  alm.getGoodwords()\n{}\n```\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setGoodwords({24227504, 1219922507, 1794085167})\n>>>\n>>> alm.getGoodwords()\n{24227504, 1219922507, 1794085167}\n```\n\n---\n\n### Methods:\n- **setUserToken** - Method for adding user token\n- **getUserTokens** - User token list retrieval method\n- **getUserTokenId** - Method for obtaining user token identifier\n- **getUserTokenWord** - Method for obtaining a custom token by its identifier\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setUserToken(\"usa\")\n>>>\n>>> alm.setUserToken(\"russia\")\n>>>\n>>> alm.getUserTokenId(\"usa\")\n5759809081\n>>>\n>>> alm.getUserTokenId(\"russia\")\n9910674734\n>>>\n>>> alm.getUserTokens()\n['usa', 'russia']\n>>>\n>>> alm.getUserTokenWord(5759809081)\n'usa'\n>>>\n>>> alm.getUserTokenWord(9910674734)\n'russia'\n>>>\n>> alm.clear(alm.clear_t.utokens)\n>>>\n>>> alm.getUserTokens()\n[]\n```\n\n---\n\n### Methods:\n- **findNgram** - N-gram search method in text\n- **word** - \"Method to extract a word by its identifier\"\n\n### Example:\n```python\n>>> import alm\n>>> \n>>> def callbackFn(text):\n...     print(text)\n... \n>>> alm.setOption(alm.options_t.confidence)\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>> alm.readArpa('./lm.arpa')\n>>> \n>>> alm.idw(\"\u043f\u0440\u0438\u0432\u0435\u0442\")\n2487910648\n>>> alm.word(2487910648)\n'\u043f\u0440\u0438\u0432\u0435\u0442'\n>>> \n>>> alm.findNgram(\"\u041e\u0441\u043e\u0431\u043e\u0435 \u043c\u0435\u0441\u0442\u043e \u0437\u0430\u043d\u0438\u043c\u0430\u0435\u0442 \u0447\u0443\u0434\u043e\u0442\u0432\u043e\u0440\u043d\u0430\u044f \u0438\u043a\u043e\u043d\u0430 \u041b\u043e\u0431\u0437\u0430\u043d\u0438\u0435 \u0425\u0440\u0438\u0441\u0442\u0430 \u0418\u0443\u0434\u043e\u044e\", callbackFn)\n<s> \u041e\u0441\u043e\u0431\u043e\u0435\n\u041e\u0441\u043e\u0431\u043e\u0435 \u043c\u0435\u0441\u0442\u043e\n\u043c\u0435\u0441\u0442\u043e \u0437\u0430\u043d\u0438\u043c\u0430\u0435\u0442\n\u0437\u0430\u043d\u0438\u043c\u0430\u0435\u0442 \u0447\u0443\u0434\u043e\u0442\u0432\u043e\u0440\u043d\u0430\u044f\n\u0447\u0443\u0434\u043e\u0442\u0432\u043e\u0440\u043d\u0430\u044f \u0438\u043a\u043e\u043d\u0430\n\u0438\u043a\u043e\u043d\u0430 \u041b\u043e\u0431\u0437\u0430\u043d\u0438\u0435\n\u041b\u043e\u0431\u0437\u0430\u043d\u0438\u0435 \u0425\u0440\u0438\u0441\u0442\u0430\n\u0425\u0440\u0438\u0441\u0442\u0430 \u0418\u0443\u0434\u043e\u044e\n\u0418\u0443\u0434\u043e\u044e </s>\n\n\n>>>\n```\n\n---\n\n### Methods:\n- **setUserTokenMethod** - Method for set a custom token processing function\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> def fn(token, word):\n...     if token and (token == \"<usa>\"):\n...         if word and (word.lower() == \"usa\"):\n...             return True\n...     elif token and (token == \"<russia>\"):\n...         if word and (word.lower() == \"russia\"):\n...             return True\n...     return False\n... \n>>> alm.setUserToken(\"usa\")\n>>>\n>>> alm.setUserToken(\"russia\")\n>>>\n>>> alm.setUserTokenMethod(\"usa\", fn)\n>>>\n>>> alm.setUserTokenMethod(\"russia\", fn)\n>>>\n>>> alm.idw(\"usa\")\n5759809081\n>>>\n>>> alm.idw(\"russia\")\n9910674734\n>>>\n>>> alm.getUserTokenWord(5759809081)\n'usa'\n>>>\n>>> alm.getUserTokenWord(9910674734)\n'russia'\n```\n\n---\n\n### Methods:\n- **setAlmV2** - Method for set the language model type ALMv2\n- **unsetAlmV2** - Method for unset the language model type ALMv2\n- **readALM** - Method for reading data from a binary container\n- **setWordPreprocessingMethod** - Method for set the word preprocessing function\n\n### Example:\n```python\n>>> import alm\n>>> \n>>> alm.setAlmV2()\n>>> \n>>> def run(word, context):\n...     if word == \"\u0432\u043e\u0437\u043b\u0435\": word = \"\u043e\u043a\u043e\u043b\u043e\"\n...     return word\n... \n>>> alm.setOption(alm.options_t.debug)\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.setWordPreprocessingMethod(run)\n>>>\n>>> a = alm.perplexity(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\ninfo: <s> \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 <punct> <punct> <punct> </s>\n\ninfo: p( \u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e | <s> ) \t= [2gram] 0.00038931 [ -3.40969900 ] / 0.99999991\ninfo: p( \u0438\u0437 | \u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e ...) \t= [2gram] 0.10110741 [ -0.99521700 ] / 0.99999979\ninfo: p( \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 | \u0438\u0437 ...) \t= [2gram] 0.00711798 [ -2.14764300 ] / 1.00000027\ninfo: p( \u0432 | \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 ...) \t= [2gram] 0.51077661 [ -0.29176900 ] / 1.00000021\ninfo: p( \u043e\u043b\u0435\u0433\u0430 | \u0432 ...) \t= [2gram] 0.00082936 [ -3.08125500 ] / 0.99999974\ninfo: p( \u0443\u0434\u0430\u0440\u0438\u043b | \u043e\u043b\u0435\u0433\u0430 ...) \t= [2gram] 0.25002820 [ -0.60201100 ] / 0.99999978\ninfo: p( \u044f\u0440\u043a\u0438\u0439 | \u0443\u0434\u0430\u0440\u0438\u043b ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 | \u044f\u0440\u043a\u0438\u0439 ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 | \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u0442\u0440\u0430\u043a\u0442\u043e\u0440 | \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( <punct> | \u0442\u0440\u0430\u043a\u0442\u043e\u0440 ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999973\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( </s> | <punct> ...) \t= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993\n\ninfo: 1 sentences, 13 words, 0 OOVs\ninfo: 3 zeroprobs, logprob= -12.97624000 ppl= 8.45034200 ppl1= 9.95800426\n\ninfo: <s> \u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u043e\u043a\u043e\u043b\u043e \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 <punct> <punct> <punct> <punct> </s>\n\ninfo: p( \u0441 | <s> ) \t= [2gram] 0.00642448 [ -2.19216200 ] / 0.99999991\ninfo: p( \u043b\u044f\u0437\u0433\u043e\u043c | \u0441 ...) \t= [2gram] 0.00195917 [ -2.70792700 ] / 0.99999999\ninfo: p( \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f | \u043b\u044f\u0437\u0433\u043e\u043c ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u0438 | \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f ...) \t= [2gram] 0.51169951 [ -0.29098500 ] / 1.00000024\ninfo: p( \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f | \u0438 ...) \t= [2gram] 0.00143382 [ -2.84350600 ] / 0.99999975\ninfo: p( \u043e\u043a\u043e\u043b\u043e | \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f ...) \t= [1gram] 0.00011358 [ -3.94468000 ] / 1.00000003\ninfo: p( \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 | \u043e\u043a\u043e\u043b\u043e ...) \t= [1gram] 0.00003932 [ -4.40541100 ] / 1.00000016\ninfo: p( <punct> | \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999990\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( </s> | <punct> ...) \t= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993\n\ninfo: 1 sentences, 11 words, 0 OOVs\ninfo: 4 zeroprobs, logprob= -17.93030200 ppl= 31.20267541 ppl1= 42.66064865\n>>> print(a.logprob)\n-30.906542\n```\n\n### Example:\n```python\n>>> import alm\n>>> \n>>> alm.setAlmV2()\n>>> \n>>> alm.setOption(alm.options_t.debug)\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> def statusAlm(status):\n...     print(\"Read ALM\", status)\n... \n>>> alm.readALM(\"./lm.alm\", \"password\", 128, statusAlm)\nRead ALM 0\nRead ALM 1\nRead ALM 2\nRead ALM 3\nRead ALM 4\nRead ALM 5\nRead ALM 6\n...\n>>>\n>>> a = alm.perplexity(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\ninfo: <s> \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 <punct> <punct> <punct> </s>\n\ninfo: p( \u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e | <s> ) \t= [2gram] 0.00038931 [ -3.40969900 ] / 0.99999991\ninfo: p( \u0438\u0437 | \u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e ...) \t= [2gram] 0.10110741 [ -0.99521700 ] / 0.99999979\ninfo: p( \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 | \u0438\u0437 ...) \t= [2gram] 0.00711798 [ -2.14764300 ] / 1.00000027\ninfo: p( \u0432 | \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 ...) \t= [2gram] 0.51077661 [ -0.29176900 ] / 1.00000021\ninfo: p( \u043e\u043b\u0435\u0433\u0430 | \u0432 ...) \t= [2gram] 0.00082936 [ -3.08125500 ] / 0.99999974\ninfo: p( \u0443\u0434\u0430\u0440\u0438\u043b | \u043e\u043b\u0435\u0433\u0430 ...) \t= [2gram] 0.25002820 [ -0.60201100 ] / 0.99999978\ninfo: p( \u044f\u0440\u043a\u0438\u0439 | \u0443\u0434\u0430\u0440\u0438\u043b ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 | \u044f\u0440\u043a\u0438\u0439 ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 | \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u0442\u0440\u0430\u043a\u0442\u043e\u0440 | \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( <punct> | \u0442\u0440\u0430\u043a\u0442\u043e\u0440 ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999973\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( </s> | <punct> ...) \t= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993\n\ninfo: 1 sentences, 13 words, 0 OOVs\ninfo: 3 zeroprobs, logprob= -12.97624000 ppl= 8.45034200 ppl1= 9.95800426\n\ninfo: <s> \u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u043e\u043a\u043e\u043b\u043e \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 <punct> <punct> <punct> <punct> </s>\n\ninfo: p( \u0441 | <s> ) \t= [2gram] 0.00642448 [ -2.19216200 ] / 0.99999991\ninfo: p( \u043b\u044f\u0437\u0433\u043e\u043c | \u0441 ...) \t= [2gram] 0.00195917 [ -2.70792700 ] / 0.99999999\ninfo: p( \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f | \u043b\u044f\u0437\u0433\u043e\u043c ...) \t= [2gram] 0.50002878 [ -0.30100500 ] / 1.00000034\ninfo: p( \u0438 | \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f ...) \t= [2gram] 0.51169951 [ -0.29098500 ] / 1.00000024\ninfo: p( \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f | \u0438 ...) \t= [2gram] 0.00143382 [ -2.84350600 ] / 0.99999975\ninfo: p( \u043e\u043a\u043e\u043b\u043e | \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f ...) \t= [1gram] 0.00011358 [ -3.94468000 ] / 1.00000003\ninfo: p( \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 | \u043e\u043a\u043e\u043b\u043e ...) \t= [1gram] 0.00003932 [ -4.40541100 ] / 1.00000016\ninfo: p( <punct> | \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999990\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( <punct> | <punct> ...) \t= [OOV] 0.00000000 [ -inf ] / 0.99999993\ninfo: p( </s> | <punct> ...) \t= [1gram] 0.05693430 [ -1.24462600 ] / 0.99999993\n\ninfo: 1 sentences, 11 words, 0 OOVs\ninfo: 4 zeroprobs, logprob= -17.93030200 ppl= 31.20267541 ppl1= 42.66064865\n>>> print(a.logprob)\n-30.906542\n```\n\n---\n\n### Methods:\n- **setLogfile** - Method of set the file for log output\n- **setOOvFile** - Method set file for saving OOVs words\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setLogfile(\"./log.txt\")\n>>>\n>>> alm.setOOvFile(\"./oov.txt\")\n```\n\n---\n\n### Methods:\n- **perplexity** - Perplexity calculation\n- **pplConcatenate** - Method of combining perplexia\n- **pplByFiles** - Method for reading perplexity calculation by file or group of files\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> a = alm.perplexity(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\n>>>\n>>> print(a.logprob)\n-30.906542\n>>>\n>>> print(a.oovs)\n0\n>>>\n>>> print(a.words)\n24\n>>>\n>>> print(a.sentences)\n2\n>>>\n>>> print(a.zeroprobs)\n7\n>>>\n>>> print(a.ppl)\n17.229063831108224\n>>>\n>>> print(a.ppl1)\n19.398698060810077\n>>>\n>>> b = alm.pplByFiles(\"./text.txt\")\n>>>\n>>> c = alm.pplConcatenate(a, b)\n>>>\n>>> print(c.ppl)\n7.384123548831112\n```\n\n### Description\n| Name      | Description                                                                 |\n|-----------|-----------------------------------------------------------------------------|\n| ppl       | The meaning of perplexity without considering the beginning of the sentence |\n| ppl1      | The meaning of perplexion taking into account the beginning of the sentence |\n| oovs      | Count of oov words                                                          |\n| words     | Count of words in sentence                                                  |\n| logprob   | Word sequence frequency                                                     |\n| sentences | Count of sequences                                                          |\n| zeroprobs | Count of zero probs                                                         |\n\n---\n\n### Methods:\n- **tokenization** - Method for breaking text into tokens\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> def tokensFn(word, context, reset, stop):\n...     print(word, \" => \", context)\n...     return True\n...\n>>> alm.switchAllowApostrophe()\n>>>\n>>> alm.tokenization(\"\u00abOn nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'\u00e9valuer\u00bb l'astronomie\", tokensFn)\n\u00ab  =>  []\nOn  =>  ['\u00ab']\nnous  =>  ['\u00ab', 'On']\ndit  =>  ['\u00ab', 'On', 'nous']\nqu'aujourd'hui  =>  ['\u00ab', 'On', 'nous', 'dit']\nc'est  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\"]\nle  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\"]\ncas  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le']\n,  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas']\nencore  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',']\nfaudra-t-il  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore']\nl  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l']\n'  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l']\n\u00e9valuer  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', \"'\"]\n\u00bb  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', \"'\", '\u00e9valuer']\nl  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', \"'\", '\u00e9valuer', '\u00bb']\n'  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', \"'\", '\u00e9valuer', '\u00bb', 'l']\nastronomie  =>  ['\u00ab', 'On', 'nous', 'dit', \"qu'aujourd'hui\", \"c'est\", 'le', 'cas', ',', 'encore', 'faudra-t-il', 'l', \"'\", '\u00e9valuer', '\u00bb', 'l', \"'\"]\n```\n\n---\n\n### Methods:\n- **setTokenizerFn** - Method for set the function of an external tokenizer\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> def tokenizerFn(text, callback):\n...     word = \"\"\n...     context = []\n...     for letter in text:\n...         if letter == \" \" and len(word) > 0:\n...             if not callback(word, context, False, False): return\n...             context.append(word)\n...             word = \"\"\n...         elif letter == \".\" or letter == \"!\" or letter == \"?\":\n...             if not callback(word, context, True, False): return\n...             word = \"\"\n...             context = []\n...         else:\n...             word += letter\n...     if len(word) > 0:\n...         if not callback(word, context, False, True): return\n...\n>>> def tokensFn(word, context, reset, stop):\n...     print(word, \" => \", context)\n...     return True\n...\n>>> alm.setTokenizerFn(tokenizerFn)\n>>>\n>>> alm.tokenization(\"Hello World today!\", tokensFn)\nHello  =>  []\nWorld  =>  ['Hello']\ntoday  =>  ['Hello', 'World']\n```\n\n---\n\n### Methods:\n- **sentences** - Sentences generation method\n- **sentencesToFile** - Method for assembling a specified number of sentences and writing to a file\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> def sentencesFn(text):\n...     print(\"Sentences:\", text)\n...     return True\n...\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.sentences(sentencesFn)\nSentences: <s> \u0412 \u043e\u0431\u0449\u0435\u043c </s>\nSentences: <s> \u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430 </s>\nSentences: <s> \u0423 \u043c\u0435\u043d\u044f \u043d\u0435\u0442 </s>\nSentences: <s> \u042f \u0432\u043e\u043e\u0431\u0449\u0435 \u043d\u0435 \u0445\u043e\u0447\u0443 </s>\nSentences: <s> \u0414\u0430 \u0438 \u0432 \u043e\u0431\u0449\u0435\u043c </s>\nSentences: <s> \u041d\u0435 \u043c\u043e\u0433\u0443 </s>\nSentences: <s> \u041d\u0443 \u0432 \u043e\u0431\u0449\u0435\u043c </s>\nSentences: <s> \u0422\u0430\u043a \u0447\u0442\u043e \u044f \u0432\u043e\u043e\u0431\u0449\u0435 \u043d\u0435 \u0445\u043e\u0447\u0443 </s>\nSentences: <s> \u041f\u043e\u0442\u043e\u043c\u0443 \u0447\u0442\u043e \u044f \u0432\u043e\u043e\u0431\u0449\u0435 \u043d\u0435 \u0445\u043e\u0447\u0443 </s>\nSentences: <s> \u041f\u0440\u043e\u0434\u043e\u043b\u0436\u0435\u043d\u0438\u0435 \u0441\u043b\u0435\u0434\u0443\u0435\u0442 </s>\nSentences: <s> \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u043e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 </s>\n>>>\n>>> alm.sentencesToFile(5, \"./result.txt\")\n```\n\n---\n\n### Methods:\n- **fixUppers** - Method for correcting registers in the text\n- **fixUppersByFiles** - Method for correcting text registers in a text file\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.fixUppers(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u043e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\n'\u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440??? \u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....'\n>>>\n>>> alm.fixUppersByFiles(\"./corpus\", \"./result.txt\", \"txt\")\n```\n\n---\n\n### Methods:\n- **checkHypLat** - Hyphen and latin character search method\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.checkHypLat(\"Hello-World\")\n(True, True)\n>>>\n>>> alm.checkHypLat(\"Hello\")\n(False, True)\n>>>\n>>> alm.checkHypLat(\"\u041f\u0440\u0438\u0432\u0435\u0442\")\n(False, False)\n>>>\n>>> alm.checkHypLat(\"\u0442\u0430\u043a-\u043a\u0430\u043a\")\n(True, False)\n```\n\n---\n\n### Methods:\n- **getUppers** - Method for extracting registers for each word\n- **countLetter** - Method for counting the amount of a specific letter in a word\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.idw(\"Living\")\n10493385932\n>>>\n>>> alm.idw(\"in\")\n3301\n>>>\n>>> alm.idw(\"the\")\n217280\n>>>\n>>> alm.idw(\"USA\")\n188643\n>>>\n>>> alm.getUppers([10493385932, 3301, 217280, 188643])\n[1, 0, 0, 7]\n>>> \n>>> alm.countLetter(\"hello-world\", \"-\")\n1\n>>>\n>>> alm.countLetter(\"hello-world\", \"l\")\n3\n```\n\n---\n\n### Methods:\n- **urls** - Method for extracting URL address coordinates in a string\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.urls(\"This website: example.com was designed with ...\")\n{14: 25}\n>>>\n>>> alm.urls(\"This website: https://a.b.c.example.net?id=52#test-1 was designed with ...\")\n{14: 52}\n>>>\n>>> alm.urls(\"This website: https://a.b.c.example.net?id=52#test-1 and 127.0.0.1 was designed with ...\")\n{14: 52, 57: 66}\n```\n\n---\n\n### Methods:\n- **roman2Arabic** - Method for translating Roman numerals to Arabic\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.roman2Arabic(\"XVI\")\n16\n```\n\n---\n\n### Methods:\n- **rest** - Method for correction and detection of words with mixed alphabets\n- **setSubstitutes** - Method for set letters to correct words from mixed alphabets\n- **getSubstitutes** - Method of extracting letters to correct words from mixed alphabets\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.setSubstitutes({'p':'\u0440','c':'\u0441','o':'\u043e','t':'\u0442','k':'\u043a','e':'\u0435','a':'\u0430','h':'\u043d','x':'\u0445','b':'\u0432','m':'\u043c'})\n>>>\n>>> alm.getSubstitutes()\n{'a': '\u0430', 'b': '\u0432', 'c': '\u0441', 'e': '\u0435', 'h': '\u043d', 'k': '\u043a', 'm': '\u043c', 'o': '\u043e', 'p': '\u0440', 't': '\u0442', 'x': '\u0445'}\n>>>\n>>> str = \"\u041fP\u0418BET\u0418\u041a\"\n>>>\n>>> str.lower()\n'\u043fp\u0438bet\u0438\u043a'\n>>>\n>>> alm.rest(str)\n'\u043f\u0440\u0438\u0432\u0435\u0442\u0438\u043a'\n```\n\n---\n\n### Methods:\n- **setTokensDisable** - Method for set the list of forbidden tokens\n- **setTokensUnknown** - Method for set the list of tokens cast to \u2329unk\u232a\n- **setTokenDisable** - Method for set the list of unidentifiable tokens\n- **setTokenUnknown** - Method of set the list of tokens that need to be identified as \u2329unk\u232a\n- **getTokensDisable** - Method for retrieving the list of forbidden tokens\n- **getTokensUnknown** - Method for extracting a list of tokens reducible to \u2329unk\u232a\n- **setAllTokenDisable** - Method for set all tokens as unidentifiable\n- **setAllTokenUnknown** - The method of set all tokens identified as \u2329unk\u232a\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.idw(\"<date>\")\n6\n>>>\n>>> alm.idw(\"<time>\")\n7\n>>>\n>>> alm.idw(\"<abbr>\")\n5\n>>>\n>>> alm.idw(\"<math>\")\n9\n>>>\n>>> alm.setTokenDisable(\"date|time|abbr|math\")\n>>>\n>>> alm.getTokensDisable()\n{9, 5, 6, 7}\n>>>\n>>> alm.setTokensDisable({6, 7, 5, 9})\n>>>\n>>> alm.setTokenUnknown(\"date|time|abbr|math\")\n>>>\n>>> alm.getTokensUnknown()\n{9, 5, 6, 7}\n>>>\n>>> alm.setTokensUnknown({6, 7, 5, 9})\n>>>\n>>> alm.setAllTokenDisable()\n>>>\n>>> alm.getTokensDisable()\n{2, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 23}\n>>>\n>>> alm.setAllTokenUnknown()\n>>>\n>>> alm.getTokensUnknown()\n{2, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 23}\n```\n\n---\n\n### Methods:\n- **countAlphabet** - Method of obtaining the number of letters in the dictionary\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.getAlphabet()\n'abcdefghijklmnopqrstuvwxyz'\n>>>\n>>> alm.countAlphabet()\n26\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.countAlphabet()\n59\n```\n\n---\n\n### Methods:\n- **countBigrams** - Method get count bigrams\n- **countTrigrams** - Method get count trigrams\n- **countGrams** - Method get count N-gram by lm size\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.countBigrams(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\n12\n>>>\n>>> alm.countTrigrams(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\n10\n>>>\n>>> alm.size()\n3\n>>>\n>>> alm.countGrams(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\")\n10\n>>>\n>>> alm.idw(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e\")\n3263936167\n>>>\n>>> alm.idw(\"\u0438\u0437\")\n5134\n>>>\n>>> alm.idw(\"\u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438\")\n12535356101\n>>>\n>>> alm.idw(\"\u0432\")\n53\n>>>\n>>> alm.idw(\"\u041e\u043b\u0435\u0433\u0430\")\n2824508300\n>>>\n>>> alm.idw(\"\u0443\u0434\u0430\u0440\u0438\u043b\")\n24816796913\n>>>\n>>> alm.countBigrams([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])\n5\n>>>\n>>> alm.countTrigrams([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])\n4\n>>>\n>>> alm.countGrams([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])\n4\n```\n\n---\n\n### Methods:\n- **arabic2Roman** - Convert arabic number to roman number\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.arabic2Roman(23)\n'XXIII'\n>>>\n>>> alm.arabic2Roman(\"33\")\n'XXXIII'\n```\n\n---\n\n### Methods:\n- **setThreads** - Method for set the number of threads (0 - all threads)\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.setThreads(3)\n>>>\n>>> a = alm.pplByFiles(\"./text.txt\")\n>>>\n>>> print(a.logprob)\n-48201.29481399994\n```\n\n---\n\n### Methods:\n- **fti** - Method for removing the fractional part of a number\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.fti(5892.4892)\n5892489200000\n>>>\n>>> alm.fti(5892.4892, 4)\n58924892\n```\n\n---\n\n### Methods:\n- **context** - Method for assembling text context from a sequence\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.idw(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e\")\n3263936167\n>>>\n>>> alm.idw(\"\u0438\u0437\")\n5134\n>>>\n>>> alm.idw(\"\u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438\")\n12535356101\n>>>\n>>> alm.idw(\"\u0432\")\n53\n>>>\n>>> alm.idw(\"\u041e\u043b\u0435\u0433\u0430\")\n2824508300\n>>>\n>>> alm.idw(\"\u0443\u0434\u0430\u0440\u0438\u043b\")\n24816796913\n>>>\n>>> alm.context([3263936167, 5134, 12535356101, 53, 2824508300, 24816796913])\n'\u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b'\n```\n\n---\n\n### Methods:\n- **isAbbr** - Method of checking a word for compliance with an abbreviation\n- **isSuffix** - Method for checking a word for a suffix of a numeric abbreviation\n- **isToken** - Method for checking if an identifier matches a token\n- **isIdWord** - Method for checking if an identifier matches a word\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.addAbbr(\"\u0421\u0428\u0410\")\n>>>\n>>> alm.isAbbr(\"\u0441\u0448\u0430\")\nTrue\n>>>\n>>> alm.addSuffix(\"1-\u044f\")\n>>>\n>>> alm.isSuffix(\"1-\u044f\")\nTrue\n>>>\n>>> alm.isToken(alm.idw(\"\u0421\u0428\u0410\"))\nTrue\n>>>\n>>> alm.isToken(alm.idw(\"1-\u044f\"))\nTrue\n>>>\n>>> alm.isToken(alm.idw(\"125\"))\nTrue\n>>>\n>>> alm.isToken(alm.idw(\"<s>\"))\nTrue\n>>>\n>>> alm.isToken(alm.idw(\"Hello\"))\nFalse\n>>>\n>>> alm.isIdWord(alm.idw(\"https://anyks.com\"))\nTrue\n>>>\n>>> alm.isIdWord(alm.idw(\"Hello\"))\nTrue\n>>>\n>>> alm.isIdWord(alm.idw(\"-\"))\nFalse\n```\n\n---\n\n### Methods:\n- **findByFiles** - Method search N-grams in a text file\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.debug)\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.findByFiles(\"./text.txt\", \"./result.txt\")\ninfo: <s> \u041a\u0443\u043a\u0430\u0439\n\u0441\u0430\u0440\u0438 \u043a\u0443\u043a\u0430\u0439\n\u0441\u0430\u0440\u0438 \u044f\u043f\u043e\u043d\u0441\u043a\u0438\u0435\n\u044f\u043f\u043e\u043d\u0441\u043a\u0438\u0435 \u043a\u0430\u043b\u043b\u0438\u0433\u0440\u0430\u0444\u044b\n\u043a\u0430\u043b\u043b\u0438\u0433\u0440\u0430\u0444\u044b \u044f\n\u044f \u043f\u043e\u0441\u0442\u043e\u044f\u043d\u043d\u043e\n\u043f\u043e\u0441\u0442\u043e\u044f\u043d\u043d\u043e \u043d\u0430\u0432\u0435\u0449\u0430\u043b\n\u043d\u0430\u0432\u0435\u0449\u0430\u043b \u0438\u0445\n\u0438\u0445 \u0442\u0430\u0439\u043d\u043e\n\u0442\u0430\u0439\u043d\u043e \u043e\u0442\n\u043e\u0442 \u043b\u044e\u0434\u0435\u0439\n\u043b\u044e\u0434\u0435\u0439 </s>\n\n\ninfo: <s> \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437\n\u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438\n\u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432\n\u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430\n\u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b\n\u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439\n\u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440\n\u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439\n\u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440\n\u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440\n\n<s> \u0421 \u043b\u044f\u0437\u0433\u043e\u043c\n\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f\n\u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438\n\u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f\n\u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435\n\u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430\n\u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430\n```\n\n---\n\n### Methods:\n- **checkSequence** - Sequence Existence Method\n- **existSequence** - Method for checking the existence of a sequence, excluding non-word tokens\n- **checkByFiles** - Method for checking if a sequence exists in a text file\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.debug)\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.addAbbr(\"\u0421\u0428\u0410\")\n>>>\n>>> alm.isAbbr(\"\u0441\u0448\u0430\")\n>>>\n>>> alm.checkSequence(\"\u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u043e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b\")\nTrue\n>>>\n>>> alm.checkSequence(\"\u0421\u0435\u0433\u043e\u0434\u043d\u044f \u0441\u044b\u0433\u0440\u0430\u043b \u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 \u0441 \u043a\u043e\u0440\u043f\u043e\u0440\u0430\u0442\u0438\u0432\u043d\u044b\u043c \u0441\u0435\u043a\u0442\u043e\u0440\u043e\u043c\")\nTrue\n>>>\n>>> alm.checkSequence(\"\u0421\u0435\u0433\u043e\u0434\u043d\u044f \u0441\u044b\u0433\u0440\u0430\u043b \u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 \u0441 \u043a\u043e\u0440\u043f\u043e\u0440\u0430\u0442\u0438\u0432\u043d\u044b\u043c \u0441\u0435\u043a\u0442\u043e\u0440\u043e\u043c\", True)\nTrue\n>>>\n>>> alm.checkSequence(\"\u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439\")\nTrue\n>>>\n>>> alm.checkSequence(\"\u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439\", True)\nTrue\n>>>\n>>> alm.checkSequence(\"\u043e\u0442 \u0433\u043e\u0441\u0441\u0435\u043a\u0440\u0435\u0442\u0430\u0440\u044f \u0421\u0428\u0410\")\nTrue\n>>>\n>>> alm.checkSequence(\"\u043e\u0442 \u0433\u043e\u0441\u0441\u0435\u043a\u0440\u0435\u0442\u0430\u0440\u044f \u0421\u0428\u0410\", True)\nTrue\n>>>\n>>> alm.checkSequence(\"\u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u043e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b\", 2)\nTrue\n>>>\n>>> alm.checkSequence([\"\u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e\",\"\u0438\u0437\",\"\u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438\",\"\u0432\",\"\u043e\u043b\u0435\u0433\u0430\",\"\u0443\u0434\u0430\u0440\u0438\u043b\"], 2)\nTrue\n>>>\n>>> alm.existSequence(\"<s> \u0421\u0435\u0433\u043e\u0434\u043d\u044f \u0441\u044b\u0433\u0440\u0430\u043b \u0438 \u0432, \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440, \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 - \u0441 \u043a\u043e\u0440\u043f\u043e\u0440\u0430\u0442\u0438\u0432\u043d\u044b\u043c \u0441\u0435\u043a\u0442\u043e\u0440\u043e\u043c </s>\", 2)\n(True, 0)\n>>>\n>>> alm.existSequence([\"<s>\",\"\u0421\u0435\u0433\u043e\u0434\u043d\u044f\",\"\u0441\u044b\u0433\u0440\u0430\u043b\",\"\u0438\",\"\u0432\",\",\",\"\u041e\u043b\u0435\u0433\u0430\",\"\u0443\u0434\u0430\u0440\u0438\u043b\",\"\u044f\u0440\u043a\u0438\u0439\",\"\u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440\",\",\",\"\u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439\",\"\u0442\u0440\u0430\u043a\u0442\u043e\u0440\",\"-\",\"\u0441\",\"\u043a\u043e\u0440\u043f\u043e\u0440\u0430\u0442\u0438\u0432\u043d\u044b\u043c\",\"\u0441\u0435\u043a\u0442\u043e\u0440\u043e\u043c\",\"</s>\"], 2)\n(True, 2)\n>>>\n>>> alm.idw(\"\u043e\u0442\")\n6086\n>>>\n>>> alm.idw(\"\u0433\u043e\u0441\u0441\u0435\u043a\u0440\u0435\u0442\u0430\u0440\u044f\")\n51273912082\n>>>\n>>> alm.idw(\"\u0421\u0428\u0410\")\n5\n>>>\n>>> alm.checkSequence([6086, 51273912082, 5])\nTrue\n>>>\n>>> alm.checkSequence([6086, 51273912082, 5], True)\nTrue\n>>>\n>>> alm.checkSequence([\"\u043e\u0442\", \"\u0433\u043e\u0441\u0441\u0435\u043a\u0440\u0435\u0442\u0430\u0440\u044f\", \"\u0421\u0428\u0410\"])\nTrue\n>>>\n>>> alm.checkSequence([\"\u043e\u0442\", \"\u0433\u043e\u0441\u0441\u0435\u043a\u0440\u0435\u0442\u0430\u0440\u044f\", \"\u0421\u0428\u0410\"], True)\nTrue\n>>>\n>>> alm.checkByFiles(\"./text.txt\", \"./result.txt\")\ninfo: 1999 | YES | \u041a\u0430\u043a\u043e\u0439-\u0442\u043e \u043f\u0435\u0440\u0438\u043e\u0434 \u0432\u0440\u0435\u043c\u0435\u043d\u0438 \u043c\u044b \u0432\u043e\u043e\u0431\u0449\u0435 \u043d\u0435 \u043e\u0431\u0449\u0430\u043b\u0438\u0441\u044c\n\ninfo: 2000 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2001 | YES | \u0422\u0430\u043a \u043a\u0430\u043a \u044d\u0442\u0438 \u044f\u0439\u0446\u0430 \u0436\u0430\u043b\u043a\u043e \u0435\u0441\u0442\u044c \u0430 \u0445\u043e\u0447\u0435\u0442\u0441\u044f \u0432\u0441\u0435 \u0431\u043e\u043b\u044c\u0448\u0435 \u043b\u044e\u0431\u043e\u0432\u0430\u0442\u044c\u0441\u044f \u0438\u0445 \u043c\u043e\u0436\u043d\u043e \u043f\u043e\u043a\u0440\u044b\u0442\u044c \u043b\u0430\u043a\u043e\u043c \u0434\u0430\u0436\u0435 \u043f\u0440\u043e\u0437\u0440\u0430\u0447\u043d\u044b\u043c \u043b\u0430\u043a\u043e\u043c \u0434\u043b\u044f \u043d\u043e\u0433\u0442\u0435\u0439\n\ninfo: 2002 | NO | \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u044f\u043f\u043e\u043d\u0441\u043a\u0438\u0435 \u043a\u0430\u043b\u043b\u0438\u0433\u0440\u0430\u0444\u044b \u044f \u043f\u043e\u0441\u0442\u043e\u044f\u043d\u043d\u043e \u043d\u0430\u0432\u0435\u0449\u0430\u043b \u0438\u0445 \u0442\u0430\u0439\u043d\u043e \u043e\u0442 \u043b\u044e\u0434\u0435\u0439\n\ninfo: 2003 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\n\ninfo: 2004 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440?\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2005 | YES | \u0421\u0435\u0433\u043e\u0434\u043d\u044f \u044f\u0438\u0447\u043d\u0438\u0446\u0435\u0439 \u043d\u0438\u043a\u0442\u043e \u043d\u0435 \u0437\u0430\u0432\u0442\u0440\u0430\u043a\u0430\u043b \u043a\u0430\u043a \u0432\u043f\u0440\u043e\u0447\u0435\u043c \u0438 \u0432\u0447\u0435\u0440\u0430 \u043d\u0430 \u0431\u043b\u0438\u0436\u0430\u0439\u0448\u0435\u043c \u043a \u043d\u0430\u043c \u0440\u044b\u043d\u043a\u0435 \u043c\u044b \u0435\u043b\u0438 \u0444\u0440\u0443\u043a\u0442\u043e\u0432\u044b\u0439 \u0441\u0430\u043b\u0430\u0442 \u0441\u043e \u0441\u0432\u0435\u0436\u0435\u0432\u044b\u0436\u0430\u0442\u044b\u043c \u0441\u043e\u043a\u043e\u043c \u043a\u0430\u043a \u0432 \u0441\u0442\u0430\u0440\u044b\u0435 \u0434\u043e\u0431\u0440\u044b\u0435 \u0432\u0440\u0435\u043c\u0435\u043d\u0430 \u0432 \u0411\u0440\u0430\u0437\u0438\u043b\u0438\u0438\n\ninfo: 2006 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440!\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2007 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\nAll texts: 2007\nExists texts: 1359\nNot exists texts: 648\n>>>\n>>> alm.checkByFiles(\"./corpus\", \"./result.txt\", False, \"txt\")\ninfo: 1999 | YES | \u041a\u0430\u043a\u043e\u0439-\u0442\u043e \u043f\u0435\u0440\u0438\u043e\u0434 \u0432\u0440\u0435\u043c\u0435\u043d\u0438 \u043c\u044b \u0432\u043e\u043e\u0431\u0449\u0435 \u043d\u0435 \u043e\u0431\u0449\u0430\u043b\u0438\u0441\u044c\n\ninfo: 2000 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2001 | YES | \u0422\u0430\u043a \u043a\u0430\u043a \u044d\u0442\u0438 \u044f\u0439\u0446\u0430 \u0436\u0430\u043b\u043a\u043e \u0435\u0441\u0442\u044c \u0430 \u0445\u043e\u0447\u0435\u0442\u0441\u044f \u0432\u0441\u0435 \u0431\u043e\u043b\u044c\u0448\u0435 \u043b\u044e\u0431\u043e\u0432\u0430\u0442\u044c\u0441\u044f \u0438\u0445 \u043c\u043e\u0436\u043d\u043e \u043f\u043e\u043a\u0440\u044b\u0442\u044c \u043b\u0430\u043a\u043e\u043c \u0434\u0430\u0436\u0435 \u043f\u0440\u043e\u0437\u0440\u0430\u0447\u043d\u044b\u043c \u043b\u0430\u043a\u043e\u043c \u0434\u043b\u044f \u043d\u043e\u0433\u0442\u0435\u0439\n\ninfo: 2002 | NO | \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u044f\u043f\u043e\u043d\u0441\u043a\u0438\u0435 \u043a\u0430\u043b\u043b\u0438\u0433\u0440\u0430\u0444\u044b \u044f \u043f\u043e\u0441\u0442\u043e\u044f\u043d\u043d\u043e \u043d\u0430\u0432\u0435\u0449\u0430\u043b \u0438\u0445 \u0442\u0430\u0439\u043d\u043e \u043e\u0442 \u043b\u044e\u0434\u0435\u0439\n\ninfo: 2003 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\n\ninfo: 2004 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440?\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2005 | YES | \u0421\u0435\u0433\u043e\u0434\u043d\u044f \u044f\u0438\u0447\u043d\u0438\u0446\u0435\u0439 \u043d\u0438\u043a\u0442\u043e \u043d\u0435 \u0437\u0430\u0432\u0442\u0440\u0430\u043a\u0430\u043b \u043a\u0430\u043a \u0432\u043f\u0440\u043e\u0447\u0435\u043c \u0438 \u0432\u0447\u0435\u0440\u0430 \u043d\u0430 \u0431\u043b\u0438\u0436\u0430\u0439\u0448\u0435\u043c \u043a \u043d\u0430\u043c \u0440\u044b\u043d\u043a\u0435 \u043c\u044b \u0435\u043b\u0438 \u0444\u0440\u0443\u043a\u0442\u043e\u0432\u044b\u0439 \u0441\u0430\u043b\u0430\u0442 \u0441\u043e \u0441\u0432\u0435\u0436\u0435\u0432\u044b\u0436\u0430\u0442\u044b\u043c \u0441\u043e\u043a\u043e\u043c \u043a\u0430\u043a \u0432 \u0441\u0442\u0430\u0440\u044b\u0435 \u0434\u043e\u0431\u0440\u044b\u0435 \u0432\u0440\u0435\u043c\u0435\u043d\u0430 \u0432 \u0411\u0440\u0430\u0437\u0438\u043b\u0438\u0438\n\ninfo: 2006 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440!\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2007 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\nAll texts: 2007\nExists texts: 1359\nNot exists texts: 648\n>>>\n>>> alm.checkByFiles(\"./corpus\", \"./result.txt\", True, \"txt\")\ninfo: 2000 | NO | \u0422\u0430\u043a \u043a\u0430\u043a \u044d\u0442\u0438 \u044f\u0439\u0446\u0430 \u0436\u0430\u043b\u043a\u043e \u0435\u0441\u0442\u044c \u0430 \u0445\u043e\u0447\u0435\u0442\u0441\u044f \u0432\u0441\u0435 \u0431\u043e\u043b\u044c\u0448\u0435 \u043b\u044e\u0431\u043e\u0432\u0430\u0442\u044c\u0441\u044f \u0438\u0445 \u043c\u043e\u0436\u043d\u043e \u043f\u043e\u043a\u0440\u044b\u0442\u044c \u043b\u0430\u043a\u043e\u043c \u0434\u0430\u0436\u0435 \u043f\u0440\u043e\u0437\u0440\u0430\u0447\u043d\u044b\u043c \u043b\u0430\u043a\u043e\u043c \u0434\u043b\u044f \u043d\u043e\u0433\u0442\u0435\u0439\n\ninfo: 2001 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2002 | NO | \u0421\u0435\u0433\u043e\u0434\u043d\u044f \u044f\u0438\u0447\u043d\u0438\u0446\u0435\u0439 \u043d\u0438\u043a\u0442\u043e \u043d\u0435 \u0437\u0430\u0432\u0442\u0440\u0430\u043a\u0430\u043b \u043a\u0430\u043a \u0432\u043f\u0440\u043e\u0447\u0435\u043c \u0438 \u0432\u0447\u0435\u0440\u0430 \u043d\u0430 \u0431\u043b\u0438\u0436\u0430\u0439\u0448\u0435\u043c \u043a \u043d\u0430\u043c \u0440\u044b\u043d\u043a\u0435 \u043c\u044b \u0435\u043b\u0438 \u0444\u0440\u0443\u043a\u0442\u043e\u0432\u044b\u0439 \u0441\u0430\u043b\u0430\u0442 \u0441\u043e \u0441\u0432\u0435\u0436\u0435\u0432\u044b\u0436\u0430\u0442\u044b\u043c \u0441\u043e\u043a\u043e\u043c \u043a\u0430\u043a \u0432 \u0441\u0442\u0430\u0440\u044b\u0435 \u0434\u043e\u0431\u0440\u044b\u0435 \u0432\u0440\u0435\u043c\u0435\u043d\u0430 \u0432 \u0411\u0440\u0430\u0437\u0438\u043b\u0438\u0438\n\ninfo: 2003 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440!\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2004 | NO | \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u044f\u043f\u043e\u043d\u0441\u043a\u0438\u0435 \u043a\u0430\u043b\u043b\u0438\u0433\u0440\u0430\u0444\u044b \u044f \u043f\u043e\u0441\u0442\u043e\u044f\u043d\u043d\u043e \u043d\u0430\u0432\u0435\u0449\u0430\u043b \u0438\u0445 \u0442\u0430\u0439\u043d\u043e \u043e\u0442 \u043b\u044e\u0434\u0435\u0439\n\ninfo: 2005 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440?\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 2006 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\n\ninfo: 2007 | NO | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\nAll texts: 2007\nExists texts: 0\nNot exists texts: 2007\n```\n\n---\n\n### Methods:\n- **check** - String Check Method\n- **match** - String Matching Method\n- **addAbbr** - Method add abbreviation\n- **addSuffix** - Method add number suffix abbreviation\n- **setSuffixes** - Method set number suffix abbreviations\n- **readSuffix** - Method for reading data from a file of suffixes and abbreviations\n\n### Example:\n```python\n>>> import alm\n>>> \n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>> alm.setSubstitutes({'p':'\u0440','c':'\u0441','o':'\u043e','t':'\u0442','k':'\u043a','e':'\u0435','a':'\u0430','h':'\u043d','x':'\u0445','b':'\u0432','m':'\u043c'})\n>>> \n>>> alm.check(\"\u0414\u043e\u043c-2\", alm.check_t.home2)\nTrue\n>>> \n>>> alm.check(\"\u0414\u043e\u043c2\", alm.check_t.home2)\nFalse\n>>> \n>>> alm.check(\"\u0414\u043e\u043c-2\", alm.check_t.latian)\nFalse\n>>> \n>>> alm.check(\"Hello\", alm.check_t.latian)\nTrue\n>>> \n>>> alm.check(\"\u043f\u0440i\u0432\u0435\u0442\", alm.check_t.latian)\nTrue\n>>> \n>>> alm.check(\"\u0414\u043e\u043c-2\", alm.check_t.hyphen)\nTrue\n>>> \n>>> alm.check(\"\u0414\u043e\u043c2\", alm.check_t.hyphen)\nFalse\n>>> \n>>> alm.check(\"\u0414\", alm.check_t.letter)\nTrue\n>>> \n>>> alm.check(\"$\", alm.check_t.letter)\nFalse\n>>> \n>>> alm.check(\"-\", alm.check_t.letter)\nFalse\n>>> \n>>> alm.check(\"\u043f\u0440\u043e\u0441t\u043e\u043a\u0432a\u0448\u0438\u043d\u043e\", alm.check_t.similars)\nTrue\n>>> \n>>> alm.match(\"my site http://example.ru, it's true\", alm.match_t.url)\nTrue\n>>> \n>>> alm.match(\"\u043f\u043e \u0432\u0430\u0448\u0435\u043c\u0443 ip \u0430\u0434\u0440\u0435\u0441\u0443 46.40.123.12 \u043f\u0440\u043e\u0432\u043e\u0434\u0438\u0442\u0441\u044f \u043f\u0440\u043e\u0432\u0435\u0440\u043a\u0430\", alm.match_t.url)\nTrue\n>>> \n>>> alm.match(\"\u043c\u043e\u0439 \u0430\u0434\u0440\u0435\u0441 \u0432 \u0444\u043e\u0440\u043c\u0430\u0442\u0435 IPv6: http://[2001:0db8:11a3:09d7:1f34:8a2e:07a0:765d]/\", alm.match_t.url)\nTrue\n>>> \n>>> alm.match(\"13-\u044f\", alm.match_t.abbr)\nTrue\n>>> \nalm.match(\"13-\u044f-\u0439\", alm.match_t.abbr)\nFalse\n>>> \nalm.match(\"\u0442.\u0434\", alm.match_t.abbr)\nTrue\n>>> \nalm.match(\"\u0442.\u043f.\", alm.match_t.abbr)\nTrue\n>>> \n>>> alm.match(\"\u0421.\u0428.\u0410.\", alm.match_t.abbr)\nTrue\n>>> \n>>> alm.addAbbr(\"\u0441\u0448\u0430\")\n>>> alm.match(\"\u0421\u0428\u0410\", alm.match_t.abbr)\nTrue\n>>> \n>>> alm.addSuffix(\"15-\u043b\u0435\u0442\u0438\u044f\")\n>>> alm.match(\"15-\u043b\u0435\u0442\u0438\u044f\", alm.match_t.abbr)\nTrue\n>>> \n>>> alm.getSuffixes()\n{3139900457}\n>>> \n>>> alm.idw(\"\u043b\u0435\u0442\")\n328041\n>>> \n>>> alm.idw(\"\u0442\u044b\u0445\")\n352214\n>>> \n>>> alm.setSuffixes({328041, 352214})\n>>> \n>>> alm.getSuffixes()\n{328041, 352214}\n>>> \n>>> def status(status):\n...     print(status)\n... \n>>> alm.readSuffix(\"./suffix.abbr\", status)\n>>> \n>>> alm.match(\"15-\u043b\u0435\u0442\", alm.match_t.abbr)\nTrue\n>>> \n>>> alm.match(\"20-\u0442\u044b\u0445\", alm.match_t.abbr)\nTrue\n>>> \n>>> alm.match(\"15-\u043b\u0435\u0442\u0438\u044f\", alm.match_t.abbr)\nFalse\n>>> \n>>> alm.match(\"Hello\", alm.match_t.latian)\nTrue\n>>> \n>>> alm.match(\"\u043f\u0440i\u0432\u0435\u0442\", alm.match_t.latian)\nFalse\n>>> \n>>> alm.match(\"23424\", alm.match_t.number)\nTrue\n>>> \n>>> alm.match(\"hello\", alm.match_t.number)\nFalse\n>>> \n>>> alm.match(\"23424.55\", alm.match_t.number)\nFalse\n>>> \n>>> alm.match(\"23424\", alm.match_t.decimal)\nFalse\n>>> \n>>> alm.match(\"23424.55\", alm.match_t.decimal)\nTrue\n>>> \n>>> alm.match(\"23424,55\", alm.match_t.decimal)\nTrue\n>>> \n>>> alm.match(\"-23424.55\", alm.match_t.decimal)\nTrue\n>>> \n>>> alm.match(\"+23424.55\", alm.match_t.decimal)\nTrue\n>>> \n>>> alm.match(\"+23424.55\", alm.match_t.anumber)\nTrue\n>>> \n>>> alm.match(\"15T-34\", alm.match_t.anumber)\nTrue\n>>> \n>>> alm.match(\"hello\", alm.match_t.anumber)\nFalse\n>>> \n>>> alm.match(\"hello\", alm.match_t.allowed)\nTrue\n>>> \n>>> alm.match(\"\u00e9valuer\", alm.match_t.allowed)\nFalse\n>>> \n>>> alm.match(\"13\", alm.match_t.allowed)\nTrue\n>>> \n>>> alm.match(\"Hello-World\", alm.match_t.allowed)\nTrue\n>>> \n>>> alm.match(\"Hello\", alm.match_t.math)\nFalse\n>>> \n>>> alm.match(\"+\", alm.match_t.math)\nTrue\n>>> \n>>> alm.match(\"=\", alm.match_t.math)\nTrue\n>>> \n>>> alm.match(\"Hello\", alm.match_t.upper)\nTrue\n>>> \n>>> alm.match(\"hello\", alm.match_t.upper)\nFalse\n>>> \n>>> alm.match(\"hellO\", alm.match_t.upper)\nFalse\n>>> \n>>> alm.match(\"a\", alm.match_t.punct)\nFalse\n>>> \n>>> alm.match(\",\", alm.match_t.punct)\nTrue\n>>> \n>>> alm.match(\" \", alm.match_t.space)\nTrue\n>>> \n>>> alm.match(\"a\", alm.match_t.space)\nFalse\n>>> \n>>> alm.match(\"a\", alm.match_t.special)\nFalse\n>>> \n>>> alm.match(\"\u00b1\", alm.match_t.special)\nFalse\n>>> \n>>> alm.match(\"[\", alm.match_t.isolation)\nTrue\n>>> \n>>> alm.match(\"a\", alm.match_t.isolation)\nFalse\n>>> \n>>> alm.match(\"a\", alm.match_t.greek)\nFalse\n>>> \n>>> alm.match(\"\u03a8\", alm.match_t.greek)\nTrue\n>>> \n>>> alm.match(\"->\", alm.match_t.route)\nFalse\n>>> \n>>> alm.match(\"\u21d4\", alm.match_t.route)\nTrue\n>>> \n>>> alm.match(\"a\", alm.match_t.letter)\nTrue\n>>> \n>>> alm.match(\"!\", alm.match_t.letter)\nFalse\n>>> \n>>> alm.match(\"!\", alm.match_t.pcards)\nFalse\n>>> \n>>> alm.match(\"\u2663\", alm.match_t.pcards)\nTrue\n>>> \n>>> alm.match(\"p\", alm.match_t.currency)\nFalse\n>>> \n>>> alm.match(\"$\", alm.match_t.currency)\nTrue\n>>> \n>>> alm.match(\"\u20ac\", alm.match_t.currency)\nTrue\n>>> \n>>> alm.match(\"\u20bd\", alm.match_t.currency)\nTrue\n>>> \n>>> alm.match(\"\u20bf\", alm.match_t.currency)\nTrue\n```\n\n---\n\n### Methods:\n- **delInText** - Method for delete letter in text\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.delInText(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440??? \u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\", alm.wdel_t.punct)\n'\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 \u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430'\n>>>\n>>> alm.delInText(\"hello-world-hello-world\", alm.wdel_t.hyphen)\n'helloworldhelloworld'\n>>>\n>>> alm.delInText(\"\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440??? \u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\", alm.wdel_t.broken)\n'\u043d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440 \u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430'\n>>>\n>>> alm.delInText(\"\u00abOn nous dit qu'aujourd'hui c'est le cas, encore faudra-t-il l'\u00e9valuer\u00bb l'astronomie\", alm.wdel_t.broken)\n\"On nous dit qu'aujourd'hui c'est le cas encore faudra-t-il l'valuer l'astronomie\"\n```\n\n---\n\n### Methods:\n- **countsByFiles** - Method for counting the number of n-grams in a text file\n\n### Example:\n```python\n>>> import alm\n>>>\n>>> alm.setOption(alm.options_t.debug)\n>>>\n>>> alm.setOption(alm.options_t.confidence)\n>>>\n>>> alm.setAlphabet(\"abcdefghijklmnopqrstuvwxyz\u0430\u0431\u0432\u0433\u0434\u0435\u0451\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\")\n>>>\n>>> alm.readArpa('./lm.arpa')\n>>>\n>>> alm.countsByFiles(\"./text.txt\", \"./result.txt\", 3)\ninfo: 0 | \u0421\u0435\u0433\u043e\u0434\u043d\u044f \u044f\u0438\u0447\u043d\u0438\u0446\u0435\u0439 \u043d\u0438\u043a\u0442\u043e \u043d\u0435 \u0437\u0430\u0432\u0442\u0440\u0430\u043a\u0430\u043b \u043a\u0430\u043a \u0432\u043f\u0440\u043e\u0447\u0435\u043c \u0438 \u0432\u0447\u0435\u0440\u0430 \u043d\u0430 \u0431\u043b\u0438\u0436\u0430\u0439\u0448\u0435\u043c \u043a \u043d\u0430\u043c \u0440\u044b\u043d\u043a\u0435 \u043c\u044b \u0435\u043b\u0438 \u0444\u0440\u0443\u043a\u0442\u043e\u0432\u044b\u0439 \u0441\u0430\u043b\u0430\u0442 \u0441\u043e \u0441\u0432\u0435\u0436\u0435\u0432\u044b\u0436\u0430\u0442\u044b\u043c \u0441\u043e\u043a\u043e\u043c \u043a\u0430\u043a \u0432 \u0441\u0442\u0430\u0440\u044b\u0435 \u0434\u043e\u0431\u0440\u044b\u0435 \u0432\u0440\u0435\u043c\u0435\u043d\u0430 \u0432 \u0411\u0440\u0430\u0437\u0438\u043b\u0438\u0438\n\ninfo: 10 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440?\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 10 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440!\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 0 | \u0422\u0430\u043a \u043a\u0430\u043a \u044d\u0442\u0438 \u044f\u0439\u0446\u0430 \u0436\u0430\u043b\u043a\u043e \u0435\u0441\u0442\u044c \u0430 \u0445\u043e\u0447\u0435\u0442\u0441\u044f \u0432\u0441\u0435 \u0431\u043e\u043b\u044c\u0448\u0435 \u043b\u044e\u0431\u043e\u0432\u0430\u0442\u044c\u0441\u044f \u0438\u0445 \u043c\u043e\u0436\u043d\u043e \u043f\u043e\u043a\u0440\u044b\u0442\u044c \u043b\u0430\u043a\u043e\u043c \u0434\u0430\u0436\u0435 \u043f\u0440\u043e\u0437\u0440\u0430\u0447\u043d\u044b\u043c \u043b\u0430\u043a\u043e\u043c \u0434\u043b\u044f \u043d\u043e\u0433\u0442\u0435\u0439\n\ninfo: 10 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\n\nCounts 3grams: 471\n>>>\n>>> alm.countsByFiles(\"./corpus\", \"./result.txt\", 2, \"txt\")\ninfo: 19 | \u0422\u0430\u043a \u043a\u0430\u043a \u044d\u0442\u0438 \u044f\u0439\u0446\u0430 \u0436\u0430\u043b\u043a\u043e \u0435\u0441\u0442\u044c \u0430 \u0445\u043e\u0447\u0435\u0442\u0441\u044f \u0432\u0441\u0435 \u0431\u043e\u043b\u044c\u0448\u0435 \u043b\u044e\u0431\u043e\u0432\u0430\u0442\u044c\u0441\u044f \u0438\u0445 \u043c\u043e\u0436\u043d\u043e \u043f\u043e\u043a\u0440\u044b\u0442\u044c \u043b\u0430\u043a\u043e\u043c \u0434\u0430\u0436\u0435 \u043f\u0440\u043e\u0437\u0440\u0430\u0447\u043d\u044b\u043c \u043b\u0430\u043a\u043e\u043c \u0434\u043b\u044f \u043d\u043e\u0433\u0442\u0435\u0439\n\ninfo: 12 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440.\u0441 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 12 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440!\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 10 | \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u043a\u0443\u043a\u0430\u0439 <unk> <unk> \u0441\u0430\u0440\u0438 \u044f\u043f\u043e\u043d\u0441\u043a\u0438\u0435 \u043a\u0430\u043b\u043b\u0438\u0433\u0440\u0430\u0444\u044b \u044f \u043f\u043e\u0441\u0442\u043e\u044f\u043d\u043d\u043e \u043d\u0430\u0432\u0435\u0449\u0430\u043b \u0438\u0445 \u0442\u0430\u0439\u043d\u043e \u043e\u0442 \u043b\u044e\u0434\u0435\u0439\n\ninfo: 12 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440???\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430....\n\ninfo: 12 | \u041d\u0435\u043e\u0436\u0438\u0434\u0430\u043d\u043d\u043e \u0438\u0437 \u043f\u043e\u0434\u0432\u043e\u0440\u043e\u0442\u043d\u0438 \u0432 \u041e\u043b\u0435\u0433\u0430 \u0443\u0434\u0430\u0440\u0438\u043b \u044f\u0440\u043a\u0438\u0439 \u043f\u0440\u043e\u0436\u0435\u043a\u0442\u043e\u0440 \u043f\u0430\u0442\u0440\u0443\u043b\u044c\u043d\u044b\u0439 \u0442\u0440\u0430\u043a\u0442\u043e\u0440?\u0421 \u043b\u044f\u0437\u0433\u043e\u043c \u0432\u044b\u043a\u0430\u0442\u0438\u043b\u0441\u044f \u0438 \u043e\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u043b\u0441\u044f \u0432\u043e\u0437\u043b\u0435 \u043c\u0430\u043b\u044c\u0447\u0438\u043a\u0430.\n\ninfo: 27 | \u0421\u0435\u0433\u043e\u0434\u043d\u044f \u044f\u0438\u0447\u043d\u0438\u0446\u0435\u0439 \u043d\u0438\u043a\u0442\u043e \u043d\u0435 \u0437\u0430\u0432\u0442\u0440\u0430\u043a\u0430\u043b \u043a\u0430\u043a \u0432\u043f\u0440\u043e\u0447\u0435\u043c \u0438 \u0432\u0447\u0435\u0440\u0430 \u043d\u0430 \u0431\u043b\u0438\u0436\u0430\u0439\u0448\u0435\u043c \u043a \u043d\u0430\u043c \u0440\u044b\u043d\u043a\u0435 \u043c\u044b \u0435\u043b\u0438 \u0444\u0440\u0443\u043a\u0442\u043e\u0432\u044b\u0439 \u0441\u0430\u043b\u0430\u0442 \u0441\u043e \u0441\u0432\u0435\u0436\u0435\u0432\u044b\u0436\u0430\u0442\u044b\u043c \u0441\u043e\u043a\u043e\u043c \u043a\u0430\u043a \u0432 \u0441\u0442\u0430\u0440\u044b\u0435 \u0434\u043e\u0431\u0440\u044b\u0435 \u0432\u0440\u0435\u043c\u0435\u043d\u0430 \u0432 \u0411\u0440\u0430\u0437\u0438\u043b\u0438\u0438\n\nCounts 2grams: 20270\n```\n\n### Description\n| N-gram size | Description         |\n|-------------|---------------------|\n| 1           | language model size |\n| 2           | bigram              |\n| 3           | trigram             |\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Smart language model",
    "version": "3.5.0",
    "split_keywords": [
        "nlp",
        "lm",
        "alm",
        "language-model"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6ffae5079319cd6a2fdb1fde521a3d214215d625770eebf0caf21385bbcbb715",
                "md5": "ff4a8fd61d4dbb9ec23a795493c9b9d3",
                "sha256": "42c9f869e957260e564836232bbf0e37c507cc7155a8c6024beaffebf6ac1526"
            },
            "downloads": -1,
            "filename": "anyks_lm-3.5.0-cp39-cp39-macosx_10_9_universal2.whl",
            "has_sig": false,
            "md5_digest": "ff4a8fd61d4dbb9ec23a795493c9b9d3",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.6",
            "size": 1986560,
            "upload_time": "2023-01-05T00:04:23",
            "upload_time_iso_8601": "2023-01-05T00:04:23.650547Z",
            "url": "https://files.pythonhosted.org/packages/6f/fa/e5079319cd6a2fdb1fde521a3d214215d625770eebf0caf21385bbcbb715/anyks_lm-3.5.0-cp39-cp39-macosx_10_9_universal2.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3e3d8d49d9c3ff4013891b15d7cb8b9990084ee79acd14bcf05bf43bb40a76a7",
                "md5": "b0f79966fba47aa9d40ddc1c694a6485",
                "sha256": "7cdc9f7614dcde92fda2d4f64462b957f6e2e8427744a4f7b53ecaf794fd299a"
            },
            "downloads": -1,
            "filename": "anyks-lm-3.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "b0f79966fba47aa9d40ddc1c694a6485",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 448809,
            "upload_time": "2023-01-05T00:01:56",
            "upload_time_iso_8601": "2023-01-05T00:01:56.914294Z",
            "url": "https://files.pythonhosted.org/packages/3e/3d/8d49d9c3ff4013891b15d7cb8b9990084ee79acd14bcf05bf43bb40a76a7/anyks-lm-3.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-05 00:01:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "anyks",
    "github_project": "alm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "anyks-lm"
}
        
Elapsed time: 0.03354s