bitermplus


Namebitermplus JSON
Version 0.7.0 PyPI version JSON
download
home_page
SummaryBiterm Topic Model
upload_time2023-05-30 17:50:06
maintainer
docs_urlNone
author
requires_python>=3.7
licenseMIT License Copyright (c) 2021 Maksim Terpilowski Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords topic model machine learning nlp
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Biterm Topic Model

![GitHub Workflow Status](https://img.shields.io/github/actions/workflow/status/maximtrp/bitermplus/package-test.yml)
[![Documentation Status](https://readthedocs.org/projects/bitermplus/badge/?version=latest)](https://bitermplus.readthedocs.io/en/latest/?badge=latest)
[![Codacy Badge](https://app.codacy.com/project/badge/Grade/192b6a75449040ff868932a15ca28ce9)](https://www.codacy.com/gh/maximtrp/bitermplus/dashboard?utm_source=github.com&utm_medium=referral&utm_content=maximtrp/bitermplus&utm_campaign=Badge_Grade)
[![Issues](https://img.shields.io/github/issues/maximtrp/bitermplus.svg)](https://github.com/maximtrp/bitermplus/issues)
[![Downloads](https://img.shields.io/pypi/dm/bitermplus)](https://pypi.org/project/bitermplus/)
[![Downloads](https://pepy.tech/badge/bitermplus)](https://pepy.tech/project/bitermplus)
![PyPI](https://img.shields.io/pypi/v/bitermplus)

*Bitermplus* implements [Biterm topic model](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.402.4032&rep=rep1&type=pdf) for short texts introduced by Xiaohui Yan, Jiafeng Guo, Yanyan Lan, and Xueqi Cheng. Actually, it is a cythonized version of [BTM](https://github.com/xiaohuiyan/BTM). This package is also capable of computing *perplexity*, *semantic coherence*, and *entropy* metrics.

## Development

Please note that bitermplus is actively improved.
Refer to [documentation](https://bitermplus.readthedocs.io) to stay up to date.

## Requirements

* cython
* numpy
* pandas
* scipy
* scikit-learn
* tqdm

## Setup

### Linux and Windows

There should be no issues with installing *bitermplus* under these OSes. You can install the package directly from PyPi.

```bash
pip install bitermplus
```

Or from this repo:

```bash
pip install git+https://github.com/maximtrp/bitermplus.git
```

### Mac OS

First, you need to install XCode CLT and [Homebrew](https://brew.sh).
Then, install `libomp` using `brew`:

```bash
xcode-select --install
brew install libomp
pip3 install bitermplus
```

If you have the following issue with libomp (`fatal error: 'omp.h' file not found`), run `brew info libomp` in the console:

```bash
brew info libomp
```

You should see the following output:

```
libomp: stable 15.0.5 (bottled) [keg-only]
LLVM's OpenMP runtime library
https://openmp.llvm.org/
/opt/homebrew/Cellar/libomp/15.0.5 (7 files, 1.6MB)
Poured from bottle on 2022-11-19 at 12:16:49
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/libomp.rb
License: MIT
==> Dependencies
Build: cmake ✘, lit ✘
==> Caveats
libomp is keg-only, which means it was not symlinked into /opt/homebrew,
because it can override GCC headers and result in broken builds.

For compilers to find libomp you may need to set:
export LDFLAGS="-L/opt/homebrew/opt/libomp/lib"
export CPPFLAGS="-I/opt/homebrew/opt/libomp/include"

==> Analytics
install: 192,197 (30 days), 373,389 (90 days), 1,285,192 (365 days)
install-on-request: 24,388 (30 days), 48,013 (90 days), 164,666 (365 days)
build-error: 0 (30 days)
```

Export `LDFLAGS` and `CPPFLAGS` as suggested in brew output:

```bash
export LDFLAGS="-L/opt/homebrew/opt/libomp/lib"
export CPPFLAGS="-I/opt/homebrew/opt/libomp/include"
```

## Example

### Model fitting

```python
import bitermplus as btm
import numpy as np
import pandas as pd

# IMPORTING DATA
df = pd.read_csv(
    'dataset/SearchSnippets.txt.gz', header=None, names=['texts'])
texts = df['texts'].str.strip().tolist()

# PREPROCESSING
# Obtaining terms frequency in a sparse matrix and corpus vocabulary
X, vocabulary, vocab_dict = btm.get_words_freqs(texts)
tf = np.array(X.sum(axis=0)).ravel()
# Vectorizing documents
docs_vec = btm.get_vectorized_docs(texts, vocabulary)
docs_lens = list(map(len, docs_vec))
# Generating biterms
biterms = btm.get_biterms(docs_vec)

# INITIALIZING AND RUNNING MODEL
model = btm.BTM(
    X, vocabulary, seed=12321, T=8, M=20, alpha=50/8, beta=0.01)
model.fit(biterms, iterations=20)
p_zd = model.transform(docs_vec)

# METRICS
perplexity = btm.perplexity(model.matrix_topics_words_, p_zd, X, 8)
coherence = btm.coherence(model.matrix_topics_words_, X, M=20)
# or
perplexity = model.perplexity_
coherence = model.coherence_

# LABELS
model.labels_
# or
btm.get_docs_top_topic(texts, model.matrix_docs_topics_)
```

### Results visualization

You need to install [tmplot](https://github.com/maximtrp/tmplot) first.

```python
import tmplot as tmp
tmp.report(model=model, docs=texts)
```

![Report interface](images/topics_terms_plots.png)

## Tutorial

There is a [tutorial](https://bitermplus.readthedocs.io/en/latest/tutorial.html)
in documentation that covers the important steps of topic modeling (including
stability measures and results visualization).

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "bitermplus",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "topic model,machine learning,nlp",
    "author": "",
    "author_email": "Maksim Terpilovskii <maximtrp@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/18/39/797484bdd7d9278611fd9fe4c087e0ec3658b1fd8ee22de3a42f7b5b2745/bitermplus-0.7.0.tar.gz",
    "platform": null,
    "description": "# Biterm Topic Model\n\n![GitHub Workflow Status](https://img.shields.io/github/actions/workflow/status/maximtrp/bitermplus/package-test.yml)\n[![Documentation Status](https://readthedocs.org/projects/bitermplus/badge/?version=latest)](https://bitermplus.readthedocs.io/en/latest/?badge=latest)\n[![Codacy Badge](https://app.codacy.com/project/badge/Grade/192b6a75449040ff868932a15ca28ce9)](https://www.codacy.com/gh/maximtrp/bitermplus/dashboard?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=maximtrp/bitermplus&amp;utm_campaign=Badge_Grade)\n[![Issues](https://img.shields.io/github/issues/maximtrp/bitermplus.svg)](https://github.com/maximtrp/bitermplus/issues)\n[![Downloads](https://img.shields.io/pypi/dm/bitermplus)](https://pypi.org/project/bitermplus/)\n[![Downloads](https://pepy.tech/badge/bitermplus)](https://pepy.tech/project/bitermplus)\n![PyPI](https://img.shields.io/pypi/v/bitermplus)\n\n*Bitermplus* implements [Biterm topic model](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.402.4032&rep=rep1&type=pdf) for short texts introduced by Xiaohui Yan, Jiafeng Guo, Yanyan Lan, and Xueqi Cheng. Actually, it is a cythonized version of [BTM](https://github.com/xiaohuiyan/BTM). This package is also capable of computing *perplexity*, *semantic coherence*, and *entropy* metrics.\n\n## Development\n\nPlease note that bitermplus is actively improved.\nRefer to [documentation](https://bitermplus.readthedocs.io) to stay up to date.\n\n## Requirements\n\n* cython\n* numpy\n* pandas\n* scipy\n* scikit-learn\n* tqdm\n\n## Setup\n\n### Linux and Windows\n\nThere should be no issues with installing *bitermplus* under these OSes. You can install the package directly from PyPi.\n\n```bash\npip install bitermplus\n```\n\nOr from this repo:\n\n```bash\npip install git+https://github.com/maximtrp/bitermplus.git\n```\n\n### Mac OS\n\nFirst, you need to install XCode CLT and [Homebrew](https://brew.sh).\nThen, install `libomp` using `brew`:\n\n```bash\nxcode-select --install\nbrew install libomp\npip3 install bitermplus\n```\n\nIf you have the following issue with libomp (`fatal error: 'omp.h' file not found`), run `brew info libomp` in the console:\n\n```bash\nbrew info libomp\n```\n\nYou should see the following output:\n\n```\nlibomp: stable 15.0.5 (bottled) [keg-only]\nLLVM's OpenMP runtime library\nhttps://openmp.llvm.org/\n/opt/homebrew/Cellar/libomp/15.0.5 (7 files, 1.6MB)\nPoured from bottle on 2022-11-19 at 12:16:49\nFrom: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/libomp.rb\nLicense: MIT\n==> Dependencies\nBuild: cmake \u2718, lit \u2718\n==> Caveats\nlibomp is keg-only, which means it was not symlinked into /opt/homebrew,\nbecause it can override GCC headers and result in broken builds.\n\nFor compilers to find libomp you may need to set:\nexport LDFLAGS=\"-L/opt/homebrew/opt/libomp/lib\"\nexport CPPFLAGS=\"-I/opt/homebrew/opt/libomp/include\"\n\n==> Analytics\ninstall: 192,197 (30 days), 373,389 (90 days), 1,285,192 (365 days)\ninstall-on-request: 24,388 (30 days), 48,013 (90 days), 164,666 (365 days)\nbuild-error: 0 (30 days)\n```\n\nExport `LDFLAGS` and `CPPFLAGS` as suggested in brew output:\n\n```bash\nexport LDFLAGS=\"-L/opt/homebrew/opt/libomp/lib\"\nexport CPPFLAGS=\"-I/opt/homebrew/opt/libomp/include\"\n```\n\n## Example\n\n### Model fitting\n\n```python\nimport bitermplus as btm\nimport numpy as np\nimport pandas as pd\n\n# IMPORTING DATA\ndf = pd.read_csv(\n    'dataset/SearchSnippets.txt.gz', header=None, names=['texts'])\ntexts = df['texts'].str.strip().tolist()\n\n# PREPROCESSING\n# Obtaining terms frequency in a sparse matrix and corpus vocabulary\nX, vocabulary, vocab_dict = btm.get_words_freqs(texts)\ntf = np.array(X.sum(axis=0)).ravel()\n# Vectorizing documents\ndocs_vec = btm.get_vectorized_docs(texts, vocabulary)\ndocs_lens = list(map(len, docs_vec))\n# Generating biterms\nbiterms = btm.get_biterms(docs_vec)\n\n# INITIALIZING AND RUNNING MODEL\nmodel = btm.BTM(\n    X, vocabulary, seed=12321, T=8, M=20, alpha=50/8, beta=0.01)\nmodel.fit(biterms, iterations=20)\np_zd = model.transform(docs_vec)\n\n# METRICS\nperplexity = btm.perplexity(model.matrix_topics_words_, p_zd, X, 8)\ncoherence = btm.coherence(model.matrix_topics_words_, X, M=20)\n# or\nperplexity = model.perplexity_\ncoherence = model.coherence_\n\n# LABELS\nmodel.labels_\n# or\nbtm.get_docs_top_topic(texts, model.matrix_docs_topics_)\n```\n\n### Results visualization\n\nYou need to install [tmplot](https://github.com/maximtrp/tmplot) first.\n\n```python\nimport tmplot as tmp\ntmp.report(model=model, docs=texts)\n```\n\n![Report interface](images/topics_terms_plots.png)\n\n## Tutorial\n\nThere is a [tutorial](https://bitermplus.readthedocs.io/en/latest/tutorial.html)\nin documentation that covers the important steps of topic modeling (including\nstability measures and results visualization).\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2021 Maksim Terpilowski  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "Biterm Topic Model",
    "version": "0.7.0",
    "project_urls": {
        "documentation": "https://bitermplus.readthedocs.io/",
        "homepage": "https://github.com/maximtrp/bitermplus"
    },
    "split_keywords": [
        "topic model",
        "machine learning",
        "nlp"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1839797484bdd7d9278611fd9fe4c087e0ec3658b1fd8ee22de3a42f7b5b2745",
                "md5": "1f92a49a4d7a86f28189f64700750ebf",
                "sha256": "acb0c8b3aa44f2e8498236d1d226abaed395bbbbba1c560dcf9b8c7422690c79"
            },
            "downloads": -1,
            "filename": "bitermplus-0.7.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1f92a49a4d7a86f28189f64700750ebf",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 264443,
            "upload_time": "2023-05-30T17:50:06",
            "upload_time_iso_8601": "2023-05-30T17:50:06.713209Z",
            "url": "https://files.pythonhosted.org/packages/18/39/797484bdd7d9278611fd9fe4c087e0ec3658b1fd8ee22de3a42f7b5b2745/bitermplus-0.7.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-05-30 17:50:06",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "maximtrp",
    "github_project": "bitermplus",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "bitermplus"
}
        
Elapsed time: 0.07227s