# Installation from pip3
```shell
pip3 install --verbose phrase_detective
python -m spacy download en_core_web_trf
python -m spacy download es_dep_news_trf
python -m spacy download de_dep_news_trf
```
# Usage
Please refer to [api docs](https://qishe-nlp.github.io/phrase-detective/)
### Detect noun phrases
```
import spacy
from spacy import Language
from phrase_detective import NounPhraseRecognizer, PKG_INDICES
@Language.factory("nprecog")
def create_np_parser(nlp: Language, name: str):
return NounPhraseRecognizer(nlp)
def noun_phrase(lang, sentence):
nlp = spacy.load(PKG_INDICES[lang])
nlp.add_pipe("nprecog")
doc = nlp(sentence)
for np in doc._.noun_phrases:
print(np.text)
```
### Detect verb phrases
```
import spacy
from spacy import Language
from phrase_detective import VerbKnowledgeRecognizer, PKG_INDICES
@Language.factory("vkbrecog")
def create_vkb_parser(nlp: Language, name: str):
return VerbKnowledgeRecognizer(nlp)
def verb_knowledge(lang, sentence):
nlp = spacy.load(PKG_INDICES[lang])
nlp.add_pipe("vkbrecog")
doc = nlp(sentence)
for v in doc._.verbs:
print("TEXT: {}, TAG: {}, FORM: {}, ORIGNAL: {}".format(v.text, v.tag_, spacy.explain(v.tag_), v.lemma_))
for pp in doc._.passive_phrases:
print(pp.text)
for vp in doc._.verb_phrases:
print(vp)
```
# Development
### Clone project
```
git clone https://github.com/qishe-nlp/phrase-detective.git
```
### Install [poetry](https://python-poetry.org/docs/)
### Install dependencies
```
poetry update
```
### Test and Issue
```
poetry run pytest -rP
```
which run tests under `tests/*`
### Create sphinx docs
```
poetry shell
cd apidocs
sphinx-apidoc -f -o source ../phrase_detective
make html
python -m http.server -d build/html
```
### Hose docs on github pages
```
cp -rf apidocs/build/html/* docs/
```
### Build
* Change `version` in `pyproject.toml` and `phrase_detective/__init__.py`
* Build python package by `poetry build`
### Git commit and push
### Publish from local dev env
* Set pypi test environment variables in poetry, refer to [poetry doc](https://python-poetry.org/docs/repositories/)
* Publish to pypi test by `poetry publish -r test`
### Publish through CI
* Github action build and publish package to [test pypi repo](https://test.pypi.org/)
```
git tag [x.x.x]
git push origin master
```
* Manually publish to [pypi repo](https://pypi.org/) through [github action](https://github.com/qishe-nlp/phrase-detective/actions/workflows/pypi.yml)
Raw data
{
"_id": null,
"home_page": "https://github.com/qishe-nlp/phrase-detective",
"name": "phrase-detective",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.12",
"maintainer_email": null,
"keywords": "phrase detection, NLP, spacy",
"author": "Phoenix Grey",
"author_email": "phoenix.grey0108@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/11/52/a4d0a9f32b9473fbc37c8edb4eeca621d76784f8ce72d847802d374dcfde/phrase_detective-0.1.35.tar.gz",
"platform": null,
"description": "# Installation from pip3\n\n```shell\npip3 install --verbose phrase_detective \npython -m spacy download en_core_web_trf\npython -m spacy download es_dep_news_trf\npython -m spacy download de_dep_news_trf\n```\n\n# Usage\n\nPlease refer to [api docs](https://qishe-nlp.github.io/phrase-detective/)\n\n### Detect noun phrases \n```\nimport spacy\nfrom spacy import Language\nfrom phrase_detective import NounPhraseRecognizer, PKG_INDICES\n\n@Language.factory(\"nprecog\")\ndef create_np_parser(nlp: Language, name: str):\n return NounPhraseRecognizer(nlp) \n\ndef noun_phrase(lang, sentence):\n nlp = spacy.load(PKG_INDICES[lang])\n nlp.add_pipe(\"nprecog\")\n doc = nlp(sentence)\n for np in doc._.noun_phrases:\n print(np.text)\n\n```\n### Detect verb phrases \n\n```\nimport spacy\nfrom spacy import Language\nfrom phrase_detective import VerbKnowledgeRecognizer, PKG_INDICES\n\n@Language.factory(\"vkbrecog\")\ndef create_vkb_parser(nlp: Language, name: str):\n return VerbKnowledgeRecognizer(nlp) \n\ndef verb_knowledge(lang, sentence):\n nlp = spacy.load(PKG_INDICES[lang])\n nlp.add_pipe(\"vkbrecog\")\n doc = nlp(sentence)\n for v in doc._.verbs:\n print(\"TEXT: {}, TAG: {}, FORM: {}, ORIGNAL: {}\".format(v.text, v.tag_, spacy.explain(v.tag_), v.lemma_))\n for pp in doc._.passive_phrases:\n print(pp.text)\n for vp in doc._.verb_phrases:\n print(vp)\n```\n\n# Development\n\n### Clone project\n```\ngit clone https://github.com/qishe-nlp/phrase-detective.git\n```\n\n### Install [poetry](https://python-poetry.org/docs/)\n\n### Install dependencies\n```\npoetry update\n```\n\n### Test and Issue\n```\npoetry run pytest -rP\n```\nwhich run tests under `tests/*`\n\n### Create sphinx docs\n```\npoetry shell\ncd apidocs\nsphinx-apidoc -f -o source ../phrase_detective\nmake html\npython -m http.server -d build/html\n```\n\n### Hose docs on github pages\n```\ncp -rf apidocs/build/html/* docs/\n```\n\n### Build\n* Change `version` in `pyproject.toml` and `phrase_detective/__init__.py`\n* Build python package by `poetry build`\n\n### Git commit and push\n\n### Publish from local dev env\n* Set pypi test environment variables in poetry, refer to [poetry doc](https://python-poetry.org/docs/repositories/)\n* Publish to pypi test by `poetry publish -r test`\n\n### Publish through CI \n\n* Github action build and publish package to [test pypi repo](https://test.pypi.org/)\n\n```\ngit tag [x.x.x]\ngit push origin master\n```\n\n* Manually publish to [pypi repo](https://pypi.org/) through [github action](https://github.com/qishe-nlp/phrase-detective/actions/workflows/pypi.yml)\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Phrase recognizer component for spacy pipeline",
"version": "0.1.35",
"project_urls": {
"Documentation": "https://qishe-nlp.github.io/phrase-detective/",
"Homepage": "https://github.com/qishe-nlp/phrase-detective",
"Repository": "https://github.com/qishe-nlp/phrase-detective"
},
"split_keywords": [
"phrase detection",
" nlp",
" spacy"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3079210dc784c273564b6bbf2d1405a3bf0b8cefff2efea120b1a58a697597b6",
"md5": "a10246baef8485841e14d3d80faa6a69",
"sha256": "743a48f8e4bda4e8a410cbfa9c72fd583fa62e7cbdc863fbd2fbcc3ef6a1734f"
},
"downloads": -1,
"filename": "phrase_detective-0.1.35-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a10246baef8485841e14d3d80faa6a69",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.12",
"size": 9819,
"upload_time": "2024-12-25T11:14:36",
"upload_time_iso_8601": "2024-12-25T11:14:36.949905Z",
"url": "https://files.pythonhosted.org/packages/30/79/210dc784c273564b6bbf2d1405a3bf0b8cefff2efea120b1a58a697597b6/phrase_detective-0.1.35-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1152a4d0a9f32b9473fbc37c8edb4eeca621d76784f8ce72d847802d374dcfde",
"md5": "647ddfba2ecd6423f9c0532a1d182b96",
"sha256": "b02fab8104253b694574ed5f1191e26b8decc892d38932577c26154389f52f02"
},
"downloads": -1,
"filename": "phrase_detective-0.1.35.tar.gz",
"has_sig": false,
"md5_digest": "647ddfba2ecd6423f9c0532a1d182b96",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.12",
"size": 6010,
"upload_time": "2024-12-25T11:14:39",
"upload_time_iso_8601": "2024-12-25T11:14:39.088246Z",
"url": "https://files.pythonhosted.org/packages/11/52/a4d0a9f32b9473fbc37c8edb4eeca621d76784f8ce72d847802d374dcfde/phrase_detective-0.1.35.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-25 11:14:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "qishe-nlp",
"github_project": "phrase-detective",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "phrase-detective"
}