chinesevocablist


Namechinesevocablist JSON
Version 0.3.8 PyPI version JSON
download
home_pagehttp://github.com/kerrickstaley/Chinese-Vocab-List
SummaryProgrammatic interface to the Chinese Vocab List
upload_time2024-01-05 04:18:14
maintainer
docs_urlNone
authorKerrick Staley
requires_python
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            # Chinese Vocab List
A list of Chinese vocabulary words with definitions, pronunciations, and example sentences. Under a CC-BY-SA license. See [chinese_vocab_list.yaml](https://raw.githubusercontent.com/kerrickstaley/Chinese-Vocab-List/master/chinese_vocab_list.yaml) for the list itself.

Used by the Chinese Prestudy Anki addon. See [this blog post](https://www.kerrickstaley.com/2018/09/04/chinese-prestudy) for more details.

[![Build Status](https://travis-ci.org/kerrickstaley/Chinese-Vocab-List.svg?branch=master)](https://travis-ci.org/kerrickstaley/Chinese-Vocab-List)

## Contributing
There are a few ways to contribute:
* Making changes to the source code in `src/`.
* Making changes files in `contrib_files/`:
  * `subtlex_dupes.yaml` lists words that are redundant with other words. For example, `身上: 身` in that file means that instead of learning the word "身上", someone should just learn the word "身".
  * `preferred_entries.yaml` indicates which entries from CC-CEDICT are the best to use for each word. Only needed when you increase the size of the vocab list and it complains because it finds a word with multiple definition. Note: some words have multiple meanings that are worth learning but are split across different entries in CC-CEDICT. For example, 只 and 面. I don't have a good way to represent these in `chinese_vocab_list.yaml` yet.
* Directly modifying `chinese_vocab_list.yaml`.

If you change `src/` or `contrib_files/`, be sure to run `make chinese_vocab_list.yaml` and check in both your changes and the generated changes to `chinese_vocab_list.yaml`.

## Updating reference_files:
* `cc_cedict.txt`: Run `curl https://www.mdbg.net/chinese/export/cedict/cedict_1_0_ts_utf-8_mdbg.txt.gz | gunzip > reference_files/cc_cedict.txt`
  * You may need to update contrib_files/preferred_entries.yaml and/or other files in order to handle the update. Run `make` and fix errors until the vocab list builds cleanly.

## Publishing to PyPI
If your name is Kerrick, you can publish the `chinesevocablist` package to PyPI by running these commands from the root of the repo:
```
rm -rf dist/*
python3 -m build
python3 -m twine upload dist/*
```
Note that this directly uploads to prod PyPI and skips uploading to test PyPI.

            

Raw data

            {
    "_id": null,
    "home_page": "http://github.com/kerrickstaley/Chinese-Vocab-List",
    "name": "chinesevocablist",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Kerrick Staley",
    "author_email": "k@kerrickstaley.com",
    "download_url": "https://files.pythonhosted.org/packages/94/b4/092e2ea0c40dfa4b2070916b884b2fd20e7b10c723c0247e0bcd6f37c011/chinesevocablist-0.3.8.tar.gz",
    "platform": null,
    "description": "# Chinese Vocab List\nA list of Chinese vocabulary words with definitions, pronunciations, and example sentences. Under a CC-BY-SA license. See [chinese_vocab_list.yaml](https://raw.githubusercontent.com/kerrickstaley/Chinese-Vocab-List/master/chinese_vocab_list.yaml) for the list itself.\n\nUsed by the Chinese Prestudy Anki addon. See [this blog post](https://www.kerrickstaley.com/2018/09/04/chinese-prestudy) for more details.\n\n[![Build Status](https://travis-ci.org/kerrickstaley/Chinese-Vocab-List.svg?branch=master)](https://travis-ci.org/kerrickstaley/Chinese-Vocab-List)\n\n## Contributing\nThere are a few ways to contribute:\n* Making changes to the source code in `src/`.\n* Making changes files in `contrib_files/`:\n  * `subtlex_dupes.yaml` lists words that are redundant with other words. For example, `\u8eab\u4e0a: \u8eab` in that file means that instead of learning the word \"\u8eab\u4e0a\", someone should just learn the word \"\u8eab\".\n  * `preferred_entries.yaml` indicates which entries from CC-CEDICT are the best to use for each word. Only needed when you increase the size of the vocab list and it complains because it finds a word with multiple definition. Note: some words have multiple meanings that are worth learning but are split across different entries in CC-CEDICT. For example, \u53ea and \u9762. I don't have a good way to represent these in `chinese_vocab_list.yaml` yet.\n* Directly modifying `chinese_vocab_list.yaml`.\n\nIf you change `src/` or `contrib_files/`, be sure to run `make chinese_vocab_list.yaml` and check in both your changes and the generated changes to `chinese_vocab_list.yaml`.\n\n## Updating reference_files:\n* `cc_cedict.txt`: Run `curl https://www.mdbg.net/chinese/export/cedict/cedict_1_0_ts_utf-8_mdbg.txt.gz | gunzip > reference_files/cc_cedict.txt`\n  * You may need to update contrib_files/preferred_entries.yaml and/or other files in order to handle the update. Run `make` and fix errors until the vocab list builds cleanly.\n\n## Publishing to PyPI\nIf your name is Kerrick, you can publish the `chinesevocablist` package to PyPI by running these commands from the root of the repo:\n```\nrm -rf dist/*\npython3 -m build\npython3 -m twine upload dist/*\n```\nNote that this directly uploads to prod PyPI and skips uploading to test PyPI.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Programmatic interface to the Chinese Vocab List",
    "version": "0.3.8",
    "project_urls": {
        "Homepage": "http://github.com/kerrickstaley/Chinese-Vocab-List"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fbbbab7504d859e7f645be9f4295bd476236aec7c9ebf6be29218a5730aa382c",
                "md5": "6a8e49a83917fc09d326bb832c023959",
                "sha256": "51345d779681034452f509b5f4e4685ce60e5b965a0725577550a04b8cf871d5"
            },
            "downloads": -1,
            "filename": "chinesevocablist-0.3.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6a8e49a83917fc09d326bb832c023959",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 527809,
            "upload_time": "2024-01-05T04:18:12",
            "upload_time_iso_8601": "2024-01-05T04:18:12.972861Z",
            "url": "https://files.pythonhosted.org/packages/fb/bb/ab7504d859e7f645be9f4295bd476236aec7c9ebf6be29218a5730aa382c/chinesevocablist-0.3.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "94b4092e2ea0c40dfa4b2070916b884b2fd20e7b10c723c0247e0bcd6f37c011",
                "md5": "5df0d743da64bb1f7185e43bdad4d2fb",
                "sha256": "45e4e21d632af513d94aa0aced3fa85a8c2187a165b1d49d06101cc25f213b2b"
            },
            "downloads": -1,
            "filename": "chinesevocablist-0.3.8.tar.gz",
            "has_sig": false,
            "md5_digest": "5df0d743da64bb1f7185e43bdad4d2fb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 523380,
            "upload_time": "2024-01-05T04:18:14",
            "upload_time_iso_8601": "2024-01-05T04:18:14.762777Z",
            "url": "https://files.pythonhosted.org/packages/94/b4/092e2ea0c40dfa4b2070916b884b2fd20e7b10c723c0247e0bcd6f37c011/chinesevocablist-0.3.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-05 04:18:14",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kerrickstaley",
    "github_project": "Chinese-Vocab-List",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": false,
    "lcname": "chinesevocablist"
}
        
Elapsed time: 0.27029s