langlearncopilot


Namelanglearncopilot JSON
Version 0.3.1 PyPI version JSON
download
home_pagehttps://github.com/osm3000/LangLearnCopilot
SummaryLangLearnCopilot is a library to help you learn a 'human' language.
upload_time2023-09-11 05:20:20
maintainer
docs_urlNone
authorosm3000
requires_python>=3.9,<4.0
license
keywords langlearncopilot gpt copilot
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)  
[![Poetry](https://img.shields.io/endpoint?url=https://python-poetry.org/badge/v0.json)](https://python-poetry.org/)

# LangLearnCopilot
LangLearnCopilot is a collection of functions and tools, to generate content in a proper format, to help you learn a new language.

For now, I am focusing on French (my personal interesting) and Spanish (for friends).

While this is a standalone library, my advice is, for the best results, to use its outcomes hand-in-hand with the awesome flashcards application, [Anki](https://apps.ankiweb.net/)

The main two applications at the moment

## Installation
`pip install langlearncopilot`

## Usage
You can find examples in the `./examples` folder

Note: you need to set your OpenAI key first. If the `OPENAI_API_KEY` is declared already as an environment variable, or exisiting in the current folder in a `.env` file, then the package will find it.

Other wise, you can set it manually by calling
```python
from langlearncopilot.llm_calls import set_openai_key

set_openai_key("ENTER KEY VALUE HERE")
```

### Given a text, extract all the unique words and its translation from that text
```python
from langlearncopilot import generators

generators.generate_unique_words(
    article="Bonjour, je m'appelle Jean. Je suis un étudiant à l'université de Paris."
)
```
returns
```
{
    'bonjour': 'hello',
    'je': 'i',
    "m'appelle": 'is called',
    'suis': 'am',
    'un': 'a',
    'étudiant': 'student',
    'à': 'at',
    "l'université": 'university',
    'de': 'of'
}
```

### Given a word, generate 3 phrases that are using these words
```python
from langlearncopilot import generators

generators.generate_phrases("combien")
```

returns
```python
[
    {'combien': {'phrase': 'Combien de personnes sont venues à la fête?', 'translation': ' How many people came to the party?'}},
    {'combien': {'phrase': 'Combien coûte ce sac à dos?', 'translation': ' How much does this backpack cost?'}},
    {'combien': {'phrase': "Il m'a demandé combien de temps cela prendrait.", 'translation': ' He asked me how long it would take.'}}
]
```

### Extract unique words from URL, with their translation
```python
from langlearncopilot.parsers import get_text_from_webpage
from langlearncopilot.generators import generate_unique_words


def main():
    # Get text from a webpage
    text = get_text_from_webpage(
        url="https://www.lemonde.fr/planete/article/2023/08/27/comment-les-parcs-nationaux-americains-tentent-de-faire-face-aux-effets-du-rechauffement-climatique_6186696_3244.html"
    )
    # Generate unique words from the text
    words = generate_unique_words(article=text, language="french")
    print(words)


if __name__ == "__main__":
    main()

```
returns
```python
{
    'comment': 'how',
    'les': 'the',
    'parcs': 'parks',
    'nationaux': 'national',
    'américains': 'american',
    'tentent': 'try',
    'de': 'of',
    ...
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/osm3000/LangLearnCopilot",
    "name": "langlearncopilot",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9,<4.0",
    "maintainer_email": "",
    "keywords": "langlearncopilot,gpt,copilot",
    "author": "osm3000",
    "author_email": "bd2fc312-d8c3-482f-8535-94c14174bc88@anonaddy.me",
    "download_url": "https://files.pythonhosted.org/packages/f9/06/4b6d7619335df27c378ae2f94b2ba53166cfb73afdb8a33fd3c0ed6783f2/langlearncopilot-0.3.1.tar.gz",
    "platform": null,
    "description": "[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\u00a0\u00a0\n[![Poetry](https://img.shields.io/endpoint?url=https://python-poetry.org/badge/v0.json)](https://python-poetry.org/)\n\n# LangLearnCopilot\nLangLearnCopilot is a collection of functions and tools, to generate content in a proper format, to help you learn a new language.\n\nFor now, I am focusing on French (my personal interesting) and Spanish (for friends).\n\nWhile this is a standalone library, my advice is, for the best results, to use its outcomes hand-in-hand with the awesome flashcards application, [Anki](https://apps.ankiweb.net/)\n\nThe main two applications at the moment\n\n## Installation\n`pip install langlearncopilot`\n\n## Usage\nYou can find examples in the `./examples` folder\n\nNote: you need to set your OpenAI key first. If the `OPENAI_API_KEY` is declared already as an environment variable, or exisiting in the current folder in a `.env` file, then the package will find it.\n\nOther wise, you can set it manually by calling\n```python\nfrom langlearncopilot.llm_calls import set_openai_key\n\nset_openai_key(\"ENTER KEY VALUE HERE\")\n```\n\n### Given a text, extract all the unique words and its translation from that text\n```python\nfrom langlearncopilot import generators\n\ngenerators.generate_unique_words(\n    article=\"Bonjour, je m'appelle Jean. Je suis un \u00e9tudiant \u00e0 l'universit\u00e9 de Paris.\"\n)\n```\nreturns\n```\n{\n    'bonjour': 'hello',\n    'je': 'i',\n    \"m'appelle\": 'is called',\n    'suis': 'am',\n    'un': 'a',\n    '\u00e9tudiant': 'student',\n    '\u00e0': 'at',\n    \"l'universit\u00e9\": 'university',\n    'de': 'of'\n}\n```\n\n### Given a word, generate 3 phrases that are using these words\n```python\nfrom langlearncopilot import generators\n\ngenerators.generate_phrases(\"combien\")\n```\n\nreturns\n```python\n[\n    {'combien': {'phrase': 'Combien de personnes sont venues \u00e0 la f\u00eate?', 'translation': ' How many people came to the party?'}},\n    {'combien': {'phrase': 'Combien co\u00fbte ce sac \u00e0 dos?', 'translation': ' How much does this backpack cost?'}},\n    {'combien': {'phrase': \"Il m'a demand\u00e9 combien de temps cela prendrait.\", 'translation': ' He asked me how long it would take.'}}\n]\n```\n\n### Extract unique words from URL, with their translation\n```python\nfrom langlearncopilot.parsers import get_text_from_webpage\nfrom langlearncopilot.generators import generate_unique_words\n\n\ndef main():\n    # Get text from a webpage\n    text = get_text_from_webpage(\n        url=\"https://www.lemonde.fr/planete/article/2023/08/27/comment-les-parcs-nationaux-americains-tentent-de-faire-face-aux-effets-du-rechauffement-climatique_6186696_3244.html\"\n    )\n    # Generate unique words from the text\n    words = generate_unique_words(article=text, language=\"french\")\n    print(words)\n\n\nif __name__ == \"__main__\":\n    main()\n\n```\nreturns\n```python\n{\n    'comment': 'how',\n    'les': 'the',\n    'parcs': 'parks',\n    'nationaux': 'national',\n    'am\u00e9ricains': 'american',\n    'tentent': 'try',\n    'de': 'of',\n    ...\n}\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "LangLearnCopilot is a library to help you learn a 'human' language.",
    "version": "0.3.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/osm3000/LangLearnCopilot/issues",
        "Home Page": "https://github.com/osm3000/LangLearnCopilot",
        "Homepage": "https://github.com/osm3000/LangLearnCopilot",
        "Repository": "https://github.com/osm3000/LangLearnCopilot"
    },
    "split_keywords": [
        "langlearncopilot",
        "gpt",
        "copilot"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "291e79c4b9c139f9d5430c4b687a6b6a5887fa1b107e65c2caff10203bd473b4",
                "md5": "b244175e1fbff1b50771b22be78b7058",
                "sha256": "98894bb3368b2d8c59b2789dd536b2cb86e317a4d9784031901453c3340cf495"
            },
            "downloads": -1,
            "filename": "langlearncopilot-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b244175e1fbff1b50771b22be78b7058",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9,<4.0",
            "size": 13034,
            "upload_time": "2023-09-11T05:20:18",
            "upload_time_iso_8601": "2023-09-11T05:20:18.824707Z",
            "url": "https://files.pythonhosted.org/packages/29/1e/79c4b9c139f9d5430c4b687a6b6a5887fa1b107e65c2caff10203bd473b4/langlearncopilot-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f9064b6d7619335df27c378ae2f94b2ba53166cfb73afdb8a33fd3c0ed6783f2",
                "md5": "04aac856d1fc6f5df8e0b9cb6c9f47aa",
                "sha256": "0034be875dddf7cb8dfd0723d11daf9ee92f318aa2e6616b2cd968eb25565e99"
            },
            "downloads": -1,
            "filename": "langlearncopilot-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "04aac856d1fc6f5df8e0b9cb6c9f47aa",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9,<4.0",
            "size": 9404,
            "upload_time": "2023-09-11T05:20:20",
            "upload_time_iso_8601": "2023-09-11T05:20:20.619492Z",
            "url": "https://files.pythonhosted.org/packages/f9/06/4b6d7619335df27c378ae2f94b2ba53166cfb73afdb8a33fd3c0ed6783f2/langlearncopilot-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-11 05:20:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "osm3000",
    "github_project": "LangLearnCopilot",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "langlearncopilot"
}
        
Elapsed time: 2.49910s