# LangWhat
Answer "What is it?" on the command line with the power of large language models (LLMs).
[pyWhat](https://github.com/bee-san/pyWhat) LLM version, leveraging OpenAI API and Sydney (Bing AI Chat).
- [LangWhat](#langwhat)
- [Screenshots](#screenshots)
- [OpenAI](#openai)
- [Sydney](#sydney)
- [Caching](#caching)
- [Features](#features)
- [Installation](#installation)
- [pipx](#pipx)
- [pip](#pip)
- [Usage](#usage)
- [Develop](#develop)
- [Credits](#credits)
## Screenshots
### OpenAI
```bash
langwhat 'f7316ffccd4d2d555a7522328cf792dd73bfbcd9'
langwhat 'f7316ffccd4d2d555a7522328cf792dd73bfbcd9' --zh
```
![](images/screenshot-hash.png)
### Sydney
Sydney fixed my type "marry" automatically.
```bash
langwhat 'marry ball washington' -s
langwhat 'marry ball washington' -s -z
```
![](images/screenshot-sydney-mary.png)
### Caching
Responses are much faster when cache hits, and token usage becomes 0.
Note that Sydney doesn't support counting token usage, and always shows 0.
```bash
lw teddy --show-token-usage
lw teddy --show-token-usage
```
![](images/screenshot-caching-teddy.png)
## Features
- Uses few-shot prompting to reduce model mis-behavior
- English by default for superior response speed and accuracy
- Supports supplying API Key via either environment variable or config file at `~/.config/langwhat/api_key.txt`
- Supports using [Sydney](https://www.bing.com/search?q=Bing+AI&showconv=1) with `-s` flag
- Caching responses in a local sqlite database
## Installation
### pipx
This is the recommended installation method.
```
$ pipx install langwhat
# python 3.11 or higher is required, if your pipx uses a lower version of python by default,
# you could run the following command to install langwhat with python 3.11
# pipx install --python "$(which python3.11)"
```
### [pip](https://pypi.org/project/langwhat/)
```
$ pip install langwhat
```
## Usage
```
$ langwhat --help
usage: lw [-h] [-z] [-s] [-C] [--show-token-usage] [-V] what
positional arguments:
what what is it
options:
-h, --help show this help message and exit
-z, --zh Use Mandarin to prompt and answer
-s, --sydney Use Sydney (Bing AI) instead of OpenAI
-C, --no-cache Disable cache
--show-token-usage Show token usage
-V, --version show program's version number and exit
```
## Develop
```
$ git clone https://github.com/tddschn/langwhat.git
$ cd langwhat
$ poetry install
```
## Credits
- LangChain
- [EdgeGPT](https://github.com/acheong08/EdgeGPT)
- [Yihong](https://github.com/yihong0618/iWhat)
Raw data
{
"_id": null,
"home_page": "https://github.com/tddschn/langwhat",
"name": "langwhat",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.11,<4.0",
"maintainer_email": "",
"keywords": "LLM",
"author": "Teddy Xinyuan Chen",
"author_email": "45612704+tddschn@users.noreply.github.com",
"download_url": "https://files.pythonhosted.org/packages/a4/fe/d39a11a59053a086789c9948e22ff17f3ee34aeb3053b4a3be20316202d9/langwhat-2.0.3.tar.gz",
"platform": null,
"description": "# LangWhat\n\nAnswer \"What is it?\" on the command line with the power of large language models (LLMs).\n\n[pyWhat](https://github.com/bee-san/pyWhat) LLM version, leveraging OpenAI API and Sydney (Bing AI Chat).\n\n- [LangWhat](#langwhat)\n - [Screenshots](#screenshots)\n - [OpenAI](#openai)\n - [Sydney](#sydney)\n - [Caching](#caching)\n - [Features](#features)\n - [Installation](#installation)\n - [pipx](#pipx)\n - [pip](#pip)\n - [Usage](#usage)\n - [Develop](#develop)\n - [Credits](#credits)\n\n## Screenshots\n\n### OpenAI\n\n```bash\nlangwhat 'f7316ffccd4d2d555a7522328cf792dd73bfbcd9'\n\nlangwhat 'f7316ffccd4d2d555a7522328cf792dd73bfbcd9' --zh\n```\n\n![](images/screenshot-hash.png)\n\n### Sydney\n\nSydney fixed my type \"marry\" automatically.\n\n```bash\nlangwhat 'marry ball washington' -s\n\nlangwhat 'marry ball washington' -s -z\n```\n\n\n![](images/screenshot-sydney-mary.png)\n\n### Caching\n\nResponses are much faster when cache hits, and token usage becomes 0.\n\nNote that Sydney doesn't support counting token usage, and always shows 0.\n\n```bash\nlw teddy --show-token-usage\n\nlw teddy --show-token-usage\n```\n\n![](images/screenshot-caching-teddy.png)\n## Features\n- Uses few-shot prompting to reduce model mis-behavior\n- English by default for superior response speed and accuracy\n- Supports supplying API Key via either environment variable or config file at `~/.config/langwhat/api_key.txt`\n- Supports using [Sydney](https://www.bing.com/search?q=Bing+AI&showconv=1) with `-s` flag\n- Caching responses in a local sqlite database\n\n\n## Installation\n\n### pipx\n\nThis is the recommended installation method.\n\n```\n$ pipx install langwhat\n\n# python 3.11 or higher is required, if your pipx uses a lower version of python by default,\n# you could run the following command to install langwhat with python 3.11\n# pipx install --python \"$(which python3.11)\"\n```\n\n### [pip](https://pypi.org/project/langwhat/)\n\n```\n$ pip install langwhat\n```\n\n\n## Usage\n\n```\n$ langwhat --help\n\nusage: lw [-h] [-z] [-s] [-C] [--show-token-usage] [-V] what\n\npositional arguments:\n what what is it\n\noptions:\n -h, --help show this help message and exit\n -z, --zh Use Mandarin to prompt and answer\n -s, --sydney Use Sydney (Bing AI) instead of OpenAI\n -C, --no-cache Disable cache\n --show-token-usage Show token usage\n -V, --version show program's version number and exit\n```\n\n\n\n## Develop\n\n```\n$ git clone https://github.com/tddschn/langwhat.git\n$ cd langwhat\n$ poetry install\n```\n\n## Credits\n- LangChain\n- [EdgeGPT](https://github.com/acheong08/EdgeGPT)\n- [Yihong](https://github.com/yihong0618/iWhat)\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "What is it? pyWhat but with LLMs",
"version": "2.0.3",
"split_keywords": [
"llm"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7725e9588751ea647caeff979dfc8741864478c01cd9f24fff82ae457e4f5ed2",
"md5": "4be2a814b280cf08c97173bb73644fd9",
"sha256": "61f909e9e755236587b8c20bf7cf526b849b150a52d3d038d085e7a33e4f3bc3"
},
"downloads": -1,
"filename": "langwhat-2.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "4be2a814b280cf08c97173bb73644fd9",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11,<4.0",
"size": 9857,
"upload_time": "2023-03-14T07:34:31",
"upload_time_iso_8601": "2023-03-14T07:34:31.114689Z",
"url": "https://files.pythonhosted.org/packages/77/25/e9588751ea647caeff979dfc8741864478c01cd9f24fff82ae457e4f5ed2/langwhat-2.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a4fed39a11a59053a086789c9948e22ff17f3ee34aeb3053b4a3be20316202d9",
"md5": "64f04e108c2ae9083a0f8709bda952e6",
"sha256": "432537d4ef49a39e4372cd11470ea6a32e933bb039c5a13e8b47403644c0823d"
},
"downloads": -1,
"filename": "langwhat-2.0.3.tar.gz",
"has_sig": false,
"md5_digest": "64f04e108c2ae9083a0f8709bda952e6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11,<4.0",
"size": 8860,
"upload_time": "2023-03-14T07:34:32",
"upload_time_iso_8601": "2023-03-14T07:34:32.599865Z",
"url": "https://files.pythonhosted.org/packages/a4/fe/d39a11a59053a086789c9948e22ff17f3ee34aeb3053b4a3be20316202d9/langwhat-2.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-03-14 07:34:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "tddschn",
"github_project": "langwhat",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "langwhat"
}