llm-xai


Namellm-xai JSON
Version 0.2 PyPI version JSON
download
home_pageNone
SummaryLLM plugin for models hosted by OpenRouter
upload_time2024-11-04 23:33:51
maintainerNone
docs_urlNone
authorNick Mystic
requires_pythonNone
licenseApache-2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # llm-xai
LLM plugin to access xAI's models


## Installation

First, [install the LLM command-line utility](https://llm.datasette.io/en/stable/setup.html).

Now install this plugin in the same environment as LLM.
```bash
llm install llm-xai
```

## Configuration

You will need an API key from xAI. You can [obtain one here](https://console.x.ai).

You can set that as an environment variable called `XAI_KEY`, or add it to the `llm` set of saved keys using:

```bash
llm keys set xai
```
```
Enter key: <paste key here>
```

## Usage

To list available models, run:
```bash
llm models list
```
You should see a list that looks something like this:
```
xAI: xAI/grok-beta
xAI: xAIcompletion/grok-beta
...
```
To run a prompt against a model, pass its full model ID to the `-m` option, like this:
```bash
llm chat -m xAI/grok-beta
```
Enter your prompt, and have a chat:
```shell
Chatting with xAI/grok-beta
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> sup playa
Hey, what's up?
>
```

xAI offers a completion endpoint.
```bash
llm -m xAIcompletion/grok-beta "You must know this about me:"
```
```shell
 I’m not a fan of being alone. I have a hard time finding peace in the silence. My own thoughts drive me crazy. But I knew I had to do this for myself. I had to prove to myself that I could be alone and be okay with it
...
 ```
You can set a shorter alias for a model using the `llm aliases` command like so:
```bash
llm aliases set grok xAI/grok-beta
```
Now you can prompt Claude using:
```bash
cat llm_xai.py | llm -m grok-beta -s 'write some pytest tests for this'
```
## Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:
```bash
cd llm-xai
python3 -m venv venv
source venv/bin/activate
```
Now install the dependencies and test dependencies:
```bash
pip install -e '.[test]'
```
To run the tests:
```bash
pytest
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llm-xai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Nick Mystic",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/59/79/880bd549d97d772fdb58c14637450f41e96740e0e252fa1a1807f097c346/llm_xai-0.2.tar.gz",
    "platform": null,
    "description": "# llm-xai\nLLM plugin to access xAI's models\n\n\n## Installation\n\nFirst, [install the LLM command-line utility](https://llm.datasette.io/en/stable/setup.html).\n\nNow install this plugin in the same environment as LLM.\n```bash\nllm install llm-xai\n```\n\n## Configuration\n\nYou will need an API key from xAI. You can [obtain one here](https://console.x.ai).\n\nYou can set that as an environment variable called `XAI_KEY`, or add it to the `llm` set of saved keys using:\n\n```bash\nllm keys set xai\n```\n```\nEnter key: <paste key here>\n```\n\n## Usage\n\nTo list available models, run:\n```bash\nllm models list\n```\nYou should see a list that looks something like this:\n```\nxAI: xAI/grok-beta\nxAI: xAIcompletion/grok-beta\n...\n```\nTo run a prompt against a model, pass its full model ID to the `-m` option, like this:\n```bash\nllm chat -m xAI/grok-beta\n```\nEnter your prompt, and have a chat:\n```shell\nChatting with xAI/grok-beta\nType 'exit' or 'quit' to exit\nType '!multi' to enter multiple lines, then '!end' to finish\n> sup playa\nHey, what's up?\n>\n```\n\nxAI offers a completion endpoint.\n```bash\nllm -m xAIcompletion/grok-beta \"You must know this about me:\"\n```\n```shell\n I\u2019m not a fan of being alone. I have a hard time finding peace in the silence. My own thoughts drive me crazy. But I knew I had to do this for myself. I had to prove to myself that I could be alone and be okay with it\n...\n ```\nYou can set a shorter alias for a model using the `llm aliases` command like so:\n```bash\nllm aliases set grok xAI/grok-beta\n```\nNow you can prompt Claude using:\n```bash\ncat llm_xai.py | llm -m grok-beta -s 'write some pytest tests for this'\n```\n## Development\n\nTo set up this plugin locally, first checkout the code. Then create a new virtual environment:\n```bash\ncd llm-xai\npython3 -m venv venv\nsource venv/bin/activate\n```\nNow install the dependencies and test dependencies:\n```bash\npip install -e '.[test]'\n```\nTo run the tests:\n```bash\npytest\n```\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "LLM plugin for models hosted by OpenRouter",
    "version": "0.2",
    "project_urls": {
        "CI": "https://github.com/ghostofpokemon/llm-xai/actions",
        "Changelog": "https://github.com/ghostofpokemon/llm-xai/releases",
        "Homepage": "https://github.com/ghostofpokemon/llm-xai",
        "Issues": "https://github.com/ghostofpokemon/llm-xai/issues"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bac824a5d3830906f3e324023cfd9d5e9be09a86083744495165e87bb382144f",
                "md5": "8b86ccf20bfd19b259758154745960b4",
                "sha256": "da10217998467efe8db51e0f2fe4dc11f5e01b17bc2bd2eb9acbc2b4cc97c817"
            },
            "downloads": -1,
            "filename": "llm_xai-0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8b86ccf20bfd19b259758154745960b4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 4358,
            "upload_time": "2024-11-04T23:33:50",
            "upload_time_iso_8601": "2024-11-04T23:33:50.425717Z",
            "url": "https://files.pythonhosted.org/packages/ba/c8/24a5d3830906f3e324023cfd9d5e9be09a86083744495165e87bb382144f/llm_xai-0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5979880bd549d97d772fdb58c14637450f41e96740e0e252fa1a1807f097c346",
                "md5": "2a9b2709e60fc0bad2b07c6c439eddb7",
                "sha256": "3ff29ac518478994cdd1d731d00276ddace2e78f2ea6e792185a6f846817610f"
            },
            "downloads": -1,
            "filename": "llm_xai-0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "2a9b2709e60fc0bad2b07c6c439eddb7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 4026,
            "upload_time": "2024-11-04T23:33:51",
            "upload_time_iso_8601": "2024-11-04T23:33:51.704062Z",
            "url": "https://files.pythonhosted.org/packages/59/79/880bd549d97d772fdb58c14637450f41e96740e0e252fa1a1807f097c346/llm_xai-0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-04 23:33:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ghostofpokemon",
    "github_project": "llm-xai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "llm-xai"
}
        
Elapsed time: 0.65887s