openplayground


Nameopenplayground JSON
Version 0.1.5 PyPI version JSON
download
home_pagehttps://nat.dev/
SummaryAn LLM playground you can run on your laptop.
upload_time2023-04-13 18:19:00
maintainer
docs_urlNone
authorNat Friedman
requires_python>=3.9,<4.0
license
keywords llm playground openplayground
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # openplayground

An LLM playground you can run on your laptop.

https://user-images.githubusercontent.com/111631/227399583-39b23f48-9823-4571-a906-985dbe282b20.mp4

#### Features

- Use any model from [OpenAI](), [Anthropic](), [Cohere](), [Forefront](), [HuggingFace](), [Aleph Alpha](), and [llama.cpp]().
- Full playground UI, including history, parameter tuning, keyboard shortcuts, and logprops.
- Compare models side-by-side with the same prompt, individually tune model parameters, and retry with different parameters.
- Automatically detects local models in your HuggingFace cache, and lets you install new ones.
- Works OK on your phone.
- Probably won't kill everyone.

## Try on nat.dev

Try the hosted version: [nat.dev](https://nat.dev).

## How to install and run

```sh
$ pip install openplayground
$ openplayground run
```

Alternatively, run it as a docker container:
```sh
$ docker run --name openplayground -p 5432:5432 -d --volume openplayground:/web/config natorg/openplayground
```

This runs a Flask process, so you can add the typical flags such as setting a different port `openplayground run -p 1235` and others.

## How to run for development

```sh
$ git clone https://github.com/nat/openplayground
$ cd app && npm install && npx parcel watch src/index.html --no-cache
$ cd server && pip3 install -r requirements.txt && cd .. && python3 -m server.app
```

## Docker

```sh
$ docker build . --tag "openplayground"
$ docker run --name openplayground -p 5432:5432 -d --volume openplayground:/web/config openplayground
```

First volume is optional. It's used to store API keys, models settings.

## Ideas for contributions

- Add a token counter to the playground
- Add a cost counter to the playground and the compare page
- Measure and display time to first token
- Setup automatic builds with GitHub Actions
- The default parameters for each model are configured in the `server/models.json` file. If you find better default parameters for a model, please submit a pull request!
- Someone can help us make a homebrew package, and a dockerfile
- Easier way to install open source models directly from openplayground, with `openplayground install <model>` or in the UI.
- Find and fix bugs
- ChatGPT UI, with turn-by-turn, markdown rendering, chatgpt plugin support, etc.
- We will probably need multimodal inputs and outputs at some point in 2023

### llama.cpp

## Adding models to openplayground

Models and providers have three types in openplayground:

- Searchable
- Local inference
- API

You can add models in `server/models.json` with the following schema:

#### Local inference

For models running locally on your device you can add them to openplayground like the following (a minimal example):

```json
"llama": {
    "api_key" : false,
    "models" : {
        "llama-70b": {
            "parameters": {
                "temperature": {
                    "value": 0.5,
                    "range": [
                        0.1,
                        1.0
                    ]
                },
            }
        }
    }
}
```

Keep in mind you will need to add a generation method for your model in `server/app.py`. Take a look at `local_text_generation()` as an example.

#### API Provider Inference

This is for model providers like OpenAI, cohere, forefront, and more. You can connect them easily into openplayground (a minimal example):

```json
"cohere": {
    "api_key" : true,
    "models" : {
        "xlarge": {
            "parameters": {
                "temperature": {
                    "value": 0.5,
                    "range": [
                        0.1,
                        1.0
                    ]
                },
            }
        }
    }
}
```

Keep in mind you will need to add a generation method for your model in `server/app.py`. Take a look at `openai_text_generation()` or `cohere_text_generation()` as an example.

#### Searchable models

We use this for Huggingface Remote Inference models, the search endpoint is useful for scaling to N models in the settings page.

```json
"provider_name": {
    "api_key": true,
    "search": {
        "endpoint": "ENDPOINT_URL"
    },
    "parameters": {
        "parameter": {
            "value": 1.0,
            "range": [
                0.1,
                1.0
            ]
        },
    }
}
```

#### Credits

Instigated by Nat Friedman. Initial implementation by [Zain Huda](https://github.com/zainhuda) as a repl.it bounty. Many features and extensive refactoring by Alex Lourenco.

            

Raw data

            {
    "_id": null,
    "home_page": "https://nat.dev/",
    "name": "openplayground",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9,<4.0",
    "maintainer_email": "",
    "keywords": "llm,playground,openplayground",
    "author": "Nat Friedman",
    "author_email": "nat@nat.org",
    "download_url": "https://files.pythonhosted.org/packages/10/6b/74fb3ab4479193e6eaca2a7a6b1e5e3f8c80486daf8d2b8c2d28b6b92549/openplayground-0.1.5.tar.gz",
    "platform": null,
    "description": "# openplayground\n\nAn LLM playground you can run on your laptop.\n\nhttps://user-images.githubusercontent.com/111631/227399583-39b23f48-9823-4571-a906-985dbe282b20.mp4\n\n#### Features\n\n- Use any model from [OpenAI](), [Anthropic](), [Cohere](), [Forefront](), [HuggingFace](), [Aleph Alpha](), and [llama.cpp]().\n- Full playground UI, including history, parameter tuning, keyboard shortcuts, and logprops.\n- Compare models side-by-side with the same prompt, individually tune model parameters, and retry with different parameters.\n- Automatically detects local models in your HuggingFace cache, and lets you install new ones.\n- Works OK on your phone.\n- Probably won't kill everyone.\n\n## Try on nat.dev\n\nTry the hosted version: [nat.dev](https://nat.dev).\n\n## How to install and run\n\n```sh\n$ pip install openplayground\n$ openplayground run\n```\n\nAlternatively, run it as a docker container:\n```sh\n$ docker run --name openplayground -p 5432:5432 -d --volume openplayground:/web/config natorg/openplayground\n```\n\nThis runs a Flask process, so you can add the typical flags such as setting a different port `openplayground run -p 1235` and others.\n\n## How to run for development\n\n```sh\n$ git clone https://github.com/nat/openplayground\n$ cd app && npm install && npx parcel watch src/index.html --no-cache\n$ cd server && pip3 install -r requirements.txt && cd .. && python3 -m server.app\n```\n\n## Docker\n\n```sh\n$ docker build . --tag \"openplayground\"\n$ docker run --name openplayground -p 5432:5432 -d --volume openplayground:/web/config openplayground\n```\n\nFirst volume is optional. It's used to store API keys, models settings.\n\n## Ideas for contributions\n\n- Add a token counter to the playground\n- Add a cost counter to the playground and the compare page\n- Measure and display time to first token\n- Setup automatic builds with GitHub Actions\n- The default parameters for each model are configured in the `server/models.json` file. If you find better default parameters for a model, please submit a pull request!\n- Someone can help us make a homebrew package, and a dockerfile\n- Easier way to install open source models directly from openplayground, with `openplayground install <model>` or in the UI.\n- Find and fix bugs\n- ChatGPT UI, with turn-by-turn, markdown rendering, chatgpt plugin support, etc.\n- We will probably need multimodal inputs and outputs at some point in 2023\n\n### llama.cpp\n\n## Adding models to openplayground\n\nModels and providers have three types in openplayground:\n\n- Searchable\n- Local inference\n- API\n\nYou can add models in `server/models.json` with the following schema:\n\n#### Local inference\n\nFor models running locally on your device you can add them to openplayground like the following (a minimal example):\n\n```json\n\"llama\": {\n    \"api_key\" : false,\n    \"models\" : {\n        \"llama-70b\": {\n            \"parameters\": {\n                \"temperature\": {\n                    \"value\": 0.5,\n                    \"range\": [\n                        0.1,\n                        1.0\n                    ]\n                },\n            }\n        }\n    }\n}\n```\n\nKeep in mind you will need to add a generation method for your model in `server/app.py`. Take a look at `local_text_generation()` as an example.\n\n#### API Provider Inference\n\nThis is for model providers like OpenAI, cohere, forefront, and more. You can connect them easily into openplayground (a minimal example):\n\n```json\n\"cohere\": {\n    \"api_key\" : true,\n    \"models\" : {\n        \"xlarge\": {\n            \"parameters\": {\n                \"temperature\": {\n                    \"value\": 0.5,\n                    \"range\": [\n                        0.1,\n                        1.0\n                    ]\n                },\n            }\n        }\n    }\n}\n```\n\nKeep in mind you will need to add a generation method for your model in `server/app.py`. Take a look at `openai_text_generation()` or `cohere_text_generation()` as an example.\n\n#### Searchable models\n\nWe use this for Huggingface Remote Inference models, the search endpoint is useful for scaling to N models in the settings page.\n\n```json\n\"provider_name\": {\n    \"api_key\": true,\n    \"search\": {\n        \"endpoint\": \"ENDPOINT_URL\"\n    },\n    \"parameters\": {\n        \"parameter\": {\n            \"value\": 1.0,\n            \"range\": [\n                0.1,\n                1.0\n            ]\n        },\n    }\n}\n```\n\n#### Credits\n\nInstigated by Nat Friedman. Initial implementation by [Zain Huda](https://github.com/zainhuda) as a repl.it bounty. Many features and extensive refactoring by Alex Lourenco.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "An LLM playground you can run on your laptop.",
    "version": "0.1.5",
    "split_keywords": [
        "llm",
        "playground",
        "openplayground"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3e713ee848220fb5b5917c836496d86cc205e5fe051ae158d6c3389cfe61ba44",
                "md5": "5cdd720d8e10541af1a6e114bd6e6c92",
                "sha256": "361e235b7f1591be868e4101e8617f736b3167b30eb814df17471c279465dc28"
            },
            "downloads": -1,
            "filename": "openplayground-0.1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5cdd720d8e10541af1a6e114bd6e6c92",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9,<4.0",
            "size": 838524,
            "upload_time": "2023-04-13T18:18:59",
            "upload_time_iso_8601": "2023-04-13T18:18:59.474431Z",
            "url": "https://files.pythonhosted.org/packages/3e/71/3ee848220fb5b5917c836496d86cc205e5fe051ae158d6c3389cfe61ba44/openplayground-0.1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "106b74fb3ab4479193e6eaca2a7a6b1e5e3f8c80486daf8d2b8c2d28b6b92549",
                "md5": "64297b98210d1e9c9282039e09092b96",
                "sha256": "279911fd0714b65a532b26b14da420c1be3dd3191500cb51baf0ca76b56c379e"
            },
            "downloads": -1,
            "filename": "openplayground-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "64297b98210d1e9c9282039e09092b96",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9,<4.0",
            "size": 835805,
            "upload_time": "2023-04-13T18:19:00",
            "upload_time_iso_8601": "2023-04-13T18:19:00.869544Z",
            "url": "https://files.pythonhosted.org/packages/10/6b/74fb3ab4479193e6eaca2a7a6b1e5e3f8c80486daf8d2b8c2d28b6b92549/openplayground-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-13 18:19:00",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "openplayground"
}
        
Elapsed time: 0.49048s