speck


Namespeck JSON
Version 0.1.8 PyPI version JSON
download
home_pagehttps://github.com/speckai/speck-llm-observability
SummarySpeck - Development and observability toolkit for LLM apps.
upload_time2024-03-03 01:35:13
maintainer
docs_urlNone
author
requires_python
license
keywords speck openai llm ai chat bot gpt gpt-3 gpt-4 anthropic replicate litellm observability
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
    <img src="https://raw.githubusercontent.com/speckai/speck/main/assets/speck_banner.jpg">
</p>
<p align="center">
    <a href="https://pypi.org/project/speck/">
        <img src="https://img.shields.io/pypi/dm/speck" />
    </a>
    <a href="https://discord.gg/speck">
        <img src="https://dcbadge.vercel.app/api/server/frnaYYaKj3?style=flat" />
    </a>
    <a href="https://github.com/speckai/speck">
        <img src="https://img.shields.io/github/commit-activity/m/speckai/speck" />
    </a>
    <a href="https://linkedin.com/company/speck">
        <img src="https://img.shields.io/badge/LinkedIn-0077B5?logo=linkedin&logoColor=white" />
    </a>
</p>

---

<b>Speck</b> is a livetrace debugging and metrics tracking platform for LLM apps.

Speck streamlines LLM app development with its live debugging and metrics tracking. It simplifies prompt engineering and testing across any LLM, saving you time and enhancing your workflow.

### Features

Speck's main features include:

1. [Live LLM debugging]
2. [LLM observability](https://getspeck.ai/dash/home)
3. Developer framework for calling models
4. [OpenAI proxy](https://docs.getspeck.ai/development/openai)

### Support

| Model                                         | Support |
| --------------------------------------------- | :-----: |
| OpenAI                                        |   ✅    |
| AzureOpenAI                                   |   ✅    |
| Anthropic                                     |   ✅    |
| Replicate                                     |   ✅    |
| [LiteLLM](https://github.com/BerriAI/litellm) |   ✅    |

---

The [dashboard](https://getspeck.ai/dash/home) on the Speck website has 4 main features:

- Home: Dashboard for LLM usage metrics
- Logs: Inspect recent LLM calls
- Playground: Prompt engineer with any model
- Live Debug: Test prompts with on-the-fly debugging

If you have any feature requests or want to stay up to date, please join our [Discord community](https://discord.gg/speck)!

---

## Getting Started

### Python

```shell
pip install speck
```

Then, you can run something like:

```python
from speck import Speck
client = Speck(api_key=None, api_keys={"openai": "sk-...", "anthropic": "sk-..."})
response: Response = client.chat.create(
    prompt=[{"role": "system", "content": "Count to 5."}],
    config={"model": "anthropic:claude-2"}
)
```

Now, each call will be logged for testing. Read more on our [documentation](https://docs.getspeck.ai)!

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/speckai/speck-llm-observability",
    "name": "speck",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "speck,openai,llm,ai,chat,bot,gpt,gpt-3,gpt-4,anthropic,replicate,litellm,observability",
    "author": "",
    "author_email": "Lucas Jaggernauth <luke@getspeck.ai>, Raghav Pillai <raghav@getspeck.ai>",
    "download_url": "https://files.pythonhosted.org/packages/60/98/bc3e73463ec429f8de9704c30d6b633b5a1fc363cf609125d33e64a6b1b5/speck-0.1.8.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n    <img src=\"https://raw.githubusercontent.com/speckai/speck/main/assets/speck_banner.jpg\">\n</p>\n<p align=\"center\">\n    <a href=\"https://pypi.org/project/speck/\">\n        <img src=\"https://img.shields.io/pypi/dm/speck\" />\n    </a>\n    <a href=\"https://discord.gg/speck\">\n        <img src=\"https://dcbadge.vercel.app/api/server/frnaYYaKj3?style=flat\" />\n    </a>\n    <a href=\"https://github.com/speckai/speck\">\n        <img src=\"https://img.shields.io/github/commit-activity/m/speckai/speck\" />\n    </a>\n    <a href=\"https://linkedin.com/company/speck\">\n        <img src=\"https://img.shields.io/badge/LinkedIn-0077B5?logo=linkedin&logoColor=white\" />\n    </a>\n</p>\n\n---\n\n<b>Speck</b> is a livetrace debugging and metrics tracking platform for LLM apps.\n\nSpeck streamlines LLM app development with its live debugging and metrics tracking. It simplifies prompt engineering and testing across any LLM, saving you time and enhancing your workflow.\n\n### Features\n\nSpeck's main features include:\n\n1. [Live LLM debugging]\n2. [LLM observability](https://getspeck.ai/dash/home)\n3. Developer framework for calling models\n4. [OpenAI proxy](https://docs.getspeck.ai/development/openai)\n\n### Support\n\n| Model                                         | Support |\n| --------------------------------------------- | :-----: |\n| OpenAI                                        |   \u2705    |\n| AzureOpenAI                                   |   \u2705    |\n| Anthropic                                     |   \u2705    |\n| Replicate                                     |   \u2705    |\n| [LiteLLM](https://github.com/BerriAI/litellm) |   \u2705    |\n\n---\n\nThe [dashboard](https://getspeck.ai/dash/home) on the Speck website has 4 main features:\n\n- Home: Dashboard for LLM usage metrics\n- Logs: Inspect recent LLM calls\n- Playground: Prompt engineer with any model\n- Live Debug: Test prompts with on-the-fly debugging\n\nIf you have any feature requests or want to stay up to date, please join our [Discord community](https://discord.gg/speck)!\n\n---\n\n## Getting Started\n\n### Python\n\n```shell\npip install speck\n```\n\nThen, you can run something like:\n\n```python\nfrom speck import Speck\nclient = Speck(api_key=None, api_keys={\"openai\": \"sk-...\", \"anthropic\": \"sk-...\"})\nresponse: Response = client.chat.create(\n    prompt=[{\"role\": \"system\", \"content\": \"Count to 5.\"}],\n    config={\"model\": \"anthropic:claude-2\"}\n)\n```\n\nNow, each call will be logged for testing. Read more on our [documentation](https://docs.getspeck.ai)!\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Speck - Development and observability toolkit for LLM apps.",
    "version": "0.1.8",
    "project_urls": {
        "Download": "https://github.com/speckai/speck-llm-observability/archive/refs/tags/v0.1.8.tar.gz",
        "Homepage": "https://github.com/speckai/speck-llm-observability"
    },
    "split_keywords": [
        "speck",
        "openai",
        "llm",
        "ai",
        "chat",
        "bot",
        "gpt",
        "gpt-3",
        "gpt-4",
        "anthropic",
        "replicate",
        "litellm",
        "observability"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8505ea86c795813ce405886cadd391539dd4610a0ae13cc1d537ae42a1aeafe6",
                "md5": "a94532adb110821b802795ac5b95819a",
                "sha256": "84c61178d3804cbdee1f00cab413d5d84b63d3c18b8b44b50e00181b96aaea43"
            },
            "downloads": -1,
            "filename": "speck-0.1.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a94532adb110821b802795ac5b95819a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 31803,
            "upload_time": "2024-03-03T01:35:11",
            "upload_time_iso_8601": "2024-03-03T01:35:11.438292Z",
            "url": "https://files.pythonhosted.org/packages/85/05/ea86c795813ce405886cadd391539dd4610a0ae13cc1d537ae42a1aeafe6/speck-0.1.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6098bc3e73463ec429f8de9704c30d6b633b5a1fc363cf609125d33e64a6b1b5",
                "md5": "b00f4bd4087dbe180004f83bfe2827b8",
                "sha256": "8fe40ceafb3bf40be3c3c851b19b73d4ef19af50a6e9181fbfc35afd7f0a6322"
            },
            "downloads": -1,
            "filename": "speck-0.1.8.tar.gz",
            "has_sig": false,
            "md5_digest": "b00f4bd4087dbe180004f83bfe2827b8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 24595,
            "upload_time": "2024-03-03T01:35:13",
            "upload_time_iso_8601": "2024-03-03T01:35:13.252584Z",
            "url": "https://files.pythonhosted.org/packages/60/98/bc3e73463ec429f8de9704c30d6b633b5a1fc363cf609125d33e64a6b1b5/speck-0.1.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-03 01:35:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "speckai",
    "github_project": "speck-llm-observability",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "speck"
}
        
Elapsed time: 0.19262s