think-llm


Namethink-llm JSON
Version 0.0.5 PyPI version JSON
download
home_pagehttps://github.com/senko/think
SummaryCreate programs that think, using LLMs.
upload_time2024-01-31 02:09:02
maintainer
docs_urlNone
authorSenko Rasic
requires_python>=3.9,<4.0
licenseMIT
keywords ai llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Think

Think is a Python package for creating thinking programs.

It provides simple but powerful primitives for *robust* integration of Large Language
Models (LLMs) into your Python programs.

## Examples

Using AI as an ordinary function:

```python
from think.llm.openai import ChatGPT
from think.ai import ai

@ai
def haiku(topic: str) -> str:
    """
    Write a haiku about {{ topic }}
    """

llm = ChatGPT()

print(haiku(llm, topic="computers"))
```

Allowing AI to use tools:

```python
from datetime import date

from think.llm.openai import ChatGPT
from think.chat import Chat
from think.tool import tool

@tool
def current_date() -> str:
    """
    Get the current date.

    :returns: current date in YYYY-MM-DD format
    """
    return date.today().isoformat()


llm = ChatGPT()
chat = Chat("You are a helpful assistant.")
chat.user("How old are you (in days since your knowledge cutoff)?")

print(llm(chat, tools=[current_date]))
```

Parsing AI output:

```python
import json
from pydantic import BaseModel
from think.llm.openai import ChatGPT
from think.chat import Chat
from think.parser import JSONParser


class CityInfo(BaseModel):
    name: str
    country: str
    population: int
    latitude: float
    longitude: float


llm = ChatGPT()
parser = JSONParser(spec=CityInfo)
chat = Chat(
    "You are a hepful assistant. Your task is to answer questions about cities, "
    "to the best of your knowledge. Your output must be a valid JSON conforming to "
    "this JSON schema:\n" + json.dumps(parser.schema)
).user(city)

answer = llm(chat, parser=parser)

print(f"{answer.name} is a city in {answer.country} with {answer.population} inhabitants.")
print(f"It is located at {answer.latitude} latitude and {answer.longitude} longitude.")
```

## Quickstart

Install via `pip`:

```bash
pip install think-llm
```

Note that the package name is `think-llm`, *not* `think`.

Set up your LLM credentials (OpenAI or Anthropic, depending on the LLM you want to use):

```bash
export OPENAI_API_KEY=<your-openai-key>
export ANTHROPIC_API_KEY=<your-anthropic-key>
```

And you're ready to go:

```python
from think.llm.openai import ChatGPT
from think.chat import Chat

llm = ChatGPT()
chat = Chat("You are a helpful assistant.").user("Tell me a funny joke.")
print(llm(chat))
```

Explore the [examples](./examples/) directory for more usage examples, and the
source code for documentation on how to use the library (until we build proper docs! if you
want to help out with that, please see below).

## Roadmap

Features and capabilities that are planned for the near future:

- documentation
- full support for Anthropic (tools, parsers, AI functions)
- support for other LLM APIs via LiteLLM or similar
- support for local LLMs via HuggingFace
- more examples

If you want to help with any of these, please look at the open issues, join the
conversation and submit a PR. Please read the Contributing section below.

## Contributing

Contributions are welcome!

To ensure that your contribution is accepted, please follow these guidelines:

- open an issue to discuss your idea before you start working on it, or if there's
  already an issue for your idea, join the conversation there and explain how you
  plan to implement it
- make sure that your code is well documented (docstrings, type annotations, comments,
  etc.) and tested (test coverage should only go up)
- make sure that your code is formatted and type-checked with `ruff` (default settings)

## Copyright

Copyright (C) 2023-2024. Senko Rasic and Think contributors. You may use and/or distribute
this project under the terms of MIT license. See the LICENSE file for more details.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/senko/think",
    "name": "think-llm",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9,<4.0",
    "maintainer_email": "",
    "keywords": "ai,llm",
    "author": "Senko Rasic",
    "author_email": "senko@senko.net",
    "download_url": "https://files.pythonhosted.org/packages/08/fe/69c9c58ab4200e9caf7fc2ad28e0823e03cbeb6b5dc4c4a69cebaccb8202/think_llm-0.0.5.tar.gz",
    "platform": null,
    "description": "# Think\n\nThink is a Python package for creating thinking programs.\n\nIt provides simple but powerful primitives for *robust* integration of Large Language\nModels (LLMs) into your Python programs.\n\n## Examples\n\nUsing AI as an ordinary function:\n\n```python\nfrom think.llm.openai import ChatGPT\nfrom think.ai import ai\n\n@ai\ndef haiku(topic: str) -> str:\n    \"\"\"\n    Write a haiku about {{ topic }}\n    \"\"\"\n\nllm = ChatGPT()\n\nprint(haiku(llm, topic=\"computers\"))\n```\n\nAllowing AI to use tools:\n\n```python\nfrom datetime import date\n\nfrom think.llm.openai import ChatGPT\nfrom think.chat import Chat\nfrom think.tool import tool\n\n@tool\ndef current_date() -> str:\n    \"\"\"\n    Get the current date.\n\n    :returns: current date in YYYY-MM-DD format\n    \"\"\"\n    return date.today().isoformat()\n\n\nllm = ChatGPT()\nchat = Chat(\"You are a helpful assistant.\")\nchat.user(\"How old are you (in days since your knowledge cutoff)?\")\n\nprint(llm(chat, tools=[current_date]))\n```\n\nParsing AI output:\n\n```python\nimport json\nfrom pydantic import BaseModel\nfrom think.llm.openai import ChatGPT\nfrom think.chat import Chat\nfrom think.parser import JSONParser\n\n\nclass CityInfo(BaseModel):\n    name: str\n    country: str\n    population: int\n    latitude: float\n    longitude: float\n\n\nllm = ChatGPT()\nparser = JSONParser(spec=CityInfo)\nchat = Chat(\n    \"You are a hepful assistant. Your task is to answer questions about cities, \"\n    \"to the best of your knowledge. Your output must be a valid JSON conforming to \"\n    \"this JSON schema:\\n\" + json.dumps(parser.schema)\n).user(city)\n\nanswer = llm(chat, parser=parser)\n\nprint(f\"{answer.name} is a city in {answer.country} with {answer.population} inhabitants.\")\nprint(f\"It is located at {answer.latitude} latitude and {answer.longitude} longitude.\")\n```\n\n## Quickstart\n\nInstall via `pip`:\n\n```bash\npip install think-llm\n```\n\nNote that the package name is `think-llm`, *not* `think`.\n\nSet up your LLM credentials (OpenAI or Anthropic, depending on the LLM you want to use):\n\n```bash\nexport OPENAI_API_KEY=<your-openai-key>\nexport ANTHROPIC_API_KEY=<your-anthropic-key>\n```\n\nAnd you're ready to go:\n\n```python\nfrom think.llm.openai import ChatGPT\nfrom think.chat import Chat\n\nllm = ChatGPT()\nchat = Chat(\"You are a helpful assistant.\").user(\"Tell me a funny joke.\")\nprint(llm(chat))\n```\n\nExplore the [examples](./examples/) directory for more usage examples, and the\nsource code for documentation on how to use the library (until we build proper docs! if you\nwant to help out with that, please see below).\n\n## Roadmap\n\nFeatures and capabilities that are planned for the near future:\n\n- documentation\n- full support for Anthropic (tools, parsers, AI functions)\n- support for other LLM APIs via LiteLLM or similar\n- support for local LLMs via HuggingFace\n- more examples\n\nIf you want to help with any of these, please look at the open issues, join the\nconversation and submit a PR. Please read the Contributing section below.\n\n## Contributing\n\nContributions are welcome!\n\nTo ensure that your contribution is accepted, please follow these guidelines:\n\n- open an issue to discuss your idea before you start working on it, or if there's\n  already an issue for your idea, join the conversation there and explain how you\n  plan to implement it\n- make sure that your code is well documented (docstrings, type annotations, comments,\n  etc.) and tested (test coverage should only go up)\n- make sure that your code is formatted and type-checked with `ruff` (default settings)\n\n## Copyright\n\nCopyright (C) 2023-2024. Senko Rasic and Think contributors. You may use and/or distribute\nthis project under the terms of MIT license. See the LICENSE file for more details.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Create programs that think, using LLMs.",
    "version": "0.0.5",
    "project_urls": {
        "Homepage": "https://github.com/senko/think",
        "Repository": "https://github.com/senko/think"
    },
    "split_keywords": [
        "ai",
        "llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0e2bf24e9b2233ee9872fb4931aec69c40c8bea6c81cf5af31daf4b255ef8db8",
                "md5": "831752d65649bd6cd613819fa04849aa",
                "sha256": "6e5e54cefb5570dd92766097960c5f7b037a32614c68b6cf68a65eee6735909f"
            },
            "downloads": -1,
            "filename": "think_llm-0.0.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "831752d65649bd6cd613819fa04849aa",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9,<4.0",
            "size": 13897,
            "upload_time": "2024-01-31T02:09:00",
            "upload_time_iso_8601": "2024-01-31T02:09:00.793043Z",
            "url": "https://files.pythonhosted.org/packages/0e/2b/f24e9b2233ee9872fb4931aec69c40c8bea6c81cf5af31daf4b255ef8db8/think_llm-0.0.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "08fe69c9c58ab4200e9caf7fc2ad28e0823e03cbeb6b5dc4c4a69cebaccb8202",
                "md5": "1ca4707022e696d46e9770ee1eadd47f",
                "sha256": "84c2a9b070d2fd8fb2ba8c6bbc5d4ee379b417c6cddb26c5edf61269a1fc88b5"
            },
            "downloads": -1,
            "filename": "think_llm-0.0.5.tar.gz",
            "has_sig": false,
            "md5_digest": "1ca4707022e696d46e9770ee1eadd47f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9,<4.0",
            "size": 12974,
            "upload_time": "2024-01-31T02:09:02",
            "upload_time_iso_8601": "2024-01-31T02:09:02.939860Z",
            "url": "https://files.pythonhosted.org/packages/08/fe/69c9c58ab4200e9caf7fc2ad28e0823e03cbeb6b5dc4c4a69cebaccb8202/think_llm-0.0.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-31 02:09:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "senko",
    "github_project": "think",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "think-llm"
}
        
Elapsed time: 0.17782s