llm-axe


Namellm-axe JSON
Version 1.1.0 PyPI version JSON
download
home_pageNone
SummaryA toolkit for quickly implementing llm powered functionalities.
upload_time2024-05-18 16:20:50
maintainerNone
docs_urlNone
authorEmir Sahin
requires_pythonNone
licenseMIT
keywords python llm axe llm toolkit local llm local llm internet function caller llm ollama
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# llm-axe 🪓

<img alt="PyPI - Version" src="https://img.shields.io/pypi/v/llm-axe"> <img alt="PyPI - Downloads" src="https://img.shields.io/pypi/dm/llm-axe">
<img alt="Static Badge" src="https://img.shields.io/badge/clones-63/month-purple"> <img alt="GitHub forks" src="https://img.shields.io/github/forks/emirsahin1/llm-axe?style=flat">
[![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Femirsahin1%2Fllm-axe&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)](https://github.com/emirsahin1/llm-axe)
















llm-axe is a handy little axe for developing llm powered applications. 

It allows you to quickly implement complex interactions for local LLMs, such as function callers, online agents, pre-made generic agents, and more.


## Installation



```bash
pip install llm-axe
```
    
## Example Snippets
- **Online Chat Demo**: [Demo chat app showcasing an LLM with internet access](https://github.com/emirsahin1/llm-axe/tree/main/examples/ex_online_chat_demo.py)

- **Function Calling**

&emsp;&emsp;A function calling LLM can be created with just **3 lines of code**:
<br>
&emsp;&emsp;No need for premade schemas, templates, special prompts, or specialized functions.
```python
prompt = "I have 500 coins, I just got 200 more. How many do I have?"

llm = OllamaChat(model="llama3:instruct")
fc = FunctionCaller(llm, [get_time, get_date, get_location, add, multiply])
result = fc.get_function(prompt)
```
- **Online Agent**
```python
prompt = "Tell me a bit about this website:  https://toscrape.com/?"
llm = OllamaChat(model="llama3:instruct")
searcher = OnlineAgent(llm)
resp = searcher.search(prompt)

#output: Based on information from the internet, it appears that https://toscrape.com/ is a website dedicated to web scraping.
# It provides a sandbox environment for beginners and developers to learn and validate their web scraping technologies...
```
- **PDF Reader**
```python
llm = OllamaChat(model="llama3:instruct")
files = ["../FileOne.pdf", "../FileTwo.pdf"]
agent = PdfReader(llm)
resp = agent.ask("Summarize these documents for me", files)
```

- **Data Extractor**
```python
llm = OllamaChat(model="llama3:instruct")
info = read_pdf("../Example.pdf")
de = DataExtractor(llm, reply_as_json=True)
resp = de.ask(info, ["name", "email", "phone", "address"])

#output: {'Name': 'Frodo Baggins', 'Email': 'frodo@gmail.com', 'Phone': '555-555-5555', 'Address': 'Bag-End, Hobbiton, The Shire'}
```
[**See more complete examples**](https://github.com/emirsahin1/llm-axe/tree/main/examples)

[**How to setup llm-axe with your own LLM**](https://github.com/emirsahin1/llm-axe/blob/main/examples/ex_llm_setup.py)


## Features

- Local LLM internet access with Online Agent
- PDF Document Reader Agent
- Premade utility Agents for common tasks
- Compatible with any LLM, local or externally hosted
- Built-in support for Ollama



## Important Notes

The results you get from the agents are highly dependent on the capability of your LLM. An inadequate LLM will not be able to provide results that are usable with llm-axe

**Testing in development was done using llama3 8b:instruct 4 bit quant**

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llm-axe",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "python, llm axe, llm toolkit, local llm, local llm internet, function caller llm, ollama",
    "author": "Emir Sahin",
    "author_email": "emirsah122@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/e8/b7/4d11c127633b7a72bb2a714c7da3ad706fc43928410a0e3c2f0b03800864/llm_axe-1.1.0.tar.gz",
    "platform": null,
    "description": "\r\n# llm-axe \ud83e\ude93\r\n\r\n<img alt=\"PyPI - Version\" src=\"https://img.shields.io/pypi/v/llm-axe\"> <img alt=\"PyPI - Downloads\" src=\"https://img.shields.io/pypi/dm/llm-axe\">\r\n<img alt=\"Static Badge\" src=\"https://img.shields.io/badge/clones-63/month-purple\"> <img alt=\"GitHub forks\" src=\"https://img.shields.io/github/forks/emirsahin1/llm-axe?style=flat\">\r\n[![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Femirsahin1%2Fllm-axe&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)](https://github.com/emirsahin1/llm-axe)\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\nllm-axe is a handy little axe for developing llm powered applications. \r\n\r\nIt allows you to quickly implement complex interactions for local LLMs, such as function callers, online agents, pre-made generic agents, and more.\r\n\r\n\r\n## Installation\r\n\r\n\r\n\r\n```bash\r\npip install llm-axe\r\n```\r\n    \r\n## Example Snippets\r\n- **Online Chat Demo**: [Demo chat app showcasing an LLM with internet access](https://github.com/emirsahin1/llm-axe/tree/main/examples/ex_online_chat_demo.py)\r\n\r\n- **Function Calling**\r\n\r\n&emsp;&emsp;A function calling LLM can be created with just **3 lines of code**:\r\n<br>\r\n&emsp;&emsp;No need for premade schemas, templates, special prompts, or specialized functions.\r\n```python\r\nprompt = \"I have 500 coins, I just got 200 more. How many do I have?\"\r\n\r\nllm = OllamaChat(model=\"llama3:instruct\")\r\nfc = FunctionCaller(llm, [get_time, get_date, get_location, add, multiply])\r\nresult = fc.get_function(prompt)\r\n```\r\n- **Online Agent**\r\n```python\r\nprompt = \"Tell me a bit about this website:  https://toscrape.com/?\"\r\nllm = OllamaChat(model=\"llama3:instruct\")\r\nsearcher = OnlineAgent(llm)\r\nresp = searcher.search(prompt)\r\n\r\n#output: Based on information from the internet, it appears that https://toscrape.com/ is a website dedicated to web scraping.\r\n# It provides a sandbox environment for beginners and developers to learn and validate their web scraping technologies...\r\n```\r\n- **PDF Reader**\r\n```python\r\nllm = OllamaChat(model=\"llama3:instruct\")\r\nfiles = [\"../FileOne.pdf\", \"../FileTwo.pdf\"]\r\nagent = PdfReader(llm)\r\nresp = agent.ask(\"Summarize these documents for me\", files)\r\n```\r\n\r\n- **Data Extractor**\r\n```python\r\nllm = OllamaChat(model=\"llama3:instruct\")\r\ninfo = read_pdf(\"../Example.pdf\")\r\nde = DataExtractor(llm, reply_as_json=True)\r\nresp = de.ask(info, [\"name\", \"email\", \"phone\", \"address\"])\r\n\r\n#output: {'Name': 'Frodo Baggins', 'Email': 'frodo@gmail.com', 'Phone': '555-555-5555', 'Address': 'Bag-End, Hobbiton, The Shire'}\r\n```\r\n[**See more complete examples**](https://github.com/emirsahin1/llm-axe/tree/main/examples)\r\n\r\n[**How to setup llm-axe with your own LLM**](https://github.com/emirsahin1/llm-axe/blob/main/examples/ex_llm_setup.py)\r\n\r\n\r\n## Features\r\n\r\n- Local LLM internet access with Online Agent\r\n- PDF Document Reader Agent\r\n- Premade utility Agents for common tasks\r\n- Compatible with any LLM, local or externally hosted\r\n- Built-in support for Ollama\r\n\r\n\r\n\r\n## Important Notes\r\n\r\nThe results you get from the agents are highly dependent on the capability of your LLM. An inadequate LLM will not be able to provide results that are usable with llm-axe\r\n\r\n**Testing in development was done using llama3 8b:instruct 4 bit quant**\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A toolkit for quickly implementing llm powered functionalities.",
    "version": "1.1.0",
    "project_urls": null,
    "split_keywords": [
        "python",
        " llm axe",
        " llm toolkit",
        " local llm",
        " local llm internet",
        " function caller llm",
        " ollama"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "939272c170200461717bdf6eca2fb7c2a71cdf16394ec22f5b11f49f5b010b48",
                "md5": "be104b91df7c2f2b21e313bf129629c2",
                "sha256": "47ec355f88ba3a87c12173e30e0fdbcbc98619e4bd6dc829774a0d90f52b7137"
            },
            "downloads": -1,
            "filename": "llm_axe-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "be104b91df7c2f2b21e313bf129629c2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 13218,
            "upload_time": "2024-05-18T16:20:48",
            "upload_time_iso_8601": "2024-05-18T16:20:48.761135Z",
            "url": "https://files.pythonhosted.org/packages/93/92/72c170200461717bdf6eca2fb7c2a71cdf16394ec22f5b11f49f5b010b48/llm_axe-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e8b74d11c127633b7a72bb2a714c7da3ad706fc43928410a0e3c2f0b03800864",
                "md5": "c10d6f01e4fd3d9b9816a59e8701ba63",
                "sha256": "27f557191dea5175a49978265749cd64d641ca6acafc3aec1cd9081f62099543"
            },
            "downloads": -1,
            "filename": "llm_axe-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "c10d6f01e4fd3d9b9816a59e8701ba63",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 13730,
            "upload_time": "2024-05-18T16:20:50",
            "upload_time_iso_8601": "2024-05-18T16:20:50.298723Z",
            "url": "https://files.pythonhosted.org/packages/e8/b7/4d11c127633b7a72bb2a714c7da3ad706fc43928410a0e3c2f0b03800864/llm_axe-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-18 16:20:50",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llm-axe"
}
        
Elapsed time: 0.27144s