# Ollama-Airflow-Operator: Because AI, Airflow, and Alliteration Are Awesome!
> 🤖 "Airflow + Ollama = Magic" - Some Data Scientist, probably
Welcome to the Ollama-Airflow-Operator, where we make running LLMs as easy as pie (and potentially more delicious)!
## What's This All About?
Ever wished you could orchestrate Large Language Models (LLMs) with the ease of scheduling a tweet? Well, now you can! This Airflow plugin lets you manage Ollama (the cool kid in the LLM playground) **with no external compute dependencies**. That is the key.
### Here is why
In the wild world of enterprise AI, Data Engineers often find themselves in a peculiar predicament. They have the skills, the ideas, and the burning desire to leverage LLMs for innovative solutions. But alas! They're often stymied by the lack of access to dedicated LLM endpoints in their organizations.
Enter the Ollama-Airflow-Operator – a glimmer of hope in the dark night of corporate red tape. Born from the frustration of Data Engineers who dared to dream, this plugin turns the tables on the traditional AI deployment model.
With this operator, Data Engineers can now leverage their existing Airflow infrastructure to spin up LLM endpoints on demand. No need for external compute resources or endpoints. No more waiting for IT approvals. Just pure, unadulterated LLM power at your fingertips.
### Is this the only reason?
No. Imagine the future of Agents. There are tools after tools and frameworks after frameworks that are showing up. But nothing stands as tall as Airflow in the world of task orchestration. Any AI Agent will first and foremost be an orchestrated workflow, and that is where the power of Airflow comes in. Pair it up with Ollama - and Data Engineers can now make magic happen say with enriching data in ways not possible previously, or making sense of unstructured data and the possibilities are limitless.
## Getting Started
Get ready to embark on your LLM journey with Airflow! Here's how to get started:
### 1. Installation
First, install the Ollama-Airflow-Operator package:
```bash
pip install airflow-ollama
```
OR
```bash
pip install git+https://github.com/KarmaloopAI/Ollama-Airflow-Operator.git
```
### 2. Import the Operators
This should be same as any other operators you use in your Airflow DAGs
```python
from airflow import DAG
from airflow.utils.dates import days_ago
from airflow_ollama.operators.ollama_operator import OllamaOperator
from airflow_ollama.operators.ngrok_exposer_operator import NgrokExposerOperator
```
### 3. Use it in your DAGs
Here is an example of how to use it.
```python
with DAG('ollama_llm_setup', start_date=days_ago(1), schedule_interval=None) as dag:
install_ollama = OllamaOperator(
task_id='install_ollama',
install_ollama=True,
model_name='gemma:2b', # Choose your model
run_model=False
)
expose_ollama = NgrokExposerOperator(
task_id='expose_ollama',
port=11434, # Default Ollama port
auth_token='your_ngrok_auth_token'
)
run_model = OllamaOperator(
task_id='run_model',
model_name='gemma:2b',
run_model=True
)
install_ollama >> expose_ollama >> run_model
```
### About the Author
Hello, my name is [Abi Chatterjee](https://www.linkedin.com/in/abi-chatterjee/) and you may reach me via [LinkedIn](https://www.linkedin.com/in/abi-chatterjee/)
I am in love with LLMs, for the better or worse, and have been finding innovative ways of building LLM and GenAi solutions.
Raw data
{
"_id": null,
"home_page": "https://github.com/KarmaloopAI/Ollama-Airflow-Operator",
"name": "airflow-ollama",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "airflow ollama ngrok operator",
"author": "Abi Chatterjee",
"author_email": "abhishek.chatterjee@live.com",
"download_url": "https://files.pythonhosted.org/packages/b3/a6/2b2bd709da7eee3d502383d16b57ddb5aba3ab182cbcfc09ee4da8024aa1/airflow-ollama-0.11.0.tar.gz",
"platform": null,
"description": "# Ollama-Airflow-Operator: Because AI, Airflow, and Alliteration Are Awesome!\n\n> \ud83e\udd16 \"Airflow + Ollama = Magic\" - Some Data Scientist, probably\n\nWelcome to the Ollama-Airflow-Operator, where we make running LLMs as easy as pie (and potentially more delicious)!\n\n## What's This All About?\n\nEver wished you could orchestrate Large Language Models (LLMs) with the ease of scheduling a tweet? Well, now you can! This Airflow plugin lets you manage Ollama (the cool kid in the LLM playground) **with no external compute dependencies**. That is the key.\n\n### Here is why\n\nIn the wild world of enterprise AI, Data Engineers often find themselves in a peculiar predicament. They have the skills, the ideas, and the burning desire to leverage LLMs for innovative solutions. But alas! They're often stymied by the lack of access to dedicated LLM endpoints in their organizations.\n\nEnter the Ollama-Airflow-Operator \u2013 a glimmer of hope in the dark night of corporate red tape. Born from the frustration of Data Engineers who dared to dream, this plugin turns the tables on the traditional AI deployment model.\n\nWith this operator, Data Engineers can now leverage their existing Airflow infrastructure to spin up LLM endpoints on demand. No need for external compute resources or endpoints. No more waiting for IT approvals. Just pure, unadulterated LLM power at your fingertips.\n\n### Is this the only reason?\nNo. Imagine the future of Agents. There are tools after tools and frameworks after frameworks that are showing up. But nothing stands as tall as Airflow in the world of task orchestration. Any AI Agent will first and foremost be an orchestrated workflow, and that is where the power of Airflow comes in. Pair it up with Ollama - and Data Engineers can now make magic happen say with enriching data in ways not possible previously, or making sense of unstructured data and the possibilities are limitless.\n\n## Getting Started\n\nGet ready to embark on your LLM journey with Airflow! Here's how to get started:\n\n### 1. Installation\n\nFirst, install the Ollama-Airflow-Operator package:\n\n```bash\npip install airflow-ollama\n```\nOR\n```bash\npip install git+https://github.com/KarmaloopAI/Ollama-Airflow-Operator.git\n```\n\n### 2. Import the Operators\nThis should be same as any other operators you use in your Airflow DAGs\n```python\nfrom airflow import DAG\nfrom airflow.utils.dates import days_ago\nfrom airflow_ollama.operators.ollama_operator import OllamaOperator\nfrom airflow_ollama.operators.ngrok_exposer_operator import NgrokExposerOperator\n```\n\n### 3. Use it in your DAGs\nHere is an example of how to use it.\n```python\nwith DAG('ollama_llm_setup', start_date=days_ago(1), schedule_interval=None) as dag:\n \n install_ollama = OllamaOperator(\n task_id='install_ollama',\n install_ollama=True,\n model_name='gemma:2b', # Choose your model\n run_model=False\n )\n\n expose_ollama = NgrokExposerOperator(\n task_id='expose_ollama',\n port=11434, # Default Ollama port\n auth_token='your_ngrok_auth_token'\n )\n\n run_model = OllamaOperator(\n task_id='run_model',\n model_name='gemma:2b',\n run_model=True\n )\n\n install_ollama >> expose_ollama >> run_model\n```\n\n### About the Author\nHello, my name is [Abi Chatterjee](https://www.linkedin.com/in/abi-chatterjee/) and you may reach me via [LinkedIn](https://www.linkedin.com/in/abi-chatterjee/)\n\nI am in love with LLMs, for the better or worse, and have been finding innovative ways of building LLM and GenAi solutions.\n",
"bugtrack_url": null,
"license": null,
"summary": "Airflow operator for managing Ollama and ngrok",
"version": "0.11.0",
"project_urls": {
"Homepage": "https://github.com/KarmaloopAI/Ollama-Airflow-Operator"
},
"split_keywords": [
"airflow",
"ollama",
"ngrok",
"operator"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "c745486dc8ac40945806fb9a98f3a2d19190217dc8ecfab5d3e4c56dcd8b2160",
"md5": "0715d609db29fbab606912a862e33993",
"sha256": "3f68ffd6512020a6aae38f0d75b8aac1aec1ebb91c5ce75ed83a1079d92a0c58"
},
"downloads": -1,
"filename": "airflow_ollama-0.11.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0715d609db29fbab606912a862e33993",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 7483,
"upload_time": "2024-08-19T11:17:30",
"upload_time_iso_8601": "2024-08-19T11:17:30.477434Z",
"url": "https://files.pythonhosted.org/packages/c7/45/486dc8ac40945806fb9a98f3a2d19190217dc8ecfab5d3e4c56dcd8b2160/airflow_ollama-0.11.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b3a62b2bd709da7eee3d502383d16b57ddb5aba3ab182cbcfc09ee4da8024aa1",
"md5": "0d2afa84af607a3ccc92e9cf36c481f2",
"sha256": "a2a989ec7eb7a2a7b5229dc4db2935351277585c83ee64824881c7d282716b47"
},
"downloads": -1,
"filename": "airflow-ollama-0.11.0.tar.gz",
"has_sig": false,
"md5_digest": "0d2afa84af607a3ccc92e9cf36c481f2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 7243,
"upload_time": "2024-08-19T11:17:32",
"upload_time_iso_8601": "2024-08-19T11:17:32.061548Z",
"url": "https://files.pythonhosted.org/packages/b3/a6/2b2bd709da7eee3d502383d16b57ddb5aba3ab182cbcfc09ee4da8024aa1/airflow-ollama-0.11.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-19 11:17:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "KarmaloopAI",
"github_project": "Ollama-Airflow-Operator",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "apache-airflow",
"specs": [
[
">=",
"2.0.0"
]
]
},
{
"name": "pytest",
"specs": [
[
">=",
"6.2.0"
]
]
},
{
"name": "pytest-mock",
"specs": [
[
">=",
"3.6.0"
]
]
},
{
"name": "requests",
"specs": [
[
">=",
"2.25.0"
]
]
}
],
"lcname": "airflow-ollama"
}