Name | llmer JSON |
Version |
1.0.1
JSON |
| download |
home_page | https://github.com/pydaxing/llmer |
Summary | llmer is a lightweight Python library designed to streamline the development of applications leveraging large language models (LLMs). It provides high-level APIs and utilities for parallel processing, runtime management, file handling, and prompt generation, reducing the overhead of repetitive tasks. |
upload_time | 2024-11-20 13:19:08 |
maintainer | None |
docs_url | None |
author | pydaxing |
requires_python | >=3.6 |
license | None |
keywords |
llmer
leveraging
large
language
models
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LLMER: 一个化繁为简的大模型(LLM)应用开发者神器
`llmer` 是一个轻量级的 Python 库,旨在简化大型语言模型(LLMs)应用中的复杂过程。它提供了用于并行处理、运行时管理、文件处理和Prompt格式化等常用的高级 API 和实用工具,从而不用每次都需要重复开发相关代码,简化工作。
## 功能特点
- **模型调用**: 支持OpenAI风格和Azure模型调用。
- **并行处理**: 支持线程池并发、多线程并发、异步函数异步并发、非异步函数异步并发。
- **运行时管理**: 提供timeout装饰器用于超时控制,并发安全锁装饰器保证并发过程中对数据修改的正确性。
- **文件工具**: 提供对 YAML读取、文件转List(尤其适用jsonl)、List存文件、 图像转 Base64 编码的工具。
- **提示管理**: 将 message 转成 ChatML 格式。
- **更多功能**: 敬请期待...
## 安装
要安装 `llmer`,使用以下命令:
```bash
pip install llmer
```
## 快速开始
```python
from llmer.parallel.thread_pool import ThreadPool
@ThreadPool(parallel_count=4)
def square(num):
return num ** 2
tasks = [{"num": i} for i in range(10)]
results = square(tasks)
print(results) # Output: [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
```
## 功能模块概览
LLMER主要包括以下功能,更多通用功能正在开发中,敬请期待:
### 1. 模型调用
- **`Azure`**: 用于调用托管于 Microsoft Azure 的语言模型。
- **`OpenAI`**: 用于调用 GPT 系列模型以及按 OpenAI 风格部署的模型。
### 2. 并行处理
- **`ThreadPool`**: 使用线程池进行并发。
- **`MultiThread`**: 使用多线程进行并发。
- **`AsyncParallel`**: 针对异步函数的异步并行。
- **`AsyncExecutor`**: 针对非异步函数的异步并行。
### 3. 运行管理
- **`timeout`**: 函数timeout装饰器(包括生成器也可以直接使用)。
- **`parallel_safe_lock`**: 支持自定义超时设置的并行安全锁。
### 4. 文件工具
- 提供对 YAML读取、文件转List(尤其适用jsonl)、List存文件、 图像转 Base64 编码的工具。
### 5. 提示词管理
- 将openai格式的message转成ChatML格式。
## API 文档
## 1. 模型调用
`llmer` 的 `model` 模块提供了两个类**`Azure`**和**`OpenAI`**,分别支持Azure模型和OpenAI风格模型(GPT和按openai风格部署的模型)调用
### 1.1 Azure
Azure 可支持chat和函数调用。
**使用举例**:
```python
from llmer.model import Azure
openai_api_key = "xxxx"
openai_model_name = "gpt-4o"
openai_api_version = 'xxxx'
openai_api_base = "xxxx"
headers = {}
gpt4o = Azure(
headers=headers,
api_key=openai_api_key,
api_version=openai_api_version,
endpoint=openai_api_base,
timeout=10,
retry=2
)
gpt_response = gpt4o.chat(
model=openai_model_name,
temperature=0.1,
stream=True,
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Can you tell me the weather today?"},
],
response_format=None,
)
for chunk in gpt_response:
print(chunk)
```
### 1.2 OpenAI
同样支持Chat和函数调用(function call)
**使用举例**:
```python
from llmer.model import Azure, OpenAI
claude_api_key = "xxxx"
claud_model_name = "anthropic.claude-3-5-sonnet-20240620-v1:0"
claude_api_base = "xxxx"
headers = {}
claude = OpenAI(
headers=headers,
api_key=claude_api_key,
endpoint=claude_api_base,
timeout=10,
retry=2
)
response = claude.chat(
model=claud_model_name,
temperature=0.1,
stream=True,
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Can you tell me the weather today?"},
],
response_format=None,
)
for chunk in response:
print(chunk)
```
## 2. 并行执行
### 2.1 ThreadPool
使用线程池方式实现并行。
**使用举例**:
```python
from llmer.parallel.thread_pool import ThreadPool
@ThreadPool(parallel_count=3)
def add_one(num):
return num + 1
tasks = [{"num": i} for i in range(5)]
results = add_one(tasks)
print(results) # Output: [1, 2, 3, 4, 5]
```
### 2.2 MultiThread
使用多线程方式实现并行。
**使用举例**:
```python
from llmer.parallel.multi_thread import MultiThread
@MultiThread(parallel_count=2)
def multiply(num):
return num * 2
tasks = [{"num": i} for i in range(4)]
results = multiply(tasks)
print(results) # Output: [0, 2, 4, 6]
```
### 2.3 AsyncParallel
对异步函数实现异步并发。
**使用举例**:
```python
import asyncio
from llmer.parallel.async_parallel import AsyncParallel
@AsyncParallel(parallel_count=2)
async def async_task(num):
await asyncio.sleep(1)
return num * 10
tasks = [{"num": i} for i in range(4)]
results = asyncio.run(async_task(tasks))
print(results) # Output: [0, 10, 20, 30]
```
### 2.4 AsyncExecutor
对非异步函数实现异步并发
**使用举例**:
```python
import asyncio
import time
from llmer.parallel.async_executor import AsyncExecutor
@AsyncExecutor(parallel_count=2)
def async_task(num):
time.sleep(1)
return num * 5
tasks = [{"num": i} for i in range(4)]
results = asyncio.run(async_task(tasks))
print(results) # Output: [0, 5, 10, 15]
```
## 3. 运行管理
### 3.1 timeout装饰器
`timeout` 装饰器允许为函数设置执行的时间限制。如果函数超出了指定的时间,将引发 `ExecutionTimeoutError` 异常。
对于生成器函数(使用了`yield`的函数)同样无缝支持。
**使用举例**:
```python
import time
from llmer.runtime.context import timeout
from llmer.runtime.exceptions import ExecutionTimeoutError
@timeout(3) # Set timeout to 3 seconds
def long_running_function(x):
time.sleep(x) # Simulate a long task
try:
long_running_function(5)
except ExecutionTimeoutError as e:
print(e) # Output: Function 'long_running_function' timed out after 3 seconds
# 还支持超时时间覆写,比如下面的timeout=2将覆盖@timeout(3)
try:
long_running_function(5, timeout=2)
except ExecutionTimeoutError as e:
print(e) # Output: Function 'long_running_function' timed out after 2 seconds
```
### 3.2 并行安全锁装饰器
`parallel_safe_lock` 装饰器确保函数只能在成功获取锁时执行。如果函数在指定的超时时间内无法获取锁,将引发 `AcquireLockTimeoutError` 异常。这在函数处理共享资源并需要防止并发执行时非常有用。
**使用举例**:
```python
from llmer.runtime import parallel_safe_lock, AcquireLockTimeoutError
import threading
from time import sleep
lock = threading.Lock()
@parallel_safe_lock(lock, seconds=0.5)
def write_data(data: str):
print(f"Writing data: {data}")
sleep(0.6)
def task():
try:
write_data("some data")
except AcquireLockTimeoutError as e:
print(e) # Output: critical_section acquires lock, timeout exceeded 0.5
threads = []
for _ in range(3):
t = threading.Thread(target=task)
t.start()
threads.append(t)
for t in threads:
t.join()
# 同样,这里的超时时间也可以覆写,在调用write_data的时候write_data("some data", timeout=0.8)也能够修改原超时时间。
```
## 4. 文件工具
提供了文件处理的实用工具,例如YAML读取、文件转List(尤其适用jsonl)、List存文件、 图像转 Base64 编码。
### 4.1 文件转列表
`file_to_list()` 函数读取 JSONL 文件(其它任何文件均可,每一行转换成list中的一个元素),并将其内容作为字典列表返回。
#### 参数:
- `path` (Optional[str]): JSONL 文件的路径
**使用举例**:
```python
from llmer.file import file_to_list
# Example usage: Reading a JSONL file from the current script directory
data = file_to_list("data.jsonl")
print(data)
# Output: List of dictionaries read from the JSONL file
```
### 4.2 列表转文件
`list_to_file()` 函数将数据列表(例如字典、字符串等)保存到文件中。您可以指定打开文件的模式(`'w'` 为写入,`'a'` 为追加)。
#### 参数:
- `data` (List[Any]): 要保存到文件中的数据。它应该是一个列表,列表中的每个项目将被写入文件的一个新行。
- `path` (Optional[str]): 保存文件的路径
- `mode` (str, 默认 'w'): 打开文件的模式。`'w'` 将覆盖文件,`'a'` 将数据追加到文件末尾。
**使用举例**:
```python
from llmer.file import list_to_file
# Example data to save
data = [{"name": "John", "age": 30}, {"name": "Jane", "age": 25}]
# Example usage: Saving a list of dictionaries to a JSONL file
list_to_file(data, "output.jsonl")
```
### 4.3 YAML 读取
`yaml_reader()` 函数读取一个 YAML 配置文件,并将其内容作为 Python 字典返回。该函数假定 YAML 文件结构正确,并返回解析后的数据。
#### 参数:
- `path` (Optional[str]): YAML 文件的路径
**使用举例**:
```python
from llmer.file import yaml_reader
# Example YAML file path
yaml_file = "config.yaml"
# Example usage: Reading a YAML configuration file
config = yaml_reader(yaml_file)
print(config)
```
### 4.4 图像转 Base64
`image_to_base64()` 函数将图像文件转换为 Base64 编码的字符串。它还可以为 Base64 字符串添加`data:xxx`,以便直接在网页内容或其他用途中嵌入图像。
#### 参数:
- `path` (Optional[str]): 图像文件的路径
- `prefix` (bool, 可选): 如果为 `True`,则在 Base64 字符串前添加数据前缀。默认值为 `False`。
**使用举例**:
```python
from llmer.file import image_to_base64
# Example image file path
image_file = "example.png"
# Convert image to Base64 without prefix
encoded_image = image_to_base64(image_file)
print(encoded_image)
# Convert image to Base64 with prefix
encoded_image_with_prefix = image_to_base64(image_file, prefix=True)
print(encoded_image_with_prefix)
```
## 5. 提示工具
### 5.1 ChatML 格式化
`chatml()` 函数将一系列消息格式化为 ChatML 格式,这种格式通常用于以结构化方式处理语言模型中的对话。此函数生成一个字符串,其中每条消息都被适当地包装为系统、用户和助手角色的标签。
#### 参数:
- `messages` (List[Dict[str, str]]): 消息列表,其中每条消息是一个包含以下键的字典:
- `role` (str): 说话者的角色,例如 `"system"`、`"user"` 或 `"assistant"`。
- `content` (str): 消息的内容。
**使用举例**:
```python
from llmer.prompt import chatml
# Example messages
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Can you tell me the weather today?"},
{"role": "assistant", "content": "The weather is sunny with a chance of rain."}
]
# Format messages into ChatML
formatted_message = chatml(messages)
print(formatted_message)
```
```text
<|im_start|>system
You are a helpful assistant.<|im_end|>
<|im_start|>user
Can you tell me the weather today?<|im_end|>
<|im_start|>assistant
The weather is sunny with a chance of rain.<|im_end|>
<|im_start|>assistant
```
## 贡献
欢迎为本项目做出贡献。您可以提交问题或提交拉取请求。
## 联系
您可以通过 pydaxing@gmail.com 联系我们。
Raw data
{
"_id": null,
"home_page": "https://github.com/pydaxing/llmer",
"name": "llmer",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": "llmer leveraging large language models",
"author": "pydaxing",
"author_email": "pydaxing@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/7a/57/5fcd2096f8a031c2ff4f5350f6f243d77ac9780ddc20cd46a06988c73655/llmer-1.0.1.tar.gz",
"platform": null,
"description": "# LLMER: \u4e00\u4e2a\u5316\u7e41\u4e3a\u7b80\u7684\u5927\u6a21\u578b\uff08LLM\uff09\u5e94\u7528\u5f00\u53d1\u8005\u795e\u5668\n\n`llmer` \u662f\u4e00\u4e2a\u8f7b\u91cf\u7ea7\u7684 Python \u5e93\uff0c\u65e8\u5728\u7b80\u5316\u5927\u578b\u8bed\u8a00\u6a21\u578b\uff08LLMs\uff09\u5e94\u7528\u4e2d\u7684\u590d\u6742\u8fc7\u7a0b\u3002\u5b83\u63d0\u4f9b\u4e86\u7528\u4e8e\u5e76\u884c\u5904\u7406\u3001\u8fd0\u884c\u65f6\u7ba1\u7406\u3001\u6587\u4ef6\u5904\u7406\u548cPrompt\u683c\u5f0f\u5316\u7b49\u5e38\u7528\u7684\u9ad8\u7ea7 API \u548c\u5b9e\u7528\u5de5\u5177\uff0c\u4ece\u800c\u4e0d\u7528\u6bcf\u6b21\u90fd\u9700\u8981\u91cd\u590d\u5f00\u53d1\u76f8\u5173\u4ee3\u7801\uff0c\u7b80\u5316\u5de5\u4f5c\u3002\n\n## \u529f\u80fd\u7279\u70b9\n- **\u6a21\u578b\u8c03\u7528**: \u652f\u6301OpenAI\u98ce\u683c\u548cAzure\u6a21\u578b\u8c03\u7528\u3002\n- **\u5e76\u884c\u5904\u7406**: \u652f\u6301\u7ebf\u7a0b\u6c60\u5e76\u53d1\u3001\u591a\u7ebf\u7a0b\u5e76\u53d1\u3001\u5f02\u6b65\u51fd\u6570\u5f02\u6b65\u5e76\u53d1\u3001\u975e\u5f02\u6b65\u51fd\u6570\u5f02\u6b65\u5e76\u53d1\u3002\n- **\u8fd0\u884c\u65f6\u7ba1\u7406**: \u63d0\u4f9btimeout\u88c5\u9970\u5668\u7528\u4e8e\u8d85\u65f6\u63a7\u5236\uff0c\u5e76\u53d1\u5b89\u5168\u9501\u88c5\u9970\u5668\u4fdd\u8bc1\u5e76\u53d1\u8fc7\u7a0b\u4e2d\u5bf9\u6570\u636e\u4fee\u6539\u7684\u6b63\u786e\u6027\u3002\n- **\u6587\u4ef6\u5de5\u5177**: \u63d0\u4f9b\u5bf9 YAML\u8bfb\u53d6\u3001\u6587\u4ef6\u8f6cList\uff08\u5c24\u5176\u9002\u7528jsonl\uff09\u3001List\u5b58\u6587\u4ef6\u3001 \u56fe\u50cf\u8f6c Base64 \u7f16\u7801\u7684\u5de5\u5177\u3002\n- **\u63d0\u793a\u7ba1\u7406**: \u5c06 message \u8f6c\u6210 ChatML \u683c\u5f0f\u3002\n- **\u66f4\u591a\u529f\u80fd**: \u656c\u8bf7\u671f\u5f85...\n\n## \u5b89\u88c5\n\u8981\u5b89\u88c5 `llmer`\uff0c\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\uff1a\n```bash\npip install llmer\n```\n\n## \u5feb\u901f\u5f00\u59cb\n\n```python\nfrom llmer.parallel.thread_pool import ThreadPool\n\n\n@ThreadPool(parallel_count=4)\ndef square(num):\n return num ** 2\n\n\ntasks = [{\"num\": i} for i in range(10)]\nresults = square(tasks)\nprint(results) # Output: [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]\n```\n## \u529f\u80fd\u6a21\u5757\u6982\u89c8\nLLMER\u4e3b\u8981\u5305\u62ec\u4ee5\u4e0b\u529f\u80fd\uff0c\u66f4\u591a\u901a\u7528\u529f\u80fd\u6b63\u5728\u5f00\u53d1\u4e2d\uff0c\u656c\u8bf7\u671f\u5f85\uff1a\n\n### 1. \u6a21\u578b\u8c03\u7528\n- **`Azure`**: \u7528\u4e8e\u8c03\u7528\u6258\u7ba1\u4e8e Microsoft Azure \u7684\u8bed\u8a00\u6a21\u578b\u3002\n- **`OpenAI`**: \u7528\u4e8e\u8c03\u7528 GPT \u7cfb\u5217\u6a21\u578b\u4ee5\u53ca\u6309 OpenAI \u98ce\u683c\u90e8\u7f72\u7684\u6a21\u578b\u3002\n\n### 2. \u5e76\u884c\u5904\u7406\n- **`ThreadPool`**: \u4f7f\u7528\u7ebf\u7a0b\u6c60\u8fdb\u884c\u5e76\u53d1\u3002\n- **`MultiThread`**: \u4f7f\u7528\u591a\u7ebf\u7a0b\u8fdb\u884c\u5e76\u53d1\u3002\n- **`AsyncParallel`**: \u9488\u5bf9\u5f02\u6b65\u51fd\u6570\u7684\u5f02\u6b65\u5e76\u884c\u3002\n- **`AsyncExecutor`**: \u9488\u5bf9\u975e\u5f02\u6b65\u51fd\u6570\u7684\u5f02\u6b65\u5e76\u884c\u3002\n\n### 3. \u8fd0\u884c\u7ba1\u7406\n- **`timeout`**: \u51fd\u6570timeout\u88c5\u9970\u5668\uff08\u5305\u62ec\u751f\u6210\u5668\u4e5f\u53ef\u4ee5\u76f4\u63a5\u4f7f\u7528\uff09\u3002\n- **`parallel_safe_lock`**: \u652f\u6301\u81ea\u5b9a\u4e49\u8d85\u65f6\u8bbe\u7f6e\u7684\u5e76\u884c\u5b89\u5168\u9501\u3002\n\n### 4. \u6587\u4ef6\u5de5\u5177\n- \u63d0\u4f9b\u5bf9 YAML\u8bfb\u53d6\u3001\u6587\u4ef6\u8f6cList\uff08\u5c24\u5176\u9002\u7528jsonl\uff09\u3001List\u5b58\u6587\u4ef6\u3001 \u56fe\u50cf\u8f6c Base64 \u7f16\u7801\u7684\u5de5\u5177\u3002\n\n### 5. \u63d0\u793a\u8bcd\u7ba1\u7406\n- \u5c06openai\u683c\u5f0f\u7684message\u8f6c\u6210ChatML\u683c\u5f0f\u3002\n\n## API \u6587\u6863\n\n## 1. \u6a21\u578b\u8c03\u7528\n\n`llmer` \u7684 `model` \u6a21\u5757\u63d0\u4f9b\u4e86\u4e24\u4e2a\u7c7b**`Azure`**\u548c**`OpenAI`**\uff0c\u5206\u522b\u652f\u6301Azure\u6a21\u578b\u548cOpenAI\u98ce\u683c\u6a21\u578b\uff08GPT\u548c\u6309openai\u98ce\u683c\u90e8\u7f72\u7684\u6a21\u578b\uff09\u8c03\u7528\n\n### 1.1 Azure\n\nAzure \u53ef\u652f\u6301chat\u548c\u51fd\u6570\u8c03\u7528\u3002\n\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.model import Azure\n\nopenai_api_key = \"xxxx\"\nopenai_model_name = \"gpt-4o\"\nopenai_api_version = 'xxxx'\nopenai_api_base = \"xxxx\"\n\nheaders = {}\n\ngpt4o = Azure(\n headers=headers,\n api_key=openai_api_key,\n api_version=openai_api_version,\n endpoint=openai_api_base,\n timeout=10,\n retry=2\n)\n\ngpt_response = gpt4o.chat(\n model=openai_model_name,\n temperature=0.1,\n stream=True,\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n {\"role\": \"user\", \"content\": \"Can you tell me the weather today?\"},\n ],\n response_format=None,\n)\n\nfor chunk in gpt_response:\n print(chunk)\n```\n\n\n### 1.2 OpenAI\n\n\u540c\u6837\u652f\u6301Chat\u548c\u51fd\u6570\u8c03\u7528\uff08function call\uff09\n\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.model import Azure, OpenAI\n\nclaude_api_key = \"xxxx\"\nclaud_model_name = \"anthropic.claude-3-5-sonnet-20240620-v1:0\"\nclaude_api_base = \"xxxx\"\n\nheaders = {}\n\nclaude = OpenAI(\n headers=headers,\n api_key=claude_api_key,\n endpoint=claude_api_base,\n timeout=10,\n retry=2\n)\n\nresponse = claude.chat(\n model=claud_model_name,\n temperature=0.1,\n stream=True,\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n {\"role\": \"user\", \"content\": \"Can you tell me the weather today?\"},\n ],\n response_format=None,\n)\nfor chunk in response:\n print(chunk)\n```\n\n\n\n## 2. \u5e76\u884c\u6267\u884c\n\n### 2.1 ThreadPool\n\u4f7f\u7528\u7ebf\u7a0b\u6c60\u65b9\u5f0f\u5b9e\u73b0\u5e76\u884c\u3002\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.parallel.thread_pool import ThreadPool\n\n\n@ThreadPool(parallel_count=3)\ndef add_one(num):\n return num + 1\n\n\ntasks = [{\"num\": i} for i in range(5)]\nresults = add_one(tasks)\nprint(results) # Output: [1, 2, 3, 4, 5]\n```\n\n### 2.2 MultiThread\n\u4f7f\u7528\u591a\u7ebf\u7a0b\u65b9\u5f0f\u5b9e\u73b0\u5e76\u884c\u3002\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.parallel.multi_thread import MultiThread\n\n\n@MultiThread(parallel_count=2)\ndef multiply(num):\n return num * 2\n\n\ntasks = [{\"num\": i} for i in range(4)]\nresults = multiply(tasks)\nprint(results) # Output: [0, 2, 4, 6]\n```\n\n### 2.3 AsyncParallel\n\u5bf9\u5f02\u6b65\u51fd\u6570\u5b9e\u73b0\u5f02\u6b65\u5e76\u53d1\u3002\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nimport asyncio\nfrom llmer.parallel.async_parallel import AsyncParallel\n\n\n@AsyncParallel(parallel_count=2)\nasync def async_task(num):\n await asyncio.sleep(1)\n return num * 10\n\n\ntasks = [{\"num\": i} for i in range(4)]\nresults = asyncio.run(async_task(tasks))\nprint(results) # Output: [0, 10, 20, 30]\n```\n\n\n### 2.4 AsyncExecutor\n\u5bf9\u975e\u5f02\u6b65\u51fd\u6570\u5b9e\u73b0\u5f02\u6b65\u5e76\u53d1\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nimport asyncio\nimport time\nfrom llmer.parallel.async_executor import AsyncExecutor\n\n\n@AsyncExecutor(parallel_count=2)\ndef async_task(num):\n time.sleep(1)\n return num * 5\n\n\ntasks = [{\"num\": i} for i in range(4)]\nresults = asyncio.run(async_task(tasks))\nprint(results) # Output: [0, 5, 10, 15]\n```\n\n\n## 3. \u8fd0\u884c\u7ba1\u7406\n\n### 3.1 timeout\u88c5\u9970\u5668\n\n`timeout` \u88c5\u9970\u5668\u5141\u8bb8\u4e3a\u51fd\u6570\u8bbe\u7f6e\u6267\u884c\u7684\u65f6\u95f4\u9650\u5236\u3002\u5982\u679c\u51fd\u6570\u8d85\u51fa\u4e86\u6307\u5b9a\u7684\u65f6\u95f4\uff0c\u5c06\u5f15\u53d1 `ExecutionTimeoutError` \u5f02\u5e38\u3002\n\n\u5bf9\u4e8e\u751f\u6210\u5668\u51fd\u6570\uff08\u4f7f\u7528\u4e86`yield`\u7684\u51fd\u6570\uff09\u540c\u6837\u65e0\u7f1d\u652f\u6301\u3002\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nimport time\nfrom llmer.runtime.context import timeout\nfrom llmer.runtime.exceptions import ExecutionTimeoutError\n\n\n@timeout(3) # Set timeout to 3 seconds\ndef long_running_function(x):\n time.sleep(x) # Simulate a long task\n\n\ntry:\n long_running_function(5)\nexcept ExecutionTimeoutError as e:\n print(e) # Output: Function 'long_running_function' timed out after 3 seconds\n\n# \u8fd8\u652f\u6301\u8d85\u65f6\u65f6\u95f4\u8986\u5199\uff0c\u6bd4\u5982\u4e0b\u9762\u7684timeout=2\u5c06\u8986\u76d6@timeout(3)\ntry:\n long_running_function(5, timeout=2)\nexcept ExecutionTimeoutError as e:\n print(e) # Output: Function 'long_running_function' timed out after 2 seconds\n```\n\n\n### 3.2 \u5e76\u884c\u5b89\u5168\u9501\u88c5\u9970\u5668\n\n`parallel_safe_lock` \u88c5\u9970\u5668\u786e\u4fdd\u51fd\u6570\u53ea\u80fd\u5728\u6210\u529f\u83b7\u53d6\u9501\u65f6\u6267\u884c\u3002\u5982\u679c\u51fd\u6570\u5728\u6307\u5b9a\u7684\u8d85\u65f6\u65f6\u95f4\u5185\u65e0\u6cd5\u83b7\u53d6\u9501\uff0c\u5c06\u5f15\u53d1 `AcquireLockTimeoutError` \u5f02\u5e38\u3002\u8fd9\u5728\u51fd\u6570\u5904\u7406\u5171\u4eab\u8d44\u6e90\u5e76\u9700\u8981\u9632\u6b62\u5e76\u53d1\u6267\u884c\u65f6\u975e\u5e38\u6709\u7528\u3002\n\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.runtime import parallel_safe_lock, AcquireLockTimeoutError\nimport threading\nfrom time import sleep\n\nlock = threading.Lock()\n\n\n@parallel_safe_lock(lock, seconds=0.5)\ndef write_data(data: str):\n print(f\"Writing data: {data}\")\n sleep(0.6)\n\n\ndef task():\n try:\n write_data(\"some data\")\n except AcquireLockTimeoutError as e:\n print(e) # Output: critical_section acquires lock, timeout exceeded 0.5\n\n\nthreads = []\nfor _ in range(3):\n t = threading.Thread(target=task)\n t.start()\n threads.append(t)\n\nfor t in threads:\n t.join()\n\n\n# \u540c\u6837\uff0c\u8fd9\u91cc\u7684\u8d85\u65f6\u65f6\u95f4\u4e5f\u53ef\u4ee5\u8986\u5199\uff0c\u5728\u8c03\u7528write_data\u7684\u65f6\u5019write_data(\"some data\", timeout=0.8)\u4e5f\u80fd\u591f\u4fee\u6539\u539f\u8d85\u65f6\u65f6\u95f4\u3002\n```\n\n## 4. \u6587\u4ef6\u5de5\u5177\n\n\u63d0\u4f9b\u4e86\u6587\u4ef6\u5904\u7406\u7684\u5b9e\u7528\u5de5\u5177\uff0c\u4f8b\u5982YAML\u8bfb\u53d6\u3001\u6587\u4ef6\u8f6cList\uff08\u5c24\u5176\u9002\u7528jsonl\uff09\u3001List\u5b58\u6587\u4ef6\u3001 \u56fe\u50cf\u8f6c Base64 \u7f16\u7801\u3002\n\n### 4.1 \u6587\u4ef6\u8f6c\u5217\u8868\n\n`file_to_list()` \u51fd\u6570\u8bfb\u53d6 JSONL \u6587\u4ef6\uff08\u5176\u5b83\u4efb\u4f55\u6587\u4ef6\u5747\u53ef\uff0c\u6bcf\u4e00\u884c\u8f6c\u6362\u6210list\u4e2d\u7684\u4e00\u4e2a\u5143\u7d20\uff09\uff0c\u5e76\u5c06\u5176\u5185\u5bb9\u4f5c\u4e3a\u5b57\u5178\u5217\u8868\u8fd4\u56de\u3002\n\n#### \u53c2\u6570:\n- `path` (Optional[str]): JSONL \u6587\u4ef6\u7684\u8def\u5f84\n\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.file import file_to_list\n\n# Example usage: Reading a JSONL file from the current script directory\ndata = file_to_list(\"data.jsonl\")\nprint(data)\n# Output: List of dictionaries read from the JSONL file\n```\n\n\n### 4.2 \u5217\u8868\u8f6c\u6587\u4ef6\n\n`list_to_file()` \u51fd\u6570\u5c06\u6570\u636e\u5217\u8868\uff08\u4f8b\u5982\u5b57\u5178\u3001\u5b57\u7b26\u4e32\u7b49\uff09\u4fdd\u5b58\u5230\u6587\u4ef6\u4e2d\u3002\u60a8\u53ef\u4ee5\u6307\u5b9a\u6253\u5f00\u6587\u4ef6\u7684\u6a21\u5f0f\uff08`'w'` \u4e3a\u5199\u5165\uff0c`'a'` \u4e3a\u8ffd\u52a0\uff09\u3002\n\n#### \u53c2\u6570:\n- `data` (List[Any]): \u8981\u4fdd\u5b58\u5230\u6587\u4ef6\u4e2d\u7684\u6570\u636e\u3002\u5b83\u5e94\u8be5\u662f\u4e00\u4e2a\u5217\u8868\uff0c\u5217\u8868\u4e2d\u7684\u6bcf\u4e2a\u9879\u76ee\u5c06\u88ab\u5199\u5165\u6587\u4ef6\u7684\u4e00\u4e2a\u65b0\u884c\u3002\n- `path` (Optional[str]): \u4fdd\u5b58\u6587\u4ef6\u7684\u8def\u5f84\n- `mode` (str, \u9ed8\u8ba4 'w'): \u6253\u5f00\u6587\u4ef6\u7684\u6a21\u5f0f\u3002`'w'` \u5c06\u8986\u76d6\u6587\u4ef6\uff0c`'a'` \u5c06\u6570\u636e\u8ffd\u52a0\u5230\u6587\u4ef6\u672b\u5c3e\u3002\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.file import list_to_file\n\n# Example data to save\ndata = [{\"name\": \"John\", \"age\": 30}, {\"name\": \"Jane\", \"age\": 25}]\n\n# Example usage: Saving a list of dictionaries to a JSONL file\nlist_to_file(data, \"output.jsonl\")\n```\n\n\n\n### 4.3 YAML \u8bfb\u53d6\n\n`yaml_reader()` \u51fd\u6570\u8bfb\u53d6\u4e00\u4e2a YAML \u914d\u7f6e\u6587\u4ef6\uff0c\u5e76\u5c06\u5176\u5185\u5bb9\u4f5c\u4e3a Python \u5b57\u5178\u8fd4\u56de\u3002\u8be5\u51fd\u6570\u5047\u5b9a YAML \u6587\u4ef6\u7ed3\u6784\u6b63\u786e\uff0c\u5e76\u8fd4\u56de\u89e3\u6790\u540e\u7684\u6570\u636e\u3002\n\n#### \u53c2\u6570:\n- `path` (Optional[str]): YAML \u6587\u4ef6\u7684\u8def\u5f84\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.file import yaml_reader\n\n# Example YAML file path\nyaml_file = \"config.yaml\"\n\n# Example usage: Reading a YAML configuration file\nconfig = yaml_reader(yaml_file)\n\nprint(config)\n```\n\n\n### 4.4 \u56fe\u50cf\u8f6c Base64\n\n`image_to_base64()` \u51fd\u6570\u5c06\u56fe\u50cf\u6587\u4ef6\u8f6c\u6362\u4e3a Base64 \u7f16\u7801\u7684\u5b57\u7b26\u4e32\u3002\u5b83\u8fd8\u53ef\u4ee5\u4e3a Base64 \u5b57\u7b26\u4e32\u6dfb\u52a0`data:xxx`\uff0c\u4ee5\u4fbf\u76f4\u63a5\u5728\u7f51\u9875\u5185\u5bb9\u6216\u5176\u4ed6\u7528\u9014\u4e2d\u5d4c\u5165\u56fe\u50cf\u3002\n\n#### \u53c2\u6570:\n- `path` (Optional[str]): \u56fe\u50cf\u6587\u4ef6\u7684\u8def\u5f84\n- `prefix` (bool, \u53ef\u9009): \u5982\u679c\u4e3a `True`\uff0c\u5219\u5728 Base64 \u5b57\u7b26\u4e32\u524d\u6dfb\u52a0\u6570\u636e\u524d\u7f00\u3002\u9ed8\u8ba4\u503c\u4e3a `False`\u3002\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.file import image_to_base64\n\n# Example image file path\nimage_file = \"example.png\"\n\n# Convert image to Base64 without prefix\nencoded_image = image_to_base64(image_file)\n\nprint(encoded_image)\n\n# Convert image to Base64 with prefix\nencoded_image_with_prefix = image_to_base64(image_file, prefix=True)\n\nprint(encoded_image_with_prefix)\n```\n\n\n## 5. \u63d0\u793a\u5de5\u5177\n\n### 5.1 ChatML \u683c\u5f0f\u5316\n\n`chatml()` \u51fd\u6570\u5c06\u4e00\u7cfb\u5217\u6d88\u606f\u683c\u5f0f\u5316\u4e3a ChatML \u683c\u5f0f\uff0c\u8fd9\u79cd\u683c\u5f0f\u901a\u5e38\u7528\u4e8e\u4ee5\u7ed3\u6784\u5316\u65b9\u5f0f\u5904\u7406\u8bed\u8a00\u6a21\u578b\u4e2d\u7684\u5bf9\u8bdd\u3002\u6b64\u51fd\u6570\u751f\u6210\u4e00\u4e2a\u5b57\u7b26\u4e32\uff0c\u5176\u4e2d\u6bcf\u6761\u6d88\u606f\u90fd\u88ab\u9002\u5f53\u5730\u5305\u88c5\u4e3a\u7cfb\u7edf\u3001\u7528\u6237\u548c\u52a9\u624b\u89d2\u8272\u7684\u6807\u7b7e\u3002\n\n#### \u53c2\u6570:\n- `messages` (List[Dict[str, str]]): \u6d88\u606f\u5217\u8868\uff0c\u5176\u4e2d\u6bcf\u6761\u6d88\u606f\u662f\u4e00\u4e2a\u5305\u542b\u4ee5\u4e0b\u952e\u7684\u5b57\u5178\uff1a\n - `role` (str): \u8bf4\u8bdd\u8005\u7684\u89d2\u8272\uff0c\u4f8b\u5982 `\"system\"`\u3001`\"user\"` \u6216 `\"assistant\"`\u3002\n - `content` (str): \u6d88\u606f\u7684\u5185\u5bb9\u3002\n\n\n**\u4f7f\u7528\u4e3e\u4f8b**:\n\n```python\nfrom llmer.prompt import chatml\n\n# Example messages\nmessages = [\n {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n {\"role\": \"user\", \"content\": \"Can you tell me the weather today?\"},\n {\"role\": \"assistant\", \"content\": \"The weather is sunny with a chance of rain.\"}\n]\n\n# Format messages into ChatML\nformatted_message = chatml(messages)\n\nprint(formatted_message)\n```\n\n```text\n<|im_start|>system\nYou are a helpful assistant.<|im_end|>\n<|im_start|>user\nCan you tell me the weather today?<|im_end|>\n<|im_start|>assistant\nThe weather is sunny with a chance of rain.<|im_end|>\n<|im_start|>assistant\n```\n\n\n\n## \u8d21\u732e\n\n\u6b22\u8fce\u4e3a\u672c\u9879\u76ee\u505a\u51fa\u8d21\u732e\u3002\u60a8\u53ef\u4ee5\u63d0\u4ea4\u95ee\u9898\u6216\u63d0\u4ea4\u62c9\u53d6\u8bf7\u6c42\u3002\n\n## \u8054\u7cfb\n\n\u60a8\u53ef\u4ee5\u901a\u8fc7 pydaxing@gmail.com \u8054\u7cfb\u6211\u4eec\u3002\n",
"bugtrack_url": null,
"license": null,
"summary": "llmer is a lightweight Python library designed to streamline the development of applications leveraging large language models (LLMs). It provides high-level APIs and utilities for parallel processing, runtime management, file handling, and prompt generation, reducing the overhead of repetitive tasks.",
"version": "1.0.1",
"project_urls": {
"Homepage": "https://github.com/pydaxing/llmer"
},
"split_keywords": [
"llmer",
"leveraging",
"large",
"language",
"models"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ba20a86e1b2510282a5f1e43ad4a90b6ee459ee255934b3abd785c7eda312ec7",
"md5": "b23967c8a733308d933848849d70a7d8",
"sha256": "6efd129ed8549acfd75cb919bf58e983f7d7e563f82e488e0c3df8bc430820ed"
},
"downloads": -1,
"filename": "llmer-1.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b23967c8a733308d933848849d70a7d8",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 127044,
"upload_time": "2024-11-20T13:19:05",
"upload_time_iso_8601": "2024-11-20T13:19:05.239314Z",
"url": "https://files.pythonhosted.org/packages/ba/20/a86e1b2510282a5f1e43ad4a90b6ee459ee255934b3abd785c7eda312ec7/llmer-1.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7a575fcd2096f8a031c2ff4f5350f6f243d77ac9780ddc20cd46a06988c73655",
"md5": "5748c4372b73d37ab5561908482cb158",
"sha256": "4ae44108756df07c0bfb59e975652de0605b44b83433184e43a5363feed1463c"
},
"downloads": -1,
"filename": "llmer-1.0.1.tar.gz",
"has_sig": false,
"md5_digest": "5748c4372b73d37ab5561908482cb158",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 126733,
"upload_time": "2024-11-20T13:19:08",
"upload_time_iso_8601": "2024-11-20T13:19:08.766404Z",
"url": "https://files.pythonhosted.org/packages/7a/57/5fcd2096f8a031c2ff4f5350f6f243d77ac9780ddc20cd46a06988c73655/llmer-1.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-20 13:19:08",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "pydaxing",
"github_project": "llmer",
"github_not_found": true,
"lcname": "llmer"
}