# aicodetools
Simple, lightweight AI code tools with Docker-only support. No complex dependencies.
Provides four essential tools for AI agents: **read**, **write**, **edit**, and **run** commands.
Runs in a secure Docker container with automatic setup and management.
## Installation
You can install the package using pip:
```bash
pip install aicodetools
```
Or for development:
```bash
pip install -e .
```
## Quick Start
```python
from aicodetools.client import CodeToolsClient
# Auto-starts Docker server if needed (uses python:3.11-slim + pip install)
client = CodeToolsClient(auto_start=True)
# Get simple functional tools
tools = client.tools(selection_list=["read_file", "write_file", "edit_file", "run_command"])
read, write, edit, run_cmd = tools
# Read a file with smart token management
result = read("example.py")
print(result["content"])
# Write a file (with safety checks)
write("hello.py", "print('Hello, World!')")
# Edit file using string replacement
edit("hello.py", "Hello", "Hi")
# Run commands (non-interactive)
result = run_cmd("python hello.py", interactive=False)
print(result["stdout"])
# Interactive commands still available on client
client.run_command("python -i", interactive=True)
client.send_input("2 + 2")
output = client.get_output()
# Clean up when done
client.stop_server()
```
## Docker Configuration
### Using Custom Docker Images
The framework automatically installs `aicodetools` via pip inside any Python container:
```python
from aicodetools import CodeToolsClient
# Default: uses python:3.11-slim + pip install aicodetools
client = CodeToolsClient(auto_start=True)
# Use different Python version
client = CodeToolsClient(
auto_start=True,
docker_image="python:3.12-alpine"
)
# Use custom port (default is 18080 to avoid conflicts)
client = CodeToolsClient(
auto_start=True,
port=19080
)
# Use your own custom Python image
client = CodeToolsClient(
auto_start=True,
docker_image="my-company/python-base:latest"
)
```
### Docker Image Requirements
Your custom Docker image only needs:
- Python 3.10+ installed
- `pip` available
- Internet access (to install aicodetools package)
### Example Custom Dockerfile
```dockerfile
FROM python:3.11-slim
# Install system dependencies if needed
RUN apt-get update && apt-get install -y git curl && rm -rf /var/lib/apt/lists/*
# Pre-install aicodetools (optional - will be installed automatically if not present)
RUN pip install aicodetools
# Optional: Pre-install common packages for your use case
RUN pip install numpy pandas requests beautifulsoup4
# Set working directory
WORKDIR /workspace
CMD ["/bin/bash"]
```
### Manual Docker Usage
If you prefer to manage Docker yourself:
```bash
# Use any Python image and install aicodetools
docker run -d -p 18080:8080 --name my-aicodetools --rm python:3.11-slim \
bash -c "pip install --break-system-packages aicodetools && python -m aicodetools.server --host 0.0.0.0 --port 8080"
# Then connect without auto_start
client = CodeToolsClient(auto_start=False, server_url="http://localhost:18080")
# Or use a different port
docker run -d -p 19080:8080 --name my-aicodetools-alt --rm python:3.12-alpine \
bash -c "pip install --break-system-packages aicodetools && python -m aicodetools.server --host 0.0.0.0 --port 8080"
client = CodeToolsClient(auto_start=False, server_url="http://localhost:19080")
# With your own custom image
docker run -d -p 20080:8080 --name my-custom --rm my-company/python-base:latest \
bash -c "pip install --break-system-packages aicodetools && python -m aicodetools.server --host 0.0.0.0 --port 8080"
```
## Core Tools
Four essential tools, designed for simplicity and reliability:
### ๐ **Read Tool**
- Smart file reading with tiered token management (4k/10k modes)
- Regex pattern matching with context lines
- Line range support for targeted reading
- Automatic compression for long lines (6k max per line)
### โ๏ธ **Write Tool**
- Safe file writing with read-first validation for existing files
- Automatic backup creation with timestamps
- UTF-8 encoding by default (simplified for Linux containers)
- Directory creation if needed
### โ๏ธ **Edit Tool**
- String-based find and replace editing
- Support for single or all occurrences (replace_all flag)
- Automatic backup before editing
- Detailed change reporting with diffs
### โก **Run Tool**
- **Single function**: `run_command(command, timeout=300, interactive=False)`
- **Non-interactive**: Auto-kill on timeout, return complete results
- **Interactive**: Stream output, agent controls (get_output, send_input, stop_process)
- **Single command limit**: Only one command at a time (prevents agent confusion)
## Usage Examples
### Context Manager Usage
```python
from aicodetools.client import CodeToolsClient
# Recommended: Use context manager for automatic cleanup
with CodeToolsClient(auto_start=True) as client:
# Get functional tools
tools = client.tools(selection_list=["read_file", "write_file", "edit_file", "run_command"])
read, write, edit, run_cmd = tools
# Read file with regex pattern matching
matches = read("example.py", regex=r"def \w+")
# Safe file editing workflow
read("config.py") # Read first for safety
edit("config.py", "DEBUG = False", "DEBUG = True")
# Execute multiple commands (non-interactive)
run_cmd("pip install requests", interactive=False)
result = run_cmd("python -c 'import requests; print(requests.__version__)'", interactive=False)
print(f"Requests version: {result['stdout']}")
# Server automatically stops when exiting context
```
### Interactive Command Example
```python
from aicodetools import CodeToolsClient
import time
client = CodeToolsClient(auto_start=True)
# Start a Python REPL (interactive mode)
result = client.run_command("python -i", interactive=True)
print(f"Python REPL started: {result['success']}")
# Send commands and get output
client.send_input("x = 10")
client.send_input("y = 20")
client.send_input("print(x + y)")
# Get accumulated output
time.sleep(1) # Wait for commands to execute
output = client.get_output()
print("Python REPL output:", output["recent_stdout"])
# Stop the process
client.stop_process()
client.stop_server()
```
### AI Agent Integration
```python
from aicodetools.client import CodeToolsClient
def create_tool_functions():
"""Create tool functions for AI agent integration."""
client = CodeToolsClient(auto_start=True)
# Get the simplified functional tools
tools = client.tools(selection_list=["read_file", "write_file", "edit_file", "run_command"])
read, write, edit, run_cmd = tools
return [read, write, edit, run_cmd], client
# Use with your favorite AI framework
tools, client = create_tool_functions()
read, write, edit, run_cmd = tools
# Your AI agent can now use these simple functions
# agent = YourAIAgent(tools=tools)
# response = agent.run("Create a Python script that calculates fibonacci numbers")
# Example usage:
content = read("example.py") # Read file content
write("fibonacci.py", "def fib(n): return n if n < 2 else fib(n-1) + fib(n-2)") # Write file
edit("fibonacci.py", "fib", "fibonacci") # Edit file
result = run_cmd("python fibonacci.py", timeout=10) # Run command
# Clean up when done
client.stop_server()
```
## Multi-Agent Support with ClientManager
The `ClientManager` enables multiple AI agents to work concurrently, each with isolated Docker environments.
### Basic Multi-Agent Setup
```python
from aicodetools import ClientManager
# Create manager with organized logging
manager = ClientManager(
docker_image="python:3.11-slim",
base_log_dir="./agent_logs"
)
# Get clients for different agents
data_agent = manager.get_client("data_processor") # Logs: ./agent_logs/data_processor/
code_agent = manager.get_client("code_reviewer") # Logs: ./agent_logs/code_reviewer/
test_agent = manager.get_client("test_writer") # Logs: ./agent_logs/test_writer/
# Each agent gets isolated Docker container with unique ports
# Container names: aicodetools-data_processor-abc123, etc.
# Use agents normally - each has separate environment
data_tools = data_agent.tools(["read_file", "write_file", "run_command"])
code_tools = code_agent.tools(["read_file", "edit_file", "run_command"])
# Clean up when done
manager.close_all_clients()
```
### Parallel Agent Execution
```python
import threading
from aicodetools import ClientManager
def agent_worker(manager, agent_id, task):
"""Worker function for parallel agent execution."""
client = manager.get_client(agent_id)
tools = client.tools(["read_file", "write_file", "edit_file", "run_command"])
read, write, edit, run_cmd = tools
# Agent performs its task
write(f"{agent_id}_output.py", f"# Task: {task}\nprint('Completed by {agent_id}')")
result = run_cmd(f"python {agent_id}_output.py")
print(f"{agent_id}: {result['stdout'].strip()}")
# Create manager for parallel execution
with ClientManager(base_log_dir="./parallel_logs") as manager:
# Define agents and their tasks
agents = [
("frontend_dev", "Build UI components"),
("backend_dev", "Implement API endpoints"),
("database_dev", "Design database schema"),
("tester", "Write comprehensive tests")
]
# Start all agents in parallel
threads = []
for agent_id, task in agents:
thread = threading.Thread(
target=agent_worker,
args=(manager, agent_id, task)
)
threads.append(thread)
thread.start()
# Wait for all agents to complete
for thread in threads:
thread.join()
print("All agents completed!")
# Auto-cleanup when exiting context manager
```
### ClientManager Features
```python
from aicodetools import ClientManager
manager = ClientManager(base_log_dir="./my_logs")
# Client lifecycle management
client = manager.get_client("worker_1")
info = manager.get_client_info("worker_1")
print(f"Worker 1: port={info['port']}, container={info['container_name']}")
# List all active clients
clients = manager.list_clients()
for client_id, info in clients.items():
status = "โ
Running" if info['is_running'] else "โ Stopped"
print(f"{client_id}: {status} (port {info['port']})")
# Selective cleanup
manager.close_client("worker_1") # Stop specific client
manager.close_all_clients() # Stop all clients
# Thread-safe operations
# Multiple threads can safely call get_client() simultaneously
```
### Key Benefits
- **Isolation**: Each agent runs in its own Docker container with unique ports
- **Threading**: Thread-safe client creation and management
- **Organized Logs**: Separate log directories per agent (`{base_dir}/{agent_id}/tool_calls.txt`)
- **Zero Conflicts**: Automatic port allocation prevents conflicts
- **Backward Compatible**: Existing `CodeToolsClient` code works unchanged
## Architecture
### ๐ณ **Docker-Only Design**
- Simplified deployment: Only Docker containers supported
- Auto-fallback: Creates base container if Docker not running
- Secure isolation: All operations run in containerized environment
- No complex environment management
### ๐๏ธ **Server-Client Model**
- **Server**: Runs in Docker container, handles tool execution
- **Client**: Python interface, communicates via HTTP/JSON API
- **Auto-start**: Client automatically manages Docker server lifecycle
- **Stateless**: Clean separation between client and execution environment
### ๐ฏ **Key Benefits**
- **Simplicity**: 4 core tools vs 14+ complex tools in v1
- **Reliability**: Docker-only, predictable environment
- **Maintainability**: Simple codebase, clear architecture
- **Performance**: Lightweight, fast startup
- **Agent-Friendly**: Better error messages, token awareness
## Requirements
- Python 3.10+
- Docker (required - no local fallback)
- Minimal dependencies: `requests`, `tiktoken`
## Development
### Code Quality ๐งน
- `make style` to format the code
- `make check_code_quality` to check code quality (PEP8 basically)
- `black .`
- `ruff . --fix`
### Tests ๐งช
[`pytests`](https://docs.pytest.org/en/7.1.x/) is used to run our tests.
### Publishing ๐
```bash
poetry build
poetry publish
```
## License
MIT
Raw data
{
"_id": null,
"home_page": "https://github.com/balajidinesh/aicodetools",
"name": "aicodetools",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "artificial intelligence, ai agents, code tools, docker, simple",
"author": "balajidinesh",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/c8/2f/a0c2b94a346378fa1b997fede2912eb4c8798c81fb616071106e4b7a5a00/aicodetools-2.0.49.tar.gz",
"platform": null,
"description": "# aicodetools\n\nSimple, lightweight AI code tools with Docker-only support. No complex dependencies.\n\nProvides four essential tools for AI agents: **read**, **write**, **edit**, and **run** commands.\nRuns in a secure Docker container with automatic setup and management.\n\n\n## Installation\n\nYou can install the package using pip:\n\n```bash\npip install aicodetools\n```\n\nOr for development:\n\n```bash\npip install -e .\n```\n\n## Quick Start\n\n```python\nfrom aicodetools.client import CodeToolsClient\n\n# Auto-starts Docker server if needed (uses python:3.11-slim + pip install)\nclient = CodeToolsClient(auto_start=True)\n\n# Get simple functional tools\ntools = client.tools(selection_list=[\"read_file\", \"write_file\", \"edit_file\", \"run_command\"])\nread, write, edit, run_cmd = tools\n\n# Read a file with smart token management\nresult = read(\"example.py\")\nprint(result[\"content\"])\n\n# Write a file (with safety checks)\nwrite(\"hello.py\", \"print('Hello, World!')\")\n\n# Edit file using string replacement\nedit(\"hello.py\", \"Hello\", \"Hi\")\n\n# Run commands (non-interactive)\nresult = run_cmd(\"python hello.py\", interactive=False)\nprint(result[\"stdout\"])\n\n# Interactive commands still available on client\nclient.run_command(\"python -i\", interactive=True)\nclient.send_input(\"2 + 2\")\noutput = client.get_output()\n\n# Clean up when done\nclient.stop_server()\n```\n\n## Docker Configuration\n\n### Using Custom Docker Images\n\nThe framework automatically installs `aicodetools` via pip inside any Python container:\n\n```python\nfrom aicodetools import CodeToolsClient\n\n# Default: uses python:3.11-slim + pip install aicodetools\nclient = CodeToolsClient(auto_start=True)\n\n# Use different Python version\nclient = CodeToolsClient(\n auto_start=True,\n docker_image=\"python:3.12-alpine\"\n)\n\n# Use custom port (default is 18080 to avoid conflicts)\nclient = CodeToolsClient(\n auto_start=True,\n port=19080\n)\n\n# Use your own custom Python image\nclient = CodeToolsClient(\n auto_start=True,\n docker_image=\"my-company/python-base:latest\"\n)\n```\n\n### Docker Image Requirements\n\nYour custom Docker image only needs:\n- Python 3.10+ installed\n- `pip` available\n- Internet access (to install aicodetools package)\n\n### Example Custom Dockerfile\n\n```dockerfile\nFROM python:3.11-slim\n\n# Install system dependencies if needed\nRUN apt-get update && apt-get install -y git curl && rm -rf /var/lib/apt/lists/*\n\n# Pre-install aicodetools (optional - will be installed automatically if not present)\nRUN pip install aicodetools\n\n# Optional: Pre-install common packages for your use case\nRUN pip install numpy pandas requests beautifulsoup4\n\n# Set working directory\nWORKDIR /workspace\n\nCMD [\"/bin/bash\"]\n```\n\n### Manual Docker Usage\n\nIf you prefer to manage Docker yourself:\n\n```bash\n# Use any Python image and install aicodetools\ndocker run -d -p 18080:8080 --name my-aicodetools --rm python:3.11-slim \\\n bash -c \"pip install --break-system-packages aicodetools && python -m aicodetools.server --host 0.0.0.0 --port 8080\"\n\n# Then connect without auto_start\nclient = CodeToolsClient(auto_start=False, server_url=\"http://localhost:18080\")\n\n# Or use a different port\ndocker run -d -p 19080:8080 --name my-aicodetools-alt --rm python:3.12-alpine \\\n bash -c \"pip install --break-system-packages aicodetools && python -m aicodetools.server --host 0.0.0.0 --port 8080\"\nclient = CodeToolsClient(auto_start=False, server_url=\"http://localhost:19080\")\n\n# With your own custom image\ndocker run -d -p 20080:8080 --name my-custom --rm my-company/python-base:latest \\\n bash -c \"pip install --break-system-packages aicodetools && python -m aicodetools.server --host 0.0.0.0 --port 8080\"\n```\n\n## Core Tools\n\nFour essential tools, designed for simplicity and reliability:\n\n### \ud83d\udcd6 **Read Tool**\n- Smart file reading with tiered token management (4k/10k modes)\n- Regex pattern matching with context lines\n- Line range support for targeted reading\n- Automatic compression for long lines (6k max per line)\n\n### \u270f\ufe0f **Write Tool**\n- Safe file writing with read-first validation for existing files\n- Automatic backup creation with timestamps\n- UTF-8 encoding by default (simplified for Linux containers)\n- Directory creation if needed\n\n### \u2702\ufe0f **Edit Tool**\n- String-based find and replace editing\n- Support for single or all occurrences (replace_all flag)\n- Automatic backup before editing\n- Detailed change reporting with diffs\n\n### \u26a1 **Run Tool**\n- **Single function**: `run_command(command, timeout=300, interactive=False)`\n- **Non-interactive**: Auto-kill on timeout, return complete results\n- **Interactive**: Stream output, agent controls (get_output, send_input, stop_process)\n- **Single command limit**: Only one command at a time (prevents agent confusion)\n\n## Usage Examples\n\n### Context Manager Usage\n\n```python\nfrom aicodetools.client import CodeToolsClient\n\n# Recommended: Use context manager for automatic cleanup\nwith CodeToolsClient(auto_start=True) as client:\n # Get functional tools\n tools = client.tools(selection_list=[\"read_file\", \"write_file\", \"edit_file\", \"run_command\"])\n read, write, edit, run_cmd = tools\n\n # Read file with regex pattern matching\n matches = read(\"example.py\", regex=r\"def \\w+\")\n\n # Safe file editing workflow\n read(\"config.py\") # Read first for safety\n edit(\"config.py\", \"DEBUG = False\", \"DEBUG = True\")\n\n # Execute multiple commands (non-interactive)\n run_cmd(\"pip install requests\", interactive=False)\n result = run_cmd(\"python -c 'import requests; print(requests.__version__)'\", interactive=False)\n print(f\"Requests version: {result['stdout']}\")\n\n# Server automatically stops when exiting context\n```\n\n### Interactive Command Example\n\n```python\nfrom aicodetools import CodeToolsClient\nimport time\n\nclient = CodeToolsClient(auto_start=True)\n\n# Start a Python REPL (interactive mode)\nresult = client.run_command(\"python -i\", interactive=True)\nprint(f\"Python REPL started: {result['success']}\")\n\n# Send commands and get output\nclient.send_input(\"x = 10\")\nclient.send_input(\"y = 20\")\nclient.send_input(\"print(x + y)\")\n\n# Get accumulated output\ntime.sleep(1) # Wait for commands to execute\noutput = client.get_output()\nprint(\"Python REPL output:\", output[\"recent_stdout\"])\n\n# Stop the process\nclient.stop_process()\nclient.stop_server()\n```\n\n### AI Agent Integration\n\n```python\nfrom aicodetools.client import CodeToolsClient\n\ndef create_tool_functions():\n \"\"\"Create tool functions for AI agent integration.\"\"\"\n client = CodeToolsClient(auto_start=True)\n\n # Get the simplified functional tools\n tools = client.tools(selection_list=[\"read_file\", \"write_file\", \"edit_file\", \"run_command\"])\n read, write, edit, run_cmd = tools\n\n return [read, write, edit, run_cmd], client\n\n# Use with your favorite AI framework\ntools, client = create_tool_functions()\nread, write, edit, run_cmd = tools\n\n# Your AI agent can now use these simple functions\n# agent = YourAIAgent(tools=tools)\n# response = agent.run(\"Create a Python script that calculates fibonacci numbers\")\n\n# Example usage:\ncontent = read(\"example.py\") # Read file content\nwrite(\"fibonacci.py\", \"def fib(n): return n if n < 2 else fib(n-1) + fib(n-2)\") # Write file\nedit(\"fibonacci.py\", \"fib\", \"fibonacci\") # Edit file\nresult = run_cmd(\"python fibonacci.py\", timeout=10) # Run command\n\n# Clean up when done\nclient.stop_server()\n```\n\n## Multi-Agent Support with ClientManager\n\nThe `ClientManager` enables multiple AI agents to work concurrently, each with isolated Docker environments.\n\n### Basic Multi-Agent Setup\n\n```python\nfrom aicodetools import ClientManager\n\n# Create manager with organized logging\nmanager = ClientManager(\n docker_image=\"python:3.11-slim\",\n base_log_dir=\"./agent_logs\"\n)\n\n# Get clients for different agents\ndata_agent = manager.get_client(\"data_processor\") # Logs: ./agent_logs/data_processor/\ncode_agent = manager.get_client(\"code_reviewer\") # Logs: ./agent_logs/code_reviewer/\ntest_agent = manager.get_client(\"test_writer\") # Logs: ./agent_logs/test_writer/\n\n# Each agent gets isolated Docker container with unique ports\n# Container names: aicodetools-data_processor-abc123, etc.\n\n# Use agents normally - each has separate environment\ndata_tools = data_agent.tools([\"read_file\", \"write_file\", \"run_command\"])\ncode_tools = code_agent.tools([\"read_file\", \"edit_file\", \"run_command\"])\n\n# Clean up when done\nmanager.close_all_clients()\n```\n\n### Parallel Agent Execution\n\n```python\nimport threading\nfrom aicodetools import ClientManager\n\ndef agent_worker(manager, agent_id, task):\n \"\"\"Worker function for parallel agent execution.\"\"\"\n client = manager.get_client(agent_id)\n tools = client.tools([\"read_file\", \"write_file\", \"edit_file\", \"run_command\"])\n read, write, edit, run_cmd = tools\n\n # Agent performs its task\n write(f\"{agent_id}_output.py\", f\"# Task: {task}\\nprint('Completed by {agent_id}')\")\n result = run_cmd(f\"python {agent_id}_output.py\")\n print(f\"{agent_id}: {result['stdout'].strip()}\")\n\n# Create manager for parallel execution\nwith ClientManager(base_log_dir=\"./parallel_logs\") as manager:\n # Define agents and their tasks\n agents = [\n (\"frontend_dev\", \"Build UI components\"),\n (\"backend_dev\", \"Implement API endpoints\"),\n (\"database_dev\", \"Design database schema\"),\n (\"tester\", \"Write comprehensive tests\")\n ]\n\n # Start all agents in parallel\n threads = []\n for agent_id, task in agents:\n thread = threading.Thread(\n target=agent_worker,\n args=(manager, agent_id, task)\n )\n threads.append(thread)\n thread.start()\n\n # Wait for all agents to complete\n for thread in threads:\n thread.join()\n\n print(\"All agents completed!\")\n # Auto-cleanup when exiting context manager\n```\n\n### ClientManager Features\n\n```python\nfrom aicodetools import ClientManager\n\nmanager = ClientManager(base_log_dir=\"./my_logs\")\n\n# Client lifecycle management\nclient = manager.get_client(\"worker_1\")\ninfo = manager.get_client_info(\"worker_1\")\nprint(f\"Worker 1: port={info['port']}, container={info['container_name']}\")\n\n# List all active clients\nclients = manager.list_clients()\nfor client_id, info in clients.items():\n status = \"\u2705 Running\" if info['is_running'] else \"\u274c Stopped\"\n print(f\"{client_id}: {status} (port {info['port']})\")\n\n# Selective cleanup\nmanager.close_client(\"worker_1\") # Stop specific client\nmanager.close_all_clients() # Stop all clients\n\n# Thread-safe operations\n# Multiple threads can safely call get_client() simultaneously\n```\n\n### Key Benefits\n\n- **Isolation**: Each agent runs in its own Docker container with unique ports\n- **Threading**: Thread-safe client creation and management\n- **Organized Logs**: Separate log directories per agent (`{base_dir}/{agent_id}/tool_calls.txt`)\n- **Zero Conflicts**: Automatic port allocation prevents conflicts\n- **Backward Compatible**: Existing `CodeToolsClient` code works unchanged\n\n## Architecture\n\n### \ud83d\udc33 **Docker-Only Design**\n- Simplified deployment: Only Docker containers supported\n- Auto-fallback: Creates base container if Docker not running\n- Secure isolation: All operations run in containerized environment\n- No complex environment management\n\n### \ud83c\udfd7\ufe0f **Server-Client Model**\n- **Server**: Runs in Docker container, handles tool execution\n- **Client**: Python interface, communicates via HTTP/JSON API\n- **Auto-start**: Client automatically manages Docker server lifecycle\n- **Stateless**: Clean separation between client and execution environment\n\n### \ud83c\udfaf **Key Benefits**\n- **Simplicity**: 4 core tools vs 14+ complex tools in v1\n- **Reliability**: Docker-only, predictable environment\n- **Maintainability**: Simple codebase, clear architecture\n- **Performance**: Lightweight, fast startup\n- **Agent-Friendly**: Better error messages, token awareness\n\n## Requirements\n\n- Python 3.10+\n- Docker (required - no local fallback)\n- Minimal dependencies: `requests`, `tiktoken`\n\n## Development\n\n### Code Quality \ud83e\uddf9\n\n- `make style` to format the code\n- `make check_code_quality` to check code quality (PEP8 basically)\n- `black .`\n- `ruff . --fix`\n\n### Tests \ud83e\uddea\n\n[`pytests`](https://docs.pytest.org/en/7.1.x/) is used to run our tests.\n\n### Publishing \ud83d\ude80\n\n```bash\npoetry build\npoetry publish\n```\n\n## License\n\nMIT\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Simple, lightweight AI code tools with Docker-only support - no complex dependencies",
"version": "2.0.49",
"project_urls": {
"Documentation": "https://github.com/balajidinesh/aicodetools",
"Homepage": "https://github.com/balajidinesh/aicodetools",
"Repository": "https://github.com/balajidinesh/aicodetools"
},
"split_keywords": [
"artificial intelligence",
" ai agents",
" code tools",
" docker",
" simple"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "38346d3a5a607758f2404ad342beee5950673e7aa05e5d5aba25311426699370",
"md5": "19cec1157c2238ccb7b214df40a6c581",
"sha256": "2c79c504766c2e2f363f4cb2a3d7a74ba14b9106e121333f01d6ff9739babd1f"
},
"downloads": -1,
"filename": "aicodetools-2.0.49-py3-none-any.whl",
"has_sig": false,
"md5_digest": "19cec1157c2238ccb7b214df40a6c581",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 38704,
"upload_time": "2025-10-09T08:40:07",
"upload_time_iso_8601": "2025-10-09T08:40:07.286267Z",
"url": "https://files.pythonhosted.org/packages/38/34/6d3a5a607758f2404ad342beee5950673e7aa05e5d5aba25311426699370/aicodetools-2.0.49-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c82fa0c2b94a346378fa1b997fede2912eb4c8798c81fb616071106e4b7a5a00",
"md5": "a51bd67cb657713ee8c07dd3294be69f",
"sha256": "a00703cf36f76c3ee405622f92466b51bd196bda4e0a0d6238eeb44fd6ef6e1b"
},
"downloads": -1,
"filename": "aicodetools-2.0.49.tar.gz",
"has_sig": false,
"md5_digest": "a51bd67cb657713ee8c07dd3294be69f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 34717,
"upload_time": "2025-10-09T08:40:08",
"upload_time_iso_8601": "2025-10-09T08:40:08.882897Z",
"url": "https://files.pythonhosted.org/packages/c8/2f/a0c2b94a346378fa1b997fede2912eb4c8798c81fb616071106e4b7a5a00/aicodetools-2.0.49.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-09 08:40:08",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "balajidinesh",
"github_project": "aicodetools",
"github_not_found": true,
"lcname": "aicodetools"
}