swarmcode


Nameswarmcode JSON
Version 0.1.1 PyPI version JSON
download
home_pageNone
SummaryOpen-source AI coding assistant powered by Cerebras - fast inference for Llama 3.3 70B and Qwen3 Coder 480B
upload_time2025-08-17 20:52:07
maintainerNone
docs_urlNone
authorMichael Pfaffenberger
requires_python>=3.10
licenseMIT
keywords ai assistant cerebras code-generation coding development-tools llama qwen
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ๐Ÿš€ SwarmCode ๐Ÿš€

*The open-source AI coding assistant that actually works with Cerebras and other providers*

![Build Status](https://img.shields.io/badge/build-passing-brightgreen)
![Coverage](https://img.shields.io/badge/coverage-95%25-brightgreen)
![Python](https://img.shields.io/badge/python-3.10%2B-blue)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

*"Who needs expensive AI IDEs when you have SwarmCode?"* 

## Overview

SwarmCode is a powerful AI-powered code generation agent that works with **Cerebras**, OpenAI, Gemini, and other providers. Built as a response to expensive proprietary tools, SwarmCode gives you complete control over your AI coding assistant.

## โœจ Features

- **๐Ÿง  Cerebras Integration**: Fast inference with Cerebras models (Llama 3.3 70B, Qwen3 Coder 480B)
- **๐ŸŒ Multi-Provider Support**: Works with OpenAI, Gemini, Anthropic, Together AI, and custom endpoints
- **๐Ÿ’ป Interactive CLI**: Beautiful command-line interface with rich formatting
- **๐Ÿ“š RAG Support**: Built-in embeddings and vector search for context-aware coding
- **๐Ÿ”ง MCP Server Support**: Extensible with Model Context Protocol servers
- **๐Ÿšซ No Vendor Lock-in**: Use any model provider you want

## ๐ŸŽฌ Demo

![SwarmCode Demo](code_puppy.gif)

## ๐Ÿš€ Quick Start

### Option 1: One-Line Install (Recommended) ๐ŸŽฏ

```bash
curl -sSL https://raw.githubusercontent.com/rinadelph/swarmcode/main/install.sh | bash
```

That's it! The installer will:
- โœ… Check your Python version
- โœ… Install SwarmCode and all dependencies
- โœ… Prompt for your Cerebras API key
- โœ… Configure your preferred model
- โœ… Set up the `swm` command globally

### Option 2: Install from PyPI (Coming Soon)

```bash
pip install swarmcode
swm --interactive
```

### Option 3: Install from Source

```bash
# Clone and install
git clone https://github.com/rinadelph/swarmcode.git
cd swarmcode
python3 install.py

# Start coding!
swm --interactive
```

## ๐Ÿ“‹ Requirements

- **Python 3.10+**
- **API Keys** (at least one):
  - Cerebras API key (recommended) - [Get it here](https://cerebras.ai)
  - OpenAI API key
  - Gemini API key
  - Anthropic API key
  - Or any custom endpoint

## ๐ŸŽฏ Usage Examples

### Interactive Mode
```bash
# Start interactive session with Cerebras
swarmcode --interactive

# Or use a specific model
export MODEL_NAME=Cerebras-Qwen3-Coder-480b
swarmcode --interactive
```

### Direct Task Execution
```bash
# Execute a task directly
swarmcode "write a Python FastAPI server with user authentication"

# Generate and run code
swarmcode "create a React component for a todo list and set up the project"
```

### Using Different Providers

#### Cerebras (Recommended)
```bash
export CEREBRAS_API_KEY='your-key'
export MODEL_NAME='Cerebras-Llama-3.3-70b'
swarmcode --interactive
```

#### OpenAI
```bash
export OPENAI_API_KEY='your-key'
export MODEL_NAME='gpt-4.1'
swarmcode --interactive
```

#### Gemini
```bash
export GEMINI_API_KEY='your-key'
export MODEL_NAME='gemini-2.5-flash-preview-05-20'
swarmcode --interactive
```

#### Custom Endpoints
```bash
export MODEL_NAME='my-custom-model'
export MODELS_JSON_PATH='/path/to/custom/models.json'
swarmcode --interactive
```

## ๐Ÿ”ง Configuration

### Environment Variables
Create a `.env` file in your project root:

```bash
# Primary configuration
MODEL_NAME=Cerebras-Llama-3.3-70b
CEREBRAS_API_KEY=your_cerebras_api_key

# Optional providers
OPENAI_API_KEY=your_openai_key
GEMINI_API_KEY=your_gemini_key
ANTHROPIC_API_KEY=your_anthropic_key

# Behavior
YOLO_MODE=true  # Skip confirmation prompts for commands
```

### Available Cerebras Models
- `Cerebras-Llama-3.3-70b` - Fast, general purpose (default)
- `Cerebras-Qwen3-Coder-480b` - Specialized for coding
- `Cerebras-Qwen3-235b-a22b-instruct-2507` - Large instruction model
- `Cerebras-gpt-oss-120b` - Open source GPT variant

See `cerebras_models.json` for full list.

## ๐Ÿ› ๏ธ Advanced Features

### RAG (Retrieval Augmented Generation)
SwarmCode includes built-in RAG support for better context awareness:

```bash
# Index your codebase
python index_code_puppy.py

# Use with embeddings
swarmcode --interactive --use-rag
```

### MCP Server Integration
Connect to external tools and services:

## Puppy Rules
Puppy rules allow you to define and enforce coding standards and styles that your code should comply with. These rules can cover various aspects such as formatting, naming conventions, and even design guidelines.

### Example of a Puppy Rule
For instance, if you want to ensure that your application follows a specific design guideline, like using a dark mode theme with teal accents, you can define a puppy rule like this:

```plaintext
# Puppy Rule: Dark Mode with Teal Accents

  - theme: dark
  - accent-color: teal
  - background-color: #121212
  - text-color: #e0e0e0

Ensure that all components follow these color schemes to promote consistency in design.
```

## Using MCP Servers for External Tools

Code Puppy supports **MCP (Model Context Protocol) servers** to give you access to external code tools and advanced features like code search, documentation lookups, and moreโ€”including Context7 (https://context7.com/) integration for deep docs and search!

### What is an MCP Server?
An MCP server is a standalone process (can be local or remote) that offers specialized functionality (plugins, doc search, code analysis, etc.). Code Puppy can connect to one or more MCP servers at startup, unlocking these extra commands inside your coding agent.

### Configuration
Create a config file at `~/.code_puppy/mcp_servers.json`. Hereโ€™s an example that connects to a local Context7 MCP server:

```json
{
  "mcp_servers": {
     "context7": { 
        "url": "https://mcp.context7.com/sse"
     }
  }
}
```

You can list multiple objects (one per server).

### How to Use
- Drop the config file in `~/.code_puppy/mcp_servers.json`.
- Start your MCP (like context7, or anything compatible).
- Run Code Puppy as usual. Itโ€™ll discover and use all configured MCP servers.

#### Example usage
```bash
code-puppy --interactive
# Then ask: Use context7 to look up FastAPI docs!
```

Thatโ€™s it!
If you need to run more exotic setups or connect to remote MCPs, just update your `mcp_servers.json` accordingly.

**NOTE:** Want to add your own server or tool? Just follow the config pattern aboveโ€”no code changes needed!

---

## Conclusion
By using Code Puppy, you can maintain code quality and adhere to design guidelines with ease.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "swarmcode",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Luis Alejandro Rincon <alejandro@rinconnect.org>",
    "keywords": "ai, assistant, cerebras, code-generation, coding, development-tools, llama, qwen",
    "author": "Michael Pfaffenberger",
    "author_email": "Luis Alejandro Rincon <alejandro@rinconnect.org>",
    "download_url": "https://files.pythonhosted.org/packages/57/db/410f8f06eded5e8ac14ca2c1c9d00f99427c2f88adbcb6dbfe75a583ca32/swarmcode-0.1.1.tar.gz",
    "platform": null,
    "description": "# \ud83d\ude80 SwarmCode \ud83d\ude80\n\n*The open-source AI coding assistant that actually works with Cerebras and other providers*\n\n![Build Status](https://img.shields.io/badge/build-passing-brightgreen)\n![Coverage](https://img.shields.io/badge/coverage-95%25-brightgreen)\n![Python](https://img.shields.io/badge/python-3.10%2B-blue)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\n*\"Who needs expensive AI IDEs when you have SwarmCode?\"* \n\n## Overview\n\nSwarmCode is a powerful AI-powered code generation agent that works with **Cerebras**, OpenAI, Gemini, and other providers. Built as a response to expensive proprietary tools, SwarmCode gives you complete control over your AI coding assistant.\n\n## \u2728 Features\n\n- **\ud83e\udde0 Cerebras Integration**: Fast inference with Cerebras models (Llama 3.3 70B, Qwen3 Coder 480B)\n- **\ud83c\udf10 Multi-Provider Support**: Works with OpenAI, Gemini, Anthropic, Together AI, and custom endpoints\n- **\ud83d\udcbb Interactive CLI**: Beautiful command-line interface with rich formatting\n- **\ud83d\udcda RAG Support**: Built-in embeddings and vector search for context-aware coding\n- **\ud83d\udd27 MCP Server Support**: Extensible with Model Context Protocol servers\n- **\ud83d\udeab No Vendor Lock-in**: Use any model provider you want\n\n## \ud83c\udfac Demo\n\n![SwarmCode Demo](code_puppy.gif)\n\n## \ud83d\ude80 Quick Start\n\n### Option 1: One-Line Install (Recommended) \ud83c\udfaf\n\n```bash\ncurl -sSL https://raw.githubusercontent.com/rinadelph/swarmcode/main/install.sh | bash\n```\n\nThat's it! The installer will:\n- \u2705 Check your Python version\n- \u2705 Install SwarmCode and all dependencies\n- \u2705 Prompt for your Cerebras API key\n- \u2705 Configure your preferred model\n- \u2705 Set up the `swm` command globally\n\n### Option 2: Install from PyPI (Coming Soon)\n\n```bash\npip install swarmcode\nswm --interactive\n```\n\n### Option 3: Install from Source\n\n```bash\n# Clone and install\ngit clone https://github.com/rinadelph/swarmcode.git\ncd swarmcode\npython3 install.py\n\n# Start coding!\nswm --interactive\n```\n\n## \ud83d\udccb Requirements\n\n- **Python 3.10+**\n- **API Keys** (at least one):\n  - Cerebras API key (recommended) - [Get it here](https://cerebras.ai)\n  - OpenAI API key\n  - Gemini API key\n  - Anthropic API key\n  - Or any custom endpoint\n\n## \ud83c\udfaf Usage Examples\n\n### Interactive Mode\n```bash\n# Start interactive session with Cerebras\nswarmcode --interactive\n\n# Or use a specific model\nexport MODEL_NAME=Cerebras-Qwen3-Coder-480b\nswarmcode --interactive\n```\n\n### Direct Task Execution\n```bash\n# Execute a task directly\nswarmcode \"write a Python FastAPI server with user authentication\"\n\n# Generate and run code\nswarmcode \"create a React component for a todo list and set up the project\"\n```\n\n### Using Different Providers\n\n#### Cerebras (Recommended)\n```bash\nexport CEREBRAS_API_KEY='your-key'\nexport MODEL_NAME='Cerebras-Llama-3.3-70b'\nswarmcode --interactive\n```\n\n#### OpenAI\n```bash\nexport OPENAI_API_KEY='your-key'\nexport MODEL_NAME='gpt-4.1'\nswarmcode --interactive\n```\n\n#### Gemini\n```bash\nexport GEMINI_API_KEY='your-key'\nexport MODEL_NAME='gemini-2.5-flash-preview-05-20'\nswarmcode --interactive\n```\n\n#### Custom Endpoints\n```bash\nexport MODEL_NAME='my-custom-model'\nexport MODELS_JSON_PATH='/path/to/custom/models.json'\nswarmcode --interactive\n```\n\n## \ud83d\udd27 Configuration\n\n### Environment Variables\nCreate a `.env` file in your project root:\n\n```bash\n# Primary configuration\nMODEL_NAME=Cerebras-Llama-3.3-70b\nCEREBRAS_API_KEY=your_cerebras_api_key\n\n# Optional providers\nOPENAI_API_KEY=your_openai_key\nGEMINI_API_KEY=your_gemini_key\nANTHROPIC_API_KEY=your_anthropic_key\n\n# Behavior\nYOLO_MODE=true  # Skip confirmation prompts for commands\n```\n\n### Available Cerebras Models\n- `Cerebras-Llama-3.3-70b` - Fast, general purpose (default)\n- `Cerebras-Qwen3-Coder-480b` - Specialized for coding\n- `Cerebras-Qwen3-235b-a22b-instruct-2507` - Large instruction model\n- `Cerebras-gpt-oss-120b` - Open source GPT variant\n\nSee `cerebras_models.json` for full list.\n\n## \ud83d\udee0\ufe0f Advanced Features\n\n### RAG (Retrieval Augmented Generation)\nSwarmCode includes built-in RAG support for better context awareness:\n\n```bash\n# Index your codebase\npython index_code_puppy.py\n\n# Use with embeddings\nswarmcode --interactive --use-rag\n```\n\n### MCP Server Integration\nConnect to external tools and services:\n\n## Puppy Rules\nPuppy rules allow you to define and enforce coding standards and styles that your code should comply with. These rules can cover various aspects such as formatting, naming conventions, and even design guidelines.\n\n### Example of a Puppy Rule\nFor instance, if you want to ensure that your application follows a specific design guideline, like using a dark mode theme with teal accents, you can define a puppy rule like this:\n\n```plaintext\n# Puppy Rule: Dark Mode with Teal Accents\n\n  - theme: dark\n  - accent-color: teal\n  - background-color: #121212\n  - text-color: #e0e0e0\n\nEnsure that all components follow these color schemes to promote consistency in design.\n```\n\n## Using MCP Servers for External Tools\n\nCode Puppy supports **MCP (Model Context Protocol) servers** to give you access to external code tools and advanced features like code search, documentation lookups, and more\u2014including Context7 (https://context7.com/) integration for deep docs and search!\n\n### What is an MCP Server?\nAn MCP server is a standalone process (can be local or remote) that offers specialized functionality (plugins, doc search, code analysis, etc.). Code Puppy can connect to one or more MCP servers at startup, unlocking these extra commands inside your coding agent.\n\n### Configuration\nCreate a config file at `~/.code_puppy/mcp_servers.json`. Here\u2019s an example that connects to a local Context7 MCP server:\n\n```json\n{\n  \"mcp_servers\": {\n     \"context7\": { \n        \"url\": \"https://mcp.context7.com/sse\"\n     }\n  }\n}\n```\n\nYou can list multiple objects (one per server).\n\n### How to Use\n- Drop the config file in `~/.code_puppy/mcp_servers.json`.\n- Start your MCP (like context7, or anything compatible).\n- Run Code Puppy as usual. It\u2019ll discover and use all configured MCP servers.\n\n#### Example usage\n```bash\ncode-puppy --interactive\n# Then ask: Use context7 to look up FastAPI docs!\n```\n\nThat\u2019s it!\nIf you need to run more exotic setups or connect to remote MCPs, just update your `mcp_servers.json` accordingly.\n\n**NOTE:** Want to add your own server or tool? Just follow the config pattern above\u2014no code changes needed!\n\n---\n\n## Conclusion\nBy using Code Puppy, you can maintain code quality and adhere to design guidelines with ease.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Open-source AI coding assistant powered by Cerebras - fast inference for Llama 3.3 70B and Qwen3 Coder 480B",
    "version": "0.1.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/rinadelph/swarmcode/issues",
        "Changelog": "https://github.com/rinadelph/swarmcode/releases",
        "Documentation": "https://github.com/rinadelph/swarmcode#readme",
        "Homepage": "https://github.com/rinadelph/swarmcode",
        "Repository": "https://github.com/rinadelph/swarmcode.git"
    },
    "split_keywords": [
        "ai",
        " assistant",
        " cerebras",
        " code-generation",
        " coding",
        " development-tools",
        " llama",
        " qwen"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "40c0c07f9b51ed4a879775df84d87df76386352a93ab2195ea2d9cad15e3e3cb",
                "md5": "feae5caed15884198e2f011bd5572c0a",
                "sha256": "3a79e3fc51a5208f2719a1ff4bec22d67ce0bb496e8eb10b375f7c4885700ada"
            },
            "downloads": -1,
            "filename": "swarmcode-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "feae5caed15884198e2f011bd5572c0a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 142836,
            "upload_time": "2025-08-17T20:52:06",
            "upload_time_iso_8601": "2025-08-17T20:52:06.626839Z",
            "url": "https://files.pythonhosted.org/packages/40/c0/c07f9b51ed4a879775df84d87df76386352a93ab2195ea2d9cad15e3e3cb/swarmcode-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "57db410f8f06eded5e8ac14ca2c1c9d00f99427c2f88adbcb6dbfe75a583ca32",
                "md5": "92badbd75acc68e5914dcac8536fdac1",
                "sha256": "e98717331670654df1e1fca4870184a7647542fec3597fa15e713df08ae6d68f"
            },
            "downloads": -1,
            "filename": "swarmcode-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "92badbd75acc68e5914dcac8536fdac1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 109392,
            "upload_time": "2025-08-17T20:52:07",
            "upload_time_iso_8601": "2025-08-17T20:52:07.722817Z",
            "url": "https://files.pythonhosted.org/packages/57/db/410f8f06eded5e8ac14ca2c1c9d00f99427c2f88adbcb6dbfe75a583ca32/swarmcode-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-17 20:52:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "rinadelph",
    "github_project": "swarmcode",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "swarmcode"
}
        
Elapsed time: 0.70827s