fractalic


Namefractalic JSON
Version 0.1.1 PyPI version JSON
download
home_pageNone
SummaryThe Agentic Development Environment - Turn simple documents into powerful, AI-native applications
upload_time2025-09-13 14:40:40
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords ai agents automation markdown llm agentic development mcp model-context-protocol workflow orchestration
VCS
bugtrack_url
requirements anthropic aiohttp aiohttp-cors python-dotenv groq openai litellm requests replicate uvicorn fastapi gitpython jsonschema Pillow PyYAML rich toml mcp fastmcp tiktoken httpx pydantic
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
  <img src="https://raw.githubusercontent.com/fractalic-ai/fractalic/main/docs/images/fractalic_hero.png" alt="Fractalic Hero Image">
</p>

<h1 align="center">Fractalic</h1>

<p align="center">
  <strong>The Agentic Development Environment.</strong>
  <br>
  Turn simple documents into powerful, AI-native applications.
</p>

<p align="center">
    <a href="https://github.com/fractalic-ai/fractalic/blob/main/LICENSE.txt"><img alt="License" src="https://img.shields.io/github/license/fractalic-ai/fractalic?style=flat-square&color=blue"></a>
    <a href="https://fractalic.ai/docs/"><img alt="Documentation" src="https://img.shields.io/website?url=https%3A%2F%2Ffractalic.ai%2Fdocs&label=docs&style=flat-square&up_message=online&color=brightgreen"></a>
    <a href="https://discord.gg/DHbnvxAT"><img alt="Discord" src="https://img.shields.io/badge/discord-join%20chat-5865F2?style=flat-square&logo=discord&logoColor=white"></a>
</p>

---

**Fractalic** is a new kind of development environment where your document *is* the program. Instead of writing complex scripts, you build AI-powered workflows using plain Markdown and simple YAML commands. The document evolves as operations run, creating a transparent, readable, and powerful way to orchestrate AI.

## Why Fractalic

- Build agents by writing Markdown + minimal YAML. No frameworks to wire.
- Deploy the same documents as services with a REST API.
- Use your tools via MCP or local shell/Python.
- Git-backed history, token/cost tracking, and diffs by default.

### Core hooks (what’s uniquely valuable)
1) Deterministic execution with embedded operations — runs top-to-bottom with explicit append/replace semantics.
2) Complete execution trace — every turn, tool call, and token cost recorded; easy to diff and audit.
3) Agentic tool use — governed tool loops with MCP services or local tools.
4) Documents are agents — call docs with parameters, return structured results, isolate context.
5) Structured tree context — documents are block trees with IDs; select with `blocks:` and CRUD them.

See also: `docs/core-concepts.md` (§ Blocks and IDs), `docs/advanced-llm-features.md` (§ Tool loops), `docs/context-management.md`.

## Quick Start

Run with Docker:

```bash
docker run -d --name fractalic \
  --network bridge \
  -p 3000:3000 -p 8000:8000 -p 8001:8001 -p 5859:5859 \
  -v /var/run/docker.sock:/var/run/docker.sock \
  --env HOST=0.0.0.0 \
  ghcr.io/fractalic-ai/fractalic:main
```
Then open http://localhost:3000. More options: `docs/quick-start.md`.

### Your first program (`hello-fractalic.md`)

```markdown
# Goal: Create a simple greeting {id=goal}
I want to see how Fractalic can use an LLM to generate text based on a prompt and context from this document.

@llm
prompt: Create a friendly, one-sentence greeting that mentions how Fractalic turns documents into programs.
blocks: goal
use-header: "# AI-Generated Greeting"
```

Run it in the UI or call the API:

```bash
curl -X POST http://localhost:8001/execute \
  -H "Content-Type: application/json" \
  -d '{"filename": "./hello-fractalic.md"}'
```

## What you can build (real use cases)

All examples assume the service is enabled in `mcp_servers.json` and authenticated if needed. Learn more: `docs/mcp-integration.md`.

### Memory notes with MCP Memory

```markdown
# Task {id=task}
Store a fact, list keys, then summarize changes.

@llm
prompt: |
  1) Store the note: "Fractalic launched an MCP-powered agent".
  2) List current memory keys.
  3) Summarize the change in one sentence.
blocks: task
use-header: "# Memory Result {id=memory-result}"
tools:
  - mcp/memory-stdio-server
tools-turns-max: 2

@return
block: memory-result
```

### Create a Notion page (MCP Notion)

```markdown
# Task {id=task}
Create a project note in Notion with today’s date and a short status.

@llm
prompt: |
  Create a Notion page titled "AI Weekly Status — {{today}}" with a bullet list of 3 highlights.
  If a database is required, use a reasonable default the server exposes.
blocks: task
use-header: "# Notion Result {id=notion-result}"
tools:
  - mcp/notion
tools-turns-max: 3

@return
block: notion-result
```

### Add a Google Calendar event (MCP Zapier)

```markdown
# Task {id=task}
Schedule a 30-minute sync tomorrow at 10:00 for the AI team.

@llm
prompt: |
  Create a calendar event titled "AI Team Sync" for tomorrow 10:00–10:30.
  Include Meet link if available.
blocks: task
use-header: "# Calendar Result {id=calendar-result}"
tools:
  - mcp/Zapier
tools-turns-max: 3

@return
block: calendar-result
```

### When a tool is enough (local shell_tool)

```markdown
# Task {id=task}
List current directory and summarize files.

@llm
prompt: |
  1) List files in the current directory.
  2) Summarize by type and count.
blocks: task
use-header: "# Shell Summary {id=shell-summary}"
tools:
  - shell_tool
tools-turns-max: 2

@return
block: shell-summary
```

More patterns: `docs/advanced-llm-features.md` (§ Tool loops, JSON-only outputs).

## Documents as agents (call docs with docs)

- Use `@run` for static, synchronous sub-workflows. See `docs/core-concepts.md` (§ @run).
- Use the `fractalic_run` tool to let an LLM dynamically choose and execute another document. See `docs/agent-modular-workflows.md` (§ Dynamic Agent Calling).

Example: router calls a specialized agent with parameters and returns a clean result.

```markdown
# Router Task {id=router-task}
Decide which agent to call and return only the agent’s final result.

@llm
prompt: |
  If the request mentions "notion", call the notion agent.
  Otherwise call the memory agent.
  Return only the callee’s final result.
blocks: router-task
use-header: "# Router Result {id=router-result}"
tools:
  - fractalic_run
tools-turns-max: 2
```

Learn: `docs/operations-reference.md` (§ Calling another agent via fractalic_run).

## Integrations and credits

- Model routing via LiteLLM — one OpenAI-compatible interface to 100+ providers. Project: https://github.com/BerriAI/litellm
- MCP client and manager built on FastMCP — high-level, Pythonic MCP tooling. Project: https://github.com/jlowin/fastmcp

Fractalic integrates with and credits these upstream projects. See licenses in their repos.

## Out-of-the-box tools (packs)

Explore the tools marketplace: https://github.com/fractalic-ai/fractalic-tools
- HubSpot automation suite (CRM): 40+ tools incl. contacts, deals, tickets, process discovery
- Discovery & Process Mining: event discovery, analytics
- OS pack: file, shell, text
- Telegram automation
- Web scraping/search

Add packs as MCP services or as local tools per repo instructions.

## Screenshots

<table>
  <tr>
    <td width="50%">
      <img src="docs/images/editor.png" alt="Fractalic Editor - Notebook-style UI with Markdown and YAML operations" />
      <p align="center"><em>Main Editor Interface</em></p>
    </td>
    <td width="50%">
      <img src="docs/images/notebook.png" alt="Notebook View - Interactive document execution with live results" />
      <p align="center"><em>Notebook Execution View</em></p>
    </td>
  </tr>
  <tr>
    <td width="50%">
      <img src="docs/images/tools.png" alt="MCP Tools Integration - Access external services via Model Context Protocol" />
      <p align="center"><em>MCP Tools Integration</em></p>
    </td>
    <td width="50%">
      <img src="docs/images/mcp.png" alt="MCP Manager - Unified tool and service management interface" />
      <p align="center"><em>MCP Manager Interface</em></p>
    </td>
  </tr>
  <tr>
    <td width="50%">
      <img src="docs/images/diff.png" alt="Git-backed Diffs - Complete execution trace with version control" />
      <p align="center"><em>Git-backed Execution Diffs</em></p>
    </td>
    <td width="50%">
      <img src="docs/images/inspector.png" alt="Debug Inspector - Deep inspection of execution state and variables" />
      <p align="center"><em>Debug Inspector</em></p>
    </td>
  </tr>
  <tr>
    <td width="50%">
      <img src="docs/images/inspector-messages.png" alt="Message Inspector - Detailed view of AI conversation turns and tool calls" />
      <p align="center"><em>Message Inspector</em></p>
    </td>
    <td width="50%">
      <img src="docs/images/markdown.png" alt="Markdown Editor - Clean document editing with syntax highlighting" />
      <p align="center"><em>Markdown Editor</em></p>
    </td>
  </tr>
  <tr>
    <td colspan="2" align="center">
      <img src="docs/images/deploy.png" alt="Deployment Dashboard - One-click containerization and service deployment" width="50%" />
      <p align="center"><em>Deployment Dashboard</em></p>
    </td>
  </tr>
</table>

## Architecture (short)

- ADE UI (port 8000) — notebook-like UI with streaming, diffs, debug
- AI Server (port 8001+) — FastAPI `/execute` with Swagger `/docs`
- Unified MCP Manager (port 5859) — OAuth, caching, tool catalog
- Git-backed sessions, token/cost tracker, Docker-first deploy

Details: `docs/ui-server-api.md`, `docs/mcp-integration.md`.

## See also

- Introduction: `docs/introduction.md`
- Core Concepts: `docs/core-concepts.md`
- Operations Reference: `docs/operations-reference.md`
- Advanced LLM Features: `docs/advanced-llm-features.md`
- MCP Integration: `docs/mcp-integration.md`
- Context Management: `docs/context-management.md`
- Deployment Guide: `docs/deployment-guide.md`




            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "fractalic",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "Fractalic AI <hello@fractalic.ai>",
    "keywords": "ai, agents, automation, markdown, llm, agentic, development, mcp, model-context-protocol, workflow, orchestration",
    "author": null,
    "author_email": "Fractalic AI <hello@fractalic.ai>",
    "download_url": "https://files.pythonhosted.org/packages/61/ca/da050fe92985bbea76e3d52a4f2d612c897ce9b90633a8ede3174b513bd3/fractalic-0.1.1.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/fractalic-ai/fractalic/main/docs/images/fractalic_hero.png\" alt=\"Fractalic Hero Image\">\n</p>\n\n<h1 align=\"center\">Fractalic</h1>\n\n<p align=\"center\">\n  <strong>The Agentic Development Environment.</strong>\n  <br>\n  Turn simple documents into powerful, AI-native applications.\n</p>\n\n<p align=\"center\">\n    <a href=\"https://github.com/fractalic-ai/fractalic/blob/main/LICENSE.txt\"><img alt=\"License\" src=\"https://img.shields.io/github/license/fractalic-ai/fractalic?style=flat-square&color=blue\"></a>\n    <a href=\"https://fractalic.ai/docs/\"><img alt=\"Documentation\" src=\"https://img.shields.io/website?url=https%3A%2F%2Ffractalic.ai%2Fdocs&label=docs&style=flat-square&up_message=online&color=brightgreen\"></a>\n    <a href=\"https://discord.gg/DHbnvxAT\"><img alt=\"Discord\" src=\"https://img.shields.io/badge/discord-join%20chat-5865F2?style=flat-square&logo=discord&logoColor=white\"></a>\n</p>\n\n---\n\n**Fractalic** is a new kind of development environment where your document *is* the program. Instead of writing complex scripts, you build AI-powered workflows using plain Markdown and simple YAML commands. The document evolves as operations run, creating a transparent, readable, and powerful way to orchestrate AI.\n\n## Why Fractalic\n\n- Build agents by writing Markdown + minimal YAML. No frameworks to wire.\n- Deploy the same documents as services with a REST API.\n- Use your tools via MCP or local shell/Python.\n- Git-backed history, token/cost tracking, and diffs by default.\n\n### Core hooks (what\u2019s uniquely valuable)\n1) Deterministic execution with embedded operations \u2014 runs top-to-bottom with explicit append/replace semantics.\n2) Complete execution trace \u2014 every turn, tool call, and token cost recorded; easy to diff and audit.\n3) Agentic tool use \u2014 governed tool loops with MCP services or local tools.\n4) Documents are agents \u2014 call docs with parameters, return structured results, isolate context.\n5) Structured tree context \u2014 documents are block trees with IDs; select with `blocks:` and CRUD them.\n\nSee also: `docs/core-concepts.md` (\u00a7 Blocks and IDs), `docs/advanced-llm-features.md` (\u00a7 Tool loops), `docs/context-management.md`.\n\n## Quick Start\n\nRun with Docker:\n\n```bash\ndocker run -d --name fractalic \\\n  --network bridge \\\n  -p 3000:3000 -p 8000:8000 -p 8001:8001 -p 5859:5859 \\\n  -v /var/run/docker.sock:/var/run/docker.sock \\\n  --env HOST=0.0.0.0 \\\n  ghcr.io/fractalic-ai/fractalic:main\n```\nThen open http://localhost:3000. More options: `docs/quick-start.md`.\n\n### Your first program (`hello-fractalic.md`)\n\n```markdown\n# Goal: Create a simple greeting {id=goal}\nI want to see how Fractalic can use an LLM to generate text based on a prompt and context from this document.\n\n@llm\nprompt: Create a friendly, one-sentence greeting that mentions how Fractalic turns documents into programs.\nblocks: goal\nuse-header: \"# AI-Generated Greeting\"\n```\n\nRun it in the UI or call the API:\n\n```bash\ncurl -X POST http://localhost:8001/execute \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"filename\": \"./hello-fractalic.md\"}'\n```\n\n## What you can build (real use cases)\n\nAll examples assume the service is enabled in `mcp_servers.json` and authenticated if needed. Learn more: `docs/mcp-integration.md`.\n\n### Memory notes with MCP Memory\n\n```markdown\n# Task {id=task}\nStore a fact, list keys, then summarize changes.\n\n@llm\nprompt: |\n  1) Store the note: \"Fractalic launched an MCP-powered agent\".\n  2) List current memory keys.\n  3) Summarize the change in one sentence.\nblocks: task\nuse-header: \"# Memory Result {id=memory-result}\"\ntools:\n  - mcp/memory-stdio-server\ntools-turns-max: 2\n\n@return\nblock: memory-result\n```\n\n### Create a Notion page (MCP Notion)\n\n```markdown\n# Task {id=task}\nCreate a project note in Notion with today\u2019s date and a short status.\n\n@llm\nprompt: |\n  Create a Notion page titled \"AI Weekly Status \u2014 {{today}}\" with a bullet list of 3 highlights.\n  If a database is required, use a reasonable default the server exposes.\nblocks: task\nuse-header: \"# Notion Result {id=notion-result}\"\ntools:\n  - mcp/notion\ntools-turns-max: 3\n\n@return\nblock: notion-result\n```\n\n### Add a Google Calendar event (MCP Zapier)\n\n```markdown\n# Task {id=task}\nSchedule a 30-minute sync tomorrow at 10:00 for the AI team.\n\n@llm\nprompt: |\n  Create a calendar event titled \"AI Team Sync\" for tomorrow 10:00\u201310:30.\n  Include Meet link if available.\nblocks: task\nuse-header: \"# Calendar Result {id=calendar-result}\"\ntools:\n  - mcp/Zapier\ntools-turns-max: 3\n\n@return\nblock: calendar-result\n```\n\n### When a tool is enough (local shell_tool)\n\n```markdown\n# Task {id=task}\nList current directory and summarize files.\n\n@llm\nprompt: |\n  1) List files in the current directory.\n  2) Summarize by type and count.\nblocks: task\nuse-header: \"# Shell Summary {id=shell-summary}\"\ntools:\n  - shell_tool\ntools-turns-max: 2\n\n@return\nblock: shell-summary\n```\n\nMore patterns: `docs/advanced-llm-features.md` (\u00a7 Tool loops, JSON-only outputs).\n\n## Documents as agents (call docs with docs)\n\n- Use `@run` for static, synchronous sub-workflows. See `docs/core-concepts.md` (\u00a7 @run).\n- Use the `fractalic_run` tool to let an LLM dynamically choose and execute another document. See `docs/agent-modular-workflows.md` (\u00a7 Dynamic Agent Calling).\n\nExample: router calls a specialized agent with parameters and returns a clean result.\n\n```markdown\n# Router Task {id=router-task}\nDecide which agent to call and return only the agent\u2019s final result.\n\n@llm\nprompt: |\n  If the request mentions \"notion\", call the notion agent.\n  Otherwise call the memory agent.\n  Return only the callee\u2019s final result.\nblocks: router-task\nuse-header: \"# Router Result {id=router-result}\"\ntools:\n  - fractalic_run\ntools-turns-max: 2\n```\n\nLearn: `docs/operations-reference.md` (\u00a7 Calling another agent via fractalic_run).\n\n## Integrations and credits\n\n- Model routing via LiteLLM \u2014 one OpenAI-compatible interface to 100+ providers. Project: https://github.com/BerriAI/litellm\n- MCP client and manager built on FastMCP \u2014 high-level, Pythonic MCP tooling. Project: https://github.com/jlowin/fastmcp\n\nFractalic integrates with and credits these upstream projects. See licenses in their repos.\n\n## Out-of-the-box tools (packs)\n\nExplore the tools marketplace: https://github.com/fractalic-ai/fractalic-tools\n- HubSpot automation suite (CRM): 40+ tools incl. contacts, deals, tickets, process discovery\n- Discovery & Process Mining: event discovery, analytics\n- OS pack: file, shell, text\n- Telegram automation\n- Web scraping/search\n\nAdd packs as MCP services or as local tools per repo instructions.\n\n## Screenshots\n\n<table>\n  <tr>\n    <td width=\"50%\">\n      <img src=\"docs/images/editor.png\" alt=\"Fractalic Editor - Notebook-style UI with Markdown and YAML operations\" />\n      <p align=\"center\"><em>Main Editor Interface</em></p>\n    </td>\n    <td width=\"50%\">\n      <img src=\"docs/images/notebook.png\" alt=\"Notebook View - Interactive document execution with live results\" />\n      <p align=\"center\"><em>Notebook Execution View</em></p>\n    </td>\n  </tr>\n  <tr>\n    <td width=\"50%\">\n      <img src=\"docs/images/tools.png\" alt=\"MCP Tools Integration - Access external services via Model Context Protocol\" />\n      <p align=\"center\"><em>MCP Tools Integration</em></p>\n    </td>\n    <td width=\"50%\">\n      <img src=\"docs/images/mcp.png\" alt=\"MCP Manager - Unified tool and service management interface\" />\n      <p align=\"center\"><em>MCP Manager Interface</em></p>\n    </td>\n  </tr>\n  <tr>\n    <td width=\"50%\">\n      <img src=\"docs/images/diff.png\" alt=\"Git-backed Diffs - Complete execution trace with version control\" />\n      <p align=\"center\"><em>Git-backed Execution Diffs</em></p>\n    </td>\n    <td width=\"50%\">\n      <img src=\"docs/images/inspector.png\" alt=\"Debug Inspector - Deep inspection of execution state and variables\" />\n      <p align=\"center\"><em>Debug Inspector</em></p>\n    </td>\n  </tr>\n  <tr>\n    <td width=\"50%\">\n      <img src=\"docs/images/inspector-messages.png\" alt=\"Message Inspector - Detailed view of AI conversation turns and tool calls\" />\n      <p align=\"center\"><em>Message Inspector</em></p>\n    </td>\n    <td width=\"50%\">\n      <img src=\"docs/images/markdown.png\" alt=\"Markdown Editor - Clean document editing with syntax highlighting\" />\n      <p align=\"center\"><em>Markdown Editor</em></p>\n    </td>\n  </tr>\n  <tr>\n    <td colspan=\"2\" align=\"center\">\n      <img src=\"docs/images/deploy.png\" alt=\"Deployment Dashboard - One-click containerization and service deployment\" width=\"50%\" />\n      <p align=\"center\"><em>Deployment Dashboard</em></p>\n    </td>\n  </tr>\n</table>\n\n## Architecture (short)\n\n- ADE UI (port 8000) \u2014 notebook-like UI with streaming, diffs, debug\n- AI Server (port 8001+) \u2014 FastAPI `/execute` with Swagger `/docs`\n- Unified MCP Manager (port 5859) \u2014 OAuth, caching, tool catalog\n- Git-backed sessions, token/cost tracker, Docker-first deploy\n\nDetails: `docs/ui-server-api.md`, `docs/mcp-integration.md`.\n\n## See also\n\n- Introduction: `docs/introduction.md`\n- Core Concepts: `docs/core-concepts.md`\n- Operations Reference: `docs/operations-reference.md`\n- Advanced LLM Features: `docs/advanced-llm-features.md`\n- MCP Integration: `docs/mcp-integration.md`\n- Context Management: `docs/context-management.md`\n- Deployment Guide: `docs/deployment-guide.md`\n\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "The Agentic Development Environment - Turn simple documents into powerful, AI-native applications",
    "version": "0.1.1",
    "project_urls": {
        "Discord": "https://discord.gg/DHbnvxAT",
        "Documentation": "https://fractalic.ai/docs",
        "Homepage": "https://fractalic.ai",
        "Issues": "https://github.com/fractalic-ai/fractalic/issues",
        "Repository": "https://github.com/fractalic-ai/fractalic"
    },
    "split_keywords": [
        "ai",
        " agents",
        " automation",
        " markdown",
        " llm",
        " agentic",
        " development",
        " mcp",
        " model-context-protocol",
        " workflow",
        " orchestration"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "b88c47f7121679685c1c707cbac19882fb1cec048e1e6cb6a29fec9940b779cc",
                "md5": "1d71fec6fcbb66d73efbd12097ed4499",
                "sha256": "3adc26b347d573edfaff6342cf975ac83247da9de1c0ed7942d9df011808bda2"
            },
            "downloads": -1,
            "filename": "fractalic-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1d71fec6fcbb66d73efbd12097ed4499",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 114621,
            "upload_time": "2025-09-13T14:40:39",
            "upload_time_iso_8601": "2025-09-13T14:40:39.473446Z",
            "url": "https://files.pythonhosted.org/packages/b8/8c/47f7121679685c1c707cbac19882fb1cec048e1e6cb6a29fec9940b779cc/fractalic-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "61cada050fe92985bbea76e3d52a4f2d612c897ce9b90633a8ede3174b513bd3",
                "md5": "edb6cf6ad355feefbb1b9659909e0636",
                "sha256": "8565244c7836b50d5393bd6e3c5244060fcca38061189d5123a5a49c5205f8dd"
            },
            "downloads": -1,
            "filename": "fractalic-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "edb6cf6ad355feefbb1b9659909e0636",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 183928,
            "upload_time": "2025-09-13T14:40:40",
            "upload_time_iso_8601": "2025-09-13T14:40:40.990465Z",
            "url": "https://files.pythonhosted.org/packages/61/ca/da050fe92985bbea76e3d52a4f2d612c897ce9b90633a8ede3174b513bd3/fractalic-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-13 14:40:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "fractalic-ai",
    "github_project": "fractalic",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "anthropic",
            "specs": [
                [
                    ">=",
                    "0.8.0"
                ]
            ]
        },
        {
            "name": "aiohttp",
            "specs": [
                [
                    ">=",
                    "3.9.0"
                ]
            ]
        },
        {
            "name": "aiohttp-cors",
            "specs": []
        },
        {
            "name": "python-dotenv",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "groq",
            "specs": []
        },
        {
            "name": "openai",
            "specs": []
        },
        {
            "name": "litellm",
            "specs": [
                [
                    ">=",
                    "1.70.0"
                ]
            ]
        },
        {
            "name": "requests",
            "specs": []
        },
        {
            "name": "replicate",
            "specs": []
        },
        {
            "name": "uvicorn",
            "specs": []
        },
        {
            "name": "fastapi",
            "specs": [
                [
                    "==",
                    "0.115.6"
                ]
            ]
        },
        {
            "name": "gitpython",
            "specs": [
                [
                    "==",
                    "3.1.31"
                ]
            ]
        },
        {
            "name": "jsonschema",
            "specs": [
                [
                    "==",
                    "4.23.0"
                ]
            ]
        },
        {
            "name": "Pillow",
            "specs": [
                [
                    "==",
                    "11.1.0"
                ]
            ]
        },
        {
            "name": "PyYAML",
            "specs": [
                [
                    "==",
                    "6.0.2"
                ]
            ]
        },
        {
            "name": "rich",
            "specs": [
                [
                    "==",
                    "13.9.4"
                ]
            ]
        },
        {
            "name": "toml",
            "specs": [
                [
                    "==",
                    "0.10.2"
                ]
            ]
        },
        {
            "name": "mcp",
            "specs": [
                [
                    "==",
                    "1.11.0"
                ]
            ]
        },
        {
            "name": "fastmcp",
            "specs": [
                [
                    ">=",
                    "2.10.0"
                ]
            ]
        },
        {
            "name": "tiktoken",
            "specs": []
        },
        {
            "name": "httpx",
            "specs": [
                [
                    ">=",
                    "0.27.0"
                ]
            ]
        },
        {
            "name": "pydantic",
            "specs": [
                [
                    ">=",
                    "2.7.0"
                ]
            ]
        }
    ],
    "lcname": "fractalic"
}
        
Elapsed time: 5.09146s