multiai


Namemultiai JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryA Python library for text-based AI interactions
upload_time2024-12-06 06:16:18
maintainerNone
docs_urlNone
authorKatsutoshi Seki
requires_python>=3.10
licenseMIT
keywords ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # multiai

`multiai` is a Python library and command-line tool designed to interact with text-based generative AI models from the following providers:

| AI Provider  | Web Service                        | Models Available                                               |
|--------------|------------------------------------|----------------------------------------------------------------|
| **OpenAI**   | [ChatGPT](https://chat.openai.com/) | [GPT Models](https://platform.openai.com/docs/models) |
| **Anthropic**| [Claude](https://claude.ai/) | [Claude Models](https://docs.anthropic.com/en/docs/about-claude/models) |
| **Google**   | [Gemini](https://gemini.google.com/)| [Gemini Models](https://ai.google.dev/gemini-api/docs/models/gemini) |
| **Perplexity** | [Perplexity](https://www.perplexity.ai/) | [Perplexity Models](https://docs.perplexity.ai/docs/model-cards) |
| **Mistral**  | [Mistral](https://chat.mistral.ai/chat) | [Mistral Models](https://docs.mistral.ai/getting-started/models/) |

## Key Features

- **Interactive Chat:** Communicate with AI directly from your terminal.
- **Multi-Line Input:** Supports multi-line prompts for complex queries.
- **Pager for Long Responses:** View lengthy responses conveniently using a pager.
- **Continuation Handling:** Automatically handle and request continuations if responses are cut off.
- **Automatic Chat Logging:** Automatically save your chat history for future reference.

## Usage

Install `multiai`, then configure your API keys for your chosen AI providers as environment variables or in a user-setting file. Once that's done, you can start interacting with the AI.

- To send a simple query:

  ```bash
  ai hi
  ```

  You should see a response like:

  ```bash
  gpt-4o-mini>
  Hello! How can I assist you today?
  ```

- For an interactive session, enter interactive mode:

  ```bash
  ai
  ```

  In this mode, you can continue the conversation:

  ```bash
  user> hi
  gpt-4o-mini>
  Hello! How can I assist you today?
  user> how are you
  gpt-4o-mini>
  I'm just a program, so I don't have feelings, but I'm here and ready to help you! How about you? How are you doing?
  user>
  ```

To see a list of all command-line options, use:

```bash
ai -h
```

For more detailed documentation, you can open the [manual](https://sekika.github.io/multiai/) in a web browser with:

```bash
ai -d
```

## Using `multiai` as a Python Library

`multiai` can also be used as a Python library. Here’s a simple example:

```python
import multiai

# Initialize the client
client = multiai.Prompt()
client.set_model('openai', 'gpt-4o')  # Set model
client.temperature = 0.5  # Set temperature

# Send a prompt and get a response
answer = client.ask('hi')
print(answer)

# Continue the conversation with context
answer = client.ask('how are you')
print(answer)

# Clear the conversation context
client.clear()
```

The manual includes the following sample codes:

- A script that translates a text file into English.
- A local chat app that allows you to easily select from various AI models provided by different providers and engage in conversations with them.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "multiai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "AI",
    "author": "Katsutoshi Seki",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/3b/bf/cba1d3ca1e5c5bd8b2685610e22fa048e64fe8780b17293ae1e36ba7a32c/multiai-1.0.0.tar.gz",
    "platform": null,
    "description": "# multiai\n\n`multiai` is a Python library and command-line tool designed to interact with text-based generative AI models from the following providers:\n\n| AI Provider  | Web Service                        | Models Available                                               |\n|--------------|------------------------------------|----------------------------------------------------------------|\n| **OpenAI**   | [ChatGPT](https://chat.openai.com/) | [GPT Models](https://platform.openai.com/docs/models) |\n| **Anthropic**| [Claude](https://claude.ai/) | [Claude Models](https://docs.anthropic.com/en/docs/about-claude/models) |\n| **Google**   | [Gemini](https://gemini.google.com/)| [Gemini Models](https://ai.google.dev/gemini-api/docs/models/gemini) |\n| **Perplexity** | [Perplexity](https://www.perplexity.ai/) | [Perplexity Models](https://docs.perplexity.ai/docs/model-cards) |\n| **Mistral**  | [Mistral](https://chat.mistral.ai/chat) | [Mistral Models](https://docs.mistral.ai/getting-started/models/) |\n\n## Key Features\n\n- **Interactive Chat:** Communicate with AI directly from your terminal.\n- **Multi-Line Input:** Supports multi-line prompts for complex queries.\n- **Pager for Long Responses:** View lengthy responses conveniently using a pager.\n- **Continuation Handling:** Automatically handle and request continuations if responses are cut off.\n- **Automatic Chat Logging:** Automatically save your chat history for future reference.\n\n## Usage\n\nInstall `multiai`, then configure your API keys for your chosen AI providers as environment variables or in a user-setting file. Once that's done, you can start interacting with the AI.\n\n- To send a simple query:\n\n  ```bash\n  ai hi\n  ```\n\n  You should see a response like:\n\n  ```bash\n  gpt-4o-mini>\n  Hello! How can I assist you today?\n  ```\n\n- For an interactive session, enter interactive mode:\n\n  ```bash\n  ai\n  ```\n\n  In this mode, you can continue the conversation:\n\n  ```bash\n  user> hi\n  gpt-4o-mini>\n  Hello! How can I assist you today?\n  user> how are you\n  gpt-4o-mini>\n  I'm just a program, so I don't have feelings, but I'm here and ready to help you! How about you? How are you doing?\n  user>\n  ```\n\nTo see a list of all command-line options, use:\n\n```bash\nai -h\n```\n\nFor more detailed documentation, you can open the [manual](https://sekika.github.io/multiai/) in a web browser with:\n\n```bash\nai -d\n```\n\n## Using `multiai` as a Python Library\n\n`multiai` can also be used as a Python library. Here\u2019s a simple example:\n\n```python\nimport multiai\n\n# Initialize the client\nclient = multiai.Prompt()\nclient.set_model('openai', 'gpt-4o')  # Set model\nclient.temperature = 0.5  # Set temperature\n\n# Send a prompt and get a response\nanswer = client.ask('hi')\nprint(answer)\n\n# Continue the conversation with context\nanswer = client.ask('how are you')\nprint(answer)\n\n# Clear the conversation context\nclient.clear()\n```\n\nThe manual includes the following sample codes:\n\n- A script that translates a text file into English.\n- A local chat app that allows you to easily select from various AI models provided by different providers and engage in conversations with them.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Python library for text-based AI interactions",
    "version": "1.0.0",
    "project_urls": {
        "Documentation": "https://sekika.github.io/multiai/",
        "Homepage": "https://sekika.github.io/multiai/",
        "Source": "https://github.com/sekika/multiai"
    },
    "split_keywords": [
        "ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e2becea9bd6f943603df4c317ad2c25799d698286d485e97b9b92fca3c7bb561",
                "md5": "6ea14731ee4119af62451e2fa43c4bba",
                "sha256": "73401145ef992a8c3eb4b5dc05cf6935b37eb450a0f7f4d8d5d9e72c52fe1a2a"
            },
            "downloads": -1,
            "filename": "multiai-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6ea14731ee4119af62451e2fa43c4bba",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 12395,
            "upload_time": "2024-12-06T06:16:17",
            "upload_time_iso_8601": "2024-12-06T06:16:17.138699Z",
            "url": "https://files.pythonhosted.org/packages/e2/be/cea9bd6f943603df4c317ad2c25799d698286d485e97b9b92fca3c7bb561/multiai-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3bbfcba1d3ca1e5c5bd8b2685610e22fa048e64fe8780b17293ae1e36ba7a32c",
                "md5": "3337ac57a5d0abb6dd4c03cbb03a330c",
                "sha256": "2cf977ee28c1d3c1ef4fc32b41988bf5e88af10dc8c97ba6b40114b91f879d79"
            },
            "downloads": -1,
            "filename": "multiai-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3337ac57a5d0abb6dd4c03cbb03a330c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 12774,
            "upload_time": "2024-12-06T06:16:18",
            "upload_time_iso_8601": "2024-12-06T06:16:18.520828Z",
            "url": "https://files.pythonhosted.org/packages/3b/bf/cba1d3ca1e5c5bd8b2685610e22fa048e64fe8780b17293ae1e36ba7a32c/multiai-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-06 06:16:18",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "sekika",
    "github_project": "multiai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "multiai"
}
        
Elapsed time: 0.32580s