multiai


Namemultiai JSON
Version 0.8 PyPI version JSON
download
home_pageNone
SummaryA Python library for text-based AI interactions
upload_time2024-08-18 17:19:45
maintainerNone
docs_urlNone
authorKatsutoshi Seki
requires_python>=3.10
licenseMIT
keywords ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # multiai

`multiai` is a Python library and command-line tool designed to interact with text-based generative AI models from the following providers:

| AI Provider  | Web Service                        | Models Available                                               |
|--------------|------------------------------------|----------------------------------------------------------------|
| **OpenAI**   | [ChatGPT](https://chat.openai.com/) | [GPT Models](https://platform.openai.com/docs/models) |
| **Anthropic**| [Claude](https://claude.ai/) | [Claude Models](https://docs.anthropic.com/en/docs/about-claude/models) |
| **Google**   | [Gemini](https://gemini.google.com/)| [Gemini Models](https://ai.google.dev/gemini-api/docs/models/gemini) |
| **Perplexity** | [Perplexity](https://www.perplexity.ai/) | [Perplexity Models](https://docs.perplexity.ai/docs/model-cards) |
| **Mistral**  | [Mistral](https://chat.mistral.ai/chat) | [Mistral Models](https://docs.mistral.ai/getting-started/models/) |

## Key Features

- **Interactive Chat:** Communicate with AI directly from your terminal.
- **Multi-Line Input:** Supports multi-line prompts for complex queries.
- **Pager for Long Responses:** View lengthy responses conveniently using a pager.
- **Continuation Handling:** Automatically handle and request continuations if responses are cut off.
- **Automatic Chat Logging:** Automatically save your chat history for future reference.

## Usage

Install `multiai`, then configure your API keys for your chosen AI providers as environment variables or in a user-setting file. Once that's done, you can start interacting with the AI.

- To send a simple query:

  ```bash
  ai hi
  ```

  You should see a response like:

  ```bash
  gpt-4o-mini>
  Hello! How can I assist you today?
  ```

- For an interactive session, enter interactive mode:

  ```bash
  ai
  ```

  In this mode, you can continue the conversation:

  ```bash
  user> hi
  gpt-4o-mini>
  Hello! How can I assist you today?
  user> how are you
  gpt-4o-mini>
  I'm just a program, so I don't have feelings, but I'm here and ready to help you! How about you? How are you doing?
  user>
  ```

To see a list of all command-line options, use:

```bash
ai -h
```

For more detailed documentation, you can open the [manual](https://sekika.github.io/multiai/) in a web browser with:

```bash
ai -d
```

## Using `multiai` as a Python Library

`multiai` can also be used as a Python library. Here’s a simple example:

```python
import multiai

# Initialize the client
client = multiai.Prompt()
client.set_model('openai', 'gpt-4o')  # Set model
client.temperature = 0.5  # Set temperature

# Send a prompt and get a response
answer = client.ask('hi')
print(answer)

# Continue the conversation with context
answer = client.ask('how are you')
print(answer)

# Clear the conversation context
client.clear()
```

The manual includes the following sample codes:

- A script that translates a text file into English.
- A local chat app that allows you to easily select from various AI models provided by different providers and engage in conversations with them.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "multiai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "AI",
    "author": "Katsutoshi Seki",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/7e/61/418a26842823bdba3ddb9631acfbc1e4cd91e9c19dae989d801f2e1c6fb4/multiai-0.8.tar.gz",
    "platform": null,
    "description": "# multiai\n\n`multiai` is a Python library and command-line tool designed to interact with text-based generative AI models from the following providers:\n\n| AI Provider  | Web Service                        | Models Available                                               |\n|--------------|------------------------------------|----------------------------------------------------------------|\n| **OpenAI**   | [ChatGPT](https://chat.openai.com/) | [GPT Models](https://platform.openai.com/docs/models) |\n| **Anthropic**| [Claude](https://claude.ai/) | [Claude Models](https://docs.anthropic.com/en/docs/about-claude/models) |\n| **Google**   | [Gemini](https://gemini.google.com/)| [Gemini Models](https://ai.google.dev/gemini-api/docs/models/gemini) |\n| **Perplexity** | [Perplexity](https://www.perplexity.ai/) | [Perplexity Models](https://docs.perplexity.ai/docs/model-cards) |\n| **Mistral**  | [Mistral](https://chat.mistral.ai/chat) | [Mistral Models](https://docs.mistral.ai/getting-started/models/) |\n\n## Key Features\n\n- **Interactive Chat:** Communicate with AI directly from your terminal.\n- **Multi-Line Input:** Supports multi-line prompts for complex queries.\n- **Pager for Long Responses:** View lengthy responses conveniently using a pager.\n- **Continuation Handling:** Automatically handle and request continuations if responses are cut off.\n- **Automatic Chat Logging:** Automatically save your chat history for future reference.\n\n## Usage\n\nInstall `multiai`, then configure your API keys for your chosen AI providers as environment variables or in a user-setting file. Once that's done, you can start interacting with the AI.\n\n- To send a simple query:\n\n  ```bash\n  ai hi\n  ```\n\n  You should see a response like:\n\n  ```bash\n  gpt-4o-mini>\n  Hello! How can I assist you today?\n  ```\n\n- For an interactive session, enter interactive mode:\n\n  ```bash\n  ai\n  ```\n\n  In this mode, you can continue the conversation:\n\n  ```bash\n  user> hi\n  gpt-4o-mini>\n  Hello! How can I assist you today?\n  user> how are you\n  gpt-4o-mini>\n  I'm just a program, so I don't have feelings, but I'm here and ready to help you! How about you? How are you doing?\n  user>\n  ```\n\nTo see a list of all command-line options, use:\n\n```bash\nai -h\n```\n\nFor more detailed documentation, you can open the [manual](https://sekika.github.io/multiai/) in a web browser with:\n\n```bash\nai -d\n```\n\n## Using `multiai` as a Python Library\n\n`multiai` can also be used as a Python library. Here\u2019s a simple example:\n\n```python\nimport multiai\n\n# Initialize the client\nclient = multiai.Prompt()\nclient.set_model('openai', 'gpt-4o')  # Set model\nclient.temperature = 0.5  # Set temperature\n\n# Send a prompt and get a response\nanswer = client.ask('hi')\nprint(answer)\n\n# Continue the conversation with context\nanswer = client.ask('how are you')\nprint(answer)\n\n# Clear the conversation context\nclient.clear()\n```\n\nThe manual includes the following sample codes:\n\n- A script that translates a text file into English.\n- A local chat app that allows you to easily select from various AI models provided by different providers and engage in conversations with them.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Python library for text-based AI interactions",
    "version": "0.8",
    "project_urls": {
        "Documentation": "https://sekika.github.io/multiai/",
        "Homepage": "https://sekika.github.io/multiai/",
        "Source": "https://github.com/sekika/multiai"
    },
    "split_keywords": [
        "ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "962a2ebc653b5f78ed8bf8004f3761870681740476628a9d01f4ab31fd224d8b",
                "md5": "c336f2b883ed4404e46fcc52e4f0729e",
                "sha256": "a7554da7219f744f090d64a567e2ccd35535abf522e7cbac0d34eec14b56d8d4"
            },
            "downloads": -1,
            "filename": "multiai-0.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c336f2b883ed4404e46fcc52e4f0729e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 12201,
            "upload_time": "2024-08-18T17:19:44",
            "upload_time_iso_8601": "2024-08-18T17:19:44.013646Z",
            "url": "https://files.pythonhosted.org/packages/96/2a/2ebc653b5f78ed8bf8004f3761870681740476628a9d01f4ab31fd224d8b/multiai-0.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7e61418a26842823bdba3ddb9631acfbc1e4cd91e9c19dae989d801f2e1c6fb4",
                "md5": "951c2a632945b3353217111ccbaa80ef",
                "sha256": "86a991d2db94a64819a08d8d243a6363c29ad180493582d6037806daaba945b4"
            },
            "downloads": -1,
            "filename": "multiai-0.8.tar.gz",
            "has_sig": false,
            "md5_digest": "951c2a632945b3353217111ccbaa80ef",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 12779,
            "upload_time": "2024-08-18T17:19:45",
            "upload_time_iso_8601": "2024-08-18T17:19:45.907892Z",
            "url": "https://files.pythonhosted.org/packages/7e/61/418a26842823bdba3ddb9631acfbc1e4cd91e9c19dae989d801f2e1c6fb4/multiai-0.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-18 17:19:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "sekika",
    "github_project": "multiai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "multiai"
}
        
Elapsed time: 0.89755s