persona-chat


Namepersona-chat JSON
Version 0.1.1 PyPI version JSON
download
home_pagehttps://github.com/your-org/persona-chat
SummaryA sophisticated CLI application for generating authentic multi-turn conversations using 50+ distinct user personas
upload_time2025-07-21 03:26:56
maintainerNone
docs_urlNone
authorPersona Chat Team
requires_python>=3.8
licenseMIT
keywords conversation generation personas multi-language cli ai nlp
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Persona-Based Conversation Generator CLI

A sophisticated Python CLI application that generates authentic multi-turn conversations using 50+ distinct user personas across multiple languages.

## Installation

### Option 1: Install from PyPI (Recommended)

You can install the package directly from PyPI without cloning the repository:

```bash
pip install persona-chat
```

After installation, the `persona-chat` CLI command will be available globally:

```bash
persona-chat --help
```

You can now use all CLI features as described below, including batch and multi-persona generation.

### Option 2: Clone the Repository (Development/Latest)

```bash
git clone <repository-url>
cd persona-chat

# Install dependencies
pip install -r requirements.txt

# Install the package in editable mode
pip install -e .
```

## Usage

After installation (from PyPI or from source), you can use the CLI:

```bash
persona-chat generate --help
persona-chat batch --help
```

See below for example commands and advanced usage.

## Batch Generation Examples

### Multi-Persona Batch Generation (NEW)
Batch now supports all persona options, including multi-persona, matching the generate command.

```bash
persona-chat batch --conversations 3 --turns 3 --language ar --multi-persona --persona-names "Teacher,Student,Parent" --persona-roles "primary,participant,observer" --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_multi_persona_batch
```

### Arabic Batch with a Single Dynamic Technology Persona (All Conversations)
```bash
persona-chat batch --conversations 3 --turns 4 --language ar --create-persona --persona-name "مهندس برمجيات" --persona-role "مهندس برمجيات" --persona-domain technology --persona-expertise expert --persona-communication-style technical --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_arabic_batch_dynamic_it
```

### Multi-Persona Conversation (Single)
```bash
persona-chat generate --multi-persona --persona-names "Chef,Waiter,Customer" --persona-roles "primary,participant,observer" --language ar --turns 3 --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output markdown --output-filename test_multi_persona_restaurant
```

> **Note:** All persona options (including multi-persona, persona names, roles, and dynamic persona creation) are now supported in both batch and generate commands.

<!--
### Arabic Batch with Random Personas (NOT SUPPORTED)
```bash
persona-chat batch --conversations 3 --turns 4 --language ar --personas random --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_arabic_batch_random
```

### Arabic Batch with IT Personas (Random per Conversation, NOT SUPPORTED)
```bash
persona-chat batch --conversations 3 --turns 4 --language ar --personas technology --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_arabic_batch_it
```
-->

> **Troubleshooting:**
> - If you see `No valid persona available for conversation generation`, use the `--create-persona` and `--persona-name` options as shown above.
> - If you want random personas per conversation, you must restore the static persona files or update the code to support dynamic random persona generation.

## Export Formats
- `--output json` (default): Structured JSON
- `--output csv`: Tabular CSV
- `--output markdown`: Human-readable Markdown
- `--output text`: Plain text

## LLM Backend Support (OpenAI, Ollama, Transformers)

You can use OpenAI-compatible APIs (including local Ollama) or HuggingFace Transformers as the backend for all message generation.

### Configuration
- `--llm-backend`: Choose the backend (`openai`, `transformers`, or `none`)
- `--llm-model`: Model name (e.g., `gemma2:2b` for Ollama)
- `--llm-api-key`: (For OpenAI) Your OpenAI API key (use any dummy value for local Ollama)
- `--llm-api-url`: (For OpenAI) Custom API URL for self-hosted or alternative OpenAI-compatible APIs (e.g., `http://localhost:11434/v1` for Ollama)

### Example: Local Ollama API
```bash
persona-chat generate --llm-backend openai --llm-model gemma2:2b --llm-api-key dummy --llm-api-url http://localhost:11434/v1 --persona-name "Senior Software Engineer" --turns 5
```

### Notes
- All user and assistant messages are generated using the selected LLM backend if enabled.
- For local Ollama, use `--llm-backend openai` and provide a dummy API key.
- For Transformers, the model must be available locally or downloadable from HuggingFace.
- You can set defaults in a config file or via environment variables for convenience.

## Multi-Persona Conversations
- Use `--multi-persona` to enable multi-persona mode
- Specify persona names and roles with `--persona-names` and `--persona-roles`
- Example:
  ```bash
  persona-chat generate --multi-persona --persona-names "Artist,Designer,Client" --persona-roles "primary,participant,observer" --language ar --turns 3
  ```

## Documentation

- [Installation Guide](docs/installation.md)
- [Quick Start Guide](docs/quick_start.md)
- [User Guide](docs/user_guide/)
- [Developer Guide](docs/developer_guide/)
- [Examples](docs/examples/)

## Project Structure

```
persona-chat/
├── persona_chat/          # Main application package
├── config/               # Configuration files
├── output/               # Output files
├── tests/                # Test suite
├── docs/                 # Documentation
├── examples/             # Example files
└── tools/                # Development tools
```

## Contributing

Please read [CONTRIBUTING.md](docs/developer_guide/contributing.md) for details on our code of conduct and the process for submitting pull requests.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. 

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/your-org/persona-chat",
    "name": "persona-chat",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "Ahmed Wasfy <oddadmix@gmail.com>",
    "keywords": "conversation generation, personas, multi-language, CLI, AI, NLP",
    "author": "Persona Chat Team",
    "author_email": "Ahmed Wasfy <oddadmix@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/6b/27/cf6e8a8d3615c49415652d35d59fcdec69b27554dea87f5d1d71b8c0258f/persona_chat-0.1.1.tar.gz",
    "platform": null,
    "description": "# Persona-Based Conversation Generator CLI\n\nA sophisticated Python CLI application that generates authentic multi-turn conversations using 50+ distinct user personas across multiple languages.\n\n## Installation\n\n### Option 1: Install from PyPI (Recommended)\n\nYou can install the package directly from PyPI without cloning the repository:\n\n```bash\npip install persona-chat\n```\n\nAfter installation, the `persona-chat` CLI command will be available globally:\n\n```bash\npersona-chat --help\n```\n\nYou can now use all CLI features as described below, including batch and multi-persona generation.\n\n### Option 2: Clone the Repository (Development/Latest)\n\n```bash\ngit clone <repository-url>\ncd persona-chat\n\n# Install dependencies\npip install -r requirements.txt\n\n# Install the package in editable mode\npip install -e .\n```\n\n## Usage\n\nAfter installation (from PyPI or from source), you can use the CLI:\n\n```bash\npersona-chat generate --help\npersona-chat batch --help\n```\n\nSee below for example commands and advanced usage.\n\n## Batch Generation Examples\n\n### Multi-Persona Batch Generation (NEW)\nBatch now supports all persona options, including multi-persona, matching the generate command.\n\n```bash\npersona-chat batch --conversations 3 --turns 3 --language ar --multi-persona --persona-names \"Teacher,Student,Parent\" --persona-roles \"primary,participant,observer\" --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_multi_persona_batch\n```\n\n### Arabic Batch with a Single Dynamic Technology Persona (All Conversations)\n```bash\npersona-chat batch --conversations 3 --turns 4 --language ar --create-persona --persona-name \"\u0645\u0647\u0646\u062f\u0633 \u0628\u0631\u0645\u062c\u064a\u0627\u062a\" --persona-role \"\u0645\u0647\u0646\u062f\u0633 \u0628\u0631\u0645\u062c\u064a\u0627\u062a\" --persona-domain technology --persona-expertise expert --persona-communication-style technical --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_arabic_batch_dynamic_it\n```\n\n### Multi-Persona Conversation (Single)\n```bash\npersona-chat generate --multi-persona --persona-names \"Chef,Waiter,Customer\" --persona-roles \"primary,participant,observer\" --language ar --turns 3 --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output markdown --output-filename test_multi_persona_restaurant\n```\n\n> **Note:** All persona options (including multi-persona, persona names, roles, and dynamic persona creation) are now supported in both batch and generate commands.\n\n<!--\n### Arabic Batch with Random Personas (NOT SUPPORTED)\n```bash\npersona-chat batch --conversations 3 --turns 4 --language ar --personas random --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_arabic_batch_random\n```\n\n### Arabic Batch with IT Personas (Random per Conversation, NOT SUPPORTED)\n```bash\npersona-chat batch --conversations 3 --turns 4 --language ar --personas technology --llm-backend openai --llm-model gemma2:2b --llm-api-url http://localhost:11434/v1 --output json --output-filename test_arabic_batch_it\n```\n-->\n\n> **Troubleshooting:**\n> - If you see `No valid persona available for conversation generation`, use the `--create-persona` and `--persona-name` options as shown above.\n> - If you want random personas per conversation, you must restore the static persona files or update the code to support dynamic random persona generation.\n\n## Export Formats\n- `--output json` (default): Structured JSON\n- `--output csv`: Tabular CSV\n- `--output markdown`: Human-readable Markdown\n- `--output text`: Plain text\n\n## LLM Backend Support (OpenAI, Ollama, Transformers)\n\nYou can use OpenAI-compatible APIs (including local Ollama) or HuggingFace Transformers as the backend for all message generation.\n\n### Configuration\n- `--llm-backend`: Choose the backend (`openai`, `transformers`, or `none`)\n- `--llm-model`: Model name (e.g., `gemma2:2b` for Ollama)\n- `--llm-api-key`: (For OpenAI) Your OpenAI API key (use any dummy value for local Ollama)\n- `--llm-api-url`: (For OpenAI) Custom API URL for self-hosted or alternative OpenAI-compatible APIs (e.g., `http://localhost:11434/v1` for Ollama)\n\n### Example: Local Ollama API\n```bash\npersona-chat generate --llm-backend openai --llm-model gemma2:2b --llm-api-key dummy --llm-api-url http://localhost:11434/v1 --persona-name \"Senior Software Engineer\" --turns 5\n```\n\n### Notes\n- All user and assistant messages are generated using the selected LLM backend if enabled.\n- For local Ollama, use `--llm-backend openai` and provide a dummy API key.\n- For Transformers, the model must be available locally or downloadable from HuggingFace.\n- You can set defaults in a config file or via environment variables for convenience.\n\n## Multi-Persona Conversations\n- Use `--multi-persona` to enable multi-persona mode\n- Specify persona names and roles with `--persona-names` and `--persona-roles`\n- Example:\n  ```bash\n  persona-chat generate --multi-persona --persona-names \"Artist,Designer,Client\" --persona-roles \"primary,participant,observer\" --language ar --turns 3\n  ```\n\n## Documentation\n\n- [Installation Guide](docs/installation.md)\n- [Quick Start Guide](docs/quick_start.md)\n- [User Guide](docs/user_guide/)\n- [Developer Guide](docs/developer_guide/)\n- [Examples](docs/examples/)\n\n## Project Structure\n\n```\npersona-chat/\n\u251c\u2500\u2500 persona_chat/          # Main application package\n\u251c\u2500\u2500 config/               # Configuration files\n\u251c\u2500\u2500 output/               # Output files\n\u251c\u2500\u2500 tests/                # Test suite\n\u251c\u2500\u2500 docs/                 # Documentation\n\u251c\u2500\u2500 examples/             # Example files\n\u2514\u2500\u2500 tools/                # Development tools\n```\n\n## Contributing\n\nPlease read [CONTRIBUTING.md](docs/developer_guide/contributing.md) for details on our code of conduct and the process for submitting pull requests.\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. \n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A sophisticated CLI application for generating authentic multi-turn conversations using 50+ distinct user personas",
    "version": "0.1.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/your-org/persona-chat/issues",
        "Documentation": "https://persona-chat.readthedocs.io/",
        "Homepage": "https://github.com/your-org/persona-chat",
        "Repository": "https://github.com/your-org/persona-chat"
    },
    "split_keywords": [
        "conversation generation",
        " personas",
        " multi-language",
        " cli",
        " ai",
        " nlp"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "00eec83f80fd60bc2159139c5bbaa6de547d5abb9277304f9eb4f9a355a69eab",
                "md5": "3fa7e2df29b0d3cb5116f2f22e573501",
                "sha256": "0350daa3bd00ce640f51160dd632b03c140c9132a2d437c182565cb272012396"
            },
            "downloads": -1,
            "filename": "persona_chat-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3fa7e2df29b0d3cb5116f2f22e573501",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 83258,
            "upload_time": "2025-07-21T03:26:55",
            "upload_time_iso_8601": "2025-07-21T03:26:55.072247Z",
            "url": "https://files.pythonhosted.org/packages/00/ee/c83f80fd60bc2159139c5bbaa6de547d5abb9277304f9eb4f9a355a69eab/persona_chat-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6b27cf6e8a8d3615c49415652d35d59fcdec69b27554dea87f5d1d71b8c0258f",
                "md5": "5978872b2e269115cc756fdfb445b0be",
                "sha256": "ee7caa881fd0c71a8c9e6ae89f3a728dce1a7b0db8de471df6374c32857c7f78"
            },
            "downloads": -1,
            "filename": "persona_chat-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "5978872b2e269115cc756fdfb445b0be",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 69114,
            "upload_time": "2025-07-21T03:26:56",
            "upload_time_iso_8601": "2025-07-21T03:26:56.395619Z",
            "url": "https://files.pythonhosted.org/packages/6b/27/cf6e8a8d3615c49415652d35d59fcdec69b27554dea87f5d1d71b8c0258f/persona_chat-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-21 03:26:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "your-org",
    "github_project": "persona-chat",
    "github_not_found": true,
    "lcname": "persona-chat"
}
        
Elapsed time: 1.48109s