# ChatLite 🤖
A lightweight, extensible chat application framework for building AI-powered chat interfaces. ChatLite provides an easy-to-use platform for integrating various language models with web-based chat applications.
## ✨ Features
- 🔄 Real-time WebSocket communication
- 🎯 Multi-model support (Llama, Qwen, etc.)
- 🌐 Web search integration
- 🎨 Customizable UI with modern design
- 🔌 Plugin architecture for easy extensions
- 💬 Chat history management
- 🎭 Multiple agent types support
- 📱 Responsive design
## 🚀 Quick Start
### Installation
```bash
pip install chatlite
```
### Basic Usage
```python
import chatlite
# Start a simple chat server with Llama 3.2
chatlite.local_llama3p2()
# Or use Qwen 2.5
chatlite.local_qwen2p5()
# Custom configuration
server = chatlite.create_server(
model_type="local",
model_name="llama3.2:latest",
temperature=0.7,
max_tokens=4000
)
server.run()
```
### Pre-configured Models
ChatLite comes with several pre-configured models:
```python
# Use different models directly
from chatlite import mistral_7b_v3, mixtral_8x7b, qwen_72b
# Start Mistral 7B server
mistral_7b_v3()
# Start Mixtral 8x7B server
mixtral_8x7b()
# Start Qwen 72B server
qwen_72b()
```
## 💻 Frontend Integration
ChatLite includes a Flutter-based frontend that can be easily customized. Here's a basic example of connecting to the ChatLite server:
```dart
final channel = WebSocketChannel.connect(
Uri.parse('ws://localhost:8143/ws/$clientId'),
);
// Send message
channel.sink.add(json.encode({
'message': 'Hello!',
'model': 'llama3.2:latest',
'system_prompt': 'You are a helpful assistant',
'agent_type': 'WebSearchAgent',
'is_websearch_chat': true
}));
// Listen for responses
channel.stream.listen(
(message) {
final data = jsonDecode(message);
if (data['type'] == 'stream') {
print(data['message']);
}
},
onError: (error) => print('Error: $error'),
onDone: () => print('Connection closed'),
);
```
## 🔧 Configuration
ChatLite supports various configuration options:
```python
from chatlite import create_server
server = create_server(
model_type="local", # local, huggingface, etc.
model_name="llama3.2:latest",
api_key="your-api-key", # if needed
temperature=0.7, # model temperature
max_tokens=4000, # max response length
base_url="http://localhost:11434/v1", # model API endpoint
)
```
## 🧩 Available Agents
ChatLite supports different agent types for specialized tasks:
- `WebSearchAgent`: Internet-enabled chat with web search capabilities
- `RawWebSearchAgent`: Direct web search results without summarization
- `EmailAssistantFeature`: Email composition and analysis
- `DefaultChatFeature`: Standard chat functionality
Example usage:
```python
# Client-side configuration
message_data = {
"message": "What's the latest news about AI?",
"model": "llama3.2:latest",
"agent_type": "WebSearchAgent",
"is_websearch_chat": True
}
```
## 🎨 UI Customization
The included Flutter frontend supports extensive customization:
```dart
ThemeData(
brightness: Brightness.dark,
scaffoldBackgroundColor: const Color(0xFF1C1C1E),
primaryColor: const Color(0xFF1C1C1E),
colorScheme: const ColorScheme.dark(
primary: Color(0xFFFF7762),
secondary: Color(0xFFFF7762),
),
)
```
## 📦 Project Structure
```
chatlite/
├── __init__.py # Main package initialization
├── core/ # Core functionality
│ ├── config.py # Configuration handling
│ ├── model_service.py # Model interaction
│ └── features/ # Feature implementations
├── ui/ # Flutter frontend
└── examples/ # Usage examples
```
## 🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- Built with FastAPI and Flutter
- Inspired by modern chat applications
- Uses various open-source language models
## ⚠️ Disclaimer
This is an open-source project and should be used responsibly. Please ensure compliance with all model licenses and usage terms.
Raw data
{
"_id": null,
"home_page": "https://github.com/santhosh/",
"name": "chatlite",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": null,
"author": "Kammari Santhosh",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/70/f7/ca2fdcaf84356d6c11b13a0c56215cdcce107adfe73f097bd96c4b83d8ee/chatlite-0.0.61.tar.gz",
"platform": null,
"description": "# ChatLite \ud83e\udd16\n\nA lightweight, extensible chat application framework for building AI-powered chat interfaces. ChatLite provides an easy-to-use platform for integrating various language models with web-based chat applications.\n\n## \u2728 Features\n\n- \ud83d\udd04 Real-time WebSocket communication\n- \ud83c\udfaf Multi-model support (Llama, Qwen, etc.)\n- \ud83c\udf10 Web search integration\n- \ud83c\udfa8 Customizable UI with modern design\n- \ud83d\udd0c Plugin architecture for easy extensions\n- \ud83d\udcac Chat history management\n- \ud83c\udfad Multiple agent types support\n- \ud83d\udcf1 Responsive design\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\npip install chatlite\n```\n\n### Basic Usage\n\n```python\nimport chatlite\n\n# Start a simple chat server with Llama 3.2\nchatlite.local_llama3p2()\n\n# Or use Qwen 2.5\nchatlite.local_qwen2p5()\n\n# Custom configuration\nserver = chatlite.create_server(\n model_type=\"local\",\n model_name=\"llama3.2:latest\",\n temperature=0.7,\n max_tokens=4000\n)\nserver.run()\n```\n\n### Pre-configured Models\n\nChatLite comes with several pre-configured models:\n\n```python\n# Use different models directly\nfrom chatlite import mistral_7b_v3, mixtral_8x7b, qwen_72b\n\n# Start Mistral 7B server\nmistral_7b_v3()\n\n# Start Mixtral 8x7B server\nmixtral_8x7b()\n\n# Start Qwen 72B server\nqwen_72b()\n```\n\n## \ud83d\udcbb Frontend Integration\n\nChatLite includes a Flutter-based frontend that can be easily customized. Here's a basic example of connecting to the ChatLite server:\n\n```dart\nfinal channel = WebSocketChannel.connect(\n Uri.parse('ws://localhost:8143/ws/$clientId'),\n);\n\n// Send message\nchannel.sink.add(json.encode({\n 'message': 'Hello!',\n 'model': 'llama3.2:latest',\n 'system_prompt': 'You are a helpful assistant',\n 'agent_type': 'WebSearchAgent',\n 'is_websearch_chat': true\n}));\n\n// Listen for responses\nchannel.stream.listen(\n (message) {\n final data = jsonDecode(message);\n if (data['type'] == 'stream') {\n print(data['message']);\n }\n },\n onError: (error) => print('Error: $error'),\n onDone: () => print('Connection closed'),\n);\n```\n\n## \ud83d\udd27 Configuration\n\nChatLite supports various configuration options:\n\n```python\nfrom chatlite import create_server\n\nserver = create_server(\n model_type=\"local\", # local, huggingface, etc.\n model_name=\"llama3.2:latest\",\n api_key=\"your-api-key\", # if needed\n temperature=0.7, # model temperature\n max_tokens=4000, # max response length\n base_url=\"http://localhost:11434/v1\", # model API endpoint\n)\n```\n\n## \ud83e\udde9 Available Agents\n\nChatLite supports different agent types for specialized tasks:\n\n- `WebSearchAgent`: Internet-enabled chat with web search capabilities\n- `RawWebSearchAgent`: Direct web search results without summarization\n- `EmailAssistantFeature`: Email composition and analysis\n- `DefaultChatFeature`: Standard chat functionality\n\nExample usage:\n\n```python\n# Client-side configuration\nmessage_data = {\n \"message\": \"What's the latest news about AI?\",\n \"model\": \"llama3.2:latest\",\n \"agent_type\": \"WebSearchAgent\",\n \"is_websearch_chat\": True\n}\n```\n\n## \ud83c\udfa8 UI Customization\n\nThe included Flutter frontend supports extensive customization:\n\n```dart\nThemeData(\n brightness: Brightness.dark,\n scaffoldBackgroundColor: const Color(0xFF1C1C1E),\n primaryColor: const Color(0xFF1C1C1E),\n colorScheme: const ColorScheme.dark(\n primary: Color(0xFFFF7762),\n secondary: Color(0xFFFF7762),\n ),\n)\n```\n\n## \ud83d\udce6 Project Structure\n\n```\nchatlite/\n\u251c\u2500\u2500 __init__.py # Main package initialization\n\u251c\u2500\u2500 core/ # Core functionality\n\u2502 \u251c\u2500\u2500 config.py # Configuration handling\n\u2502 \u251c\u2500\u2500 model_service.py # Model interaction\n\u2502 \u2514\u2500\u2500 features/ # Feature implementations\n\u251c\u2500\u2500 ui/ # Flutter frontend\n\u2514\u2500\u2500 examples/ # Usage examples\n```\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n1. Fork the repository\n2. Create your feature branch (`git checkout -b feature/AmazingFeature`)\n3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)\n4. Push to the branch (`git push origin feature/AmazingFeature`)\n5. Open a Pull Request\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Built with FastAPI and Flutter\n- Inspired by modern chat applications\n- Uses various open-source language models\n\n## \u26a0\ufe0f Disclaimer\n\nThis is an open-source project and should be used responsibly. Please ensure compliance with all model licenses and usage terms.",
"bugtrack_url": null,
"license": "MIT",
"summary": "ai powered chatapp for browsing and search",
"version": "0.0.61",
"project_urls": {
"Homepage": "https://github.com/santhosh/",
"Repository": "https://github.com/santhosh/"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "95118b5aac6de49b633d25315763da9b3c8d5c39a4e015297fd850e6194cd751",
"md5": "47e0496f0e31ba2a232f684adae25037",
"sha256": "c924cb8492ed7962a9423d8d8c9dbb7e958e92253852726106f54c60b71f6997"
},
"downloads": -1,
"filename": "chatlite-0.0.61-py3-none-any.whl",
"has_sig": false,
"md5_digest": "47e0496f0e31ba2a232f684adae25037",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 12900,
"upload_time": "2025-02-02T05:15:52",
"upload_time_iso_8601": "2025-02-02T05:15:52.788141Z",
"url": "https://files.pythonhosted.org/packages/95/11/8b5aac6de49b633d25315763da9b3c8d5c39a4e015297fd850e6194cd751/chatlite-0.0.61-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "70f7ca2fdcaf84356d6c11b13a0c56215cdcce107adfe73f097bd96c4b83d8ee",
"md5": "17141ec12f207df99516953e6423b6d5",
"sha256": "d9bdf9366e082fdfaf2b2aef76127d3be371221db48b140209111aa84df7e124"
},
"downloads": -1,
"filename": "chatlite-0.0.61.tar.gz",
"has_sig": false,
"md5_digest": "17141ec12f207df99516953e6423b6d5",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 22893,
"upload_time": "2025-02-02T05:15:55",
"upload_time_iso_8601": "2025-02-02T05:15:55.055125Z",
"url": "https://files.pythonhosted.org/packages/70/f7/ca2fdcaf84356d6c11b13a0c56215cdcce107adfe73f097bd96c4b83d8ee/chatlite-0.0.61.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-02 05:15:55",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "chatlite"
}