Name | lg2slack JSON |
Version |
0.1.3
JSON |
| download |
home_page | None |
Summary | Simple package to connect LangGraph applications to Slack |
upload_time | 2025-10-23 19:25:06 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | MIT |
keywords |
langgraph
slack
chatbot
ai
langchain
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# lg2slack
[](https://badge.fury.io/py/lg2slack)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
Simple, minimal package to connect LangGraph applications to Slack with just a few lines of code.
## Installation
```bash
pip install lg2slack
```
## Quick Start
### 1. Create a LangGraph App
First, create a simple LangGraph chatbot that will power your Slack bot:
```python
# agent.py
from langchain_anthropic import ChatAnthropic
from langgraph.graph import MessagesState, StateGraph, START, END
from langgraph.checkpoint.memory import MemorySaver
# Define the chatbot function
def chatbot(state: MessagesState):
model = ChatAnthropic(model="claude-3-5-sonnet-20241022")
return {"messages": [model.invoke(state["messages"])]}
# Build the graph
graph = StateGraph(MessagesState)
graph.add_node("chatbot", chatbot)
graph.add_edge(START, "chatbot")
graph.add_edge("chatbot", END)
# Compile with memory to maintain conversation history
app = graph.compile(checkpointer=MemorySaver())
```
This creates a simple chatbot that maintains conversation history across messages.
### 2. Create Your Slack Bot Server
This is where the magic happens.
Create a `slack/server.py` file in your project directory:
```python
# slack/server.py
from lg2slack import SlackBot
bot = SlackBot()
# Export the app for langgraph.json
app = bot.app
```
That's it! Just 3 lines of code.
### 3. Configure Environment Variables
Create a `.env` file with your credentials:
```bash
# Slack credentials (from https://api.slack.com/apps -> Your App)
SLACK_BOT_TOKEN=xoxb-your-bot-token
SLACK_SIGNING_SECRET=your-signing-secret
# LangGraph agent name
# This is the key you will set in langgraph.json
ASSISTANT_ID=my-assistant
```
### 4. Configure LangGraph Deployment
Add your agent and Slack server paths to `langgraph.json`:
```json
{
"dependencies": ["lg2slack", "."],
"graphs": {
"my-assistant": "./agent.py:app"
},
"env": ".env",
"http": {
"/events/slack": "slack/server:app"
}
}
```
## Local Testing
Before deploying to production, test your bot locally using ngrok.
**Important:** You'll need **separate Slack apps** for local development and production deployment, since each environment has its own request URL where Slack sends events.
### 1. Create a Slack App for Local Development
- Go to https://api.slack.com/apps
- Click "Create New App" → "From a manifest"
- Copy the contents of `slack_manifest.yaml` from this repo
- Replace placeholder values:
- `your-app-name` → Your app name (e.g., "My Bot - Local")
- `your-deployment-url` → This is your ngrok or Langgraph deployment URL. Leave as placeholder for now
- Install the app to your workspace
- Copy the Bot Token and Signing Secret to your `.env` file
### 2. Start LangGraph Dev Server
```bash
langgraph dev
# Runs on http://localhost:2024 and automatically mounts your FastAPI app
```
Note: You don't need to run a separate server! LangGraph dev automatically imports and serves the FastAPI app from your `langgraph.json`.
### 3. Expose with ngrok
Install ngrok if you haven't already:
```bash
# macOS
brew install ngrok
# Or download from https://ngrok.com/download
```
Start ngrok to expose your local server:
```bash
ngrok http 2024
```
This will output something like:
```
Forwarding https://abc123.ngrok.io -> http://localhost:2024
```
**Tip:** View all requests in ngrok's web interface at http://localhost:4040
### 4. Update Slack App Event URL
Go to your Slack app settings → Event Subscriptions:
- Request URL: `https://abc123.ngrok.io/events/slack` (use YOUR ngrok URL)
- Slack will verify the URL - you should see a green checkmark
### 5. Test Your Bot
Send a DM to your bot or @mention it in a channel! You'll see requests in both:
- LangGraph dev console
- ngrok web interface (http://localhost:4040)
## Production Deployment
Once local testing looks good, deploy to LangGraph Platform.
### 1. Create a Production Slack App
Create a **new** Slack app for production (separate from your local dev app):
1. Go to https://api.slack.com/apps
2. Click "Create New App" → "From a manifest"
3. Use the same manifest, but name it differently (e.g., "My Bot - Production")
4. After deployment, you'll update the request URL to your LangGraph Platform URL
### 2. Update Environment Variables
Update your `.env` file with the **production** Slack app credentials:
```bash
# Production Slack credentials
SLACK_BOT_TOKEN=xoxb-your-production-bot-token
SLACK_SIGNING_SECRET=your-production-signing-secret
# LangGraph configuration
ASSISTANT_ID=my-assistant
```
### 3. Deploy to LangGraph Platform
```bash
langgraph deploy
```
After deployment, you'll receive a URL like: `https://your-deployment.langraph.app`
### 4. Update Production Slack App URL
Go to your **production** Slack app settings → Event Subscriptions:
- Request URL: `https://your-deployment.langraph.app/events/slack`
Your bot is now live! Chat with it by:
- Sending a DM to the bot
- @mentioning the bot in a channel
## Advanced Usage
### Configuration Options
The `SlackBot` class accepts many parameters to customize behavior:
```python
bot = SlackBot(
# LangGraph settings
assistant_id="my-assistant", # Or from env: ASSISTANT_ID
langgraph_url=None, # Or from env: LANGGRAPH_URL (None = loopback)
# Response settings
streaming=True, # Stream responses token-by-token (default: True)
reply_in_thread=True, # Always reply in threads (default: True)
# Slack credentials (or from env)
slack_bot_token=None, # From env: SLACK_BOT_TOKEN
slack_signing_secret=None, # From env: SLACK_SIGNING_SECRET
# Feedback integration
show_feedback_buttons=False, # Show thumbs up/down buttons (default: False)
enable_feedback_comments=False, # Allow text feedback on negative reactions (default: False)
show_thread_id=False, # Show LangGraph thread_id in footer (default: False)
# Image handling
extract_images=True, # Convert markdown images to Slack blocks (default: True)
max_image_blocks=5, # Max images per message (default: 5)
# Metadata tracking
include_metadata=True, # Pass Slack context to LangSmith (default: True)
# Visual feedback
processing_reaction="eyes", # Show emoji while processing (default: None)
# Examples: "eyes", "hourglass", "robot_face"
# Message filtering (streaming only)
message_types=["AIMessageChunk"], # Which message types to stream (default: ["AIMessageChunk"])
# Options: "AIMessageChunk", "ai", "tool", "human", "system"
)
```
### Input/Output Transformers
Customize message processing with transformers:
```python
from lg2slack import SlackBot
bot = SlackBot()
# Transform user input before sending to LangGraph
@bot.transform_input
async def add_context(message: str, context) -> str:
return f"User {context.user_id} asks: {message}"
# Transform AI output before sending to Slack
@bot.transform_output
async def add_footer(response: str, context) -> str:
return f"{response}\n\n_Powered by LangGraph_"
app = bot.app
```
**Multiple transformers** are applied in registration order:
```python
@bot.transform_input
async def first_transform(message: str, context) -> str:
return f"[1] {message}"
@bot.transform_input
async def second_transform(message: str, context) -> str:
return f"[2] {message}"
# Input "hello" becomes: "[2] [1] hello"
```
### Metadata Transformers
Customize what Slack context gets passed to LangSmith:
```python
bot = SlackBot(include_metadata=True)
@bot.transform_metadata
async def custom_metadata(context) -> dict:
"""Customize metadata sent to LangSmith."""
return {
"channel_id": context.channel_id,
"is_dm": context.is_dm,
"user_id_hash": hash(context.user_id), # Hash PII for privacy
}
```
By default, the following fields are passed:
- `slack_user_id`
- `slack_channel_id`
- `slack_message_ts`
- `slack_thread_ts`
- `slack_channel_type`
- `slack_is_dm`
- `slack_is_thread`
### Streaming Mode Control
Control which message types to stream to users:
```python
# Stream only AI responses (default)
bot = SlackBot(message_types=["AIMessageChunk"])
# Stream AI responses AND tool calls
bot = SlackBot(message_types=["AIMessageChunk", "tool"])
# Stream everything (verbose!)
bot = SlackBot(message_types=["AIMessageChunk", "ai", "tool", "system"])
```
### Processing Reaction
Show a visual indicator while the bot is thinking:
```python
# Show eyes emoji while processing
bot = SlackBot(processing_reaction="eyes")
# Other options: "hourglass", "robot_face", "thinking_face", etc.
# Must be emoji NAME, not the emoji character itself
```
The reaction is automatically removed when the response is ready.
### Image Support
The bot automatically extracts markdown images and renders them as Slack image blocks:
```python
# Enable image extraction (default)
bot = SlackBot(extract_images=True, max_image_blocks=5)
# Disable image extraction
bot = SlackBot(extract_images=False)
```
When enabled, markdown like `` in AI responses will:
1. Appear as text in the message
2. Render as a native Slack image block below the text
### Feedback Integration
Collect user feedback and send it to LangSmith:
```python
bot = SlackBot(
show_feedback_buttons=True, # Show thumbs up/down
enable_feedback_comments=True, # Allow text feedback for negative reactions
show_thread_id=True, # Show thread ID for debugging
)
```
## How It Works
### Architecture
```
Slack → lg2slack → [INPUT TRANSFORMERS] → LangGraph
↓
Slack ← lg2slack ← [OUTPUT TRANSFORMERS] ← LangGraph
```
### Message Flow
1. **User sends message** in Slack (DM or @mention)
2. **Slack sends event** to `/events/slack` endpoint
3. **Input transformers** process the message
4. **Message sent to LangGraph** as HumanMessage with thread_id
5. **LangGraph processes** and generates response
6. **Streaming mode:** Each token immediately forwarded to Slack
7. **Output transformers** process the complete response
8. **Final message** displayed in Slack with optional feedback buttons
### Thread Management
lg2slack automatically manages conversation continuity:
- **Deterministic thread IDs:** Same Slack thread always maps to same LangGraph conversation using UUID5
- **Conversation history:** LangGraph's MessagesState maintains full context
- **Thread participation:** Bot auto-responds in threads where it has participated
- **No database needed:** Thread mapping is deterministic and stateless
### Streaming vs Non-Streaming
**Streaming mode (default):**
- True low-latency streaming
- Each token forwarded immediately to Slack
- Better user experience with instant feedback
- Uses Slack's `chat_startStream`, `chat_appendStream`, `chat_stopStream` APIs
**Non-streaming mode:**
```python
bot = SlackBot(streaming=False)
```
- Waits for complete response
- Sends entire message at once
- Useful for debugging or if streaming causes issues
## Feature Highlights
- **Low-latency streaming** - Instant token-by-token responses
- **LangSmith feedback integration** - Thumbs up/down buttons with optional text feedback
- **Automatic image handling** - Converts markdown images to native Slack blocks
- **Flexible transformers** - Customize inputs, outputs, and metadata
- **Smart thread management** - Automatic conversation continuity across messages
- **Processing reactions** - Visual feedback while bot is thinking
- **Message type filtering** - Control verbosity by filtering AI chunks, tools, etc.
- **Metadata tracking** - Automatic Slack context passed to LangSmith for analytics
## Examples
Check out the [`examples/plant_bot`](examples/plant_bot/) directory for a complete working example:
- **[plant_agent.py](examples/plant_bot/plant_agent.py)** - LangGraph agent with conditional image search
- **[slack_server.py](examples/plant_bot/slack_server.py)** - SlackBot setup with transformers
- **[langgraph.json](examples/plant_bot/langgraph.json)** - Full deployment configuration
## Requirements
- Python 3.10+
- LangGraph deployment with `messages` state key
- Slack workspace with bot permissions
## License
MIT
## Contributing
Contributions welcome! Please open an issue or PR.
Raw data
{
"_id": null,
"home_page": null,
"name": "lg2slack",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "langgraph, slack, chatbot, ai, langchain",
"author": null,
"author_email": "Siavash Yasini <siavash.yasini@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/65/6e/2d53bdfad3d7656b5fc60fa2d596ca40d12483ce7ac74877729d787786cb/lg2slack-0.1.3.tar.gz",
"platform": null,
"description": "# lg2slack\n\n[](https://badge.fury.io/py/lg2slack)\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n\nSimple, minimal package to connect LangGraph applications to Slack with just a few lines of code.\n\n## Installation\n\n```bash\npip install lg2slack\n```\n\n## Quick Start\n\n### 1. Create a LangGraph App\n\nFirst, create a simple LangGraph chatbot that will power your Slack bot:\n\n```python\n# agent.py\nfrom langchain_anthropic import ChatAnthropic\nfrom langgraph.graph import MessagesState, StateGraph, START, END\nfrom langgraph.checkpoint.memory import MemorySaver\n\n# Define the chatbot function\ndef chatbot(state: MessagesState):\n model = ChatAnthropic(model=\"claude-3-5-sonnet-20241022\")\n return {\"messages\": [model.invoke(state[\"messages\"])]}\n\n# Build the graph\ngraph = StateGraph(MessagesState)\ngraph.add_node(\"chatbot\", chatbot)\ngraph.add_edge(START, \"chatbot\")\ngraph.add_edge(\"chatbot\", END)\n\n# Compile with memory to maintain conversation history\napp = graph.compile(checkpointer=MemorySaver())\n```\n\nThis creates a simple chatbot that maintains conversation history across messages.\n\n### 2. Create Your Slack Bot Server\n\nThis is where the magic happens.\n\nCreate a `slack/server.py` file in your project directory:\n\n```python\n# slack/server.py\nfrom lg2slack import SlackBot\n\nbot = SlackBot()\n\n# Export the app for langgraph.json\napp = bot.app\n```\n\nThat's it! Just 3 lines of code.\n\n### 3. Configure Environment Variables\n\nCreate a `.env` file with your credentials:\n\n```bash\n# Slack credentials (from https://api.slack.com/apps -> Your App)\nSLACK_BOT_TOKEN=xoxb-your-bot-token\nSLACK_SIGNING_SECRET=your-signing-secret\n\n# LangGraph agent name\n# This is the key you will set in langgraph.json\nASSISTANT_ID=my-assistant\n```\n\n### 4. Configure LangGraph Deployment\n\nAdd your agent and Slack server paths to `langgraph.json`:\n\n```json\n{\n \"dependencies\": [\"lg2slack\", \".\"],\n \"graphs\": {\n \"my-assistant\": \"./agent.py:app\"\n },\n \"env\": \".env\",\n \"http\": {\n \"/events/slack\": \"slack/server:app\"\n }\n}\n```\n\n## Local Testing\n\nBefore deploying to production, test your bot locally using ngrok.\n\n**Important:** You'll need **separate Slack apps** for local development and production deployment, since each environment has its own request URL where Slack sends events.\n\n### 1. Create a Slack App for Local Development\n\n- Go to https://api.slack.com/apps\n- Click \"Create New App\" \u2192 \"From a manifest\"\n- Copy the contents of `slack_manifest.yaml` from this repo\n- Replace placeholder values:\n - `your-app-name` \u2192 Your app name (e.g., \"My Bot - Local\")\n - `your-deployment-url` \u2192 This is your ngrok or Langgraph deployment URL. Leave as placeholder for now\n- Install the app to your workspace\n- Copy the Bot Token and Signing Secret to your `.env` file\n\n### 2. Start LangGraph Dev Server\n\n```bash\nlanggraph dev\n# Runs on http://localhost:2024 and automatically mounts your FastAPI app\n```\n\nNote: You don't need to run a separate server! LangGraph dev automatically imports and serves the FastAPI app from your `langgraph.json`.\n\n### 3. Expose with ngrok\n\nInstall ngrok if you haven't already:\n```bash\n# macOS\nbrew install ngrok\n\n# Or download from https://ngrok.com/download\n```\n\nStart ngrok to expose your local server:\n```bash\nngrok http 2024\n```\n\nThis will output something like:\n```\nForwarding https://abc123.ngrok.io -> http://localhost:2024\n```\n\n**Tip:** View all requests in ngrok's web interface at http://localhost:4040\n\n### 4. Update Slack App Event URL\n\nGo to your Slack app settings \u2192 Event Subscriptions:\n- Request URL: `https://abc123.ngrok.io/events/slack` (use YOUR ngrok URL)\n- Slack will verify the URL - you should see a green checkmark\n\n### 5. Test Your Bot\n\nSend a DM to your bot or @mention it in a channel! You'll see requests in both:\n- LangGraph dev console\n- ngrok web interface (http://localhost:4040)\n\n\n\n## Production Deployment\n\nOnce local testing looks good, deploy to LangGraph Platform.\n\n### 1. Create a Production Slack App\n\nCreate a **new** Slack app for production (separate from your local dev app):\n\n1. Go to https://api.slack.com/apps\n2. Click \"Create New App\" \u2192 \"From a manifest\"\n3. Use the same manifest, but name it differently (e.g., \"My Bot - Production\")\n4. After deployment, you'll update the request URL to your LangGraph Platform URL\n\n### 2. Update Environment Variables\n\nUpdate your `.env` file with the **production** Slack app credentials:\n\n```bash\n# Production Slack credentials\nSLACK_BOT_TOKEN=xoxb-your-production-bot-token\nSLACK_SIGNING_SECRET=your-production-signing-secret\n\n# LangGraph configuration\nASSISTANT_ID=my-assistant\n```\n\n### 3. Deploy to LangGraph Platform\n\n```bash\nlanggraph deploy\n```\n\nAfter deployment, you'll receive a URL like: `https://your-deployment.langraph.app`\n\n### 4. Update Production Slack App URL\n\nGo to your **production** Slack app settings \u2192 Event Subscriptions:\n- Request URL: `https://your-deployment.langraph.app/events/slack`\n\nYour bot is now live! Chat with it by:\n- Sending a DM to the bot\n- @mentioning the bot in a channel\n\n## Advanced Usage\n\n### Configuration Options\n\nThe `SlackBot` class accepts many parameters to customize behavior:\n\n```python\nbot = SlackBot(\n # LangGraph settings\n assistant_id=\"my-assistant\", # Or from env: ASSISTANT_ID\n langgraph_url=None, # Or from env: LANGGRAPH_URL (None = loopback)\n\n # Response settings\n streaming=True, # Stream responses token-by-token (default: True)\n reply_in_thread=True, # Always reply in threads (default: True)\n\n # Slack credentials (or from env)\n slack_bot_token=None, # From env: SLACK_BOT_TOKEN\n slack_signing_secret=None, # From env: SLACK_SIGNING_SECRET\n\n # Feedback integration\n show_feedback_buttons=False, # Show thumbs up/down buttons (default: False)\n enable_feedback_comments=False, # Allow text feedback on negative reactions (default: False)\n show_thread_id=False, # Show LangGraph thread_id in footer (default: False)\n\n # Image handling\n extract_images=True, # Convert markdown images to Slack blocks (default: True)\n max_image_blocks=5, # Max images per message (default: 5)\n\n # Metadata tracking\n include_metadata=True, # Pass Slack context to LangSmith (default: True)\n\n # Visual feedback\n processing_reaction=\"eyes\", # Show emoji while processing (default: None)\n # Examples: \"eyes\", \"hourglass\", \"robot_face\"\n\n # Message filtering (streaming only)\n message_types=[\"AIMessageChunk\"], # Which message types to stream (default: [\"AIMessageChunk\"])\n # Options: \"AIMessageChunk\", \"ai\", \"tool\", \"human\", \"system\"\n)\n```\n\n### Input/Output Transformers\n\nCustomize message processing with transformers:\n\n```python\nfrom lg2slack import SlackBot\n\nbot = SlackBot()\n\n# Transform user input before sending to LangGraph\n@bot.transform_input\nasync def add_context(message: str, context) -> str:\n return f\"User {context.user_id} asks: {message}\"\n\n# Transform AI output before sending to Slack\n@bot.transform_output\nasync def add_footer(response: str, context) -> str:\n return f\"{response}\\n\\n_Powered by LangGraph_\"\n\napp = bot.app\n```\n\n**Multiple transformers** are applied in registration order:\n\n```python\n@bot.transform_input\nasync def first_transform(message: str, context) -> str:\n return f\"[1] {message}\"\n\n@bot.transform_input\nasync def second_transform(message: str, context) -> str:\n return f\"[2] {message}\"\n\n# Input \"hello\" becomes: \"[2] [1] hello\"\n```\n\n### Metadata Transformers\n\nCustomize what Slack context gets passed to LangSmith:\n\n```python\nbot = SlackBot(include_metadata=True)\n\n@bot.transform_metadata\nasync def custom_metadata(context) -> dict:\n \"\"\"Customize metadata sent to LangSmith.\"\"\"\n return {\n \"channel_id\": context.channel_id,\n \"is_dm\": context.is_dm,\n \"user_id_hash\": hash(context.user_id), # Hash PII for privacy\n }\n```\n\nBy default, the following fields are passed:\n- `slack_user_id`\n- `slack_channel_id`\n- `slack_message_ts`\n- `slack_thread_ts`\n- `slack_channel_type`\n- `slack_is_dm`\n- `slack_is_thread`\n\n### Streaming Mode Control\n\nControl which message types to stream to users:\n\n```python\n# Stream only AI responses (default)\nbot = SlackBot(message_types=[\"AIMessageChunk\"])\n\n# Stream AI responses AND tool calls\nbot = SlackBot(message_types=[\"AIMessageChunk\", \"tool\"])\n\n# Stream everything (verbose!)\nbot = SlackBot(message_types=[\"AIMessageChunk\", \"ai\", \"tool\", \"system\"])\n```\n\n### Processing Reaction\n\nShow a visual indicator while the bot is thinking:\n\n```python\n# Show eyes emoji while processing\nbot = SlackBot(processing_reaction=\"eyes\")\n\n# Other options: \"hourglass\", \"robot_face\", \"thinking_face\", etc.\n# Must be emoji NAME, not the emoji character itself\n```\n\nThe reaction is automatically removed when the response is ready.\n\n### Image Support\n\nThe bot automatically extracts markdown images and renders them as Slack image blocks:\n\n```python\n# Enable image extraction (default)\nbot = SlackBot(extract_images=True, max_image_blocks=5)\n\n# Disable image extraction\nbot = SlackBot(extract_images=False)\n```\n\nWhen enabled, markdown like `` in AI responses will:\n1. Appear as text in the message\n2. Render as a native Slack image block below the text\n\n### Feedback Integration\n\nCollect user feedback and send it to LangSmith:\n\n```python\nbot = SlackBot(\n show_feedback_buttons=True, # Show thumbs up/down\n enable_feedback_comments=True, # Allow text feedback for negative reactions\n show_thread_id=True, # Show thread ID for debugging\n)\n```\n\n## How It Works\n\n### Architecture\n\n```\nSlack \u2192 lg2slack \u2192 [INPUT TRANSFORMERS] \u2192 LangGraph\n \u2193\nSlack \u2190 lg2slack \u2190 [OUTPUT TRANSFORMERS] \u2190 LangGraph\n```\n\n### Message Flow\n\n1. **User sends message** in Slack (DM or @mention)\n2. **Slack sends event** to `/events/slack` endpoint\n3. **Input transformers** process the message\n4. **Message sent to LangGraph** as HumanMessage with thread_id\n5. **LangGraph processes** and generates response\n6. **Streaming mode:** Each token immediately forwarded to Slack\n7. **Output transformers** process the complete response\n8. **Final message** displayed in Slack with optional feedback buttons\n\n### Thread Management\n\nlg2slack automatically manages conversation continuity:\n\n- **Deterministic thread IDs:** Same Slack thread always maps to same LangGraph conversation using UUID5\n- **Conversation history:** LangGraph's MessagesState maintains full context\n- **Thread participation:** Bot auto-responds in threads where it has participated\n- **No database needed:** Thread mapping is deterministic and stateless\n\n### Streaming vs Non-Streaming\n\n**Streaming mode (default):**\n- True low-latency streaming\n- Each token forwarded immediately to Slack\n- Better user experience with instant feedback\n- Uses Slack's `chat_startStream`, `chat_appendStream`, `chat_stopStream` APIs\n\n**Non-streaming mode:**\n```python\nbot = SlackBot(streaming=False)\n```\n- Waits for complete response\n- Sends entire message at once\n- Useful for debugging or if streaming causes issues\n\n## Feature Highlights\n\n- **Low-latency streaming** - Instant token-by-token responses\n- **LangSmith feedback integration** - Thumbs up/down buttons with optional text feedback\n- **Automatic image handling** - Converts markdown images to native Slack blocks\n- **Flexible transformers** - Customize inputs, outputs, and metadata\n- **Smart thread management** - Automatic conversation continuity across messages\n- **Processing reactions** - Visual feedback while bot is thinking\n- **Message type filtering** - Control verbosity by filtering AI chunks, tools, etc.\n- **Metadata tracking** - Automatic Slack context passed to LangSmith for analytics\n\n## Examples\n\nCheck out the [`examples/plant_bot`](examples/plant_bot/) directory for a complete working example:\n\n- **[plant_agent.py](examples/plant_bot/plant_agent.py)** - LangGraph agent with conditional image search\n- **[slack_server.py](examples/plant_bot/slack_server.py)** - SlackBot setup with transformers\n- **[langgraph.json](examples/plant_bot/langgraph.json)** - Full deployment configuration\n\n## Requirements\n\n- Python 3.10+\n- LangGraph deployment with `messages` state key\n- Slack workspace with bot permissions\n\n## License\n\nMIT\n\n## Contributing\n\nContributions welcome! Please open an issue or PR.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Simple package to connect LangGraph applications to Slack",
"version": "0.1.3",
"project_urls": {
"Documentation": "https://github.com/syasini/lg2slack#readme",
"Homepage": "https://github.com/syasini/lg2slack",
"Issues": "https://github.com/syasini/lg2slack/issues",
"Repository": "https://github.com/syasini/lg2slack"
},
"split_keywords": [
"langgraph",
" slack",
" chatbot",
" ai",
" langchain"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "0784d272e40f4fa774c6bf91418335f01c53ad0c7ac57a99018a1eeea7ab121c",
"md5": "16c8fcadf9185c901a98062b149c8cca",
"sha256": "56b7cf2c08641d4891adfba32089a251f9740815a0c78cb1191b323ff260c78e"
},
"downloads": -1,
"filename": "lg2slack-0.1.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "16c8fcadf9185c901a98062b149c8cca",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 30390,
"upload_time": "2025-10-23T19:25:05",
"upload_time_iso_8601": "2025-10-23T19:25:05.281207Z",
"url": "https://files.pythonhosted.org/packages/07/84/d272e40f4fa774c6bf91418335f01c53ad0c7ac57a99018a1eeea7ab121c/lg2slack-0.1.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "656e2d53bdfad3d7656b5fc60fa2d596ca40d12483ce7ac74877729d787786cb",
"md5": "10defc665265d7d9ea28dc140ceb11f6",
"sha256": "f1f7819ec9352c381b41ab945150e16a575826f6daeb3acd5e0a201140b7f8fc"
},
"downloads": -1,
"filename": "lg2slack-0.1.3.tar.gz",
"has_sig": false,
"md5_digest": "10defc665265d7d9ea28dc140ceb11f6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 29953,
"upload_time": "2025-10-23T19:25:06",
"upload_time_iso_8601": "2025-10-23T19:25:06.427964Z",
"url": "https://files.pythonhosted.org/packages/65/6e/2d53bdfad3d7656b5fc60fa2d596ca40d12483ce7ac74877729d787786cb/lg2slack-0.1.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-23 19:25:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "syasini",
"github_project": "lg2slack#readme",
"github_not_found": true,
"lcname": "lg2slack"
}