# AgentBill LangChain Integration
Automatic usage tracking and billing for LangChain applications.
[](https://pypi.org/project/agentbill-langchain/)
[](https://opensource.org/licenses/MIT)
## Installation
Install via pip:
```bash
pip install agentbill-langchain
```
With OpenAI support:
```bash
pip install agentbill-langchain[openai]
```
With Anthropic support:
```bash
pip install agentbill-langchain[anthropic]
```
## Quick Start
```python
from agentbill_langchain import AgentBillCallback
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
# 1. Initialize AgentBill callback
callback = AgentBillCallback(
api_key="agb_your_api_key_here", # Get from AgentBill dashboard
base_url="https://bgwyprqxtdreuutzpbgw.supabase.co",
customer_id="customer-123",
debug=True
)
# 2. Create LangChain chain with callback
llm = ChatOpenAI(model="gpt-4o-mini")
prompt = PromptTemplate.from_template("Tell me a joke about {topic}")
chain = LLMChain(llm=llm, prompt=prompt)
# 3. Run - everything is auto-tracked!
result = chain.invoke(
{"topic": "programming"},
config={"callbacks": [callback]}
)
print(result["text"])
# ✅ Automatically captured:
# - Prompt text (hashed for privacy)
# - Model name (gpt-4o-mini)
# - Provider (openai)
# - Token usage (prompt + completion)
# - Latency (ms)
# - Costs (calculated automatically)
```
## Features
- ✅ **Zero-config instrumentation** - Just add the callback
- ✅ **Automatic token tracking** - Captures all LLM calls
- ✅ **Multi-provider support** - OpenAI, Anthropic, any LangChain LLM
- ✅ **Chain tracking** - Tracks entire chain executions
- ✅ **Cost calculation** - Auto-calculates costs per model
- ✅ **Prompt profitability** - Compare costs vs revenue
- ✅ **OpenTelemetry compatible** - Standard observability
## Advanced Usage
### Track Custom Revenue
```python
# Track revenue for profitability analysis
callback.track_revenue(
event_name="chat_completion",
revenue=0.50, # What you charged the customer
metadata={"subscription_tier": "pro"}
)
```
### Use with Agents
```python
from langchain.agents import initialize_agent, load_tools
tools = load_tools(["serpapi", "llm-math"], llm=llm)
agent = initialize_agent(
tools,
llm,
agent="zero-shot-react-description",
callbacks=[callback] # Add callback here
)
# All agent steps auto-tracked!
response = agent.run("What is 25% of 300?")
```
### Use with Sequential Chains
```python
from langchain.chains import SimpleSequentialChain
# All chain steps tracked automatically
overall_chain = SimpleSequentialChain(
chains=[chain1, chain2, chain3],
callbacks=[callback]
)
result = overall_chain.run(input_text)
```
## Configuration
```python
callback = AgentBillCallback(
api_key="agb_...", # Required - get from dashboard
base_url="https://...", # Required - your AgentBill instance
customer_id="customer-123", # Optional - for multi-tenant apps
account_id="account-456", # Optional - for account-level tracking
debug=True, # Optional - enable debug logging
batch_size=10, # Optional - batch signals before sending
flush_interval=5.0 # Optional - flush interval in seconds
)
```
## How It Works
The callback hooks into LangChain's lifecycle:
1. **on_llm_start** - Captures prompt, model, provider
2. **on_llm_end** - Captures tokens, latency, response
3. **on_llm_error** - Captures errors and retries
4. **on_chain_start** - Tracks chain execution start
5. **on_chain_end** - Tracks chain completion
All data is sent to AgentBill via the `record-signals` API endpoint with proper authentication.
## Supported Models
Auto-cost calculation for:
- OpenAI: GPT-4, GPT-4o, GPT-3.5-turbo, etc.
- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, etc.
- Any LangChain-compatible LLM
## Troubleshooting
### Not seeing data in dashboard?
1. Check API key is correct
2. Enable `debug=True` to see logs
3. Verify `base_url` matches your instance
4. Check network connectivity to AgentBill
### Token counts are zero?
- Some LLMs don't return token usage
- Callback will estimate based on response length
- OpenAI/Anthropic provide accurate counts
## License
MIT
Raw data
{
"_id": null,
"home_page": "https://github.com/Agent-Bill/langchain",
"name": "agentbill-py-langchain",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "langchain agentbill ai agent billing usage-tracking openai anthropic llm observability callbacks",
"author": "AgentBill",
"author_email": "dominic@agentbill.io",
"download_url": "https://files.pythonhosted.org/packages/fa/6b/0a5114150720a65a5c143b47a4019e7421887779d51e78cf33b12f845343/agentbill_py_langchain-3.0.2.tar.gz",
"platform": null,
"description": "# AgentBill LangChain Integration\n\nAutomatic usage tracking and billing for LangChain applications.\n\n[](https://pypi.org/project/agentbill-langchain/)\n[](https://opensource.org/licenses/MIT)\n\n## Installation\n\nInstall via pip:\n\n```bash\npip install agentbill-langchain\n```\n\nWith OpenAI support:\n```bash\npip install agentbill-langchain[openai]\n```\n\nWith Anthropic support:\n```bash\npip install agentbill-langchain[anthropic]\n```\n\n## Quick Start\n\n```python\nfrom agentbill_langchain import AgentBillCallback\nfrom langchain_openai import ChatOpenAI\nfrom langchain.chains import LLMChain\nfrom langchain.prompts import PromptTemplate\n\n# 1. Initialize AgentBill callback\ncallback = AgentBillCallback(\n api_key=\"agb_your_api_key_here\", # Get from AgentBill dashboard\n base_url=\"https://bgwyprqxtdreuutzpbgw.supabase.co\",\n customer_id=\"customer-123\",\n debug=True\n)\n\n# 2. Create LangChain chain with callback\nllm = ChatOpenAI(model=\"gpt-4o-mini\")\nprompt = PromptTemplate.from_template(\"Tell me a joke about {topic}\")\nchain = LLMChain(llm=llm, prompt=prompt)\n\n# 3. Run - everything is auto-tracked!\nresult = chain.invoke(\n {\"topic\": \"programming\"},\n config={\"callbacks\": [callback]}\n)\n\nprint(result[\"text\"])\n\n# \u2705 Automatically captured:\n# - Prompt text (hashed for privacy)\n# - Model name (gpt-4o-mini)\n# - Provider (openai)\n# - Token usage (prompt + completion)\n# - Latency (ms)\n# - Costs (calculated automatically)\n```\n\n## Features\n\n- \u2705 **Zero-config instrumentation** - Just add the callback\n- \u2705 **Automatic token tracking** - Captures all LLM calls\n- \u2705 **Multi-provider support** - OpenAI, Anthropic, any LangChain LLM\n- \u2705 **Chain tracking** - Tracks entire chain executions\n- \u2705 **Cost calculation** - Auto-calculates costs per model\n- \u2705 **Prompt profitability** - Compare costs vs revenue\n- \u2705 **OpenTelemetry compatible** - Standard observability\n\n## Advanced Usage\n\n### Track Custom Revenue\n\n```python\n# Track revenue for profitability analysis\ncallback.track_revenue(\n event_name=\"chat_completion\",\n revenue=0.50, # What you charged the customer\n metadata={\"subscription_tier\": \"pro\"}\n)\n```\n\n### Use with Agents\n\n```python\nfrom langchain.agents import initialize_agent, load_tools\n\ntools = load_tools([\"serpapi\", \"llm-math\"], llm=llm)\nagent = initialize_agent(\n tools,\n llm,\n agent=\"zero-shot-react-description\",\n callbacks=[callback] # Add callback here\n)\n\n# All agent steps auto-tracked!\nresponse = agent.run(\"What is 25% of 300?\")\n```\n\n### Use with Sequential Chains\n\n```python\nfrom langchain.chains import SimpleSequentialChain\n\n# All chain steps tracked automatically\noverall_chain = SimpleSequentialChain(\n chains=[chain1, chain2, chain3],\n callbacks=[callback]\n)\n\nresult = overall_chain.run(input_text)\n```\n\n## Configuration\n\n```python\ncallback = AgentBillCallback(\n api_key=\"agb_...\", # Required - get from dashboard\n base_url=\"https://...\", # Required - your AgentBill instance\n customer_id=\"customer-123\", # Optional - for multi-tenant apps\n account_id=\"account-456\", # Optional - for account-level tracking\n debug=True, # Optional - enable debug logging\n batch_size=10, # Optional - batch signals before sending\n flush_interval=5.0 # Optional - flush interval in seconds\n)\n```\n\n## How It Works\n\nThe callback hooks into LangChain's lifecycle:\n\n1. **on_llm_start** - Captures prompt, model, provider\n2. **on_llm_end** - Captures tokens, latency, response\n3. **on_llm_error** - Captures errors and retries\n4. **on_chain_start** - Tracks chain execution start\n5. **on_chain_end** - Tracks chain completion\n\nAll data is sent to AgentBill via the `record-signals` API endpoint with proper authentication.\n\n## Supported Models\n\nAuto-cost calculation for:\n- OpenAI: GPT-4, GPT-4o, GPT-3.5-turbo, etc.\n- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, etc.\n- Any LangChain-compatible LLM\n\n## Troubleshooting\n\n### Not seeing data in dashboard?\n\n1. Check API key is correct\n2. Enable `debug=True` to see logs\n3. Verify `base_url` matches your instance\n4. Check network connectivity to AgentBill\n\n### Token counts are zero?\n\n- Some LLMs don't return token usage\n- Callback will estimate based on response length\n- OpenAI/Anthropic provide accurate counts\n\n## License\n\nMIT\n",
"bugtrack_url": null,
"license": null,
"summary": "LangChain callback handler for automatic usage tracking and billing with AgentBill",
"version": "3.0.2",
"project_urls": {
"Homepage": "https://github.com/Agent-Bill/langchain"
},
"split_keywords": [
"langchain",
"agentbill",
"ai",
"agent",
"billing",
"usage-tracking",
"openai",
"anthropic",
"llm",
"observability",
"callbacks"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "fb91c01bca80905a380785899ec4c8a92dba11361d7b3e8c4671fb1549495c82",
"md5": "d26c1043db1d1db406dc862b3e31228d",
"sha256": "da0529892bb4f7ff9b75953fbc36751977888d5e028dcacc30b7825e940c6ea1"
},
"downloads": -1,
"filename": "agentbill_py_langchain-3.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d26c1043db1d1db406dc862b3e31228d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 7469,
"upload_time": "2025-11-04T14:24:33",
"upload_time_iso_8601": "2025-11-04T14:24:33.187856Z",
"url": "https://files.pythonhosted.org/packages/fb/91/c01bca80905a380785899ec4c8a92dba11361d7b3e8c4671fb1549495c82/agentbill_py_langchain-3.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "fa6b0a5114150720a65a5c143b47a4019e7421887779d51e78cf33b12f845343",
"md5": "2975a7587bc2e803b9c39d7dcb465853",
"sha256": "4c9edb83df94e962c5dc4ed999d31815a7c3fb2fd4cd0447664777c32bb63687"
},
"downloads": -1,
"filename": "agentbill_py_langchain-3.0.2.tar.gz",
"has_sig": false,
"md5_digest": "2975a7587bc2e803b9c39d7dcb465853",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 7573,
"upload_time": "2025-11-04T14:24:34",
"upload_time_iso_8601": "2025-11-04T14:24:34.031379Z",
"url": "https://files.pythonhosted.org/packages/fa/6b/0a5114150720a65a5c143b47a4019e7421887779d51e78cf33b12f845343/agentbill_py_langchain-3.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-11-04 14:24:34",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Agent-Bill",
"github_project": "langchain",
"github_not_found": true,
"lcname": "agentbill-py-langchain"
}