# TRACER
[](https://github.com/Chatbot-TRACER/TRACER/actions/workflows/ci.yml)
[](https://github.com/Chatbot-TRACER/TRACER/blob/main/LICENSE)
[](https://pypi.org/project/chatbot-tracer/)
> Task Recognition and Chatbot ExploreR
A tool for automatically exploring and analyzing chatbots, generating a model of the functionalities and user profiles for testing.
## Installation
The easiest way to install TRACER is from PyPI:
```bash
pip install chatbot-tracer
```
You must also have [Graphviz](https://graphviz.gitlab.io/download/) installed on your system for TRACER to work properly.
This will install the latest released version and all required Python dependencies.
---
If you want to contribute or develop TRACER, see the section below.
## 1. Overview & Goals
TRACER is a tool designed to automatically interact with and analyze target chatbots. It uses Large Language Models (LLMs) to conduct multiple conversational sessions, identify the chatbot's core functionalities, limitations, and interaction flows, and generate structured outputs for testing and analysis.
The main goals of TRACER are:
- **Workflow Modeling:** Model the user's journey through the chatbot as a directed graph, capturing sequential dependencies, branching logic, and optional steps, adapt the modeling approach based on whether the chatbot is primarily transactional or informational.
- **Profile Generation:** Generate standardized YAML user profiles based on discovered functionalities and workflows, suitable for [Sensei](https://github.com/satori-chatbots/user-simulator).
## 2. Core Functionality
The system follows a multi-phase approach implemented via a LangGraph structure:
1. **Chatbot Interaction:** Connect to and converse with target chatbots (initially Taskyto, Ada-UAM) via provided connectors.
2. **Session Preparation:** Before starting the conversations the chatbot's language and fallback message are found by sending a few confusing messages.
3. **Exploration Sessions:**
1. Conduct multiple conversational sessions to probe different aspects of the chatbot.
2. If a fallback is received during the conversation, the LLM will rephrase the sentence and if it is received again, the topic will be changed.
3. After each conversation the LLM tries to extract functionalities so that they can be further explored on the next sessions.
4. **Bot Classification:** After running all the sessions, the conversations and the found functionalities are passed to an LLM which will determine if the chatbot is transactional or informative.
- **Transactional:** chatbots that allow you to perform actions, such as booking a flight or ordering food.
- **Informational:** chatbots that provide information, such as answering questions or providing customer support.
- The detected chatbot type is stored in the application state for use in subsequent processing steps.
5. **Functionality Analysis (LLM-based):** Depending on the chatbot's type a different prompt will be used, but in this section the LLM will receive conversations and functionalities and will try to merge functionalities that are the same, maybe find new ones, and find relationships between them. The output will be a structured representation of the discovered functionalities, including parent/child relationships and unique root nodes.
- **Transactional:** The LLM will look for sequential dependencies, branching logic, and optional steps.
- **Informational:** The LLM will look for independent topics and create separate root nodes for each topic.
6. **Profile Generation (LLM-based):** After the functionalities have been found and a workflow is created, the LLM will proceed to create the profiles for Sensei taking into account the discovered things. It is in done in different sections where different prompts will be creating the goals, context, parameters and so on.
7. **YAML Validation & Correction:** Validate generated YAML profiles with a script and if any error is found, pass it to the LLM to try to correct it.
8. **Output Generation:**
- Save validated YAML profiles to disk.
- Generate a text report (`report.txt`).
- Generate a visual workflow graph (`workflow_graph.png`) using Graphviz.
## 3. Workflow Graph Generation
One of the main outputs of this tool is a visual graph (`workflow_graph.png`) showing how users interact with the chatbot. Although, the primary focus of this tool is to make the profiles, this was added to help visualize the discovered functionalities and their relationships.
As it has been explained above, the system uses different approaches for transactional and informational chatbots.
**Example Desired Flow (Transactional - Pizza Bot):**
The goal is to capture flows like this: A user starts, sees menu items. This action leads to selecting a predefined pizza or customizing one. The user then orders drinks, and then the chatbot confirms the order.
```mermaid
graph LR
Start((•)) --> F
Start --> A
Start --> E
A[provide opening hours]
E[provide price info]
F[provide menu items] --> G[order predefined pizza
Params: pizza_type];
F --> H[order custom pizza
Params: toppings, size];
G --> I[order drinks
Params: drink_type];
H --> I;
I --> D[provide order information];
```
**Example Desired Flow (Informational - Ada-UAM Bot):**
For an informational bot, the goal is to represent the different topics the user can inquire about independently. There are typically no required sequences between these topics. The structuring logic should default to creating separate root nodes.
```mermaid
graph LR
Start((•)) --> A
Start --> B
Start --> C
Start --> D
Start --> E
Start --> F
Start --> G
A[provide_contact_info];
B[provide_opening_hours];
C[explain_service_catalog];
D[explain_ticketing_process];
E[explain_wifi_setup];
F[explain_software_access];
G[handle_unclear_request];
```
> Note: The Mermaid diagrams above are illustrative of the desired logical flow. The actual implementation uses Graphviz.
## 4. Usage
Execution:
```bash
TRACER --help
```
### Arguments
All arguments are optional.
- `-s, --sessions`: Number of exploration sessions (default: 3).
- `-n, --turns`: Maximum turns per session (default: 8).
- `-t, --technology`: Chatbot technology connector to use (default: `taskyto`). See available technologies below.
- `-u, --url`: Chatbot URL (default: `http://localhost:5000`). Only necessary for technologies like `taskyto` that require an explicit endpoint. Others may have the URL embedded in their connector.
- `-m, --model`: Model for exploration (default: `gpt-4o-mini`). Supports both OpenAI models (e.g., `gpt-4o`) and Google Gemini models (e.g., `gemini-2.0-flash`). **Recommended**: Use a more powerful model (e.g., `gpt-4o`) for better exploration quality.
- `-pm, --profile-model`: Model for profile generation (default: same as exploration model). Supports both OpenAI models (e.g., `gpt-4o`) and Google Gemini models (e.g., `gemini-2.0-flash`). **Recommended**: Use a cheaper model (e.g., `gpt-4o-mini`) for cost optimization.
- `-o, --output`: Output directory for generated files (default: `output`).
- `-v` or `-vv`: Verbosity level, none will show key information, `-v` will show the conversation and `-vv` will show be debug information.
- `-gfs`, `--graph-font-size`: Font size for the graph.
- `c`, `--compact`: Compact mode for the graph.
- `-td`, `--top-down`: Top-down layout for the graph.
- `--graph-format`: Export format for the graph (choices: pdf, png, svg, all). Default is pdf. Use 'all' to export in all formats.
- `-nf`, `--nested-forward`: All the variables will be nested, creates more exhaustive profiles but also the number of conversations grows exponentially.
- `-h, --help`: Show help message and exit.
### Supported Chatbot Technologies
- `taskyto`: Custom chatbot framework (requires self-hosting and initialization).
- `ada-uam`: MillionBot instance for Universidad Autónoma de Madrid (UAM).
### Environment Variables
- For OpenAI models: Set the `OPENAI_API_KEY` environment variable with your API key.
- For Gemini models: Set the `GOOGLE_API_KEY` environment variable with your API key from Google.
## 5. Input/Output
- **Input:**
- Command-line arguments (see Usage).
- Target chatbot accessible via its connector/URL.
- **Output (in the specified `--output` directory, organized by technology):**
- Multiple `.yaml` files (one per generated user profile).
- `report.txt` (structured text report summarizing findings).
- `workflow_graph.png` (visual graph representation of the interaction flow).
### Example Commands
```bash
# Using OpenAI models
TRACER -t ada-uam -n 8 -s 12 -o generated_profiles/ada-uam -m gpt-4o-mini
# Using Gemini models
TRACER -t taskyto -n 10 -s 5 -o generated_profiles/taskyto -m gemini-2.0-flash
# Using different models for exploration and profile generation (recommended)
TRACER -t ada-uam -n 8 -s 12 -m gpt-4o -pm gpt-4o-mini -o generated_profiles/ada-uam
# Mix different model providers (explore with OpenAI, generate profiles with Gemini)
TRACER -t taskyto -m gpt-4o -pm gemini-2.0-flash -o generated_profiles/taskyto
```
## Development & Contributing
If you want to contribute to TRACER or work on its development, follow these steps:
1. Ensure Python 3.11+ and Graphviz are installed.
2. Clone the repository:
```bash
git clone https://github.com/Chatbot-TRACER/TRACER.git
cd TRACER
```
3. Install dependencies (choose one):
```bash
pip install -r requirements.txt
# or, using uv (recommended for speed)
uv sync
```
4. (Optional, for CLI development) Install in editable mode to use the `tracer` command:
```bash
pip install -e .
```
5. Make sure to have the required environment variables set for OpenAI or Google Gemini models.
```bash
export OPENAI_API_KEY=your_openai_api_key
export GOOGLE_API_KEY=your_google_api_key
```
To run TRACER in development mode:
- As a module (recommended for development):
```bash
python -m tracer.main --help
```
- Or, if installed in editable mode, use the CLI:
```bash
tracer --help
```
Raw data
{
"_id": null,
"home_page": null,
"name": "chatbot-tracer",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "chatbot, testing, automation, ai",
"author": null,
"author_email": "Iv\u00e1n Sotillo <ivan.delhorno@uam.es>",
"download_url": "https://files.pythonhosted.org/packages/27/cf/694eecd06a12dd1ad810769e691c04b353bccd4d2568587fd4331f7c802f/chatbot_tracer-0.3.0.tar.gz",
"platform": null,
"description": "# TRACER\n\n[](https://github.com/Chatbot-TRACER/TRACER/actions/workflows/ci.yml)\n[](https://github.com/Chatbot-TRACER/TRACER/blob/main/LICENSE)\n[](https://pypi.org/project/chatbot-tracer/)\n\n> Task Recognition and Chatbot ExploreR\n\nA tool for automatically exploring and analyzing chatbots, generating a model of the functionalities and user profiles for testing.\n\n## Installation\n\nThe easiest way to install TRACER is from PyPI:\n\n```bash\npip install chatbot-tracer\n```\n\nYou must also have [Graphviz](https://graphviz.gitlab.io/download/) installed on your system for TRACER to work properly.\n\nThis will install the latest released version and all required Python dependencies.\n\n---\n\nIf you want to contribute or develop TRACER, see the section below.\n\n## 1. Overview & Goals\n\nTRACER is a tool designed to automatically interact with and analyze target chatbots. It uses Large Language Models (LLMs) to conduct multiple conversational sessions, identify the chatbot's core functionalities, limitations, and interaction flows, and generate structured outputs for testing and analysis.\n\nThe main goals of TRACER are:\n\n- **Workflow Modeling:** Model the user's journey through the chatbot as a directed graph, capturing sequential dependencies, branching logic, and optional steps, adapt the modeling approach based on whether the chatbot is primarily transactional or informational.\n- **Profile Generation:** Generate standardized YAML user profiles based on discovered functionalities and workflows, suitable for [Sensei](https://github.com/satori-chatbots/user-simulator).\n\n## 2. Core Functionality\n\nThe system follows a multi-phase approach implemented via a LangGraph structure:\n\n1. **Chatbot Interaction:** Connect to and converse with target chatbots (initially Taskyto, Ada-UAM) via provided connectors.\n2. **Session Preparation:** Before starting the conversations the chatbot's language and fallback message are found by sending a few confusing messages.\n3. **Exploration Sessions:**\n 1. Conduct multiple conversational sessions to probe different aspects of the chatbot.\n 2. If a fallback is received during the conversation, the LLM will rephrase the sentence and if it is received again, the topic will be changed.\n 3. After each conversation the LLM tries to extract functionalities so that they can be further explored on the next sessions.\n4. **Bot Classification:** After running all the sessions, the conversations and the found functionalities are passed to an LLM which will determine if the chatbot is transactional or informative.\n - **Transactional:** chatbots that allow you to perform actions, such as booking a flight or ordering food.\n - **Informational:** chatbots that provide information, such as answering questions or providing customer support.\n - The detected chatbot type is stored in the application state for use in subsequent processing steps.\n5. **Functionality Analysis (LLM-based):** Depending on the chatbot's type a different prompt will be used, but in this section the LLM will receive conversations and functionalities and will try to merge functionalities that are the same, maybe find new ones, and find relationships between them. The output will be a structured representation of the discovered functionalities, including parent/child relationships and unique root nodes.\n - **Transactional:** The LLM will look for sequential dependencies, branching logic, and optional steps.\n - **Informational:** The LLM will look for independent topics and create separate root nodes for each topic.\n6. **Profile Generation (LLM-based):** After the functionalities have been found and a workflow is created, the LLM will proceed to create the profiles for Sensei taking into account the discovered things. It is in done in different sections where different prompts will be creating the goals, context, parameters and so on.\n7. **YAML Validation & Correction:** Validate generated YAML profiles with a script and if any error is found, pass it to the LLM to try to correct it.\n8. **Output Generation:**\n\n- Save validated YAML profiles to disk.\n- Generate a text report (`report.txt`).\n- Generate a visual workflow graph (`workflow_graph.png`) using Graphviz.\n\n## 3. Workflow Graph Generation\n\nOne of the main outputs of this tool is a visual graph (`workflow_graph.png`) showing how users interact with the chatbot. Although, the primary focus of this tool is to make the profiles, this was added to help visualize the discovered functionalities and their relationships.\n\nAs it has been explained above, the system uses different approaches for transactional and informational chatbots.\n\n**Example Desired Flow (Transactional - Pizza Bot):**\n\nThe goal is to capture flows like this: A user starts, sees menu items. This action leads to selecting a predefined pizza or customizing one. The user then orders drinks, and then the chatbot confirms the order.\n\n```mermaid\ngraph LR\n Start((\u2022)) --> F\n Start --> A\n Start --> E\n\n A[provide opening hours]\n E[provide price info]\n\n F[provide menu items] --> G[order predefined pizza\n Params: pizza_type];\n F --> H[order custom pizza\n Params: toppings, size];\n G --> I[order drinks\n Params: drink_type];\n H --> I;\n I --> D[provide order information];\n```\n\n**Example Desired Flow (Informational - Ada-UAM Bot):**\n\nFor an informational bot, the goal is to represent the different topics the user can inquire about independently. There are typically no required sequences between these topics. The structuring logic should default to creating separate root nodes.\n\n```mermaid\ngraph LR\n Start((\u2022)) --> A\n Start --> B\n Start --> C\n Start --> D\n Start --> E\n Start --> F\n Start --> G\n\n A[provide_contact_info];\n B[provide_opening_hours];\n C[explain_service_catalog];\n D[explain_ticketing_process];\n E[explain_wifi_setup];\n F[explain_software_access];\n G[handle_unclear_request];\n```\n\n> Note: The Mermaid diagrams above are illustrative of the desired logical flow. The actual implementation uses Graphviz.\n\n## 4. Usage\n\nExecution:\n\n```bash\nTRACER --help\n```\n\n### Arguments\n\nAll arguments are optional.\n\n- `-s, --sessions`: Number of exploration sessions (default: 3).\n- `-n, --turns`: Maximum turns per session (default: 8).\n- `-t, --technology`: Chatbot technology connector to use (default: `taskyto`). See available technologies below.\n- `-u, --url`: Chatbot URL (default: `http://localhost:5000`). Only necessary for technologies like `taskyto` that require an explicit endpoint. Others may have the URL embedded in their connector.\n- `-m, --model`: Model for exploration (default: `gpt-4o-mini`). Supports both OpenAI models (e.g., `gpt-4o`) and Google Gemini models (e.g., `gemini-2.0-flash`). **Recommended**: Use a more powerful model (e.g., `gpt-4o`) for better exploration quality.\n- `-pm, --profile-model`: Model for profile generation (default: same as exploration model). Supports both OpenAI models (e.g., `gpt-4o`) and Google Gemini models (e.g., `gemini-2.0-flash`). **Recommended**: Use a cheaper model (e.g., `gpt-4o-mini`) for cost optimization.\n- `-o, --output`: Output directory for generated files (default: `output`).\n- `-v` or `-vv`: Verbosity level, none will show key information, `-v` will show the conversation and `-vv` will show be debug information.\n- `-gfs`, `--graph-font-size`: Font size for the graph.\n- `c`, `--compact`: Compact mode for the graph.\n- `-td`, `--top-down`: Top-down layout for the graph.\n- `--graph-format`: Export format for the graph (choices: pdf, png, svg, all). Default is pdf. Use 'all' to export in all formats.\n- `-nf`, `--nested-forward`: All the variables will be nested, creates more exhaustive profiles but also the number of conversations grows exponentially.\n- `-h, --help`: Show help message and exit.\n\n### Supported Chatbot Technologies\n\n- `taskyto`: Custom chatbot framework (requires self-hosting and initialization).\n- `ada-uam`: MillionBot instance for Universidad Aut\u00f3noma de Madrid (UAM).\n\n### Environment Variables\n\n- For OpenAI models: Set the `OPENAI_API_KEY` environment variable with your API key.\n- For Gemini models: Set the `GOOGLE_API_KEY` environment variable with your API key from Google.\n\n## 5. Input/Output\n\n- **Input:**\n - Command-line arguments (see Usage).\n - Target chatbot accessible via its connector/URL.\n- **Output (in the specified `--output` directory, organized by technology):**\n - Multiple `.yaml` files (one per generated user profile).\n - `report.txt` (structured text report summarizing findings).\n - `workflow_graph.png` (visual graph representation of the interaction flow).\n\n### Example Commands\n\n```bash\n# Using OpenAI models\nTRACER -t ada-uam -n 8 -s 12 -o generated_profiles/ada-uam -m gpt-4o-mini\n\n# Using Gemini models\nTRACER -t taskyto -n 10 -s 5 -o generated_profiles/taskyto -m gemini-2.0-flash\n\n# Using different models for exploration and profile generation (recommended)\nTRACER -t ada-uam -n 8 -s 12 -m gpt-4o -pm gpt-4o-mini -o generated_profiles/ada-uam\n\n# Mix different model providers (explore with OpenAI, generate profiles with Gemini)\nTRACER -t taskyto -m gpt-4o -pm gemini-2.0-flash -o generated_profiles/taskyto\n```\n\n## Development & Contributing\n\nIf you want to contribute to TRACER or work on its development, follow these steps:\n\n1. Ensure Python 3.11+ and Graphviz are installed.\n2. Clone the repository:\n\n ```bash\n git clone https://github.com/Chatbot-TRACER/TRACER.git\n cd TRACER\n ```\n\n3. Install dependencies (choose one):\n\n ```bash\n pip install -r requirements.txt\n # or, using uv (recommended for speed)\n uv sync\n ```\n\n4. (Optional, for CLI development) Install in editable mode to use the `tracer` command:\n\n ```bash\n pip install -e .\n ```\n\n5. Make sure to have the required environment variables set for OpenAI or Google Gemini models.\n\n ```bash\n export OPENAI_API_KEY=your_openai_api_key\n export GOOGLE_API_KEY=your_google_api_key\n ```\n\nTo run TRACER in development mode:\n\n- As a module (recommended for development):\n\n ```bash\n python -m tracer.main --help\n ```\n\n- Or, if installed in editable mode, use the CLI:\n\n ```bash\n tracer --help\n ```\n",
"bugtrack_url": null,
"license": null,
"summary": "A tool to model chatbots and create profiles to test them.",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/Chatbot-TRACER/TRACER",
"Issues": "https://github.com/Chatbot-TRACER/TRACER/issues",
"Repository": "https://github.com/Chatbot-TRACER/TRACER"
},
"split_keywords": [
"chatbot",
" testing",
" automation",
" ai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "e8026cae2357100fdb579a1336c9ac57f96c3c1c4dcec62443aac3a2033e7e3f",
"md5": "b7ffe465bef01d64c3e52bdaf7017c43",
"sha256": "7aa9520306024c454464b0999125a3595fab2346935126c97ad82feb0f9b47fd"
},
"downloads": -1,
"filename": "chatbot_tracer-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b7ffe465bef01d64c3e52bdaf7017c43",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 199834,
"upload_time": "2025-07-11T12:47:06",
"upload_time_iso_8601": "2025-07-11T12:47:06.927078Z",
"url": "https://files.pythonhosted.org/packages/e8/02/6cae2357100fdb579a1336c9ac57f96c3c1c4dcec62443aac3a2033e7e3f/chatbot_tracer-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "27cf694eecd06a12dd1ad810769e691c04b353bccd4d2568587fd4331f7c802f",
"md5": "f6c9c95d7b7f16b9fa7f74555cd11e8e",
"sha256": "6a4fe25236c071d63381fb956dcc2b3d8f565b2ec1fb82bc957264d4be3526ed"
},
"downloads": -1,
"filename": "chatbot_tracer-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "f6c9c95d7b7f16b9fa7f74555cd11e8e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 165323,
"upload_time": "2025-07-11T12:47:08",
"upload_time_iso_8601": "2025-07-11T12:47:08.318636Z",
"url": "https://files.pythonhosted.org/packages/27/cf/694eecd06a12dd1ad810769e691c04b353bccd4d2568587fd4331f7c802f/chatbot_tracer-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 12:47:08",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Chatbot-TRACER",
"github_project": "TRACER",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "annotated-types",
"specs": [
[
"==",
"0.7.0"
]
]
},
{
"name": "anyio",
"specs": [
[
"==",
"4.9.0"
]
]
},
{
"name": "beautifulsoup4",
"specs": [
[
"==",
"4.13.4"
]
]
},
{
"name": "bs4",
"specs": [
[
"==",
"0.0.2"
]
]
},
{
"name": "cachetools",
"specs": [
[
"==",
"5.5.2"
]
]
},
{
"name": "certifi",
"specs": [
[
"==",
"2025.4.26"
]
]
},
{
"name": "charset-normalizer",
"specs": [
[
"==",
"3.4.2"
]
]
},
{
"name": "colorama",
"specs": [
[
"==",
"0.4.6"
]
]
},
{
"name": "distro",
"specs": [
[
"==",
"1.9.0"
]
]
},
{
"name": "filetype",
"specs": [
[
"==",
"1.2.0"
]
]
},
{
"name": "google-ai-generativelanguage",
"specs": [
[
"==",
"0.6.15"
]
]
},
{
"name": "google-api-core",
"specs": [
[
"==",
"2.24.2"
]
]
},
{
"name": "google-api-python-client",
"specs": [
[
"==",
"2.169.0"
]
]
},
{
"name": "google-auth",
"specs": [
[
"==",
"2.40.1"
]
]
},
{
"name": "google-auth-httplib2",
"specs": [
[
"==",
"0.2.0"
]
]
},
{
"name": "google-generativeai",
"specs": [
[
"==",
"0.8.5"
]
]
},
{
"name": "googleapis-common-protos",
"specs": [
[
"==",
"1.70.0"
]
]
},
{
"name": "graphviz",
"specs": [
[
"==",
"0.20.3"
]
]
},
{
"name": "grpcio",
"specs": [
[
"==",
"1.71.0"
]
]
},
{
"name": "grpcio-status",
"specs": [
[
"==",
"1.71.0"
]
]
},
{
"name": "h11",
"specs": [
[
"==",
"0.16.0"
]
]
},
{
"name": "httpcore",
"specs": [
[
"==",
"1.0.9"
]
]
},
{
"name": "httplib2",
"specs": [
[
"==",
"0.22.0"
]
]
},
{
"name": "httpx",
"specs": [
[
"==",
"0.28.1"
]
]
},
{
"name": "idna",
"specs": [
[
"==",
"3.10"
]
]
},
{
"name": "jiter",
"specs": [
[
"==",
"0.9.0"
]
]
},
{
"name": "jsonpatch",
"specs": [
[
"==",
"1.33"
]
]
},
{
"name": "jsonpointer",
"specs": [
[
"==",
"3.0.0"
]
]
},
{
"name": "langchain-core",
"specs": [
[
"==",
"0.3.60"
]
]
},
{
"name": "langchain-google-genai",
"specs": [
[
"==",
"2.0.10"
]
]
},
{
"name": "langchain-openai",
"specs": [
[
"==",
"0.3.17"
]
]
},
{
"name": "langgraph",
"specs": [
[
"==",
"0.4.5"
]
]
},
{
"name": "langgraph-checkpoint",
"specs": [
[
"==",
"2.0.26"
]
]
},
{
"name": "langgraph-prebuilt",
"specs": [
[
"==",
"0.1.8"
]
]
},
{
"name": "langgraph-sdk",
"specs": [
[
"==",
"0.1.69"
]
]
},
{
"name": "langsmith",
"specs": [
[
"==",
"0.3.42"
]
]
},
{
"name": "openai",
"specs": [
[
"==",
"1.78.1"
]
]
},
{
"name": "orjson",
"specs": [
[
"==",
"3.10.18"
]
]
},
{
"name": "ormsgpack",
"specs": [
[
"==",
"1.9.1"
]
]
},
{
"name": "packaging",
"specs": [
[
"==",
"24.2"
]
]
},
{
"name": "proto-plus",
"specs": [
[
"==",
"1.26.1"
]
]
},
{
"name": "protobuf",
"specs": [
[
"==",
"5.29.4"
]
]
},
{
"name": "pyasn1",
"specs": [
[
"==",
"0.6.1"
]
]
},
{
"name": "pyasn1-modules",
"specs": [
[
"==",
"0.4.2"
]
]
},
{
"name": "pydantic",
"specs": [
[
"==",
"2.11.4"
]
]
},
{
"name": "pydantic-core",
"specs": [
[
"==",
"2.33.2"
]
]
},
{
"name": "pyparsing",
"specs": [
[
"==",
"3.2.3"
]
]
},
{
"name": "pyyaml",
"specs": [
[
"==",
"6.0.2"
]
]
},
{
"name": "regex",
"specs": [
[
"==",
"2024.11.6"
]
]
},
{
"name": "requests",
"specs": [
[
"==",
"2.32.3"
]
]
},
{
"name": "requests-toolbelt",
"specs": [
[
"==",
"1.0.0"
]
]
},
{
"name": "rsa",
"specs": [
[
"==",
"4.9.1"
]
]
},
{
"name": "sniffio",
"specs": [
[
"==",
"1.3.1"
]
]
},
{
"name": "soupsieve",
"specs": [
[
"==",
"2.7"
]
]
},
{
"name": "tenacity",
"specs": [
[
"==",
"9.1.2"
]
]
},
{
"name": "tiktoken",
"specs": [
[
"==",
"0.9.0"
]
]
},
{
"name": "tqdm",
"specs": [
[
"==",
"4.67.1"
]
]
},
{
"name": "typing-extensions",
"specs": [
[
"==",
"4.13.2"
]
]
},
{
"name": "typing-inspection",
"specs": [
[
"==",
"0.4.0"
]
]
},
{
"name": "uritemplate",
"specs": [
[
"==",
"4.1.1"
]
]
},
{
"name": "urllib3",
"specs": [
[
"==",
"2.4.0"
]
]
},
{
"name": "xxhash",
"specs": [
[
"==",
"3.5.0"
]
]
},
{
"name": "zstandard",
"specs": [
[
"==",
"0.23.0"
]
]
}
],
"lcname": "chatbot-tracer"
}