# Basalt SDK
Basalt is a powerful tool for managing AI prompts, monitoring AI applications, and their release workflows. This SDK is the official Python package for interacting with your Basalt prompts and monitoring your AI applications.
## Installation
Install the Basalt SDK via pip:
```bash
pip install basalt-sdk
```
## Usage
### Importing and Initializing the SDK
To get started, import the `Basalt` class and initialize it with your API key:
```python
from basalt import Basalt
# Basic initialization with API key
basalt = Basalt(api_key="my-dev-api-key")
# Specify a log_level
basalt = Basalt(api_key="my-dev-api-key", log_level="debug")
# Or with an environment variable
import os
basalt = Basalt(api_key=os.getenv("BASALT_API_KEY"))
```
## Prompt SDK
The Prompt SDK allows you to interact with your Basalt prompts.
For a complete working example, check out our [Prompt SDK Demo Notebook](./examples/prompt_sdk_demo.ipynb).
### Available Methods
#### Prompts
Your Basalt instance exposes a `prompt` property for interacting with your Basalt prompts:
- **Get a Prompt**
Retrieve a specific prompt using a slug, and optional filters `tag` and `version`. Without tag or version, the production version of your prompt is selected by default.
**Example Usage:**
```python
error, result = basalt.prompt.get('prompt-slug')
# With optional tag or version parameters
error, result = basalt.prompt.get(slug='prompt-slug', tag='latest')
error, result = basalt.prompt.get(slug='prompt-slug', version='1.0.0')
# If your prompt has variables,
# pass them when fetching your prompt
error, result = basalt.prompt.get(slug='prompt-slug', variables={ 'name': 'John Doe' })
# Handle the result by unwrapping the error / value
if error:
print('Could not fetch prompt', error)
else:
# Use the prompt with your AI provider of choice
# Example: OpenAI
openai_client.chat_completion.create(
model='gpt-4',
messages=[{'role': 'user', 'content': result.prompt}]
)
```
## Monitor SDK
The Monitor SDK allows you to track and monitor your AI application's execution through traces, logs, and generations.
For a complete working example, check out our [Monitor SDK Demo Notebook](./examples/monitor_sdk_demo.ipynb).
### Creating a Trace
A trace represents a complete execution flow in your application:
```python
# Create a trace
trace = basalt.monitor.create_trace(
"slug", # Chain slug - identifies this type of workflow
{
"input": "What are the benefits of AI in healthcare?",
"user": {"id": "user123", "name": "John Doe"},
"organization": {"id": "org123", "name": "Healthcare Inc"},
"metadata": {"source": "web", "priority": "high"}
}
)
```
### Adding Logs to a Trace
Logs represent individual steps or operations within a trace:
```python
# Create a log for content moderation
moderation_log = trace.create_log({
"type": "span",
"name": "content-moderation",
"input": trace.input,
"metadata": {"model": "text-moderation-latest"},
"user": {"id": "user123", "name": "John Doe"},
"organization": {"id": "org123", "name": "Healthcare Inc"}
})
# Update and end the log
moderation_log.update({"metadata": {"completed": True}})
moderation_log.end({"flagged": False, "categories": [], "scores": {}})
```
### Creating and Managing Generations
Generations are special types of logs specifically for AI model interactions:
```python
# Create a log for the main processing
main_log = trace.create_log({
"type": "span",
"name": "main-processing",
"user": {"id": "user123", "name": "John Doe"},
"organization": {"id": "org123", "name": "Healthcare Inc"},
"input": trace.input
})
# Create a generation within the main log using a prompt from Basalt
generation = main_log.create_generation({
"name": "healthcare-benefits-generation",
"input": trace.input,
"prompt": {
"slug": "prompt-slug", # This tells the SDK to fetch the prompt from Basalt
"version": "0.1" # This specifies the version to use
}
})
# Or create a generation not managed in Basalt
generation = main_log.create_generation({
"name": "healthcare-benefits-generation",
"user": {"id": "user123", "name": "John Doe"},
"organization": {"id": "org123", "name": "Healthcare Inc"},
"input": trace.input
})
# End the generation with the response
generation.end("AI generated response")
# End the log and trace
main_log.end("Final output")
trace.end("End of trace")
```
### Complex Workflows with Nested Logs
You can create complex workflows with nested logs and multiple generations:
```python
# Create a nested log
nested_log = parent_log.create_log({
"type": "span",
"name": "nested-process",
"metadata": {"key": "value"},
"input": parent_log.input
})
# Create generations within nested logs
nested_generation = nested_log.create_generation({
"name": "nested-generation",
"input": nested_log.input,
"prompt": {"slug": "another-prompt", "version": "0.1"},
"variables": {"variable_example": "test variable"}
})
# End all logs in reverse order
nested_generation.end("Generation output")
nested_log.end("Nested log output")
parent_log.end("Parent log output")
trace.end("End of trace")
```
## License
MIT
Raw data
{
"_id": null,
"home_page": "https://github.com/basalt-ai/basalt-python",
"name": "basalt-sdk",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": "basalt, ai, sdk, python",
"author": "Basalt",
"author_email": "support@getbasalt.ai",
"download_url": "https://files.pythonhosted.org/packages/9a/72/c6921de52d33cf3ea724b1eb7368bcde0701d4d79008ef24e73d6db313fd/basalt_sdk-0.3.0.tar.gz",
"platform": null,
"description": "# Basalt SDK\n\nBasalt is a powerful tool for managing AI prompts, monitoring AI applications, and their release workflows. This SDK is the official Python package for interacting with your Basalt prompts and monitoring your AI applications.\n\n## Installation\n\nInstall the Basalt SDK via pip:\n\n```bash\npip install basalt-sdk\n```\n\n## Usage\n\n### Importing and Initializing the SDK\n\nTo get started, import the `Basalt` class and initialize it with your API key:\n\n```python\nfrom basalt import Basalt\n\n# Basic initialization with API key\nbasalt = Basalt(api_key=\"my-dev-api-key\")\n\n# Specify a log_level\nbasalt = Basalt(api_key=\"my-dev-api-key\", log_level=\"debug\")\n\n# Or with an environment variable\nimport os\nbasalt = Basalt(api_key=os.getenv(\"BASALT_API_KEY\"))\n```\n\n## Prompt SDK\n\nThe Prompt SDK allows you to interact with your Basalt prompts.\n\nFor a complete working example, check out our [Prompt SDK Demo Notebook](./examples/prompt_sdk_demo.ipynb).\n\n### Available Methods\n\n#### Prompts\nYour Basalt instance exposes a `prompt` property for interacting with your Basalt prompts:\n\n- **Get a Prompt**\n\n Retrieve a specific prompt using a slug, and optional filters `tag` and `version`. Without tag or version, the production version of your prompt is selected by default.\n\n **Example Usage:**\n\n ```python\n error, result = basalt.prompt.get('prompt-slug')\n\n # With optional tag or version parameters\n error, result = basalt.prompt.get(slug='prompt-slug', tag='latest')\n error, result = basalt.prompt.get(slug='prompt-slug', version='1.0.0')\n\n # If your prompt has variables,\n # pass them when fetching your prompt\n error, result = basalt.prompt.get(slug='prompt-slug', variables={ 'name': 'John Doe' })\n\n # Handle the result by unwrapping the error / value\n if error:\n print('Could not fetch prompt', error)\n else:\n # Use the prompt with your AI provider of choice\n # Example: OpenAI\n openai_client.chat_completion.create(\n model='gpt-4',\n messages=[{'role': 'user', 'content': result.prompt}]\n )\n ```\n\n## Monitor SDK\n\nThe Monitor SDK allows you to track and monitor your AI application's execution through traces, logs, and generations.\n\nFor a complete working example, check out our [Monitor SDK Demo Notebook](./examples/monitor_sdk_demo.ipynb).\n\n### Creating a Trace\n\nA trace represents a complete execution flow in your application:\n\n```python\n# Create a trace\ntrace = basalt.monitor.create_trace(\n \"slug\", # Chain slug - identifies this type of workflow\n {\n \"input\": \"What are the benefits of AI in healthcare?\",\n \"user\": {\"id\": \"user123\", \"name\": \"John Doe\"},\n \"organization\": {\"id\": \"org123\", \"name\": \"Healthcare Inc\"},\n \"metadata\": {\"source\": \"web\", \"priority\": \"high\"}\n }\n)\n```\n\n### Adding Logs to a Trace\n\nLogs represent individual steps or operations within a trace:\n\n```python\n# Create a log for content moderation\nmoderation_log = trace.create_log({\n \"type\": \"span\",\n \"name\": \"content-moderation\",\n \"input\": trace.input,\n \"metadata\": {\"model\": \"text-moderation-latest\"},\n \"user\": {\"id\": \"user123\", \"name\": \"John Doe\"},\n \"organization\": {\"id\": \"org123\", \"name\": \"Healthcare Inc\"}\n})\n\n# Update and end the log\nmoderation_log.update({\"metadata\": {\"completed\": True}})\nmoderation_log.end({\"flagged\": False, \"categories\": [], \"scores\": {}})\n```\n\n### Creating and Managing Generations\n\nGenerations are special types of logs specifically for AI model interactions:\n\n```python\n# Create a log for the main processing\nmain_log = trace.create_log({\n \"type\": \"span\",\n \"name\": \"main-processing\",\n \"user\": {\"id\": \"user123\", \"name\": \"John Doe\"},\n \"organization\": {\"id\": \"org123\", \"name\": \"Healthcare Inc\"},\n \"input\": trace.input\n})\n\n# Create a generation within the main log using a prompt from Basalt\ngeneration = main_log.create_generation({\n \"name\": \"healthcare-benefits-generation\",\n \"input\": trace.input,\n \"prompt\": {\n \"slug\": \"prompt-slug\", # This tells the SDK to fetch the prompt from Basalt\n \"version\": \"0.1\" # This specifies the version to use\n }\n})\n\n# Or create a generation not managed in Basalt\ngeneration = main_log.create_generation({\n \"name\": \"healthcare-benefits-generation\",\n \"user\": {\"id\": \"user123\", \"name\": \"John Doe\"},\n \"organization\": {\"id\": \"org123\", \"name\": \"Healthcare Inc\"},\n \"input\": trace.input\n})\n\n# End the generation with the response\ngeneration.end(\"AI generated response\")\n\n# End the log and trace\nmain_log.end(\"Final output\")\ntrace.end(\"End of trace\")\n```\n\n### Complex Workflows with Nested Logs\n\nYou can create complex workflows with nested logs and multiple generations:\n\n```python\n# Create a nested log\nnested_log = parent_log.create_log({\n \"type\": \"span\",\n \"name\": \"nested-process\",\n \"metadata\": {\"key\": \"value\"},\n \"input\": parent_log.input\n})\n\n# Create generations within nested logs\nnested_generation = nested_log.create_generation({\n \"name\": \"nested-generation\",\n \"input\": nested_log.input,\n \"prompt\": {\"slug\": \"another-prompt\", \"version\": \"0.1\"},\n \"variables\": {\"variable_example\": \"test variable\"}\n})\n\n# End all logs in reverse order\nnested_generation.end(\"Generation output\")\nnested_log.end(\"Nested log output\")\nparent_log.end(\"Parent log output\")\ntrace.end(\"End of trace\")\n```\n\n## License\n\nMIT\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Basalt SDK for python",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/basalt-ai/basalt-python"
},
"split_keywords": [
"basalt",
" ai",
" sdk",
" python"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "18ecdff6cdf9f6ebdb401bb6417763c28bf57ddb36cbeef819ee52020bad2db9",
"md5": "ceb06d8b2265d2a2af319c0f64eee91e",
"sha256": "3de3851ef2d0b7e06a7557bc6e9056093d3ca0111c64a446aeeec47c01f4cb26"
},
"downloads": -1,
"filename": "basalt_sdk-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ceb06d8b2265d2a2af319c0f64eee91e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 45089,
"upload_time": "2025-08-08T12:30:19",
"upload_time_iso_8601": "2025-08-08T12:30:19.081441Z",
"url": "https://files.pythonhosted.org/packages/18/ec/dff6cdf9f6ebdb401bb6417763c28bf57ddb36cbeef819ee52020bad2db9/basalt_sdk-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "9a72c6921de52d33cf3ea724b1eb7368bcde0701d4d79008ef24e73d6db313fd",
"md5": "5ccb1c094b3ea1f03bed35bbfa1f6051",
"sha256": "633bda9e835f13f48bc3e13b12f4bf91889699e3c4b9d906f11f89a9a52a077e"
},
"downloads": -1,
"filename": "basalt_sdk-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "5ccb1c094b3ea1f03bed35bbfa1f6051",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 41327,
"upload_time": "2025-08-08T12:30:20",
"upload_time_iso_8601": "2025-08-08T12:30:20.251331Z",
"url": "https://files.pythonhosted.org/packages/9a/72/c6921de52d33cf3ea724b1eb7368bcde0701d4d79008ef24e73d6db313fd/basalt_sdk-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-08 12:30:20",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "basalt-ai",
"github_project": "basalt-python",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "basalt-sdk"
}