[](https://badge.fury.io/py/gravixlayer)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
The official Python SDK for the [GravixLayer API](https://gravixlayer.com). This library provides convenient access to the GravixLayer REST API from any Python 3.7+ application. The library includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).
## Installation
### PyPI
```bash
pip install gravixlayer
```
## Quick Start
The GravixLayer Python SDK is designed to be compatible with OpenAI's interface, making it easy to switch between providers.
### Synchronous Usage
```python
import os
from gravixlayer import GravixLayer
client = GravixLayer(api_key=os.environ.get("GRAVIXLAYER_API_KEY"))
completion = client.chat.completions.create(
model="llama3.1:8b",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What are the three most popular programming languages?"}
]
)
print(completion.choices[0].message.content)
```
### Asynchronous Usage
```python
import asyncio
import os
from gravixlayer import AsyncGravixLayer
async def main():
client = AsyncGravixLayer(api_key=os.environ.get("GRAVIXLAYER_API_KEY"))
completion = await client.chat.completions.create(
model="llama3.1:8b",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What's the capital of France?"}
]
)
print(completion.choices[0].message.content)
asyncio.run(main())
```
## API Reference
### Chat Completions
Create chat completions with various models available on GravixLayer.
```python
completion = client.chat.completions.create(
model="llama3.1:8b",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Tell me a fun fact about space"}
],
temperature=0.7,
max_tokens=150,
top_p=1.0,
frequency_penalty=0,
presence_penalty=0,
stop=None,
stream=False
)
print(completion.choices[0].message.content)
```
#### Available Parameters
| Parameter | Type | Description |
|-----------|------|-------------|
| `model` | `str` | Model to use for completion |
| `messages` | `List[Dict]` | List of messages in the conversation |
| `temperature` | `float` | Controls randomness (0.0 to 2.0) |
| `max_tokens` | `int` | Maximum number of tokens to generate |
| `top_p` | `float` | Nucleus sampling parameter |
| `frequency_penalty` | `float` | Penalty for frequent tokens |
| `presence_penalty` | `float` | Penalty for present tokens |
| `stop` | `str \| List[str]` | Stop sequences |
| `stream` | `bool` | Enable streaming responses |
### Streaming Responses
Stream responses in real-time for a better user experience:
```python
stream = client.chat.completions.create(
model="llama3.1:8b",
messages=[
{"role": "user", "content": "Tell me about the Eiffel Tower"}
],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content is not None:
print(chunk.choices[0].delta.content, end="", flush=True)
```
#### Async Streaming
```python
async def stream_chat():
client = AsyncGravixLayer(api_key="your_api_key")
stream = client.chat.completions.create(
model="llama3.1:8b",
messages=[{"role": "user", "content": "Tell me about Python"}],
stream=True
)
async for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)
```
### Error Handling
The SDK includes comprehensive error handling:
```python
from gravixlayer import GravixLayer
from gravixlayer.types.exceptions import (
GravixLayerError,
GravixLayerAuthenticationError,
GravixLayerRateLimitError,
GravixLayerBadRequestError
)
client = GravixLayer(api_key="your_api_key")
try:
completion = client.chat.completions.create(
model="llama3.1:8b",
messages=[{"role": "user", "content": "Hello!"}]
)
except GravixLayerAuthenticationError:
print("Invalid API key")
except GravixLayerRateLimitError:
print("Rate limit exceeded")
except GravixLayerBadRequestError as e:
print(f"Bad request: {e}")
except GravixLayerError as e:
print(f"API error: {e}")
```
### Command Line Interface
The SDK includes a CLI for quick testing:
```bash
# Basic chat completion
python -m gravixlayer.cli --model "llama3.1:8b" --user "Hello, how are you?"
# Streaming response
python -m gravixlayer.cli --model "llama3.1:8b" --user "Tell me a story" --stream
# With system message
python -m gravixlayer.cli --model "llama3.1:8b" --system "You are a poet" --user "Write a haiku"
```
## Configuration
### API Key
Set your API key using environment variables:
#### Set API key (Linux/macOS)
```bash
export GRAVIXLAYER_API_KEY="your_api_key_here"
```
or
#### Set API key (Windows PowerShell)
```bash
$env:GRAVIXLAYER_API_KEY="your_api_key_here"
```
Or pass it directly when initializing the client:
```python
client = GravixLayer(api_key="your_api_key_here")
```
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## Changelog
See [CHANGELOG.md](CHANGELOG.md) for a detailed history of changes.
Raw data
{
"_id": null,
"home_page": "https://github.com/sukrithpvs/gravixlayer-python",
"name": "gravixlayer",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "gravixlayer, openai, llm, ai, api, sdk",
"author": "Sukrith",
"author_email": "Sukrith <sukrithpvs@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/c2/7c/ed98700693e17fb26edddaa5ef053ec1121c4d1b77d63e05cca7289a7b2b/gravixlayer-0.0.15.tar.gz",
"platform": null,
"description": "\n[](https://badge.fury.io/py/gravixlayer)\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n\nThe official Python SDK for the [GravixLayer API](https://gravixlayer.com). This library provides convenient access to the GravixLayer REST API from any Python 3.7+ application. The library includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).\n\n\n## Installation\n\n### PyPI\n\n```bash\npip install gravixlayer\n```\n\n## Quick Start\n\nThe GravixLayer Python SDK is designed to be compatible with OpenAI's interface, making it easy to switch between providers.\n\n### Synchronous Usage\n\n```python\nimport os\nfrom gravixlayer import GravixLayer\n\nclient = GravixLayer(api_key=os.environ.get(\"GRAVIXLAYER_API_KEY\"))\n\ncompletion = client.chat.completions.create(\n model=\"llama3.1:8b\",\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n {\"role\": \"user\", \"content\": \"What are the three most popular programming languages?\"}\n ]\n)\n\nprint(completion.choices[0].message.content)\n```\n\n### Asynchronous Usage\n\n```python\nimport asyncio\nimport os\nfrom gravixlayer import AsyncGravixLayer\n\nasync def main():\n client = AsyncGravixLayer(api_key=os.environ.get(\"GRAVIXLAYER_API_KEY\"))\n \n completion = await client.chat.completions.create(\n model=\"llama3.1:8b\",\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n {\"role\": \"user\", \"content\": \"What's the capital of France?\"}\n ]\n )\n \n print(completion.choices[0].message.content)\n\nasyncio.run(main())\n```\n\n## API Reference\n\n### Chat Completions\n\nCreate chat completions with various models available on GravixLayer.\n\n```python\ncompletion = client.chat.completions.create(\n model=\"llama3.1:8b\",\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n {\"role\": \"user\", \"content\": \"Tell me a fun fact about space\"}\n ],\n temperature=0.7,\n max_tokens=150,\n top_p=1.0,\n frequency_penalty=0,\n presence_penalty=0,\n stop=None,\n stream=False\n)\n\nprint(completion.choices[0].message.content)\n```\n\n#### Available Parameters\n\n| Parameter | Type | Description |\n|-----------|------|-------------|\n| `model` | `str` | Model to use for completion |\n| `messages` | `List[Dict]` | List of messages in the conversation |\n| `temperature` | `float` | Controls randomness (0.0 to 2.0) |\n| `max_tokens` | `int` | Maximum number of tokens to generate |\n| `top_p` | `float` | Nucleus sampling parameter |\n| `frequency_penalty` | `float` | Penalty for frequent tokens |\n| `presence_penalty` | `float` | Penalty for present tokens |\n| `stop` | `str \\| List[str]` | Stop sequences |\n| `stream` | `bool` | Enable streaming responses |\n\n### Streaming Responses\n\nStream responses in real-time for a better user experience:\n\n```python\nstream = client.chat.completions.create(\n model=\"llama3.1:8b\",\n messages=[\n {\"role\": \"user\", \"content\": \"Tell me about the Eiffel Tower\"}\n ],\n stream=True\n)\n\nfor chunk in stream:\n if chunk.choices[0].delta.content is not None:\n print(chunk.choices[0].delta.content, end=\"\", flush=True)\n```\n\n#### Async Streaming\n\n```python\nasync def stream_chat():\n client = AsyncGravixLayer(api_key=\"your_api_key\")\n \n stream = client.chat.completions.create(\n model=\"llama3.1:8b\",\n messages=[{\"role\": \"user\", \"content\": \"Tell me about Python\"}],\n stream=True\n )\n \n async for chunk in stream:\n if chunk.choices[0].delta.content:\n print(chunk.choices[0].delta.content, end=\"\", flush=True)\n```\n\n### Error Handling\n\nThe SDK includes comprehensive error handling:\n\n```python\nfrom gravixlayer import GravixLayer\nfrom gravixlayer.types.exceptions import (\n GravixLayerError,\n GravixLayerAuthenticationError,\n GravixLayerRateLimitError,\n GravixLayerBadRequestError\n)\n\nclient = GravixLayer(api_key=\"your_api_key\")\n\ntry:\n completion = client.chat.completions.create(\n model=\"llama3.1:8b\",\n messages=[{\"role\": \"user\", \"content\": \"Hello!\"}]\n )\nexcept GravixLayerAuthenticationError:\n print(\"Invalid API key\")\nexcept GravixLayerRateLimitError:\n print(\"Rate limit exceeded\")\nexcept GravixLayerBadRequestError as e:\n print(f\"Bad request: {e}\")\nexcept GravixLayerError as e:\n print(f\"API error: {e}\")\n```\n\n### Command Line Interface\n\nThe SDK includes a CLI for quick testing:\n\n```bash\n# Basic chat completion\npython -m gravixlayer.cli --model \"llama3.1:8b\" --user \"Hello, how are you?\"\n\n# Streaming response\npython -m gravixlayer.cli --model \"llama3.1:8b\" --user \"Tell me a story\" --stream\n\n# With system message\npython -m gravixlayer.cli --model \"llama3.1:8b\" --system \"You are a poet\" --user \"Write a haiku\"\n```\n\n## Configuration\n\n### API Key\n\nSet your API key using environment variables:\n\n#### Set API key (Linux/macOS)\n```bash\nexport GRAVIXLAYER_API_KEY=\"your_api_key_here\"\n```\n\nor \n\n#### Set API key (Windows PowerShell)\n```bash\n$env:GRAVIXLAYER_API_KEY=\"your_api_key_here\"\n```\n\nOr pass it directly when initializing the client:\n\n```python\nclient = GravixLayer(api_key=\"your_api_key_here\")\n```\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Changelog\n\nSee [CHANGELOG.md](CHANGELOG.md) for a detailed history of changes.\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "GravixLayer Python SDK - OpenAI Compatible",
"version": "0.0.15",
"project_urls": {
"Homepage": "https://github.com/sukrithpvs/gravixlayer-python",
"Issues": "https://github.com/sukrithpvs/gravixlayer-python/issues",
"Repository": "https://github.com/sukrithpvs/gravixlayer-python"
},
"split_keywords": [
"gravixlayer",
" openai",
" llm",
" ai",
" api",
" sdk"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "789a6e8ca20de8227550de611d114d8b6a4e03245c52ce75e9a283c5c20ab3d5",
"md5": "94166deb3b884dd43bddd939d1156a98",
"sha256": "467e2b6ed7bc8d1beedb55f488f10f8b43d02ae361eb7b25f34adab98bf6e390"
},
"downloads": -1,
"filename": "gravixlayer-0.0.15-py3-none-any.whl",
"has_sig": false,
"md5_digest": "94166deb3b884dd43bddd939d1156a98",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 17778,
"upload_time": "2025-08-20T16:57:44",
"upload_time_iso_8601": "2025-08-20T16:57:44.192567Z",
"url": "https://files.pythonhosted.org/packages/78/9a/6e8ca20de8227550de611d114d8b6a4e03245c52ce75e9a283c5c20ab3d5/gravixlayer-0.0.15-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "c27ced98700693e17fb26edddaa5ef053ec1121c4d1b77d63e05cca7289a7b2b",
"md5": "3f97ea14bd697e6df42fd7bf11d9e16a",
"sha256": "2334395ef623d9b4d9f44567b96c56eda12aec243bd0872f2895b92c937b0ce8"
},
"downloads": -1,
"filename": "gravixlayer-0.0.15.tar.gz",
"has_sig": false,
"md5_digest": "3f97ea14bd697e6df42fd7bf11d9e16a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 24480,
"upload_time": "2025-08-20T16:57:45",
"upload_time_iso_8601": "2025-08-20T16:57:45.385250Z",
"url": "https://files.pythonhosted.org/packages/c2/7c/ed98700693e17fb26edddaa5ef053ec1121c4d1b77d63e05cca7289a7b2b/gravixlayer-0.0.15.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-20 16:57:45",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "sukrithpvs",
"github_project": "gravixlayer-python",
"github_not_found": true,
"lcname": "gravixlayer"
}