# fal.ai Python client
This is a Python client library for interacting with ML models deployed on [fal.ai](https://fal.ai).
## Getting started
To install the client, run:
```bash
pip install fal-client
```
To use the client, you need to have an API key. You can get one by signing up at [fal.ai](https://fal.ai). Once you have it, set
it as an environment variable:
```bash
export FAL_KEY=your-api-key
```
Now you can use the client to interact with your models. Here's an example of how to use it:
```python
import fal_client
response = fal_client.run("fal-ai/fast-sdxl", arguments={"prompt": "a cute cat, realistic, orange"})
print(response["images"][0]["url"])
```
## Asynchronous requests
The client also supports asynchronous requests out of the box. Here's an example:
```python
import asyncio
import fal_client
async def main():
response = await fal_client.run_async("fal-ai/fast-sdxl", arguments={"prompt": "a cute cat, realistic, orange"})
print(response["images"][0]["url"])
asyncio.run(main())
```
## Uploading files
If the model requires files as input, you can upload them directly to fal.media (our CDN) and pass the URLs to the client. Here's an example:
```python
import fal_client
audio_url = fal_client.upload_file("path/to/audio.wav")
response = fal_client.run("fal-ai/whisper", arguments={"audio_url": audio_url})
print(response["text"])
```
## Encoding files as in-memory data URLs
If you don't want to upload your file to our CDN service (for latency reasons, for example), you can encode it as a data URL and pass it directly to the client. Here's an example:
```python
import fal_client
audio_data_url = fal_client.encode_file("path/to/audio.wav")
response = fal_client.run("fal-ai/whisper", arguments={"audio_url": audio_data_url})
print(response["text"])
```
## Queuing requests
When you want to send a request and keep receiving updates on its status, you can use the `submit` method. Here's an example:
```python
import asyncio
import fal_client
async def main():
response = await fal_client.submit_async("fal-ai/fast-sdxl", arguments={"prompt": "a cute cat, realistic, orange"})
logs_index = 0
async for event in response.iter_events(with_logs=True):
if isinstance(event, fal_client.Queued):
print("Queued. Position:", event.position)
elif isinstance(event, (fal_client.InProgress, fal_client.Completed)):
new_logs = event.logs[logs_index:]
for log in new_logs:
print(log["message"])
logs_index = len(event.logs)
result = await response.get()
print(result["images"][0]["url"])
asyncio.run(main())
```
Raw data
{
"_id": null,
"home_page": null,
"name": "fal-client",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Features & Labels <support@fal.ai>",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/91/5d/9558b1ca66d9bd5ecf25aaf52b805ec1ff73b3a9f246d3fe940e0b9fef9d/fal_client-0.5.7.tar.gz",
"platform": null,
"description": "# fal.ai Python client\n\nThis is a Python client library for interacting with ML models deployed on [fal.ai](https://fal.ai).\n\n## Getting started\n\nTo install the client, run:\n\n```bash\npip install fal-client\n```\n\nTo use the client, you need to have an API key. You can get one by signing up at [fal.ai](https://fal.ai). Once you have it, set\nit as an environment variable:\n\n```bash\nexport FAL_KEY=your-api-key\n```\n\nNow you can use the client to interact with your models. Here's an example of how to use it:\n\n```python\nimport fal_client\n\nresponse = fal_client.run(\"fal-ai/fast-sdxl\", arguments={\"prompt\": \"a cute cat, realistic, orange\"})\nprint(response[\"images\"][0][\"url\"])\n```\n\n## Asynchronous requests\n\nThe client also supports asynchronous requests out of the box. Here's an example:\n\n```python\nimport asyncio\nimport fal_client\n\nasync def main():\n response = await fal_client.run_async(\"fal-ai/fast-sdxl\", arguments={\"prompt\": \"a cute cat, realistic, orange\"})\n print(response[\"images\"][0][\"url\"])\n\n\nasyncio.run(main())\n```\n\n## Uploading files\n\nIf the model requires files as input, you can upload them directly to fal.media (our CDN) and pass the URLs to the client. Here's an example:\n\n```python\nimport fal_client\n\naudio_url = fal_client.upload_file(\"path/to/audio.wav\")\nresponse = fal_client.run(\"fal-ai/whisper\", arguments={\"audio_url\": audio_url})\nprint(response[\"text\"])\n```\n\n## Encoding files as in-memory data URLs\n\nIf you don't want to upload your file to our CDN service (for latency reasons, for example), you can encode it as a data URL and pass it directly to the client. Here's an example:\n\n```python\nimport fal_client\n\naudio_data_url = fal_client.encode_file(\"path/to/audio.wav\")\nresponse = fal_client.run(\"fal-ai/whisper\", arguments={\"audio_url\": audio_data_url})\nprint(response[\"text\"])\n```\n\n## Queuing requests\n\nWhen you want to send a request and keep receiving updates on its status, you can use the `submit` method. Here's an example:\n\n```python\nimport asyncio\nimport fal_client\n\nasync def main():\n response = await fal_client.submit_async(\"fal-ai/fast-sdxl\", arguments={\"prompt\": \"a cute cat, realistic, orange\"})\n\n logs_index = 0\n async for event in response.iter_events(with_logs=True):\n if isinstance(event, fal_client.Queued):\n print(\"Queued. Position:\", event.position)\n elif isinstance(event, (fal_client.InProgress, fal_client.Completed)):\n new_logs = event.logs[logs_index:]\n for log in new_logs:\n print(log[\"message\"])\n logs_index = len(event.logs)\n\n result = await response.get()\n print(result[\"images\"][0][\"url\"])\n\n\nasyncio.run(main())\n```\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Python client for fal.ai",
"version": "0.5.7",
"project_urls": {
"homepage": "https://fal.ai",
"repository": "https://github.com/fal-ai/fal"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "af14c053f2ddc5c96539c7c2da1795e26ee62326a1f2cbf38d98d7854c6c8a04",
"md5": "ab2d8467756364a56e338086e27f86b6",
"sha256": "11de25fa5c8d70e0b2974f2b7d69bbfcf7c8acb53b128ed5825e98409c60e680"
},
"downloads": -1,
"filename": "fal_client-0.5.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ab2d8467756364a56e338086e27f86b6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 8526,
"upload_time": "2025-01-20T18:03:21",
"upload_time_iso_8601": "2025-01-20T18:03:21.391742Z",
"url": "https://files.pythonhosted.org/packages/af/14/c053f2ddc5c96539c7c2da1795e26ee62326a1f2cbf38d98d7854c6c8a04/fal_client-0.5.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "915d9558b1ca66d9bd5ecf25aaf52b805ec1ff73b3a9f246d3fe940e0b9fef9d",
"md5": "9847484cb21afdb2cc406012fc480da6",
"sha256": "ea4d103e7788d1b0c3d6b8ddaa16d2973edea6dd0e975b248164cc7caa9a5a21"
},
"downloads": -1,
"filename": "fal_client-0.5.7.tar.gz",
"has_sig": false,
"md5_digest": "9847484cb21afdb2cc406012fc480da6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 11966,
"upload_time": "2025-01-20T18:03:25",
"upload_time_iso_8601": "2025-01-20T18:03:25.479752Z",
"url": "https://files.pythonhosted.org/packages/91/5d/9558b1ca66d9bd5ecf25aaf52b805ec1ff73b3a9f246d3fe940e0b9fef9d/fal_client-0.5.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-20 18:03:25",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "fal-ai",
"github_project": "fal",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "fal-client"
}