zenoml-next


Namezenoml-next JSON
Version 0.6.8 PyPI version JSON
download
home_pagehttps://github.com/trojblue
SummaryUpkeeping the now archived Zeno, the AI Data Management & Evaluation Platform
upload_time2024-12-21 21:36:55
maintainerNone
docs_urlNone
authorÁngel Alexander Cabrera
requires_python<4.0,>=3.8
licenseMIT
keywords ml testing evaluation machine learning ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ZenoML-Next

This fork of ZenoML aims to maintain the existing functionality while adding small, practical features for personal use.

## Features

- **New entry point**: `zeno_next()` simplifies usage in notebooks with fixed inputs.
- **S3 URI support**: Enables S3 paths as the image column (requires `aws configure` on the host machine).

### Example: Using `zeno_next`

```python
import pandas as pd
from zeno import zeno_next

df = pd.read_parquet("tmp_df.parquet")
print(len(df))

# Use local paths:
# data_column = "filename"
# MOUNT_PATH = "/root/danbooru_images"

# Use S3 URI:
data_column = "s3_uri"
MOUNT_PATH = ""

ZENO_PORT = 8019

# Clear the cache if needed:
# !rm -rf .zeno_cache/
zeno_next(df, data_column, MOUNT_PATH, ZENO_PORT)
```


## Upkeeps

- bump `pydantic` to version 2.x

## Building Repo

```bash
# see DEVELOPMENT.md for setup repo
python -m pip install build twine
python -m build
twine check dist/*
twine upload dist/*
```

<br>
<br>
<br>


## Original README


<img src="https://zenoml.com/img/zeno.png" width="250px"/>

[![PyPI version](https://badge.fury.io/py/zenoml-next.svg)](https://badge.fury.io/py/zenoml-next)
[![MIT license](https://img.shields.io/badge/License-MIT-blue.svg)](https://lbesson.mit-license.org/)
[![DOI](https://img.shields.io/badge/doi-10.1145%2F3544548.3581268-red)](https://cabreraalex.com/paper/zeno)

Zeno is a general-purpose framework for evaluating machine learning models.
It combines a **Python API** with an **interactive UI** to allow users to discover, explore, and analyze the performance of their models across diverse use cases.
Zeno can be used for any data type or task with [modular views](https://zenoml.com/docs/views/) for everything from object detection to audio transcription.

### Demos

|                                    **Image Classification**                                     |                                         **Audio Transcription**                                          |                                       **Image Generation**                                       |                                        **Dataset Chatbot**                                        |                                       **Sensor Classification**                                        |
| :---------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------: | :-----------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------: |
|                                           Imagenette                                            |                                          Speech Accent Archive                                           |                                           DiffusionDB                                            |                                        LangChain + Notion                                         |                                              MotionSense                                               |
| [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-imagenette.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-audio-transcription.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-diffusiondb.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-langchain-qa.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-imu-classification.hf.space) |
|               [code](https://huggingface.co/spaces/zeno-ml/imagenette/tree/main)                |               [code](https://huggingface.co/spaces/zeno-ml/audio-transcription/tree/main)                |               [code](https://huggingface.co/spaces/zeno-ml/diffusiondb/tree/main)                |            [code](https://huggingface.co/spaces/zeno-ml/audio-transcription/tree/main)            |               [code](https://huggingface.co/spaces/zeno-ml/imu-classification/tree/main)               |

<br />

https://user-images.githubusercontent.com/4563691/220689691-1ad7c184-02db-4615-b5ac-f52b8d5b8ea3.mp4

## Quickstart

Install the Zeno Python package from PyPI:

```bash
pip install zenoml
```

### Command Line

To get started, run the following command to initialize a Zeno project. It will walk you through creating the `zeno.toml` configuration file:

```bash
zeno init
```

Take a look at the [configuration documentation](https://zenoml.com/docs/configuration) for additional `toml` file options like adding model functions.

Start Zeno with `zeno zeno.toml`.

### Jupyter Notebook

You can also run Zeno directly from Jupyter notebooks or lab. The `zeno` command takes a dictionary of configuration options as input. See [the docs](https://zenoml.com/docs/configuration) for a full list of options. In this example we pass the minimum options for exploring a non-tabular dataset:

```python
import pandas as pd
from zeno import zeno

df = pd.read_csv("/path/to/metadata/file.csv")

zeno({
    "metadata": df, # Pandas DataFrame with a row for each instance
    "view": "audio-transcription", # The type of view for this data/task
    "data_path": "/path/to/raw/data/", # The folder with raw data (images, audio, etc.)
    "data_column": "id" # The column in the metadata file that contains the relative paths of files in data_path
})

```

You can pass a list of decorated function references directly Zeno as you add models and metrics.

## Citation

Please reference our [CHI'23 paper](https://arxiv.org/pdf/2302.04732.pdf)

```bibtex
@inproceedings{cabrera23zeno,
  author = {Cabrera, Ángel Alexander and Fu, Erica and Bertucci, Donald and Holstein, Kenneth and Talwalkar, Ameet and Hong, Jason I. and Perer, Adam},
  title = {Zeno: An Interactive Framework for Behavioral Evaluation of Machine Learning},
  year = {2023},
  isbn = {978-1-4503-9421-5/23/04},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  url = {https://doi.org/10.1145/3544548.3581268},
  doi = {10.1145/3544548.3581268},
  booktitle = {CHI Conference on Human Factors in Computing Systems},
  location = {Hamburg, Germany},
  series = {CHI '23}
}
```

## Community

Chat with us on our [Discord channel](https://discord.gg/km62pDKAkE) or leave an issue on this repository if you run into any issues or have a request!


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/trojblue",
    "name": "zenoml-next",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": "ml, testing, evaluation, machine learning, ai",
    "author": "\u00c1ngel Alexander Cabrera",
    "author_email": "alex.cabrera@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/8e/16/04bb44bfb240c94d16a3f17225e52934d0a663e702663a44a479f6da8d0e/zenoml_next-0.6.8.tar.gz",
    "platform": null,
    "description": "# ZenoML-Next\n\nThis fork of ZenoML aims to maintain the existing functionality while adding small, practical features for personal use.\n\n## Features\n\n- **New entry point**: `zeno_next()` simplifies usage in notebooks with fixed inputs.\n- **S3 URI support**: Enables S3 paths as the image column (requires `aws configure` on the host machine).\n\n### Example: Using `zeno_next`\n\n```python\nimport pandas as pd\nfrom zeno import zeno_next\n\ndf = pd.read_parquet(\"tmp_df.parquet\")\nprint(len(df))\n\n# Use local paths:\n# data_column = \"filename\"\n# MOUNT_PATH = \"/root/danbooru_images\"\n\n# Use S3 URI:\ndata_column = \"s3_uri\"\nMOUNT_PATH = \"\"\n\nZENO_PORT = 8019\n\n# Clear the cache if needed:\n# !rm -rf .zeno_cache/\nzeno_next(df, data_column, MOUNT_PATH, ZENO_PORT)\n```\n\n\n## Upkeeps\n\n- bump `pydantic` to version 2.x\n\n## Building Repo\n\n```bash\n# see DEVELOPMENT.md for setup repo\npython -m pip install build twine\npython -m build\ntwine check dist/*\ntwine upload dist/*\n```\n\n<br>\n<br>\n<br>\n\n\n## Original README\n\n\n<img src=\"https://zenoml.com/img/zeno.png\" width=\"250px\"/>\n\n[![PyPI version](https://badge.fury.io/py/zenoml-next.svg)](https://badge.fury.io/py/zenoml-next)\n[![MIT license](https://img.shields.io/badge/License-MIT-blue.svg)](https://lbesson.mit-license.org/)\n[![DOI](https://img.shields.io/badge/doi-10.1145%2F3544548.3581268-red)](https://cabreraalex.com/paper/zeno)\n\nZeno is a general-purpose framework for evaluating machine learning models.\nIt combines a **Python API** with an **interactive UI** to allow users to discover, explore, and analyze the performance of their models across diverse use cases.\nZeno can be used for any data type or task with [modular views](https://zenoml.com/docs/views/) for everything from object detection to audio transcription.\n\n### Demos\n\n|                                    **Image Classification**                                     |                                         **Audio Transcription**                                          |                                       **Image Generation**                                       |                                        **Dataset Chatbot**                                        |                                       **Sensor Classification**                                        |\n| :---------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------: | :-----------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------: |\n|                                           Imagenette                                            |                                          Speech Accent Archive                                           |                                           DiffusionDB                                            |                                        LangChain + Notion                                         |                                              MotionSense                                               |\n| [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-imagenette.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-audio-transcription.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-diffusiondb.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-langchain-qa.hf.space/) | [![Try with Zeno](https://zenoml.com/img/zeno-badge.svg)](https://zeno-ml-imu-classification.hf.space) |\n|               [code](https://huggingface.co/spaces/zeno-ml/imagenette/tree/main)                |               [code](https://huggingface.co/spaces/zeno-ml/audio-transcription/tree/main)                |               [code](https://huggingface.co/spaces/zeno-ml/diffusiondb/tree/main)                |            [code](https://huggingface.co/spaces/zeno-ml/audio-transcription/tree/main)            |               [code](https://huggingface.co/spaces/zeno-ml/imu-classification/tree/main)               |\n\n<br />\n\nhttps://user-images.githubusercontent.com/4563691/220689691-1ad7c184-02db-4615-b5ac-f52b8d5b8ea3.mp4\n\n## Quickstart\n\nInstall the Zeno Python package from PyPI:\n\n```bash\npip install zenoml\n```\n\n### Command Line\n\nTo get started, run the following command to initialize a Zeno project. It will walk you through creating the `zeno.toml` configuration file:\n\n```bash\nzeno init\n```\n\nTake a look at the [configuration documentation](https://zenoml.com/docs/configuration) for additional `toml` file options like adding model functions.\n\nStart Zeno with `zeno zeno.toml`.\n\n### Jupyter Notebook\n\nYou can also run Zeno directly from Jupyter notebooks or lab. The `zeno` command takes a dictionary of configuration options as input. See [the docs](https://zenoml.com/docs/configuration) for a full list of options. In this example we pass the minimum options for exploring a non-tabular dataset:\n\n```python\nimport pandas as pd\nfrom zeno import zeno\n\ndf = pd.read_csv(\"/path/to/metadata/file.csv\")\n\nzeno({\n    \"metadata\": df, # Pandas DataFrame with a row for each instance\n    \"view\": \"audio-transcription\", # The type of view for this data/task\n    \"data_path\": \"/path/to/raw/data/\", # The folder with raw data (images, audio, etc.)\n    \"data_column\": \"id\" # The column in the metadata file that contains the relative paths of files in data_path\n})\n\n```\n\nYou can pass a list of decorated function references directly Zeno as you add models and metrics.\n\n## Citation\n\nPlease reference our [CHI'23 paper](https://arxiv.org/pdf/2302.04732.pdf)\n\n```bibtex\n@inproceedings{cabrera23zeno,\n  author = {Cabrera, \u00c1ngel Alexander and Fu, Erica and Bertucci, Donald and Holstein, Kenneth and Talwalkar, Ameet and Hong, Jason I. and Perer, Adam},\n  title = {Zeno: An Interactive Framework for Behavioral Evaluation of Machine Learning},\n  year = {2023},\n  isbn = {978-1-4503-9421-5/23/04},\n  publisher = {Association for Computing Machinery},\n  address = {New York, NY, USA},\n  url = {https://doi.org/10.1145/3544548.3581268},\n  doi = {10.1145/3544548.3581268},\n  booktitle = {CHI Conference on Human Factors in Computing Systems},\n  location = {Hamburg, Germany},\n  series = {CHI '23}\n}\n```\n\n## Community\n\nChat with us on our [Discord channel](https://discord.gg/km62pDKAkE) or leave an issue on this repository if you run into any issues or have a request!\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Upkeeping the now archived Zeno, the AI Data Management & Evaluation Platform",
    "version": "0.6.8",
    "project_urls": {
        "Homepage": "https://github.com/trojblue",
        "Repository": "https://github.com/trojblue/zeno-next"
    },
    "split_keywords": [
        "ml",
        " testing",
        " evaluation",
        " machine learning",
        " ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9ed474023870b18977cf52dfc262acc7c59f1b59d9415f103a82f1f8fc5f6707",
                "md5": "547c2db078b4afeadcc99c8fd449e56a",
                "sha256": "a0a81a97ba26d6c860dc2b42e8f2e152cc022029200b8107ba5d4da280eea8d9"
            },
            "downloads": -1,
            "filename": "zenoml_next-0.6.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "547c2db078b4afeadcc99c8fd449e56a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 700527,
            "upload_time": "2024-12-21T21:36:52",
            "upload_time_iso_8601": "2024-12-21T21:36:52.333569Z",
            "url": "https://files.pythonhosted.org/packages/9e/d4/74023870b18977cf52dfc262acc7c59f1b59d9415f103a82f1f8fc5f6707/zenoml_next-0.6.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8e1604bb44bfb240c94d16a3f17225e52934d0a663e702663a44a479f6da8d0e",
                "md5": "0df9d740600ee2814723bb309bb75751",
                "sha256": "a4c636a025303c7c2b99a12d0e2b551d6027e5376d79bd325de44610d21719e9"
            },
            "downloads": -1,
            "filename": "zenoml_next-0.6.8.tar.gz",
            "has_sig": false,
            "md5_digest": "0df9d740600ee2814723bb309bb75751",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 696565,
            "upload_time": "2024-12-21T21:36:55",
            "upload_time_iso_8601": "2024-12-21T21:36:55.775699Z",
            "url": "https://files.pythonhosted.org/packages/8e/16/04bb44bfb240c94d16a3f17225e52934d0a663e702663a44a479f6da8d0e/zenoml_next-0.6.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-21 21:36:55",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "trojblue",
    "github_project": "zeno-next",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "zenoml-next"
}
        
Elapsed time: 0.40681s