unirec


Nameunirec JSON
Version 0.0.1a4 PyPI version JSON
download
home_page
SummaryA compact recommender library for universal recommendation systems
upload_time2023-10-24 09:16:41
maintainer
docs_urlNone
authorJianxun Lian
requires_python>=3.8
licenseMIT
keywords recommender system deep learning pytorch
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # UniRec

## Introduction

UniRec is an easy-to-use, lightweight, and scalable implementation of recommender systems. Its primary objective is to enable users to swiftly construct a comprehensive ecosystem of recommenders using a minimal set of robust and practical recommendation models. These models are designed to deliver scalable and competitive performance, encompassing a majority of real-world recommendation scenarios.


It is important to note that this goal differs from those of other well-known public libraries, such as Recommender and RecBole, which include missions of providing an extensive range of recommendation algorithms or offering various datasets.


The term "Uni-" carries several implications:

- Unit: Our aim is to employ a minimal set of models to facilitate the recommendation service onboarding process across most real-world scenarios. By maintaining a lightweight and extensible architecture, users can effortlessly modify and incorporate customized models into UniRec, catering to their specific future requirements.


- United: In contrast to the Natural Language Processing (NLP) domain, it is challenging to rely on a single model to serve end-to-end business applications in recommender systems. It is desirable that various modules or stages (such as retrieval and ranking) within a recommender system are not isolated and trained independently but are closely interconnected.


- Unified: While we acknowledge that model parameters cannot be unified, we believe there is potential to unify model structures. Consequently, we are exploring the possibility of utilizing a unified Transformer structure to serve different modules within recommender systems.


- Universal: We aspire for UniRec to support a wide range of recommendation scenarios, including gaming, music, movies, ads, and e-commerce, using a universal data model.



## Installation 


### Installation from PyPI

1. Ensure that [PyTorch](https://pytorch.org/get-started/previous-versions/) with CUDA supported (version 1.10.0-1.13.1) is installed:


    ```shell
    pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113

    python -c "import torch; print(torch.__version__)"
    ```

2. Install `unirec` with pip:

    ```shell
    pip install unirec
    ```

### Installation from Wheel Locally

1. Ensure that [PyTorch](https://pytorch.org/get-started/previous-versions/) with CUDA supported (version 1.10.0-1.13.1) is installed:


    ```shell
    pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113

    python -c "import torch; print(torch.__version__)"
    ```

2. Clone Git Repo

    ```shell
    git clone https://github.com/microsoft/UniRec.git
    ```

3. Build

    ```shell
    cd UniRec
    pip install --user --upgrade setuptools wheel twine
    python setup.py sdist bdist_wheel
    ```
    After building, the wheel package could be found in `UniRec/dist`.

4. Install

    ```shell
    pip install dist/unirec-*.whl 
    ```
    The specific package name could be find in `UniRec/dist`.

    Check if `unirec` is installed sucessfully:

    ```shell
    python -c "from unirec.utils import general; print(general.get_local_time_str())"
    ```


## Algorithms

| Algorithm | Type | Paper | Code |
|-----------|------|-------|------|
| MF |  Collaborative Filtering | [BPR](https://dl.acm.org/doi/10.5555/1795114.1795167) | [unirec/model/cf/mf.py](./unirec/model/cf/mf.py) |
| UserCF | Collaborative Filtering | - | [unirec/model/cf/usercf.py](./unirec/model/cf/usercf.py) |
| SLIM | Collaborative Filtering | [SLIM](https://ieeexplore.ieee.org/document/6137254) | [unirec/model/cf/slim.py](./unirec/model/cf/slim.py) |
| AdmmSLIM | Collaborative Filtering | [ADMMSLIM](https://dl.acm.org/doi/10.1145/3336191.3371774) | [unirec/model/cf/admmslim.py](./unirec/model/cf/admmslim.py) |
| SAR | Collaborative Filtering | [ItemCF](https://dl.acm.org/doi/10.1145/371920.372071), [SAR](https://github.com/recommenders-team/recommenders/blob/main/examples/02_model_collaborative_filtering/sar_deep_dive.ipynb) | [unirec/model/cf/sar.py](./unirec/model/cf/sar.py) |
| EASE | Collaborative Filtering | [EASE](https://dl.acm.org/doi/abs/10.1145/3308558.3313710) | [unirec/model/cf/ease.py](./unirec/model/cf/ease.py) |
| MultiVAE | Collaborative Filtering | [MultiVAE](https://dl.acm.org/doi/10.1145/3178876.3186150) | [unirec/model/cf/multivae.py](./unirec/model/cf/multivae.py) |
| SVDPlusPlus | Sequential Model | [SVD++](https://dl.acm.org/doi/10.1145/1644873.1644874) | [unirec/model/sequential/svdplusplus.py](./unirec/model/sequential/svdplusplus.py) |
| AvgHist | Sequential Model | - | [unirec/model/sequential/avghist.py](./unirec/model/sequential/avghist.py) |
| AttHist | Sequential Model | - | [unirec/model/sequential/atthist.py](./unirec/model/sequential/atthist.py) |
| GRU4Rec | Sequential Model | [GRU4Rec](https://dl.acm.org/doi/10.1145/2988450.2988452)  | [unirec/model/sequential/gru4rec.py](./unirec/model/sequential/gru4rec.py) |
| SASRec | Sequential Model | [SASRec](https://ieeexplore.ieee.org/abstract/document/8594844)  | [unirec/model/sequential/sasrec.py](./unirec/model/sequential/sasrec.py) |
| ConvFormer | Sequential Model | [ConvFormer](https://arxiv.org/abs/2308.02925)  | [unirec/model/sequential/convformer.py](./unirec/model/sequential/convformer.py) |
| FastConvFormer | Sequential Model | [ConvFormer](https://arxiv.org/abs/2308.02925) | [unirec/model/sequential/fastconvformer.py](./unirec/model/sequential/fastconvformer.py) |
| FM | Ranking Model | [Factorization Machine](https://ieeexplore.ieee.org/document/5694074)  | [unirec/model/rank/fm.py](./unirec/model/rank/fm.py) |
| BST | Ranking Model | [Behavior sequence transformer](https://dl.acm.org/doi/10.1145/3326937.3341261) | [unirec/model/rank/bst.py](./unirec/model/rank/bst.py) |


## Examples

To go through all the examples listed below, we provide a [script](./examples/preprocess/download_split_ml100k.py) for downloading and split for [ml-100k](https://grouplens.org/datasets/movielens/100k/) dataset. Run:

```shell
python download_split_ml100k.py
```

The files for the raw dataset would be saved in your home dir: `~/.unirec/dataset/ml-100k`

Next, it is essential to convert the raw dataset into a format compatible with UniRec. Use the [script](./examples/preprocess/preprocess_ml100k.sh) to process and save the files in `UniRec/data/ml-100k`.


```shell
cd examples/preprocess
bash preprocess_ml100k.sh
```


### General Training
To train an existing model in UniRec, for instance, training SASRec with ml-100k dataset, refer to the script provided in [examples/training/train_ml100k.sh](./examples/training/train_seq_ml100k.sh).


### Multi-GPU Training
UniRec supports multi-GPU training with the integration of [Accelerate](https://huggingface.co/docs/accelerate). An example script is available at [examples/training/multi_gpu_train_ml100k.sh](./examples/training/multi_gpu_train_ml100k.sh). The key arguments in the script could be found in line 3-12 in the script:

```shell
GPU_INDICES="0,1" # e.g. "0,1"

# Specify the number of nodes to use (one node may have multiple GPUs)
NUM_NODES=1

# Specify the number of processes in each node (the number should equal the number of GPU_INDICES)
NPROC_PER_NODE=2
```

For more details about the launching command, please refer to [Accelerate Docs](https://huggingface.co/docs/accelerate/basic_tutorials/launch).

### Hyperparameter Tuning with wandb

UniRec supports hyperparameter tuning (or hyperparameter optimization, HPO) with the intergration of [WandB](https://wandb.ai). There are three major steps to start a wandb experiment.


 1. Compose a training script and enable `wandb`. An example is provided in [examples/training/train_ml100k_with_wandb.sh](./examples/training/train_ml100k_with_wandb.sh). The key arguments are:

     - `--use_wandb=1`: enable wandb in process
     - `--wandb_file=/path/to/configuration_file`: the configuration file for wandb, including command, metrics, method, and search space.
 2. Define sweep configuration. Write a YAML-format configuration file to set the command, monitor metrics, tuning method and search space.An example is available at [examples/training/wandb.yaml](./examples/training/wandb.yaml). For more details about the configuration file, refer to [WandB Docs](https://docs.wandb.ai/guides/sweeps/define-sweep-configuration)
 3. Initialize sweeps and start sweep agents. To start an experiment with wandb, first, initialize a sweep controller for selecting hyperparameters and issuing intructions; then an agent would actually perform the runs. An example for launching wandb experiments is provided in [examples/training/wandb_start.sh](./examples/training/wandb_start.sh). Note that we offer a pipeline command in the script to start the agent automatically after sweep initialization. However, we recommend the simpler manual two-step process:

 ```shell
## Step 1. Initialize sweeps with CLI using configuration file. 
## For more details, please refer to https://docs.wandb.ai/guides/sweeps/initialize-sweeps

wandb sweep config.yaml

## Step 2. After `wandb sweep`, you would get a sweep id and the hint to use `sweep agent`, like:

## wandb: Creating sweep from: ./wandb.yaml
## wandb: Created sweep with ID: xxx
## wandb: View sweep at: https://wandb.ai/xxx/xxx/xxx/xxx
## wandb: Run sweep agent with: wandb agent xxx/xxx/xxx/xxx

wandb agent entity/project/sweep_ID
```

### Serving with C# and Java

UniRec supports C# and Java inference based on [ONNX](https://onnxruntime.ai/docs/) format. We provide inference for user embedding, item embedding, and user-item score.

For more details, please refer to [examples/serving/README](./examples/serving/README)



## Contributing

This project welcomes contributions and suggestions.  Most contributions require you to agree to a
Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide
a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions
provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.

## Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft 
trademarks or logos is subject to and must follow 
[Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/usage/general).
Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.
Any use of third-party trademarks or logos are subject to those third-party's policies.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "unirec",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "recommender system,deep learning,pytorch",
    "author": "Jianxun Lian",
    "author_email": "jialia@microsoft.com",
    "download_url": "https://files.pythonhosted.org/packages/f6/65/01647bd2f9521e46a3a8c83ddfffce02dc64ed8b355af7bdfea89b9e2d73/unirec-0.0.1a4.tar.gz",
    "platform": null,
    "description": "# UniRec\n\n## Introduction\n\nUniRec is an easy-to-use, lightweight, and scalable implementation of recommender systems. Its primary objective is to enable users to swiftly construct a comprehensive ecosystem of recommenders using a minimal set of robust and practical recommendation models. These models are designed to deliver scalable and competitive performance, encompassing a majority of real-world recommendation scenarios.\n\n\nIt is important to note that this goal differs from those of other well-known public libraries, such as Recommender and RecBole, which include missions of providing an extensive range of recommendation algorithms or offering various datasets.\n\n\nThe term \"Uni-\" carries several implications:\n\n- Unit: Our aim is to employ a minimal set of models to facilitate the recommendation service onboarding process across most real-world scenarios. By maintaining a lightweight and extensible architecture, users can effortlessly modify and incorporate customized models into UniRec, catering to their specific future requirements.\n\n\n- United: In contrast to the Natural Language Processing (NLP) domain, it is challenging to rely on a single model to serve end-to-end business applications in recommender systems. It is desirable that various modules or stages (such as retrieval and ranking) within a recommender system are not isolated and trained independently but are closely interconnected.\n\n\n- Unified: While we acknowledge that model parameters cannot be unified, we believe there is potential to unify model structures. Consequently, we are exploring the possibility of utilizing a unified Transformer structure to serve different modules within recommender systems.\n\n\n- Universal: We aspire for UniRec to support a wide range of recommendation scenarios, including gaming, music, movies, ads, and e-commerce, using a universal data model.\n\n\n\n## Installation \n\n\n### Installation from PyPI\n\n1. Ensure that [PyTorch](https://pytorch.org/get-started/previous-versions/) with CUDA supported (version 1.10.0-1.13.1) is installed:\n\n\n    ```shell\n    pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113\n\n    python -c \"import torch; print(torch.__version__)\"\n    ```\n\n2. Install `unirec` with pip:\n\n    ```shell\n    pip install unirec\n    ```\n\n### Installation from Wheel Locally\n\n1. Ensure that [PyTorch](https://pytorch.org/get-started/previous-versions/) with CUDA supported (version 1.10.0-1.13.1) is installed:\n\n\n    ```shell\n    pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113\n\n    python -c \"import torch; print(torch.__version__)\"\n    ```\n\n2. Clone Git Repo\n\n    ```shell\n    git clone https://github.com/microsoft/UniRec.git\n    ```\n\n3. Build\n\n    ```shell\n    cd UniRec\n    pip install --user --upgrade setuptools wheel twine\n    python setup.py sdist bdist_wheel\n    ```\n    After building, the wheel package could be found in `UniRec/dist`.\n\n4. Install\n\n    ```shell\n    pip install dist/unirec-*.whl \n    ```\n    The specific package name could be find in `UniRec/dist`.\n\n    Check if `unirec` is installed sucessfully:\n\n    ```shell\n    python -c \"from unirec.utils import general; print(general.get_local_time_str())\"\n    ```\n\n\n## Algorithms\n\n| Algorithm | Type | Paper | Code |\n|-----------|------|-------|------|\n| MF |  Collaborative Filtering | [BPR](https://dl.acm.org/doi/10.5555/1795114.1795167) | [unirec/model/cf/mf.py](./unirec/model/cf/mf.py) |\n| UserCF | Collaborative Filtering | - | [unirec/model/cf/usercf.py](./unirec/model/cf/usercf.py) |\n| SLIM | Collaborative Filtering | [SLIM](https://ieeexplore.ieee.org/document/6137254) | [unirec/model/cf/slim.py](./unirec/model/cf/slim.py) |\n| AdmmSLIM | Collaborative Filtering | [ADMMSLIM](https://dl.acm.org/doi/10.1145/3336191.3371774) | [unirec/model/cf/admmslim.py](./unirec/model/cf/admmslim.py) |\n| SAR | Collaborative Filtering | [ItemCF](https://dl.acm.org/doi/10.1145/371920.372071), [SAR](https://github.com/recommenders-team/recommenders/blob/main/examples/02_model_collaborative_filtering/sar_deep_dive.ipynb) | [unirec/model/cf/sar.py](./unirec/model/cf/sar.py) |\n| EASE | Collaborative Filtering | [EASE](https://dl.acm.org/doi/abs/10.1145/3308558.3313710) | [unirec/model/cf/ease.py](./unirec/model/cf/ease.py) |\n| MultiVAE | Collaborative Filtering | [MultiVAE](https://dl.acm.org/doi/10.1145/3178876.3186150) | [unirec/model/cf/multivae.py](./unirec/model/cf/multivae.py) |\n| SVDPlusPlus | Sequential Model | [SVD++](https://dl.acm.org/doi/10.1145/1644873.1644874) | [unirec/model/sequential/svdplusplus.py](./unirec/model/sequential/svdplusplus.py) |\n| AvgHist | Sequential Model | - | [unirec/model/sequential/avghist.py](./unirec/model/sequential/avghist.py) |\n| AttHist | Sequential Model | - | [unirec/model/sequential/atthist.py](./unirec/model/sequential/atthist.py) |\n| GRU4Rec | Sequential Model | [GRU4Rec](https://dl.acm.org/doi/10.1145/2988450.2988452)  | [unirec/model/sequential/gru4rec.py](./unirec/model/sequential/gru4rec.py) |\n| SASRec | Sequential Model | [SASRec](https://ieeexplore.ieee.org/abstract/document/8594844)  | [unirec/model/sequential/sasrec.py](./unirec/model/sequential/sasrec.py) |\n| ConvFormer | Sequential Model | [ConvFormer](https://arxiv.org/abs/2308.02925)  | [unirec/model/sequential/convformer.py](./unirec/model/sequential/convformer.py) |\n| FastConvFormer | Sequential Model | [ConvFormer](https://arxiv.org/abs/2308.02925) | [unirec/model/sequential/fastconvformer.py](./unirec/model/sequential/fastconvformer.py) |\n| FM | Ranking Model | [Factorization Machine](https://ieeexplore.ieee.org/document/5694074)  | [unirec/model/rank/fm.py](./unirec/model/rank/fm.py) |\n| BST | Ranking Model | [Behavior sequence transformer](https://dl.acm.org/doi/10.1145/3326937.3341261) | [unirec/model/rank/bst.py](./unirec/model/rank/bst.py) |\n\n\n## Examples\n\nTo go through all the examples listed below, we provide a [script](./examples/preprocess/download_split_ml100k.py) for downloading and split for [ml-100k](https://grouplens.org/datasets/movielens/100k/) dataset. Run:\n\n```shell\npython download_split_ml100k.py\n```\n\nThe files for the raw dataset would be saved in your home dir: `~/.unirec/dataset/ml-100k`\n\nNext, it is essential to convert the raw dataset into a format compatible with UniRec. Use the [script](./examples/preprocess/preprocess_ml100k.sh) to process and save the files in `UniRec/data/ml-100k`.\n\n\n```shell\ncd examples/preprocess\nbash preprocess_ml100k.sh\n```\n\n\n### General Training\nTo train an existing model in UniRec, for instance, training SASRec with ml-100k dataset, refer to the script provided in [examples/training/train_ml100k.sh](./examples/training/train_seq_ml100k.sh).\n\n\n### Multi-GPU Training\nUniRec supports multi-GPU training with the integration of [Accelerate](https://huggingface.co/docs/accelerate). An example script is available at [examples/training/multi_gpu_train_ml100k.sh](./examples/training/multi_gpu_train_ml100k.sh). The key arguments in the script could be found in line 3-12 in the script:\n\n```shell\nGPU_INDICES=\"0,1\" # e.g. \"0,1\"\n\n# Specify the number of nodes to use (one node may have multiple GPUs)\nNUM_NODES=1\n\n# Specify the number of processes in each node (the number should equal the number of GPU_INDICES)\nNPROC_PER_NODE=2\n```\n\nFor more details about the launching command, please refer to [Accelerate Docs](https://huggingface.co/docs/accelerate/basic_tutorials/launch).\n\n### Hyperparameter Tuning with wandb\n\nUniRec supports hyperparameter tuning (or hyperparameter optimization, HPO) with the intergration of [WandB](https://wandb.ai). There are three major steps to start a wandb experiment.\n\n\n 1. Compose a training script and enable `wandb`. An example is provided in [examples/training/train_ml100k_with_wandb.sh](./examples/training/train_ml100k_with_wandb.sh). The key arguments are:\n\n     - `--use_wandb=1`: enable wandb in process\n     - `--wandb_file=/path/to/configuration_file`: the configuration file for wandb, including command, metrics, method, and search space.\n 2. Define sweep configuration. Write a YAML-format configuration file to set the command, monitor metrics, tuning method and search space.An example is available at [examples/training/wandb.yaml](./examples/training/wandb.yaml). For more details about the configuration file, refer to [WandB Docs](https://docs.wandb.ai/guides/sweeps/define-sweep-configuration)\n 3. Initialize sweeps and start sweep agents. To start an experiment with wandb, first, initialize a sweep controller for selecting hyperparameters and issuing intructions; then an agent would actually perform the runs. An example for launching wandb experiments is provided in [examples/training/wandb_start.sh](./examples/training/wandb_start.sh). Note that we offer a pipeline command in the script to start the agent automatically after sweep initialization. However, we recommend the simpler manual two-step process:\n\n ```shell\n## Step 1. Initialize sweeps with CLI using configuration file. \n## For more details, please refer to https://docs.wandb.ai/guides/sweeps/initialize-sweeps\n\nwandb sweep config.yaml\n\n## Step 2. After `wandb sweep`, you would get a sweep id and the hint to use `sweep agent`, like:\n\n## wandb: Creating sweep from: ./wandb.yaml\n## wandb: Created sweep with ID: xxx\n## wandb: View sweep at: https://wandb.ai/xxx/xxx/xxx/xxx\n## wandb: Run sweep agent with: wandb agent xxx/xxx/xxx/xxx\n\nwandb agent entity/project/sweep_ID\n```\n\n### Serving with C# and Java\n\nUniRec supports C# and Java inference based on [ONNX](https://onnxruntime.ai/docs/) format. We provide inference for user embedding, item embedding, and user-item score.\n\nFor more details, please refer to [examples/serving/README](./examples/serving/README)\n\n\n\n## Contributing\n\nThis project welcomes contributions and suggestions.  Most contributions require you to agree to a\nContributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us\nthe rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.\n\nWhen you submit a pull request, a CLA bot will automatically determine whether you need to provide\na CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions\nprovided by the bot. You will only need to do this once across all repos using our CLA.\n\nThis project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).\nFor more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or\ncontact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.\n\n## Trademarks\n\nThis project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft \ntrademarks or logos is subject to and must follow \n[Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/usage/general).\nUse of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.\nAny use of third-party trademarks or logos are subject to those third-party's policies.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A compact recommender library for universal recommendation systems",
    "version": "0.0.1a4",
    "project_urls": null,
    "split_keywords": [
        "recommender system",
        "deep learning",
        "pytorch"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9afb0a3cd72166c14bd51d0100d7ab3197d46c921daa99807d8b65ec5fdd41d4",
                "md5": "82b068d72de547c1a680340fc36aff7c",
                "sha256": "be3ac404297b74d7799064c8142566401f6cfbebae4d76c75b25184bb0cb9ce4"
            },
            "downloads": -1,
            "filename": "unirec-0.0.1a4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "82b068d72de547c1a680340fc36aff7c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 121607,
            "upload_time": "2023-10-24T09:16:40",
            "upload_time_iso_8601": "2023-10-24T09:16:40.160004Z",
            "url": "https://files.pythonhosted.org/packages/9a/fb/0a3cd72166c14bd51d0100d7ab3197d46c921daa99807d8b65ec5fdd41d4/unirec-0.0.1a4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f66501647bd2f9521e46a3a8c83ddfffce02dc64ed8b355af7bdfea89b9e2d73",
                "md5": "058274bd824f901b6bab2966c4742afa",
                "sha256": "54c006a3e015950191e5a2a516b713088f61983d07ee7a00e1dfa2cb6124bdbd"
            },
            "downloads": -1,
            "filename": "unirec-0.0.1a4.tar.gz",
            "has_sig": false,
            "md5_digest": "058274bd824f901b6bab2966c4742afa",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 89676,
            "upload_time": "2023-10-24T09:16:41",
            "upload_time_iso_8601": "2023-10-24T09:16:41.736068Z",
            "url": "https://files.pythonhosted.org/packages/f6/65/01647bd2f9521e46a3a8c83ddfffce02dc64ed8b355af7bdfea89b9e2d73/unirec-0.0.1a4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-10-24 09:16:41",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "unirec"
}
        
Elapsed time: 0.33793s