timecopilot-tirex


Nametimecopilot-tirex JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryA pre-trained Time Series Forecasting Model based on xLSTM supporting zero-shot forecasting
upload_time2025-07-14 18:02:28
maintainerNone
docs_urlNone
authorNone
requires_python>=3.11
licenseNXAI COMMUNITY LICENSE AGREEMENT Preamble 1 We are proud to present the NXAI TiRex time series model and software, demonstrating the strength of xLSTM for time series. While TiRex is freely available for open research and development, we believe that organizations significantly benefiting from our technology should contribute back. Our goal is to support research, small and medium-sized enterprises (SMEs), and open innovation, while ensuring that large enterprises who incorporate TiRex into commercial products or services fairly compensate the creators for their research and development efforts. Linz, May 27, 2025. Preamble 2 The NXAI COMMUNITY LICENSE AGREEMENT is based on the META LLAMA 3 COMMUNITY LICENSE AGREEMENT and contains some modifications, especially Section 2, “Additional Commercial Terms” is different. “Agreement” means the terms and conditions for use, reproduction, distribution and modification of the NXAI Materials set forth herein. “Documentation” means the specifications, manuals and documentation accompanying NXAI Materials distributed by NXAI at https://github.com/NX-AI/. “Licensee” or “you” means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity’s behalf), of the age required under applicable laws, rules or regulations to provide legal consent and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf. “NXAI Materials” means, collectively, NXAI’s proprietary models, algorithms and any Software, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code and all other work of NXAI in the field of neural networks, Documentation (and any portion thereof) made available under this Agreement. “NXAI” or “we” means NXAI GmbH, Linz, Austria. Contact: license@nx-ai.com By using or distributing any portion or element of the NXAI Materials, you agree to be bound by this Agreement. 1. License Rights and Redistribution. a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under NXAIs intellectual property embodied in the NXAI Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the NXAI Materials. b. Redistribution and Use. i. If you distribute or make available the NXAI Materials (or any derivative works thereof), or a product or service that uses any of them, including another AI model, you shall (A) provide a copy of this Agreement with any such NXAI Materials; and (B) prominently display “Built with technology from NXAI” on a related website, user interface, blogpost, about page, or product documentation. ii. If you receive NXAI Materials, or any derivative works thereof, from a Licensee as part of an integrated end user product, then Section 2 of this Agreement will not apply to you. iii. You must retain in all copies of the NXAI Materials that you distribute the following attribution notice within a “Notice” text file distributed as a part of such copies: “This product includes materials developed at NXAI that are licensed under the NXAI Community License, Copyright © NXAI GmbH, All Rights Reserved.” 2. Additional Commercial Terms. If (a) the Licensee, on a consolidated basis (including parent, subsidiaries, and affiliates), exceeds the annual revenue of one hundred million Euros (€100,000,000), and (b) the Licensee incorporates NXAI Material, in whole or in part, into a Commercial Product or Service, then the Licensee must obtain a commercial license from NXAI, which NXAI may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until NXAI otherwise expressly grants you such rights. 3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN “AS IS” BASIS, WITHOUT WARRANTIES OF ANY KIND, AND NXAI DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE NXAI MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS. 4. Limitation of Liability. IN NO EVENT WILL NXAI OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF NXAI OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING. 5. Intellectual Property. a. No trademark licenses are granted under this Agreement, and in connection with the NXAI Materials, neither NXAI nor Licensee may use any name or mark owned by or associated with the other or any of its affiliates, except as required for reasonable and customary use in describing and redistributing the NXAI Materials or as set forth in this Section 5(a). NXAI hereby grants you a license to use “NXAI” (the “Mark”) solely as required to comply with the last sentence of Section 1.b.i. All goodwill arising out of your use of the Mark will insure to the benefit of NXAI. b. Subject to NXAI’s ownership of NXAI Materials and derivatives made by or for NXAI, with respect to any derivative works and modifications of the NXAI Materials that are made by you, as between you and NXAI, you are and will be the owner of such derivative works and modifications. c. If you institute litigation or other proceedings against NXAI or any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the NXAI Materials or models released by NXAI outputs or results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other rights owned or licensable by you, then any licenses granted to you under this Agreement shall terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold harmless NXAI from and against any claim by any third party arising out of or related to your use or distribution of the NXAI Materials. 6. Term and Termination. The term of this Agreement will commence upon your acceptance of this Agreement or access to the NXAI Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. NXAI may terminate this Agreement if you are in breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete and cease use of the NXAI Materials. Sections 3, 4 and 7 shall survive the termination of this Agreement. 7. Governing Law and Jurisdiction. This Agreement shall be governed by and construed in accordance with the laws of the Republic of Austria, without regard to its conflict of laws principles. The courts located in Linz, Austria shall have exclusive jurisdiction over any disputes arising out of or in connection with this Agreement.
keywords tirex xlstm time series zero-shot deep learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TiRex: Zero-Shot Forecasting across Long and Short Horizons

[Paper](https://arxiv.org/abs/2505.23719) | [TiRex Huggingface Model Card](https://huggingface.co/NX-AI/TiRex)


This repository provides the pre-trained forecasting model TiRex introduced in the paper
[TiRex: Zero-Shot Forecasting across Long and Short Horizons with Enhanced In-Context Learning](https://arxiv.org/abs/2505.23719).


## TiRex Model

TiRex is a 35M parameter pre-trained time series forecasting model based on [xLSTM](https://github.com/NX-AI/xlstm).

### Key Facts:

- **Zero-Shot Forecasting**:
  TiRex performs forecasting without any training on your data. Just download and forecast.

- **Quantile Predictions**:
  TiRex not only provides point estimates but provides quantile estimates.

- **State-of-the-art Performance over Long and Short Horizons**:
  TiRex achieves top scores in various time series forecasting benchmarks, see [GiftEval](https://huggingface.co/spaces/Salesforce/GIFT-Eval) and [ChronosZS](https://huggingface.co/spaces/autogluon/fev-leaderboard).
  These benchmark show that TiRex provides great performance for both long and short-term forecasting.

## Installation
TiRex is currently only tested on *Linux systems* and Nvidia GPUs with compute capability >= 8.0.
If you want to use different systems, please check the [FAQ](#faq--troubleshooting).
It's best to install TiRex in the specified conda environment.
The respective conda dependency file is [requirements_py26.yaml](./requirements_py26.yaml).

```sh
# 1) Setup and activate conda env from ./requirements_py26.yaml
git clone github.com/NX-AI/tirex
conda env create --file ./tirex/requirements_py26.yaml
conda activate tirex

# 2) [Mandatory] Install Tirex

## 2a) Install from source
git clone github.com/NX-AI/tirex  # if not already cloned before
cd tirex
pip install -e .

# 2b) Install from PyPi (will be available soon)

# 2) Optional: Install also optional dependencies
pip install .[gluonts]      # enable gluonTS in/output API
pip install .[hfdataset]    # enable HuggingFace datasets in/output API
pip install .[notebooks]    # To run the example notebooks
```


## Quick Start

```python
import torch
from tirex import load_model, ForecastModel

model: ForecastModel = load_model("NX-AI/TiRex")
data = torch.rand((5, 128))  # Sample Data (5 time series with length 128)
forecast = model.forecast(context=data, prediction_length=64)
```

We provide an extended quick start example in [examples/quick_start_tirex.ipynb](./examples/quick_start_tirex.ipynb).
This notebook also shows how to use the different input and output types of you time series data.
If you dont have suitable hardware you can run the the extended quick start example example also in Google Colab:

<a target="_blank" href="https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb">
  <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open Quick Start In Colab"/>
</a>

###  Example Notebooks

We provide notebooks to run the benchmarks: [GiftEval](./examples/gifteval/gifteval.ipynb) and [Chronos-ZS](./examples/chronos_zs/chronos_zs.ipynb).



## FAQ / Troubleshooting:

- **Can I run TiRex on CPU?**:
  > At the moment CPU support **is experimental**.
  Running on CPU will slow down the model considerable and might likely forecast results.
  To enable TiRex on CPU you need to disable the CUDA kernels (see section [CUDA Kernels](#cuda-kernels)).
  You can also test Tirex with [Google Colab](https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb).
  If you are interested in running TiRex on more resource constrained or embedded devices get in touch with us.

- **Can I run TiRex on Windows?**:
  > We don't support Windows at the moment.
  You might still be able to run TiRex on Windows.
  In this case you can skip the conda environment installation.
  For troubleshooting on windows you can find relevant discussion on the [xLSTM GitHub repository](https://github.com/NX-AI/xlstm/issues?q=is%3Aissue%20state%3Aopen%20windows).
  You can also test Tirex with [Google Colab](https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb).


- **Can I run TiRex on macOS?**:
  > macOS is not officially supported yet, but TiRex can run on CPU (see above) and hence on macOS.
 MPS has the same limitations as CPU and is also experimental.
  You can also test Tirex with [Google Colab](https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb).


- **Can I run TiRex on Nvidia GPU with CUDA compute capability < 8.0?**:
  > The custom CUDA kernels require a GPU with CUDA compute capability >= 8.0.
  You can deactivate the custom CUDA kernels (see section [CUDA Kernels](#cuda-kernels) for more details).
  However, at the moment this **is experimental** and can result in NaNs or degraded forecasts!
  If you are interested in running TiRex on more resource constrained or embedded devices get in touch with us.

- **Can I train / finetune TiRex for my own data?**:
  > TiRex already provide state-of-the-art performance for zero-shot prediction. Hence, you can use it without training on your own data.
  However, we plan to provide fine-tuning support in the future.
  If you are interested in models fine-tuned on your data, get in touch with us.

- **Error during the installation of the conda environment**:
  > If you encounter errors during installing of the conda environment your system most likely does not support CUDA.
  Please skip the conda environment installation and install TiRex directly in a Python environment.

- **When loading TiRex I get error messages regarding sLSTM or CUDA**:
  > Please check the section on [CUDA kernels](#cuda-kernels) in the Readme.
  In the case you can not fix problem you can use a fallback implementation in pure Pytorch.
  However, this can slow down TiRex considerably and might degrade results!



## CUDA Kernels

Tirex uses custom CUDA kernels for the sLSTM cells.
These CUDA kernels are compiled when the model is loaded the first time.
The CUDA kernels require GPU hardware that support CUDA compute capability 8.0 or later.
We also highly suggest to use the provided [conda environment spec](./requirements_py26.yaml).
If you don't have such a device, or you have unresolvable problems with the kernels you can use a fallback implementation in pure Pytorch.
**However, this is at the moment **EXPERIMENTAL**, **slows** down TiRex considerably and likely **degrade forecasting** results!**
To disable the CUDA kernels set the environment variable
```bash
export TIREX_NO_CUDA=1
```
or within python:

```python
import os
os.environ['TIREX_NO_CUDA'] = '1'
```

### Troubleshooting CUDA

**This information is taken from the
[xLSTM repository](https://github.com/NX-AI/xlstm) - See this for further details**:

For the CUDA version of sLSTM, you need Compute Capability >= 8.0, see [https://developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). If you have problems with the compilation, please try:
```bash
export TORCH_CUDA_ARCH_LIST="8.0;8.6;9.0"
```

For all kinds of custom setups with torch and CUDA, keep in mind that versions have to match. Also, to make sure the correct CUDA libraries are included you can use the `XLSTM_EXTRA_INCLUDE_PATHS` environment variable now to inject different include paths, for example:

```bash
export XLSTM_EXTRA_INCLUDE_PATHS='/usr/local/include/cuda/:/usr/include/cuda/'
```

or within python:

```python
import os
os.environ['XLSTM_EXTRA_INCLUDE_PATHS']='/usr/local/include/cuda/:/usr/include/cuda/'
```


## Cite

If you use TiRex in your research, please cite our work:

```bibtex
@article{auerTiRexZeroShotForecasting2025,
  title = {{{TiRex}}: {{Zero-Shot Forecasting Across Long}} and {{Short Horizons}} with {{Enhanced In-Context Learning}}},
  author = {Auer, Andreas and Podest, Patrick and Klotz, Daniel and B{\"o}ck, Sebastian and Klambauer, G{\"u}nter and Hochreiter, Sepp},
  journal = {ArXiv},
  volume = {2505.23719},   
  year = {2025}
}
```


## License

TiRex is licensed under the [NXAI community license](./LICENSE).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "timecopilot-tirex",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "TiRex, xLSTM, Time Series, Zero-shot, Deep Learning",
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/04/5c/d14a58b48a52c00d3d66b4fbc380b582345c58765a7fe564cecb4270e001/timecopilot_tirex-0.1.0.tar.gz",
    "platform": null,
    "description": "# TiRex: Zero-Shot Forecasting across Long and Short Horizons\n\n[Paper](https://arxiv.org/abs/2505.23719) | [TiRex Huggingface Model Card](https://huggingface.co/NX-AI/TiRex)\n\n\nThis repository provides the pre-trained forecasting model TiRex introduced in the paper\n[TiRex: Zero-Shot Forecasting across Long and Short Horizons with Enhanced In-Context Learning](https://arxiv.org/abs/2505.23719).\n\n\n## TiRex Model\n\nTiRex is a 35M parameter pre-trained time series forecasting model based on [xLSTM](https://github.com/NX-AI/xlstm).\n\n### Key Facts:\n\n- **Zero-Shot Forecasting**:\n  TiRex performs forecasting without any training on your data. Just download and forecast.\n\n- **Quantile Predictions**:\n  TiRex not only provides point estimates but provides quantile estimates.\n\n- **State-of-the-art Performance over Long and Short Horizons**:\n  TiRex achieves top scores in various time series forecasting benchmarks, see [GiftEval](https://huggingface.co/spaces/Salesforce/GIFT-Eval) and [ChronosZS](https://huggingface.co/spaces/autogluon/fev-leaderboard).\n  These benchmark show that TiRex provides great performance for both long and short-term forecasting.\n\n## Installation\nTiRex is currently only tested on *Linux systems* and Nvidia GPUs with compute capability >= 8.0.\nIf you want to use different systems, please check the [FAQ](#faq--troubleshooting).\nIt's best to install TiRex in the specified conda environment.\nThe respective conda dependency file is [requirements_py26.yaml](./requirements_py26.yaml).\n\n```sh\n# 1) Setup and activate conda env from ./requirements_py26.yaml\ngit clone github.com/NX-AI/tirex\nconda env create --file ./tirex/requirements_py26.yaml\nconda activate tirex\n\n# 2) [Mandatory] Install Tirex\n\n## 2a) Install from source\ngit clone github.com/NX-AI/tirex  # if not already cloned before\ncd tirex\npip install -e .\n\n# 2b) Install from PyPi (will be available soon)\n\n# 2) Optional: Install also optional dependencies\npip install .[gluonts]      # enable gluonTS in/output API\npip install .[hfdataset]    # enable HuggingFace datasets in/output API\npip install .[notebooks]    # To run the example notebooks\n```\n\n\n## Quick Start\n\n```python\nimport torch\nfrom tirex import load_model, ForecastModel\n\nmodel: ForecastModel = load_model(\"NX-AI/TiRex\")\ndata = torch.rand((5, 128))  # Sample Data (5 time series with length 128)\nforecast = model.forecast(context=data, prediction_length=64)\n```\n\nWe provide an extended quick start example in [examples/quick_start_tirex.ipynb](./examples/quick_start_tirex.ipynb).\nThis notebook also shows how to use the different input and output types of you time series data.\nIf you dont have suitable hardware you can run the the extended quick start example example also in Google Colab:\n\n<a target=\"_blank\" href=\"https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb\">\n  <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open Quick Start In Colab\"/>\n</a>\n\n###  Example Notebooks\n\nWe provide notebooks to run the benchmarks: [GiftEval](./examples/gifteval/gifteval.ipynb) and [Chronos-ZS](./examples/chronos_zs/chronos_zs.ipynb).\n\n\n\n## FAQ / Troubleshooting:\n\n- **Can I run TiRex on CPU?**:\n  > At the moment CPU support **is experimental**.\n  Running on CPU will slow down the model considerable and might likely forecast results.\n  To enable TiRex on CPU you need to disable the CUDA kernels (see section [CUDA Kernels](#cuda-kernels)).\n  You can also test Tirex with [Google Colab](https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb).\n  If you are interested in running TiRex on more resource constrained or embedded devices get in touch with us.\n\n- **Can I run TiRex on Windows?**:\n  > We don't support Windows at the moment.\n  You might still be able to run TiRex on Windows.\n  In this case you can skip the conda environment installation.\n  For troubleshooting on windows you can find relevant discussion on the [xLSTM GitHub repository](https://github.com/NX-AI/xlstm/issues?q=is%3Aissue%20state%3Aopen%20windows).\n  You can also test Tirex with [Google Colab](https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb).\n\n\n- **Can I run TiRex on macOS?**:\n  > macOS is not officially supported yet, but TiRex can run on CPU (see above) and hence on macOS.\n MPS has the same limitations as CPU and is also experimental.\n  You can also test Tirex with [Google Colab](https://colab.research.google.com/github/NX-AI/tirex/blob/main/examples/quick_start_tirex.ipynb).\n\n\n- **Can I run TiRex on Nvidia GPU with CUDA compute capability < 8.0?**:\n  > The custom CUDA kernels require a GPU with CUDA compute capability >= 8.0.\n  You can deactivate the custom CUDA kernels (see section [CUDA Kernels](#cuda-kernels) for more details).\n  However, at the moment this **is experimental** and can result in NaNs or degraded forecasts!\n  If you are interested in running TiRex on more resource constrained or embedded devices get in touch with us.\n\n- **Can I train / finetune TiRex for my own data?**:\n  > TiRex already provide state-of-the-art performance for zero-shot prediction. Hence, you can use it without training on your own data.\n  However, we plan to provide fine-tuning support in the future.\n  If you are interested in models fine-tuned on your data, get in touch with us.\n\n- **Error during the installation of the conda environment**:\n  > If you encounter errors during installing of the conda environment your system most likely does not support CUDA.\n  Please skip the conda environment installation and install TiRex directly in a Python environment.\n\n- **When loading TiRex I get error messages regarding sLSTM or CUDA**:\n  > Please check the section on [CUDA kernels](#cuda-kernels) in the Readme.\n  In the case you can not fix problem you can use a fallback implementation in pure Pytorch.\n  However, this can slow down TiRex considerably and might degrade results!\n\n\n\n## CUDA Kernels\n\nTirex uses custom CUDA kernels for the sLSTM cells.\nThese CUDA kernels are compiled when the model is loaded the first time.\nThe CUDA kernels require GPU hardware that support CUDA compute capability 8.0 or later.\nWe also highly suggest to use the provided [conda environment spec](./requirements_py26.yaml).\nIf you don't have such a device, or you have unresolvable problems with the kernels you can use a fallback implementation in pure Pytorch.\n**However, this is at the moment **EXPERIMENTAL**, **slows** down TiRex considerably and likely **degrade forecasting** results!**\nTo disable the CUDA kernels set the environment variable\n```bash\nexport TIREX_NO_CUDA=1\n```\nor within python:\n\n```python\nimport os\nos.environ['TIREX_NO_CUDA'] = '1'\n```\n\n### Troubleshooting CUDA\n\n**This information is taken from the\n[xLSTM repository](https://github.com/NX-AI/xlstm) - See this for further details**:\n\nFor the CUDA version of sLSTM, you need Compute Capability >= 8.0, see [https://developer.nvidia.com/cuda-gpus](https://developer.nvidia.com/cuda-gpus). If you have problems with the compilation, please try:\n```bash\nexport TORCH_CUDA_ARCH_LIST=\"8.0;8.6;9.0\"\n```\n\nFor all kinds of custom setups with torch and CUDA, keep in mind that versions have to match. Also, to make sure the correct CUDA libraries are included you can use the `XLSTM_EXTRA_INCLUDE_PATHS` environment variable now to inject different include paths, for example:\n\n```bash\nexport XLSTM_EXTRA_INCLUDE_PATHS='/usr/local/include/cuda/:/usr/include/cuda/'\n```\n\nor within python:\n\n```python\nimport os\nos.environ['XLSTM_EXTRA_INCLUDE_PATHS']='/usr/local/include/cuda/:/usr/include/cuda/'\n```\n\n\n## Cite\n\nIf you use TiRex in your research, please cite our work:\n\n```bibtex\n@article{auerTiRexZeroShotForecasting2025,\n  title = {{{TiRex}}: {{Zero-Shot Forecasting Across Long}} and {{Short Horizons}} with {{Enhanced In-Context Learning}}},\n  author = {Auer, Andreas and Podest, Patrick and Klotz, Daniel and B{\\\"o}ck, Sebastian and Klambauer, G{\\\"u}nter and Hochreiter, Sepp},\n  journal = {ArXiv},\n  volume = {2505.23719},   \n  year = {2025}\n}\n```\n\n\n## License\n\nTiRex is licensed under the [NXAI community license](./LICENSE).\n",
    "bugtrack_url": null,
    "license": "NXAI COMMUNITY LICENSE AGREEMENT  Preamble 1  We are proud to present the NXAI TiRex time series model and software, demonstrating the strength of xLSTM for time series. While TiRex is freely available for open research and development, we believe that organizations significantly benefiting from our technology should contribute back. Our goal is to support research, small and medium-sized enterprises (SMEs), and open innovation, while ensuring that large enterprises who incorporate TiRex into commercial products or services fairly compensate the creators for their research and development efforts.  Linz, May 27, 2025.  Preamble 2  The NXAI COMMUNITY LICENSE AGREEMENT is based on the META LLAMA 3 COMMUNITY LICENSE AGREEMENT and contains some modifications, especially Section 2, \u201cAdditional Commercial Terms\u201d is different.  \u201cAgreement\u201d means the terms and conditions for use, reproduction, distribution and modification of the NXAI Materials set forth herein. \u201cDocumentation\u201d means the specifications, manuals and documentation accompanying NXAI Materials distributed by NXAI at https://github.com/NX-AI/. \u201cLicensee\u201d or \u201cyou\u201d means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity\u2019s behalf), of the age required under applicable laws, rules or regulations to provide legal consent and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf. \u201cNXAI Materials\u201d means, collectively, NXAI\u2019s proprietary models, algorithms and any Software, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code and all other work of NXAI in the field of neural networks, Documentation (and any portion thereof) made available under this Agreement. \u201cNXAI\u201d or \u201cwe\u201d means NXAI GmbH, Linz, Austria. Contact: license@nx-ai.com  By using or distributing any portion or element of the NXAI Materials, you agree to be bound by this Agreement.  1. License Rights and Redistribution.  a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under NXAIs intellectual property embodied in the NXAI Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the NXAI Materials.  b. Redistribution and Use.  i. If you distribute or make available the NXAI Materials (or any derivative works thereof), or a product or service that uses any of them, including another AI model, you shall (A) provide a copy of this Agreement with any such NXAI Materials; and (B) prominently display \u201cBuilt with technology from NXAI\u201d on a related website, user interface, blogpost, about page, or product documentation.  ii. If you receive NXAI Materials, or any derivative works thereof, from a Licensee as part of an integrated end user product, then Section 2 of this Agreement will not apply to you.  iii. You must retain in all copies of the NXAI Materials that you distribute the following attribution notice within a \u201cNotice\u201d text file distributed as a part of such copies: \u201cThis product includes materials developed at NXAI that are licensed under the NXAI Community License, Copyright \u00a9 NXAI GmbH, All Rights Reserved.\u201d  2. Additional Commercial Terms. If (a) the Licensee, on a consolidated basis (including parent, subsidiaries, and affiliates), exceeds the annual revenue of one hundred million Euros (\u20ac100,000,000), and (b) the Licensee incorporates NXAI Material, in whole or in part, into a Commercial Product or Service, then the Licensee must obtain a commercial license from NXAI, which NXAI may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until NXAI otherwise expressly grants you such rights.  3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN \u201cAS IS\u201d BASIS, WITHOUT WARRANTIES OF ANY KIND, AND NXAI DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE NXAI MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS.  4. Limitation of Liability. IN NO EVENT WILL NXAI OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF NXAI OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.  5. Intellectual Property.  a. No trademark licenses are granted under this Agreement, and in connection with the NXAI Materials, neither NXAI nor Licensee may use any name or mark owned by or associated with the other or any of its affiliates, except as required for reasonable and customary use in describing and redistributing the NXAI Materials or as set forth in this Section 5(a). NXAI hereby grants you a license to use \u201cNXAI\u201d (the \u201cMark\u201d) solely as required to comply with the last sentence of Section 1.b.i. All goodwill arising out of your use of the Mark will insure to the benefit of NXAI.  b. Subject to NXAI\u2019s ownership of NXAI Materials and derivatives made by or for NXAI, with respect to any derivative works and modifications of the NXAI Materials that are made by you, as between you and NXAI, you are and will be the owner of such derivative works and modifications.  c. If you institute litigation or other proceedings against NXAI or any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the NXAI Materials or models released by NXAI outputs or results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other rights owned or licensable by you, then any licenses granted to you under this Agreement shall terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold harmless NXAI from and against any claim by any third party arising out of or related to your use or distribution of the NXAI Materials.  6. Term and Termination. The term of this Agreement will commence upon your acceptance of this Agreement or access to the NXAI Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. NXAI may terminate this Agreement if you are in breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete and cease use of the NXAI Materials. Sections 3, 4 and 7 shall survive the termination of this Agreement.  7. Governing Law and Jurisdiction. This Agreement shall be governed by and construed in accordance with the laws of the Republic of Austria, without regard to its conflict of laws principles. The courts located in Linz, Austria shall have exclusive jurisdiction over any disputes arising out of or in connection with this Agreement. ",
    "summary": "A pre-trained Time Series Forecasting Model based on xLSTM supporting zero-shot forecasting",
    "version": "0.1.0",
    "project_urls": null,
    "split_keywords": [
        "tirex",
        " xlstm",
        " time series",
        " zero-shot",
        " deep learning"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f0b06dd7696d01a6ad7a4dc0095316aab073b900162208af432aaa459e17fc6c",
                "md5": "a2f77ba95bf8e17d8314addf65890aa6",
                "sha256": "810ca0a410868010cb3bd8190e2539574af8df86f7cb64bf3f9363811d899f51"
            },
            "downloads": -1,
            "filename": "timecopilot_tirex-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a2f77ba95bf8e17d8314addf65890aa6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 25532,
            "upload_time": "2025-07-14T18:02:27",
            "upload_time_iso_8601": "2025-07-14T18:02:27.849067Z",
            "url": "https://files.pythonhosted.org/packages/f0/b0/6dd7696d01a6ad7a4dc0095316aab073b900162208af432aaa459e17fc6c/timecopilot_tirex-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "045cd14a58b48a52c00d3d66b4fbc380b582345c58765a7fe564cecb4270e001",
                "md5": "bf924147df309e5ba5d81e0008b0910a",
                "sha256": "1c1b972e4705fce494d2c205d59512980bdad7bdd2124a298c9a6eee45ff3db2"
            },
            "downloads": -1,
            "filename": "timecopilot_tirex-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "bf924147df309e5ba5d81e0008b0910a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 23952,
            "upload_time": "2025-07-14T18:02:28",
            "upload_time_iso_8601": "2025-07-14T18:02:28.981884Z",
            "url": "https://files.pythonhosted.org/packages/04/5c/d14a58b48a52c00d3d66b4fbc380b582345c58765a7fe564cecb4270e001/timecopilot_tirex-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-14 18:02:28",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "timecopilot-tirex"
}
        
Elapsed time: 0.76242s