Name | mlstm-kernels JSON |
Version |
1.0.3
JSON |
| download |
home_page | None |
Summary | A library providing fast and efficient mLSTM kernels for the xLSTM. |
upload_time | 2024-12-11 17:11:34 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | NXAI COMMUNITY LICENSE AGREEMENT Preamble 1 We are proud to present the NXAI xLSTM 7B model and software, demonstrating the strength of next-generation RNN-based large language models, delivering high-quality performance and fast inference speeds. While xLSTM 7B is freely available for open research and development, we believe that organizations significantly benefiting from our technology should contribute back. Our goal is to support research, small and medium-sized enterprises (SMEs), and open innovation, while ensuring that large enterprises who incorporate xLSTM 7B into commercial products or services fairly compensate the creators for their research and development efforts. Linz, December 12, 2024. Preamble 2 The NXAI COMMUNITY LICENSE AGREEMENT is based on the META LLAMA 3 COMMUNITY LICENSE AGREEMENT and contains some modifications, especially Section 2, “Additional Commercial Terms” is different. “Agreement” means the terms and conditions for use, reproduction, distribution and modification of the NXAI Materials set forth herein. “Documentation” means the specifications, manuals and documentation accompanying NXAI Materials distributed by NXAI at https://github.com/NX-AI/. “Licensee” or “you” means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity’s behalf), of the age required under applicable laws, rules or regulations to provide legal consent and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf. “NXAI Materials” means, collectively, NXAI’s proprietary large language models, algorithms and any Software, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code and all other work of NXAI in the field of neural networks, Documentation (and any portion thereof) made available under this Agreement. “NXAI” or “we” means NXAI GmbH, Linz, Austria. By using or distributing any portion or element of the NXAI Materials, you agree to be bound by this Agreement. 1. License Rights and Redistribution. a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under NXAI’s intellectual property embodied in the NXAI Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the NXAI Materials. b. Redistribution and Use. i. If you distribute or make available the NXAI Materials (or any derivative works thereof), or a product or service that uses any of them, including another AI model, you shall (A) provide a copy of this Agreement with any such NXAI Materials; and (B) prominently display “Built with technology from NXAI” on a related website, user interface, blogpost, about page, or product documentation. ii. If you receive NXAI Materials, or any derivative works thereof, from a Licensee as part of an integrated end user product, then Section 2 of this Agreement will not apply to you. iii. You must retain in all copies of the NXAI Materials that you distribute the following attribution notice within a “Notice” text file distributed as a part of such copies: “This product includes materials developed at NXAI that are licensed under the NXAI Community License, Copyright © NXAI GmbH, All Rights Reserved.” 2. Additional Commercial Terms. If (a) the Licensee, on a consolidated basis (including parent, subsidiaries, and affiliates), exceeds the annual revenue of one hundred million Euros (€100,000,000) or more, and (b) the Licensee incorporates NXAI Material, in whole or in part, into a Commercial Product or Service, then the Licensee must obtain a commercial license from NXAI, which NXAI may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until NXAI otherwise expressly grants you such rights 3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN “AS IS” BASIS, WITHOUT WARRANTIES OF ANY KIND, AND NXAI DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE NXAI MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS. 4. Limitation of Liability. IN NO EVENT WILL NXAI OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF NXAI OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING. 5. Intellectual Property. a. No trademark licenses are granted under this Agreement, and in connection with the NXAI Materials, neither NXAI nor Licensee may use any name or mark owned by or associated with the other or any of its affiliates, except as required for reasonable and customary use in describing and redistributing the NXAI Materials or as set forth in this Section 5(a). NXAI hereby grants you a license to use “NXAI” (the “Mark”) solely as required to comply with the last sentence of Section 1.b.i. All goodwill arising out of your use of the Mark will insure to the benefit of NXAI. b. Subject to NXAI’s ownership of NXAI Materials and derivatives made by or for NXAI, with respect to any derivative works and modifications of the NXAI Materials that are made by you, as between you and NXAI, you are and will be the owner of such derivative works and modifications. c. If you institute litigation or other proceedings against NXAI or any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the NXAI Materials or models released by NXAI outputs or results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other rights owned or licensable by you, then any licenses granted to you under this Agreement shall terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold harmless NXAI from and against any claim by any third party arising out of or related to your use or distribution of the NXAI Materials. 6. Term and Termination. The term of this Agreement will commence upon your acceptance of this Agreement or access to the NXAI Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. NXAI may terminate this Agreement if you are in breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete and cease use of the NXAI Materials. Sections 3, 4 and 7 shall survive the termination of this Agreement. 7. Governing Law and Jurisdiction. This Agreement shall be governed by and construed in accordance with the laws of the Republic of Austria, without regard to its conflict of laws principles. The courts located in Linz, Austria shall have exclusive jurisdiction over any disputes arising out of or in connection with this Agreement. ==================================================================================================== This product includes software licensed under the MIT License: MIT License Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ==================================================================================================== This product includes software licensed under the BSD-3-Clause License. BSD 3-Clause License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. |
keywords |
mlstm
xlstm
lstm
transformer
machine learning
deep learning
state space models
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# mLSTM Kernels
This library provides fast and efficient mLSTM kernels for the parallel, recurrent and chunkwise form. We provide PyTorch and JAX wrappers for our kernels.
Paper coming soon! Stay tuned 📺🎧⏳✨
## Kernel Overview
This library contains three different types of kernels:
- `parallel`: Parallel kernels that process a sequence in parallel (like Attention).
- `chunkwise`: Chunkwise kernels, that process chunks of the sequence in parallel.
- `recurrent`: Recurrent step kernels for inference.
## Benchmark
Runtime comparison of mLSTM chunkwise kernel (triton) [`triton_limit_chunk`] and (triton XL) [`triton_xl_chunk`] against other baselines:
![xLSTM Figure](./res/plot_sequence_length_consttok_nh8_hd512_line.svg)
**Left**: Forward pass
**Right**: Forward and backward pass
## Usage PyTorch
### Available Kernels
You can view all available kernels for the mLSTM by calling
```python
from mlstm_kernels.torch import (
get_available_mlstm_kernels,
get_available_mlstm_sequence_kernels,
get_available_mlstm_step_kernels,
)
print(get_available_mlstm_kernels())
print(get_available_mlstm_sequence_kernels())
print(get_available_mlstm_step_kernels())
```
and then use one of
```python
from mlstm_kernels.torch import (
get_mlstm_kernel,
get_mlstm_sequence_kernel,
get_mlstm_step_kernel,
)
```
to access the specific kernel function.
### Direct Import
You can directly import the specific kernel for example the chunkwise `triton_limit_chunk` kernel via:
```python
from mlstm_kernels.torch.chunkwise import mlstm_chunkwise__limit_chunk
```
### Backend Module
For PyTorch we provide a backend module for an easy integration into existing architectures.
```python
from mlstm_kernels.torch.backend_module import mLSTMBackendConfig, mLSTMBackend
```
### Training Kernel Interface
This is the interface used for the chunkwise and parallel kernels.
```python
def mlstm_interface(
q: torch.Tensor, # (B, NH, S, DHQK)
k: torch.Tensor, # (B, NH, S, DHQK)
v: torch.Tensor, # (B, NH, S, DHHV)
i: torch.Tensor, # (B, NH, S)
f: torch.Tensor, # (B, NH, S)
c_initial: torch.Tensor = None, # (B, NH, DHQK, DHHV)
n_initial: torch.Tensor = None, # (B, NH, DHQK)
m_initial: torch.Tensor = None, # (B, NH, 1)
return_last_states: bool = False,
eps: float = 1e-6,
autocast_kernel_dtype: torch.dtype = torch.bfloat16,
chunk_size: int = 64,
**kwargs,
) -> torch.Tensor | tuple[torch.Tensor, tuple[torch.Tensor, torch.Tensor, torch.Tensor]]:
# (B, NH, S, DHHV) | ((B, NH, S, DHHV), ((B, NH, DHQK, DHHV), (B, NH, DHQK), (B, NH)))
"""
Returns:
torch.Tensor: matH outputs (no n and m values, no last states)
tuple[torch.Tensor, tuple[torch.Tensor, torch.Tensor, torch.Tensor]]: matH, (matC_last, vecN_last, scaM_last)
"""
pass
```
### Step Kernel interface
This is the interface for the mlstm step kernels.
```python
def mlstm_step_interface(
q: torch.Tensor, # (B, NH, DHQK)
k: torch.Tensor, # (B, NH, DHQK)
v: torch.Tensor, # (B, NH, DHHV)
i: torch.Tensor, # (B, NH, 1)
f: torch.Tensor, # (B, NH, 1)
c: torch.Tensor, # (B, NH, DHQK, DHHV)
n: torch.Tensor, # (B, NH, DHQK)
m: torch.Tensor, # (B, NH, 1)
eps: float = 1e-6,
**kwargs,
) -> tuple[
torch.Tensor, tuple[torch.Tensor, torch.Tensor, torch.Tensor]
]: # vecH, (matC_state_new (B, NH, DHQK, DHHV), vecN_state_new (B, NH, DHQK), vecM_state_new (B, NH, 1))
pass
```
## Usage JAX
The JAX module `mlstm_kernels.jax` mirrors the PyTorch module `mlstm_kernels.torch` and can be used in the same way.
We will also provide a backend module for Flax soon.
## Running the unit tests
The unit tests cross-check the different kernel implementations on numerical deviations for different dtypes.
You can run all of them with the following command:
```bash
pytest -s tests/torch
# make sure you are in a JAX GPU environment
pytest -s tests/jax
```
The `-s` disables the log capturing so you see the results directly on the command line.
Each test will log the outputs to a new folder with the timestamp as name in the `test_outputs/` directory.
Example:
Each test starts with the line
`Test chunkwise-triton_xl_chunk target=triton_chunkwise_xl_chunk vs. baseline=native_parallel_stablef_custbw with S=256, B=1, NH=2, DHQK=64, DHHV=128, DTYPE=torch.float32`.
This test tests the chunkwise triton kernel `triton_chunkwise_xl_chunk` against the `native_parallel_stablef_custbw` baseline and runs the `triton_chunkwise_xl_chunk` in dtype float32. It will compare the errors against the baseline in the same dtype (i.e. float32 here) and in float64 if specified.
## Citation
Our paper is currently under preparation. We will announce it soon.
In the meantime if you use this codebase, or otherwise find our work valuable, please use this citations:
```
@article{beck:25unlocking,
title={Unlocking the Power of Recurrence for Efficient xLSTM Kernels},
author={Maximilian Beck and Korbinian Pöppel and Sepp Hochreiter},
booktitle = {Under preparation},
year={2025},
}
@software{beck:24mlstmkernels,
title = {mLSTM Kernels: A Library for Efficient mLSTM Kernels},
author = {Maximilian Beck and Korbinian Pöppel and Phillip Lippe},
url = {https://github.com/NXAI/mlstm_kernels},
month = dec,
year = {2024}
}
```
Raw data
{
"_id": null,
"home_page": null,
"name": "mlstm-kernels",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "mLSTM, xLSTM, LSTM, Transformer, Machine Learning, Deep Learning, State Space Models",
"author": null,
"author_email": "Maximilian Beck <beck@ml.jku.at>, Korbinian Poeppel <poeppel@ml.jku.at>, Phillip Lippe <phillip.lippe@gmail.com>, Sebastian Boeck <sebastian.boeck@nx-ai.com>",
"download_url": "https://files.pythonhosted.org/packages/70/26/d236409e787387cab5d16cc19157934961890e17e847448df126e3b19b84/mlstm_kernels-1.0.3.tar.gz",
"platform": null,
"description": "# mLSTM Kernels\n\nThis library provides fast and efficient mLSTM kernels for the parallel, recurrent and chunkwise form. We provide PyTorch and JAX wrappers for our kernels.\n\nPaper coming soon! Stay tuned \ud83d\udcfa\ud83c\udfa7\u23f3\u2728\n\n## Kernel Overview\n\nThis library contains three different types of kernels:\n\n- `parallel`: Parallel kernels that process a sequence in parallel (like Attention).\n- `chunkwise`: Chunkwise kernels, that process chunks of the sequence in parallel.\n- `recurrent`: Recurrent step kernels for inference.\n\n## Benchmark\n\nRuntime comparison of mLSTM chunkwise kernel (triton) [`triton_limit_chunk`] and (triton XL) [`triton_xl_chunk`] against other baselines:\n\n![xLSTM Figure](./res/plot_sequence_length_consttok_nh8_hd512_line.svg)\n\n**Left**: Forward pass \n**Right**: Forward and backward pass\n\n\n\n## Usage PyTorch\n\n### Available Kernels\n\nYou can view all available kernels for the mLSTM by calling\n\n```python\nfrom mlstm_kernels.torch import (\n get_available_mlstm_kernels,\n get_available_mlstm_sequence_kernels,\n get_available_mlstm_step_kernels,\n)\n\nprint(get_available_mlstm_kernels())\nprint(get_available_mlstm_sequence_kernels())\nprint(get_available_mlstm_step_kernels())\n```\n\nand then use one of \n\n```python\nfrom mlstm_kernels.torch import (\n get_mlstm_kernel,\n get_mlstm_sequence_kernel,\n get_mlstm_step_kernel,\n)\n```\nto access the specific kernel function.\n\n### Direct Import\n\nYou can directly import the specific kernel for example the chunkwise `triton_limit_chunk` kernel via:\n\n```python\nfrom mlstm_kernels.torch.chunkwise import mlstm_chunkwise__limit_chunk\n```\n\n### Backend Module\n\nFor PyTorch we provide a backend module for an easy integration into existing architectures. \n\n```python\nfrom mlstm_kernels.torch.backend_module import mLSTMBackendConfig, mLSTMBackend\n```\n\n### Training Kernel Interface \n\nThis is the interface used for the chunkwise and parallel kernels.\n\n```python\ndef mlstm_interface(\n q: torch.Tensor, # (B, NH, S, DHQK)\n k: torch.Tensor, # (B, NH, S, DHQK)\n v: torch.Tensor, # (B, NH, S, DHHV)\n i: torch.Tensor, # (B, NH, S)\n f: torch.Tensor, # (B, NH, S)\n c_initial: torch.Tensor = None, # (B, NH, DHQK, DHHV)\n n_initial: torch.Tensor = None, # (B, NH, DHQK)\n m_initial: torch.Tensor = None, # (B, NH, 1)\n return_last_states: bool = False,\n eps: float = 1e-6,\n autocast_kernel_dtype: torch.dtype = torch.bfloat16,\n chunk_size: int = 64,\n **kwargs,\n) -> torch.Tensor | tuple[torch.Tensor, tuple[torch.Tensor, torch.Tensor, torch.Tensor]]:\n # (B, NH, S, DHHV) | ((B, NH, S, DHHV), ((B, NH, DHQK, DHHV), (B, NH, DHQK), (B, NH)))\n \"\"\"\n Returns:\n torch.Tensor: matH outputs (no n and m values, no last states)\n tuple[torch.Tensor, tuple[torch.Tensor, torch.Tensor, torch.Tensor]]: matH, (matC_last, vecN_last, scaM_last)\n \"\"\"\n pass\n\n```\n\n### Step Kernel interface\n\nThis is the interface for the mlstm step kernels.\n\n```python\ndef mlstm_step_interface(\n q: torch.Tensor, # (B, NH, DHQK)\n k: torch.Tensor, # (B, NH, DHQK)\n v: torch.Tensor, # (B, NH, DHHV)\n i: torch.Tensor, # (B, NH, 1)\n f: torch.Tensor, # (B, NH, 1)\n c: torch.Tensor, # (B, NH, DHQK, DHHV)\n n: torch.Tensor, # (B, NH, DHQK)\n m: torch.Tensor, # (B, NH, 1)\n eps: float = 1e-6,\n **kwargs,\n) -> tuple[\n torch.Tensor, tuple[torch.Tensor, torch.Tensor, torch.Tensor]\n]: # vecH, (matC_state_new (B, NH, DHQK, DHHV), vecN_state_new (B, NH, DHQK), vecM_state_new (B, NH, 1))\n pass\n```\n\n## Usage JAX\n\nThe JAX module `mlstm_kernels.jax` mirrors the PyTorch module `mlstm_kernels.torch` and can be used in the same way. \n\nWe will also provide a backend module for Flax soon. \n\n## Running the unit tests\n\nThe unit tests cross-check the different kernel implementations on numerical deviations for different dtypes.\nYou can run all of them with the following command:\n\n```bash\npytest -s tests/torch\n# make sure you are in a JAX GPU environment\npytest -s tests/jax\n```\n\nThe `-s` disables the log capturing so you see the results directly on the command line.\nEach test will log the outputs to a new folder with the timestamp as name in the `test_outputs/` directory.\n\nExample:\nEach test starts with the line\n`Test chunkwise-triton_xl_chunk target=triton_chunkwise_xl_chunk vs. baseline=native_parallel_stablef_custbw with S=256, B=1, NH=2, DHQK=64, DHHV=128, DTYPE=torch.float32`.\n\nThis test tests the chunkwise triton kernel `triton_chunkwise_xl_chunk` against the `native_parallel_stablef_custbw` baseline and runs the `triton_chunkwise_xl_chunk` in dtype float32. It will compare the errors against the baseline in the same dtype (i.e. float32 here) and in float64 if specified.\n\n## Citation\n\nOur paper is currently under preparation. We will announce it soon.\nIn the meantime if you use this codebase, or otherwise find our work valuable, please use this citations:\n\n```\n@article{beck:25unlocking,\n title={Unlocking the Power of Recurrence for Efficient xLSTM Kernels}, \n author={Maximilian Beck and Korbinian P\u00f6ppel and Sepp Hochreiter},\n booktitle = {Under preparation},\n year={2025},\n}\n@software{beck:24mlstmkernels,\n title = {mLSTM Kernels: A Library for Efficient mLSTM Kernels},\n author = {Maximilian Beck and Korbinian P\u00f6ppel and Phillip Lippe},\n url = {https://github.com/NXAI/mlstm_kernels},\n month = dec,\n year = {2024}\n}\n```\n",
"bugtrack_url": null,
"license": "NXAI COMMUNITY LICENSE AGREEMENT Preamble 1 We are proud to present the NXAI xLSTM 7B model and software, demonstrating the strength of next-generation RNN-based large language models, delivering high-quality performance and fast inference speeds. While xLSTM 7B is freely available for open research and development, we believe that organizations significantly benefiting from our technology should contribute back. Our goal is to support research, small and medium-sized enterprises (SMEs), and open innovation, while ensuring that large enterprises who incorporate xLSTM 7B into commercial products or services fairly compensate the creators for their research and development efforts. Linz, December 12, 2024. Preamble 2 The NXAI COMMUNITY LICENSE AGREEMENT is based on the META LLAMA 3 COMMUNITY LICENSE AGREEMENT and contains some modifications, especially Section 2, \u201cAdditional Commercial Terms\u201d is different. \u201cAgreement\u201d means the terms and conditions for use, reproduction, distribution and modification of the NXAI Materials set forth herein. \u201cDocumentation\u201d means the specifications, manuals and documentation accompanying NXAI Materials distributed by NXAI at https://github.com/NX-AI/. \u201cLicensee\u201d or \u201cyou\u201d means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity\u2019s behalf), of the age required under applicable laws, rules or regulations to provide legal consent and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf. \u201cNXAI Materials\u201d means, collectively, NXAI\u2019s proprietary large language models, algorithms and any Software, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code and all other work of NXAI in the field of neural networks, Documentation (and any portion thereof) made available under this Agreement. \u201cNXAI\u201d or \u201cwe\u201d means NXAI GmbH, Linz, Austria. By using or distributing any portion or element of the NXAI Materials, you agree to be bound by this Agreement. 1. License Rights and Redistribution. a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under NXAI\u2019s intellectual property embodied in the NXAI Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the NXAI Materials. b. Redistribution and Use. i. If you distribute or make available the NXAI Materials (or any derivative works thereof), or a product or service that uses any of them, including another AI model, you shall (A) provide a copy of this Agreement with any such NXAI Materials; and (B) prominently display \u201cBuilt with technology from NXAI\u201d on a related website, user interface, blogpost, about page, or product documentation. ii. If you receive NXAI Materials, or any derivative works thereof, from a Licensee as part of an integrated end user product, then Section 2 of this Agreement will not apply to you. iii. You must retain in all copies of the NXAI Materials that you distribute the following attribution notice within a \u201cNotice\u201d text file distributed as a part of such copies: \u201cThis product includes materials developed at NXAI that are licensed under the NXAI Community License, Copyright \u00a9 NXAI GmbH, All Rights Reserved.\u201d 2. Additional Commercial Terms. If (a) the Licensee, on a consolidated basis (including parent, subsidiaries, and affiliates), exceeds the annual revenue of one hundred million Euros (\u20ac100,000,000) or more, and (b) the Licensee incorporates NXAI Material, in whole or in part, into a Commercial Product or Service, then the Licensee must obtain a commercial license from NXAI, which NXAI may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until NXAI otherwise expressly grants you such rights 3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN \u201cAS IS\u201d BASIS, WITHOUT WARRANTIES OF ANY KIND, AND NXAI DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE NXAI MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE NXAI MATERIALS AND ANY OUTPUT AND RESULTS. 4. Limitation of Liability. IN NO EVENT WILL NXAI OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF NXAI OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING. 5. Intellectual Property. a. No trademark licenses are granted under this Agreement, and in connection with the NXAI Materials, neither NXAI nor Licensee may use any name or mark owned by or associated with the other or any of its affiliates, except as required for reasonable and customary use in describing and redistributing the NXAI Materials or as set forth in this Section 5(a). NXAI hereby grants you a license to use \u201cNXAI\u201d (the \u201cMark\u201d) solely as required to comply with the last sentence of Section 1.b.i. All goodwill arising out of your use of the Mark will insure to the benefit of NXAI. b. Subject to NXAI\u2019s ownership of NXAI Materials and derivatives made by or for NXAI, with respect to any derivative works and modifications of the NXAI Materials that are made by you, as between you and NXAI, you are and will be the owner of such derivative works and modifications. c. If you institute litigation or other proceedings against NXAI or any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the NXAI Materials or models released by NXAI outputs or results, or any portion of any of the foregoing, constitutes infringement of intellectual property or other rights owned or licensable by you, then any licenses granted to you under this Agreement shall terminate as of the date such litigation or claim is filed or instituted. You will indemnify and hold harmless NXAI from and against any claim by any third party arising out of or related to your use or distribution of the NXAI Materials. 6. Term and Termination. The term of this Agreement will commence upon your acceptance of this Agreement or access to the NXAI Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. NXAI may terminate this Agreement if you are in breach of any term or condition of this Agreement. Upon termination of this Agreement, you shall delete and cease use of the NXAI Materials. Sections 3, 4 and 7 shall survive the termination of this Agreement. 7. Governing Law and Jurisdiction. This Agreement shall be governed by and construed in accordance with the laws of the Republic of Austria, without regard to its conflict of laws principles. The courts located in Linz, Austria shall have exclusive jurisdiction over any disputes arising out of or in connection with this Agreement. ==================================================================================================== This product includes software licensed under the MIT License: MIT License Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ==================================================================================================== This product includes software licensed under the BSD-3-Clause License. BSD 3-Clause License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ",
"summary": "A library providing fast and efficient mLSTM kernels for the xLSTM.",
"version": "1.0.3",
"project_urls": null,
"split_keywords": [
"mlstm",
" xlstm",
" lstm",
" transformer",
" machine learning",
" deep learning",
" state space models"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e65a1d750516049d4fa9326a58f38e77653c288d5293c1d485c385d8051e241b",
"md5": "5f83a077ffe0249b9281fc20453b94c0",
"sha256": "0216d86d227cee45e20188bfc2d18dc3260c8d4eb6ea600f01e89daa6314e37c"
},
"downloads": -1,
"filename": "mlstm_kernels-1.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5f83a077ffe0249b9281fc20453b94c0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 209151,
"upload_time": "2024-12-11T17:11:27",
"upload_time_iso_8601": "2024-12-11T17:11:27.973409Z",
"url": "https://files.pythonhosted.org/packages/e6/5a/1d750516049d4fa9326a58f38e77653c288d5293c1d485c385d8051e241b/mlstm_kernels-1.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7026d236409e787387cab5d16cc19157934961890e17e847448df126e3b19b84",
"md5": "d78cfdcce1ff9b21d1517336c722f61e",
"sha256": "96fa4f67fefb4154d0fa3eac1957d1e02581be36335998409dfffb070dad948a"
},
"downloads": -1,
"filename": "mlstm_kernels-1.0.3.tar.gz",
"has_sig": false,
"md5_digest": "d78cfdcce1ff9b21d1517336c722f61e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 119307,
"upload_time": "2024-12-11T17:11:34",
"upload_time_iso_8601": "2024-12-11T17:11:34.680124Z",
"url": "https://files.pythonhosted.org/packages/70/26/d236409e787387cab5d16cc19157934961890e17e847448df126e3b19b84/mlstm_kernels-1.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-11 17:11:34",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "mlstm-kernels"
}