tabicl


Nametabicl JSON
Version 0.1.3 PyPI version JSON
download
home_pageNone
SummaryTabICL: A Tabular Foundation Model for In-Context Learning on Large Data
upload_time2025-07-08 09:48:28
maintainerNone
docs_urlNone
authorJingang Qu, David Holzmüller, Marine Le Morvan, Gaël Varoquaux
requires_python<3.13,>=3.9
licenseBSD 3-Clause License Copyright (c) 2025, Soda team @ Inria Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
keywords tabicl foundation model in-context learning tabular data
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![test](https://github.com/soda-inria/tabicl/actions/workflows/testing.yml/badge.svg)](https://github.com/soda-inria/tabicl/actions/workflows/testing.yml)
[![PyPI version](https://badge.fury.io/py/tabicl.svg)](https://badge.fury.io/py/tabicl)
[![Downloads](https://img.shields.io/pypi/dm/tabicl)](https://pypistats.org/packages/tabicl)

# TabICL: A Tabular Foundation Model for In-Context Learning on Large Data (ICML 2025)

This repo is the official implementation of ["TabICL: A Tabular Foundation Model for In-Context Learning on Large Data"](https://arxiv.org/pdf/2502.05564) as well as the follow-ups. TabICL is a tabular foundation model. Currently, it is only for classification tasks.

## Updates

***05/06/2025***

### Better-performing checkpoint 😄

We are continuously improving TabICL, and as a by-product (Great thanks to [David Holzmüller](https://github.com/dholzmueller)'s efforts !!!), we have a better-performing checkpoint. `TabICLClassifier` now accepts a new parameter `checkpoint_version` to specify which pretrained checkpoint to use. The available options are:

- `'tabicl-classifier-v1.1-0506.ckpt'` (default): The latest and best-performing version.
- `'tabicl-classifier-v1-0208.ckpt'`: The version used in the original TabICL paper. Use this if you need to reproduce the results reported in the paper.
- `'tabicl-classifier.ckpt'`: A legacy alias for `'tabicl-classifier-v1-0208.ckpt'` and will be removed in a future release.

<div style="margin-top: 30px;"></div>
<img src="./figures/TabICLv1.1_performance.png" width="70%" alt="Ranking of tabICLv1.1" style="display: block; margin: auto;">
<div style="margin-top: 30px;"></div>

<div style="margin-top: 30px;"></div>
<img src="./figures/TabICLv1.1_perf_wrt_samples.png" width="90%" alt="Ranking vs. number of samples" style="display: block; margin: auto;">
<div style="margin-top: 30px;"></div>

***05/05/2025***

### Open-source pretraining code 🥳

After intensive refactoring, we fully open-sourced our pretraining code to reproduce our paper. The scripts folder provides the commands for [stage 1](./scripts/train_stage1.sh), [stage 2](./scripts/train_stage2.sh), and [stage 3](./scripts/train_stage3.sh) of curriculum learning.

***05/01/2025***

### Accepted to ICML 2025 🎉

## Architecture

TabICL processes tabular data through three sequential stages:

1. **Column-wise Embedding**: Creates distribution-aware embeddings for each feature
2. **Row-wise Interaction**: Captures interactions between features within each row
3. **Dataset-wise In-Context Learning**: Learns patterns from labeled examples to make predictions

<img src="./figures/architecture.png" width="90%" alt="The architecture of TabICL" style="display: block; margin: auto;">

## Installation

### From [PyPI](https://pypi.org/project/tabicl)

```bash
pip install tabicl
```

### From the source

#### Option 1: Installing `tabicl` from the Local Clone

```bash
cd tabicl; pip install -e .
```

#### Option 2: Installing `tabicl` Directly from the Git Remote

```bash
pip install git+https://github.com/soda-inria/tabicl.git
```

## Usage

### Basic Usage

```python
from tabicl import TabICLClassifier

clf = TabICLClassifier()
clf.fit(X_train, y_train)  # this is cheap
clf.predict(X_test)  # in-context learning happens here
```

The code above will automatically download the pre-trained checkpoint (~100MB) from Hugging Face Hub on first use and choose a GPU if available.

### Advanced Configuration

TabICL offers a set of parameters to customize its behavior. The following example shows all available parameters with their default values and brief descriptions:

```python
from tabicl import TabICLClassifier

clf = TabICLClassifier(
  n_estimators=32,                                        # number of ensemble members
  norm_methods=["none", "power"],                         # normalization methods to try
  feat_shuffle_method="latin",                            # feature permutation strategy
  class_shift=True,                                       # whether to apply cyclic shifts to class labels
  outlier_threshold=4.0,                                  # z-score threshold for outlier detection and clipping
  softmax_temperature=0.9,                                # controls prediction confidence
  average_logits=True,                                    # whether ensemble averaging is done on logits or probabilities
  use_hierarchical=True,                                  # enable hierarchical classification for datasets with many classe
  batch_size=8,                                           # process this many ensemble members together (reduce RAM usage)
  use_amp=True,                                           # use automatic mixed precision for faster inference
  model_path=None,                                        # where the model checkpoint is stored
  allow_auto_download=True,                               # whether automatic download to the specified path is allowed
  checkpoint_version="tabicl-classifier-v1.1-0506.ckpt",  # the version of pretrained checkpoint to use
  device=None,                                            # specify device for inference
  random_state=42,                                        # random seed for reproducibility
  n_jobs=None,                                            # number of threads to use for PyTorch
  verbose=False,                                          # print detailed information during inference
  inference_config=None,                                  # inference configuration for fine-grained control
)
```

## Memory-Efficient Inference

TabICL includes memory management to handle large datasets:

- **Memory Profiling**: Built-in memory estimators for different components of the model
- **Batch Size Estimation**: Dynamically determines optimal batch sizes based on available GPU memory
- **CPU Offloading**: Automatically offloads intermediate results to CPU when beneficial
- **OOM Recovery**: Recovers gracefully from out-of-memory errors by reducing batch size

## Preprocessing

### Simple built-in preprocessing
If the input `X` to TabICL is a pandas DataFrame, TabICL will automatically:
- Detect and ordinal encode categorical columns (including string, object, category, and boolean types)
- Create a separate category for missing values in categorical features
- Perform mean imputation for missing numerical values (encoded as NaN)

If the input `X` is a numpy array, TabICL assumes that ordinal encoding and missing value imputation have already been performed.

For both input types, TabICL applies additional preprocessing:
- Outlier detection and removal
- Feature scaling and normalization
- Feature shuffling for ensemble diversity

### Advanced data preprocessing with skrub <img src="https://skrub-data.github.io/stable/_static/skrub.svg" width="8%" alt="skrub logo" style="display: inline; margin-left: 5px; margin-right: 5px;">

Real-world datasets often contain complex heterogeneous data that benefits from more sophisticated preprocessing. For these scenarios, we recommend [skrub](https://skrub-data.org/stable/index.html), a powerful library designed specifically for advanced tabular data preparation.

**Why use skrub?**
- Handles diverse data types (numerical, categorical, text, datetime, etc.)
- Provides robust preprocessing for dirty data
- Offers sophisticated feature engineering capabilities
- Supports multi-table integration and joins

#### Installation

```bash
pip install skrub -U
```

#### Basic Integration

Use skrub's [TableVectorizer](https://skrub-data.org/stable/reference/generated/skrub.TableVectorizer.html) to transform your raw data before passing it to TabICLClassifier:

```python
from skrub import TableVectorizer
from tabicl import TabICLClassifier
from sklearn.pipeline import make_pipeline

pipeline = make_pipeline(
    TableVectorizer(),  # Automatically handles various data types
    TabICLClassifier()
)

pipeline.fit(X_train, y_train)  # X should be a DataFrame
predictions = pipeline.predict(X_test)
```


## Key Features and Considerations:

- **Number of samples**:
  - TabICL is pretrained on datasets with up to 60K samples.
  - TabICL can handle datasets beyond 100K samples thanks to memory-efficient inference.
  - TabPFN (v2) is on average better than TabICL on small datasets with <10K samples, while TabICL is better on larger datasets.
  - Classical methods may catch up with TabICL at around 40K samples but they are much slower due to extensive hyperparameter tuning.

<div style="margin-top: 30px;"></div>
<img src="./figures/perf_wrt_samples.png" width="80%" alt="Ranking vs. number of samples" style="display: block; margin: auto;">
<div style="margin-top: 30px;"></div>

- **Number of features**:
  - TabICL is pretrained on datasets with up to 100 features.
  - TabICL can accommodate any number of features theoretically.

- **Number of classes**:
  - TabICL is pretrained on datasets with up to 10 classes, so it natively supports a maximum of 10 classes.
  - However, TabICL can handle any number of classes thanks to its in-built hierarchical classification.

- **Inference speed**:
  - Like TabPFN, `fit()` does minimal work while `predict()` runs the full model
  - At the same `n_estimators`, TabICL is usually 1x-5x faster than TabPFN
  - TabICL benefits more from larger `n_estimators`, hence the default of 32
  - Automatic mixed precision (AMP) provides further speed improvements on compatible GPUs

- **No tuning required**: TabICL produces good predictions without hyperparameter tuning, unlike classical methods that require extensive tuning for optimal performance.

## Performance

TabICL has achieved excellent results on the [TALENT](https://github.com/qile2000/LAMDA-TALENT) benchmark.

<img src="./figures/performance.png" width="100%" alt="Performance on TALENT" style="display: block; margin: auto;">
<div style="margin-top: 30px;"></div>

## Citation
If you use TabICL for research purposes,
please cite our **[paper](https://arxiv.org/abs/2502.05564)**:
```bibtex
@article{qu2025tabicl,
  title={TabICL: A Tabular Foundation Model for In-Context Learning on Large Data},
  author={Qu, Jingang and Holzm{\"u}ller, David and Varoquaux, Ga{\"e}l and Morvan, Marine Le},
  journal={arXiv preprint arXiv:2502.05564},
  year={2025}
}
```

## Contributors

- [Jingang Qu](https://github.com/jingangQu)
- [David Holzmüller](https://github.com/dholzmueller)
- [Marine Le Morvan](https://github.com/marineLM)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "tabicl",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.9",
    "maintainer_email": null,
    "keywords": "TabICL, foundation model, in-context learning, tabular data",
    "author": "Jingang Qu, David Holzm\u00fcller, Marine Le Morvan, Ga\u00ebl Varoquaux",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/84/8e/52ae8c8f6b1947ced70e397dd34cb34203fdd279aa901ec51002ded0a4b8/tabicl-0.1.3.tar.gz",
    "platform": null,
    "description": "[![test](https://github.com/soda-inria/tabicl/actions/workflows/testing.yml/badge.svg)](https://github.com/soda-inria/tabicl/actions/workflows/testing.yml)\n[![PyPI version](https://badge.fury.io/py/tabicl.svg)](https://badge.fury.io/py/tabicl)\n[![Downloads](https://img.shields.io/pypi/dm/tabicl)](https://pypistats.org/packages/tabicl)\n\n# TabICL: A Tabular Foundation Model for In-Context Learning on Large Data (ICML 2025)\n\nThis repo is the official implementation of [\"TabICL: A Tabular Foundation Model for In-Context Learning on Large Data\"](https://arxiv.org/pdf/2502.05564) as well as the follow-ups. TabICL is a tabular foundation model. Currently, it is only for classification tasks.\n\n## Updates\n\n***05/06/2025***\n\n### Better-performing checkpoint \ud83d\ude04\n\nWe are continuously improving TabICL, and as a by-product (Great thanks to [David Holzm\u00fcller](https://github.com/dholzmueller)'s efforts !!!), we have a better-performing checkpoint. `TabICLClassifier` now accepts a new parameter `checkpoint_version` to specify which pretrained checkpoint to use. The available options are:\n\n- `'tabicl-classifier-v1.1-0506.ckpt'` (default): The latest and best-performing version.\n- `'tabicl-classifier-v1-0208.ckpt'`: The version used in the original TabICL paper. Use this if you need to reproduce the results reported in the paper.\n- `'tabicl-classifier.ckpt'`: A legacy alias for `'tabicl-classifier-v1-0208.ckpt'` and will be removed in a future release.\n\n<div style=\"margin-top: 30px;\"></div>\n<img src=\"./figures/TabICLv1.1_performance.png\" width=\"70%\" alt=\"Ranking of tabICLv1.1\" style=\"display: block; margin: auto;\">\n<div style=\"margin-top: 30px;\"></div>\n\n<div style=\"margin-top: 30px;\"></div>\n<img src=\"./figures/TabICLv1.1_perf_wrt_samples.png\" width=\"90%\" alt=\"Ranking vs. number of samples\" style=\"display: block; margin: auto;\">\n<div style=\"margin-top: 30px;\"></div>\n\n***05/05/2025***\n\n### Open-source pretraining code \ud83e\udd73\n\nAfter intensive refactoring, we fully open-sourced our pretraining code to reproduce our paper. The scripts folder provides the commands for [stage 1](./scripts/train_stage1.sh), [stage 2](./scripts/train_stage2.sh), and [stage 3](./scripts/train_stage3.sh) of curriculum learning.\n\n***05/01/2025***\n\n### Accepted to ICML 2025 \ud83c\udf89\n\n## Architecture\n\nTabICL processes tabular data through three sequential stages:\n\n1. **Column-wise Embedding**: Creates distribution-aware embeddings for each feature\n2. **Row-wise Interaction**: Captures interactions between features within each row\n3. **Dataset-wise In-Context Learning**: Learns patterns from labeled examples to make predictions\n\n<img src=\"./figures/architecture.png\" width=\"90%\" alt=\"The architecture of TabICL\" style=\"display: block; margin: auto;\">\n\n## Installation\n\n### From [PyPI](https://pypi.org/project/tabicl)\n\n```bash\npip install tabicl\n```\n\n### From the source\n\n#### Option 1: Installing `tabicl` from the Local Clone\n\n```bash\ncd tabicl; pip install -e .\n```\n\n#### Option 2: Installing `tabicl` Directly from the Git Remote\n\n```bash\npip install git+https://github.com/soda-inria/tabicl.git\n```\n\n## Usage\n\n### Basic Usage\n\n```python\nfrom tabicl import TabICLClassifier\n\nclf = TabICLClassifier()\nclf.fit(X_train, y_train)  # this is cheap\nclf.predict(X_test)  # in-context learning happens here\n```\n\nThe code above will automatically download the pre-trained checkpoint (~100MB) from Hugging Face Hub on first use and choose a GPU if available.\n\n### Advanced Configuration\n\nTabICL offers a set of parameters to customize its behavior. The following example shows all available parameters with their default values and brief descriptions:\n\n```python\nfrom tabicl import TabICLClassifier\n\nclf = TabICLClassifier(\n  n_estimators=32,                                        # number of ensemble members\n  norm_methods=[\"none\", \"power\"],                         # normalization methods to try\n  feat_shuffle_method=\"latin\",                            # feature permutation strategy\n  class_shift=True,                                       # whether to apply cyclic shifts to class labels\n  outlier_threshold=4.0,                                  # z-score threshold for outlier detection and clipping\n  softmax_temperature=0.9,                                # controls prediction confidence\n  average_logits=True,                                    # whether ensemble averaging is done on logits or probabilities\n  use_hierarchical=True,                                  # enable hierarchical classification for datasets with many classe\n  batch_size=8,                                           # process this many ensemble members together (reduce RAM usage)\n  use_amp=True,                                           # use automatic mixed precision for faster inference\n  model_path=None,                                        # where the model checkpoint is stored\n  allow_auto_download=True,                               # whether automatic download to the specified path is allowed\n  checkpoint_version=\"tabicl-classifier-v1.1-0506.ckpt\",  # the version of pretrained checkpoint to use\n  device=None,                                            # specify device for inference\n  random_state=42,                                        # random seed for reproducibility\n  n_jobs=None,                                            # number of threads to use for PyTorch\n  verbose=False,                                          # print detailed information during inference\n  inference_config=None,                                  # inference configuration for fine-grained control\n)\n```\n\n## Memory-Efficient Inference\n\nTabICL includes memory management to handle large datasets:\n\n- **Memory Profiling**: Built-in memory estimators for different components of the model\n- **Batch Size Estimation**: Dynamically determines optimal batch sizes based on available GPU memory\n- **CPU Offloading**: Automatically offloads intermediate results to CPU when beneficial\n- **OOM Recovery**: Recovers gracefully from out-of-memory errors by reducing batch size\n\n## Preprocessing\n\n### Simple built-in preprocessing\nIf the input `X` to TabICL is a pandas DataFrame, TabICL will automatically:\n- Detect and ordinal encode categorical columns (including string, object, category, and boolean types)\n- Create a separate category for missing values in categorical features\n- Perform mean imputation for missing numerical values (encoded as NaN)\n\nIf the input `X` is a numpy array, TabICL assumes that ordinal encoding and missing value imputation have already been performed.\n\nFor both input types, TabICL applies additional preprocessing:\n- Outlier detection and removal\n- Feature scaling and normalization\n- Feature shuffling for ensemble diversity\n\n### Advanced data preprocessing with skrub <img src=\"https://skrub-data.github.io/stable/_static/skrub.svg\" width=\"8%\" alt=\"skrub logo\" style=\"display: inline; margin-left: 5px; margin-right: 5px;\">\n\nReal-world datasets often contain complex heterogeneous data that benefits from more sophisticated preprocessing. For these scenarios, we recommend [skrub](https://skrub-data.org/stable/index.html), a powerful library designed specifically for advanced tabular data preparation.\n\n**Why use skrub?**\n- Handles diverse data types (numerical, categorical, text, datetime, etc.)\n- Provides robust preprocessing for dirty data\n- Offers sophisticated feature engineering capabilities\n- Supports multi-table integration and joins\n\n#### Installation\n\n```bash\npip install skrub -U\n```\n\n#### Basic Integration\n\nUse skrub's [TableVectorizer](https://skrub-data.org/stable/reference/generated/skrub.TableVectorizer.html) to transform your raw data before passing it to TabICLClassifier:\n\n```python\nfrom skrub import TableVectorizer\nfrom tabicl import TabICLClassifier\nfrom sklearn.pipeline import make_pipeline\n\npipeline = make_pipeline(\n    TableVectorizer(),  # Automatically handles various data types\n    TabICLClassifier()\n)\n\npipeline.fit(X_train, y_train)  # X should be a DataFrame\npredictions = pipeline.predict(X_test)\n```\n\n\n## Key Features and Considerations:\n\n- **Number of samples**:\n  - TabICL is pretrained on datasets with up to 60K samples.\n  - TabICL can handle datasets beyond 100K samples thanks to memory-efficient inference.\n  - TabPFN (v2) is on average better than TabICL on small datasets with <10K samples, while TabICL is better on larger datasets.\n  - Classical methods may catch up with TabICL at around 40K samples but they are much slower due to extensive hyperparameter tuning.\n\n<div style=\"margin-top: 30px;\"></div>\n<img src=\"./figures/perf_wrt_samples.png\" width=\"80%\" alt=\"Ranking vs. number of samples\" style=\"display: block; margin: auto;\">\n<div style=\"margin-top: 30px;\"></div>\n\n- **Number of features**:\n  - TabICL is pretrained on datasets with up to 100 features.\n  - TabICL can accommodate any number of features theoretically.\n\n- **Number of classes**:\n  - TabICL is pretrained on datasets with up to 10 classes, so it natively supports a maximum of 10 classes.\n  - However, TabICL can handle any number of classes thanks to its in-built hierarchical classification.\n\n- **Inference speed**:\n  - Like TabPFN, `fit()` does minimal work while `predict()` runs the full model\n  - At the same `n_estimators`, TabICL is usually 1x-5x faster than TabPFN\n  - TabICL benefits more from larger `n_estimators`, hence the default of 32\n  - Automatic mixed precision (AMP) provides further speed improvements on compatible GPUs\n\n- **No tuning required**: TabICL produces good predictions without hyperparameter tuning, unlike classical methods that require extensive tuning for optimal performance.\n\n## Performance\n\nTabICL has achieved excellent results on the [TALENT](https://github.com/qile2000/LAMDA-TALENT) benchmark.\n\n<img src=\"./figures/performance.png\" width=\"100%\" alt=\"Performance on TALENT\" style=\"display: block; margin: auto;\">\n<div style=\"margin-top: 30px;\"></div>\n\n## Citation\nIf you use TabICL for research purposes,\nplease cite our **[paper](https://arxiv.org/abs/2502.05564)**:\n```bibtex\n@article{qu2025tabicl,\n  title={TabICL: A Tabular Foundation Model for In-Context Learning on Large Data},\n  author={Qu, Jingang and Holzm{\\\"u}ller, David and Varoquaux, Ga{\\\"e}l and Morvan, Marine Le},\n  journal={arXiv preprint arXiv:2502.05564},\n  year={2025}\n}\n```\n\n## Contributors\n\n- [Jingang Qu](https://github.com/jingangQu)\n- [David Holzm\u00fcller](https://github.com/dholzmueller)\n- [Marine Le Morvan](https://github.com/marineLM)\n",
    "bugtrack_url": null,
    "license": "BSD 3-Clause License\n        \n        Copyright (c) 2025, Soda team @ Inria\n        \n        Redistribution and use in source and binary forms, with or without\n        modification, are permitted provided that the following conditions are met:\n        \n        1. Redistributions of source code must retain the above copyright notice, this\n           list of conditions and the following disclaimer.\n        \n        2. Redistributions in binary form must reproduce the above copyright notice,\n           this list of conditions and the following disclaimer in the documentation\n           and/or other materials provided with the distribution.\n        \n        3. Neither the name of the copyright holder nor the names of its\n           contributors may be used to endorse or promote products derived from\n           this software without specific prior written permission.\n        \n        THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n        AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n        IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n        DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n        FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n        DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n        SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n        CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n        OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n        OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.",
    "summary": "TabICL: A Tabular Foundation Model for In-Context Learning on Large Data",
    "version": "0.1.3",
    "project_urls": {
        "Documentation": "https://github.com/soda-inria/tabicl#readme",
        "Issues": "https://github.com/soda-inria/tabicl/issues",
        "Source": "https://github.com/soda-inria/tabicl"
    },
    "split_keywords": [
        "tabicl",
        " foundation model",
        " in-context learning",
        " tabular data"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "3b91c7a3fa33970928a9384f4ce02957a9a471876b6319885089d32f02b29605",
                "md5": "2e2ec61d7f73401907cd5387d0aa6e0e",
                "sha256": "b7cf7c504d94bd6e4099ecd22ff7b20b1e6ea580f6e0761f3c1bd0b705b3a797"
            },
            "downloads": -1,
            "filename": "tabicl-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2e2ec61d7f73401907cd5387d0aa6e0e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.9",
            "size": 103933,
            "upload_time": "2025-07-08T09:48:23",
            "upload_time_iso_8601": "2025-07-08T09:48:23.869576Z",
            "url": "https://files.pythonhosted.org/packages/3b/91/c7a3fa33970928a9384f4ce02957a9a471876b6319885089d32f02b29605/tabicl-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "848e52ae8c8f6b1947ced70e397dd34cb34203fdd279aa901ec51002ded0a4b8",
                "md5": "069270b951e62c685fc6c4acf5f415b4",
                "sha256": "211d96d29847eef64051b664e117117ccf8b3579ed55ff435722a733366bc3c4"
            },
            "downloads": -1,
            "filename": "tabicl-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "069270b951e62c685fc6c4acf5f415b4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.9",
            "size": 1728870,
            "upload_time": "2025-07-08T09:48:28",
            "upload_time_iso_8601": "2025-07-08T09:48:28.902465Z",
            "url": "https://files.pythonhosted.org/packages/84/8e/52ae8c8f6b1947ced70e397dd34cb34203fdd279aa901ec51002ded0a4b8/tabicl-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-08 09:48:28",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "soda-inria",
    "github_project": "tabicl#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "tabicl"
}
        
Elapsed time: 0.42330s