uniception


Nameuniception JSON
Version 0.1.4 PyPI version JSON
download
home_pageNone
SummaryGeneralizable Perception Stack for 3D, 4D, spatial AI and scene understanding
upload_time2025-08-22 22:46:50
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseBSD 3-Clause
keywords computer-vision 3d-vision spatial-ai perception deep-learning pytorch
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # UniCeption

UniCeption houses modular building blocks for developing and training generalizable perception models for all things related to 3D, 4D, spatial AI and scene understanding.
It is designed to be flexible and extensible, allowing researchers to easily experiment with different architectures and configurations.

Please refer to the [Developer Guidelines](#developer-guidelines) for contributing to the project.

## Installation

### Install from PyPI

The easiest way to install UniCeption is from PyPI:

```bash
# Install with base dependencies
pip install uniception

# Optional: Install with XFormers support
pip install "uniception[xformers]"

# Optional: Install with development tools
pip install "uniception[dev]"

# Optional: Install all optional dependencies
pip install "uniception[all]"
```

### Install from Source

Clone the repository to your local machine by running the following command:

```bash
git clone git@github.com:castacks/UniCeption.git
cd UniCeption
```

### Standard Installation

Install the `uniception` package in development mode by running the following commands:

```bash
# Please use Conda or Python Virtual Environment based on your preference
# For Conda Environment
conda create --name uniception python=3.12
conda activate uniception
# For Python Virtual Environment
virtualenv uniception
source uniception/bin/activate

# Install UniCeption with base dependencies (includes PyTorch)
pip install -e .

# Optional: Install with XFormers support
pip install -e ".[xformers]"

# Optional: Install with development tools
pip install -e ".[dev]"

# Optional: Install all optional dependencies
pip install -e ".[all]"

# Setup pre-commit hooks for development
pre-commit install
```

### Optional: CroCo RoPE Extension Installation

To use CroCo models with the custom RoPE kernel:

```bash
# Recommended: Use the console script
uniception-install-croco

# Alternative: Set environment variable during installation
INSTALL_CROCO_ROPE=true pip install -e .

# Manual compilation (if needed)
cd uniception/models/libs/croco/curope
python setup.py build_ext --inplace
cd ../../../../../
```

### Installation Validation and Dependency Checking

After installation, use these console scripts to validate your setup:

```bash
# Validate installation and check dependencies
uniception-validate

# Check which optional dependencies are available
uniception-check-deps
```

### Advanced Installation Options

#### Docker Installation (No Internet Access)

If you're working in a Docker container that already has Python dependencies installed but no internet access, you can install UniCeption in development mode without triggering network requests:

```bash
# Install only the package structure without dependencies
pip install --no-index --no-deps --no-build-isolation -e .
```

**Note:** This command assumes your Docker image already contains all required dependencies (PyTorch, etc.). Use `uniception-validate` after installation to verify all dependencies are available.

#### Offline Installation

For environments without internet access:

```bash
# 1. On a machine with internet access, prepare offline wheels
uniception-prepare-offline --output-dir offline_wheels --extras all

# 2. Copy the offline_wheels directory to your offline environment
# 3. Run the offline installation
cd offline_wheels
INSTALL_CROCO_ROPE=true INSTALL_XFORMERS=true ./install_offline.sh
```

#### Downloading Checkpoints

Download UniCeption format custom checkpoints:

```bash
# Download all available checkpoints
uniception-download-checkpoints

# Download specific folders only (e.g., encoders and prediction heads)
uniception-download-checkpoints --folders encoders prediction_heads

# Specify custom destination
uniception-download-checkpoints --destination /path/to/checkpoints
```

**Available options:**
- `--folders`: Specify which folders to download. Choices: `encoders`, `info_sharing`, `prediction_heads`, `examples` (default: all folders)
- `--destination`: Custom destination folder for downloaded checkpoints (default: current directory)

---

## Currently Supported Components

### Encoders

Please refer to the `uniception/models/encoders` directory for the supported encoders and documentation for adding new encoders. The supported encoders can be listed by running:

```bash
python3 -m uniception.models.encoders.list
```

---

## Information Sharing Blocks

Please refer to the `uniception/models/info_sharing` directory for the supported information sharing blocks.

---

## Prediction Heads

Please refer to the `uniception/models/prediction_heads` directory for the supported prediction heads.

---

## Codebases built on top of UniCeption

Check out our following codebases which build on top of UniCeption:
- [UFM: A Simple Path towards Unified Dense Correspondence with Flow](https://uniflowmatch.github.io/)

## License

The code in this repository is licensed under a fully open-source [BSD 3-Clause License](LICENSE).

## Acknowledgements

We thank the following projects for their open-source code: [DUSt3R](https://github.com/naver/dust3r), [MASt3R](https://github.com/naver/mast3r), [MoGe](https://github.com/microsoft/moge), [HF PyTorch Image Models](https://github.com/huggingface/pytorch-image-models), and all the other pre-trained image encoders featured in this repo.

## Contributing & Developer Guidelines

If you find our work useful, please consider giving it a star ⭐. We welcome contributions to UniCeption! Whether it's fixing bugs, adding new features, or improving documentation, your help is appreciated.

Please follow these guidelines when contributing to UniCeption:
- **Code Style**: Follow the [Google Python Style Guide](https://google.github.io/styleguide/pyguide.html) for code style.
- **Documentation**: Add docstrings to all classes and methods.
- **Unit Tests**: Add necessary unit tests to the `tests` folder.
- **Linting**: Run `black` & `isort` on your code before committing. For example, you can run `black . && isort .`.

Please create a pull request for any changes you make, and ensure that all tests pass before merging. We also encourage you to open issues for discussion before starting on larger features or changes. Also please feel free to add further unit tests to the `tests` folder to ensure the correctness of your changes.

## Maintainers

UniCeption is maintained by the [AirLab](https://theairlab.org/). In particular, feel free to reach out to the following maintainers for any questions or issues (Github issues are preferred):
- [Nikhil Keetha](https://nik-v9.github.io/)
- [Yuchen Zhang](https://infinity1096.github.io/)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "uniception",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "computer-vision, 3d-vision, spatial-ai, perception, deep-learning, pytorch",
    "author": null,
    "author_email": "AirLab <airlab-dev@lists.andrew.cmu.edu>",
    "download_url": "https://files.pythonhosted.org/packages/eb/be/f95117434bad0c1a9ea2194cdb572b648aeb2fd7556ee1ca3aaa50af764b/uniception-0.1.4.tar.gz",
    "platform": null,
    "description": "# UniCeption\n\nUniCeption houses modular building blocks for developing and training generalizable perception models for all things related to 3D, 4D, spatial AI and scene understanding.\nIt is designed to be flexible and extensible, allowing researchers to easily experiment with different architectures and configurations.\n\nPlease refer to the [Developer Guidelines](#developer-guidelines) for contributing to the project.\n\n## Installation\n\n### Install from PyPI\n\nThe easiest way to install UniCeption is from PyPI:\n\n```bash\n# Install with base dependencies\npip install uniception\n\n# Optional: Install with XFormers support\npip install \"uniception[xformers]\"\n\n# Optional: Install with development tools\npip install \"uniception[dev]\"\n\n# Optional: Install all optional dependencies\npip install \"uniception[all]\"\n```\n\n### Install from Source\n\nClone the repository to your local machine by running the following command:\n\n```bash\ngit clone git@github.com:castacks/UniCeption.git\ncd UniCeption\n```\n\n### Standard Installation\n\nInstall the `uniception` package in development mode by running the following commands:\n\n```bash\n# Please use Conda or Python Virtual Environment based on your preference\n# For Conda Environment\nconda create --name uniception python=3.12\nconda activate uniception\n# For Python Virtual Environment\nvirtualenv uniception\nsource uniception/bin/activate\n\n# Install UniCeption with base dependencies (includes PyTorch)\npip install -e .\n\n# Optional: Install with XFormers support\npip install -e \".[xformers]\"\n\n# Optional: Install with development tools\npip install -e \".[dev]\"\n\n# Optional: Install all optional dependencies\npip install -e \".[all]\"\n\n# Setup pre-commit hooks for development\npre-commit install\n```\n\n### Optional: CroCo RoPE Extension Installation\n\nTo use CroCo models with the custom RoPE kernel:\n\n```bash\n# Recommended: Use the console script\nuniception-install-croco\n\n# Alternative: Set environment variable during installation\nINSTALL_CROCO_ROPE=true pip install -e .\n\n# Manual compilation (if needed)\ncd uniception/models/libs/croco/curope\npython setup.py build_ext --inplace\ncd ../../../../../\n```\n\n### Installation Validation and Dependency Checking\n\nAfter installation, use these console scripts to validate your setup:\n\n```bash\n# Validate installation and check dependencies\nuniception-validate\n\n# Check which optional dependencies are available\nuniception-check-deps\n```\n\n### Advanced Installation Options\n\n#### Docker Installation (No Internet Access)\n\nIf you're working in a Docker container that already has Python dependencies installed but no internet access, you can install UniCeption in development mode without triggering network requests:\n\n```bash\n# Install only the package structure without dependencies\npip install --no-index --no-deps --no-build-isolation -e .\n```\n\n**Note:** This command assumes your Docker image already contains all required dependencies (PyTorch, etc.). Use `uniception-validate` after installation to verify all dependencies are available.\n\n#### Offline Installation\n\nFor environments without internet access:\n\n```bash\n# 1. On a machine with internet access, prepare offline wheels\nuniception-prepare-offline --output-dir offline_wheels --extras all\n\n# 2. Copy the offline_wheels directory to your offline environment\n# 3. Run the offline installation\ncd offline_wheels\nINSTALL_CROCO_ROPE=true INSTALL_XFORMERS=true ./install_offline.sh\n```\n\n#### Downloading Checkpoints\n\nDownload UniCeption format custom checkpoints:\n\n```bash\n# Download all available checkpoints\nuniception-download-checkpoints\n\n# Download specific folders only (e.g., encoders and prediction heads)\nuniception-download-checkpoints --folders encoders prediction_heads\n\n# Specify custom destination\nuniception-download-checkpoints --destination /path/to/checkpoints\n```\n\n**Available options:**\n- `--folders`: Specify which folders to download. Choices: `encoders`, `info_sharing`, `prediction_heads`, `examples` (default: all folders)\n- `--destination`: Custom destination folder for downloaded checkpoints (default: current directory)\n\n---\n\n## Currently Supported Components\n\n### Encoders\n\nPlease refer to the `uniception/models/encoders` directory for the supported encoders and documentation for adding new encoders. The supported encoders can be listed by running:\n\n```bash\npython3 -m uniception.models.encoders.list\n```\n\n---\n\n## Information Sharing Blocks\n\nPlease refer to the `uniception/models/info_sharing` directory for the supported information sharing blocks.\n\n---\n\n## Prediction Heads\n\nPlease refer to the `uniception/models/prediction_heads` directory for the supported prediction heads.\n\n---\n\n## Codebases built on top of UniCeption\n\nCheck out our following codebases which build on top of UniCeption:\n- [UFM: A Simple Path towards Unified Dense Correspondence with Flow](https://uniflowmatch.github.io/)\n\n## License\n\nThe code in this repository is licensed under a fully open-source [BSD 3-Clause License](LICENSE).\n\n## Acknowledgements\n\nWe thank the following projects for their open-source code: [DUSt3R](https://github.com/naver/dust3r), [MASt3R](https://github.com/naver/mast3r), [MoGe](https://github.com/microsoft/moge), [HF PyTorch Image Models](https://github.com/huggingface/pytorch-image-models), and all the other pre-trained image encoders featured in this repo.\n\n## Contributing & Developer Guidelines\n\nIf you find our work useful, please consider giving it a star \u2b50. We welcome contributions to UniCeption! Whether it's fixing bugs, adding new features, or improving documentation, your help is appreciated.\n\nPlease follow these guidelines when contributing to UniCeption:\n- **Code Style**: Follow the [Google Python Style Guide](https://google.github.io/styleguide/pyguide.html) for code style.\n- **Documentation**: Add docstrings to all classes and methods.\n- **Unit Tests**: Add necessary unit tests to the `tests` folder.\n- **Linting**: Run `black` & `isort` on your code before committing. For example, you can run `black . && isort .`.\n\nPlease create a pull request for any changes you make, and ensure that all tests pass before merging. We also encourage you to open issues for discussion before starting on larger features or changes. Also please feel free to add further unit tests to the `tests` folder to ensure the correctness of your changes.\n\n## Maintainers\n\nUniCeption is maintained by the [AirLab](https://theairlab.org/). In particular, feel free to reach out to the following maintainers for any questions or issues (Github issues are preferred):\n- [Nikhil Keetha](https://nik-v9.github.io/)\n- [Yuchen Zhang](https://infinity1096.github.io/)\n",
    "bugtrack_url": null,
    "license": "BSD 3-Clause",
    "summary": "Generalizable Perception Stack for 3D, 4D, spatial AI and scene understanding",
    "version": "0.1.4",
    "project_urls": {
        "Bug Tracker": "https://github.com/castacks/UniCeption/issues",
        "Documentation": "https://github.com/castacks/UniCeption/blob/main/README.md",
        "Homepage": "https://github.com/castacks/UniCeption"
    },
    "split_keywords": [
        "computer-vision",
        " 3d-vision",
        " spatial-ai",
        " perception",
        " deep-learning",
        " pytorch"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "450dd1a88446398da6c79cb9125d94353bb40e2d3c1285938fcaac2ccd41d9bb",
                "md5": "251f605ddb6987cd9196f1238ef0098c",
                "sha256": "5614154b7581fd310dfd8b3399f125887ffc1fdf7042ce8fb4ca42e4c0961d1b"
            },
            "downloads": -1,
            "filename": "uniception-0.1.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "251f605ddb6987cd9196f1238ef0098c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 167423,
            "upload_time": "2025-08-22T22:46:48",
            "upload_time_iso_8601": "2025-08-22T22:46:48.873262Z",
            "url": "https://files.pythonhosted.org/packages/45/0d/d1a88446398da6c79cb9125d94353bb40e2d3c1285938fcaac2ccd41d9bb/uniception-0.1.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ebbef95117434bad0c1a9ea2194cdb572b648aeb2fd7556ee1ca3aaa50af764b",
                "md5": "4f5316d80508edc46d55b45f2cf04b72",
                "sha256": "bae8a5ad94c796191e6af03fcd6b490d9669c42c4e215297be9948c10b35c453"
            },
            "downloads": -1,
            "filename": "uniception-0.1.4.tar.gz",
            "has_sig": false,
            "md5_digest": "4f5316d80508edc46d55b45f2cf04b72",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 119487,
            "upload_time": "2025-08-22T22:46:50",
            "upload_time_iso_8601": "2025-08-22T22:46:50.377051Z",
            "url": "https://files.pythonhosted.org/packages/eb/be/f95117434bad0c1a9ea2194cdb572b648aeb2fd7556ee1ca3aaa50af764b/uniception-0.1.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-22 22:46:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "castacks",
    "github_project": "UniCeption",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "uniception"
}
        
Elapsed time: 1.81504s