autodistill-altclip


Nameautodistill-altclip JSON
Version 0.1.2 PyPI version JSON
download
home_pagehttps://github.com/autodistill/autodistill-altclip
SummaryAltCLIP model for use with Autodistill.
upload_time2023-12-05 09:16:36
maintainer
docs_urlNone
authorRoboflow
requires_python>=3.7
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
  <p>
    <a align="center" href="" target="_blank">
      <img
        width="850"
        src="https://media.roboflow.com/open-source/autodistill/autodistill-banner.png"
      >
    </a>
  </p>
</div>

# Autodistill AltCLIP Module

This repository contains the code supporting the AltCLIP base model for use with [Autodistill](https://github.com/autodistill/autodistill).

[AltCLIP](https://arxiv.org/abs/2211.06679v2) is a multi-modal vision model. With AltCLIP, you can compare the similarity between text and images, or the similarlity between two images. AltCLIP was trained on multi-lingual text-image pairs, which means it can be used for zero-shot classification with text prompts in different languages. [Read the AltCLIP paper for more information](https://arxiv.org/pdf/2211.06679v2.pdf).

The Autodistill AltCLIP module enables you to use AltCLIP for zero-shot classification.

Read the full [Autodistill documentation](https://autodistill.github.io/autodistill/).

Read the [CLIP Autodistill documentation](https://autodistill.github.io/autodistill/base_models/clip/).

## Installation

To use AltCLIP with autodistill, you need to install the following dependency:


```bash
pip3 install autodistill-altclip
```

## Quickstart

```python
from autodistill_altclip import AltCLIP
from autodistill.detection import CaptionOntology

# define an ontology to map class names to our AltCLIP prompt
# the ontology dictionary has the format {caption: class}
# where caption is the prompt sent to the base model, and class is the label that will
# be saved for that caption in the generated results
# then, load the model
base_model = AltCLIP(
    ontology=CaptionOntology(
        {
            "person": "person",
            "a forklift": "forklift"
        }
    )
)

results = base_model.predict("construction.jpg")

print(results)
```

## License

The AltCLIP model is licensed under an [Apache 2.0 license](LICENSE). See the [model README](https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/README.md) for more information.

## 🏆 Contributing

We love your input! Please see the core Autodistill [contributing guide](https://github.com/autodistill/autodistill/blob/main/CONTRIBUTING.md) to get started. Thank you 🙏 to all our contributors!

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/autodistill/autodistill-altclip",
    "name": "autodistill-altclip",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "",
    "author": "Roboflow",
    "author_email": "support@roboflow.com",
    "download_url": "https://files.pythonhosted.org/packages/00/66/3f2cdd5eea67e84aded299465ad7f912726fd76d12b920f33f179c927746/autodistill-altclip-0.1.2.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n  <p>\n    <a align=\"center\" href=\"\" target=\"_blank\">\n      <img\n        width=\"850\"\n        src=\"https://media.roboflow.com/open-source/autodistill/autodistill-banner.png\"\n      >\n    </a>\n  </p>\n</div>\n\n# Autodistill AltCLIP Module\n\nThis repository contains the code supporting the AltCLIP base model for use with [Autodistill](https://github.com/autodistill/autodistill).\n\n[AltCLIP](https://arxiv.org/abs/2211.06679v2) is a multi-modal vision model. With AltCLIP, you can compare the similarity between text and images, or the similarlity between two images. AltCLIP was trained on multi-lingual text-image pairs, which means it can be used for zero-shot classification with text prompts in different languages. [Read the AltCLIP paper for more information](https://arxiv.org/pdf/2211.06679v2.pdf).\n\nThe Autodistill AltCLIP module enables you to use AltCLIP for zero-shot classification.\n\nRead the full [Autodistill documentation](https://autodistill.github.io/autodistill/).\n\nRead the [CLIP Autodistill documentation](https://autodistill.github.io/autodistill/base_models/clip/).\n\n## Installation\n\nTo use AltCLIP with autodistill, you need to install the following dependency:\n\n\n```bash\npip3 install autodistill-altclip\n```\n\n## Quickstart\n\n```python\nfrom autodistill_altclip import AltCLIP\nfrom autodistill.detection import CaptionOntology\n\n# define an ontology to map class names to our AltCLIP prompt\n# the ontology dictionary has the format {caption: class}\n# where caption is the prompt sent to the base model, and class is the label that will\n# be saved for that caption in the generated results\n# then, load the model\nbase_model = AltCLIP(\n    ontology=CaptionOntology(\n        {\n            \"person\": \"person\",\n            \"a forklift\": \"forklift\"\n        }\n    )\n)\n\nresults = base_model.predict(\"construction.jpg\")\n\nprint(results)\n```\n\n## License\n\nThe AltCLIP model is licensed under an [Apache 2.0 license](LICENSE). See the [model README](https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/README.md) for more information.\n\n## \ud83c\udfc6 Contributing\n\nWe love your input! Please see the core Autodistill [contributing guide](https://github.com/autodistill/autodistill/blob/main/CONTRIBUTING.md) to get started. Thank you \ud83d\ude4f to all our contributors!\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "AltCLIP model for use with Autodistill.",
    "version": "0.1.2",
    "project_urls": {
        "Homepage": "https://github.com/autodistill/autodistill-altclip"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7fe04d3f29826919f06d7cb1d7ce172ef50718d7cab677b90bca0685eb025a4f",
                "md5": "bc30eacf397c9a68918f34f8a2037819",
                "sha256": "cdbacea010768b5ba6367c5d8b8f39f7d025ed60f1ec431e1d87c125a31f0fdd"
            },
            "downloads": -1,
            "filename": "autodistill_altclip-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bc30eacf397c9a68918f34f8a2037819",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 7423,
            "upload_time": "2023-12-05T09:16:34",
            "upload_time_iso_8601": "2023-12-05T09:16:34.904280Z",
            "url": "https://files.pythonhosted.org/packages/7f/e0/4d3f29826919f06d7cb1d7ce172ef50718d7cab677b90bca0685eb025a4f/autodistill_altclip-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "00663f2cdd5eea67e84aded299465ad7f912726fd76d12b920f33f179c927746",
                "md5": "5c37f7a057c7bd1be5f1fce88ed2c2fc",
                "sha256": "54fbe9cddcc6a1d389914fc5842b9c4f2e66460039ce88fa71a98be87a8fa10c"
            },
            "downloads": -1,
            "filename": "autodistill-altclip-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "5c37f7a057c7bd1be5f1fce88ed2c2fc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 7050,
            "upload_time": "2023-12-05T09:16:36",
            "upload_time_iso_8601": "2023-12-05T09:16:36.478761Z",
            "url": "https://files.pythonhosted.org/packages/00/66/3f2cdd5eea67e84aded299465ad7f912726fd76d12b920f33f179c927746/autodistill-altclip-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-05 09:16:36",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "autodistill",
    "github_project": "autodistill-altclip",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "autodistill-altclip"
}
        
Elapsed time: 0.36729s