aiconfig-extension-llama-guard


Nameaiconfig-extension-llama-guard JSON
Version 0.0.3 PyPI version JSON
download
home_page
SummaryAn extension for using LLama Guard with aiconfig
upload_time2023-12-28 21:28:02
maintainer
docs_urlNone
authorLastMile AI
requires_python>=3.10
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLama Guard with AIConfig

LLama Guard is a 7b model released by Meta. This extension allows you to use it with AIConfig.

LLaMA Guard allows you to define your own “safety taxonomy” — custom policies to determine which interactions are safe vs. unsafe between humans (prompts) and AI models (responses). What makes this cool is that it allows you to enforce your own policies _ON TOP_ of the standard guardrails that a model ships with (instead of merely overriding them).

> [!NOTE] This extension also loads the entire model into memory.

## Part 1: Installating, Importing, and using this extension

1. Install this module: run `pip3 install aiconfig_extension_llama_guard` in terminal
2. Add these lines to your code:

```python
from aiconfig_extension_llama_guard import LLamageGuardParser
from aiconfig.registry import ModelParserRegistry
```

3. In code, construct and load the model parser that to from this extension to the registry: `ModelParserRegistry.register_model_parser(LLamageGuard())`. You can read the docstrings under `ModelParserRegistry` class for more info o nwhat this does.
4. Use the `LLamageGuard` model parser however you please. Check out our tutorial to get started ([video walkthrough](https://www.youtube.com/watch?v=XxggqoqIVdg), [Jupyter notebook](https://github.com/lastmile-ai/aiconfig/tree/v1.1.8/cookbooks/LLaMA-Guard)) You can watch our video tutorial or check our Jupyter notebook tuto

## Part 2: Updating & Developing this extension

If you are not developing this extension locally (just using the published extension), feel free to ignore this part

1. Navigate to `extensions/LLama-Guard/python` and run this command: `pip3 install -e .` (this creates a local copy of the python module which is linked to this directory)
2. Edit and test the extension as you please. Feel free to submit a push request on GitHub!
3. After you're done testing, be sure to uninstall the local link to this directory if you ever want to use the published version: `pip3 uninstall aiconfig_extension_llama_guard`

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "aiconfig-extension-llama-guard",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "",
    "keywords": "",
    "author": "LastMile AI",
    "author_email": "Ankush Pala <ankush@lastmileai.dev>, Rossdan Craig <rossdan@lastmileai.dev>",
    "download_url": "https://files.pythonhosted.org/packages/ba/84/f86b4f99f5d6bdfb82848f3cb047ab6ce2e2edcbb246d463ca1389b73887/aiconfig_extension_llama_guard-0.0.3.tar.gz",
    "platform": null,
    "description": "# LLama Guard with AIConfig\n\nLLama Guard is a 7b model released by Meta. This extension allows you to use it with AIConfig.\n\nLLaMA Guard allows you to define your own \u201csafety taxonomy\u201d \u2014 custom policies to determine which interactions are safe vs. unsafe between humans (prompts) and AI models (responses). What makes this cool is that it allows you to enforce your own policies _ON TOP_ of the standard guardrails that a model ships with (instead of merely overriding them).\n\n> [!NOTE] This extension also loads the entire model into memory.\n\n## Part 1: Installating, Importing, and using this extension\n\n1. Install this module: run `pip3 install aiconfig_extension_llama_guard` in terminal\n2. Add these lines to your code:\n\n```python\nfrom aiconfig_extension_llama_guard import LLamageGuardParser\nfrom aiconfig.registry import ModelParserRegistry\n```\n\n3. In code, construct and load the model parser that to from this extension to the registry: `ModelParserRegistry.register_model_parser(LLamageGuard())`. You can read the docstrings under `ModelParserRegistry` class for more info o nwhat this does.\n4. Use the `LLamageGuard` model parser however you please. Check out our tutorial to get started ([video walkthrough](https://www.youtube.com/watch?v=XxggqoqIVdg), [Jupyter notebook](https://github.com/lastmile-ai/aiconfig/tree/v1.1.8/cookbooks/LLaMA-Guard)) You can watch our video tutorial or check our Jupyter notebook tuto\n\n## Part 2: Updating & Developing this extension\n\nIf you are not developing this extension locally (just using the published extension), feel free to ignore this part\n\n1. Navigate to `extensions/LLama-Guard/python` and run this command: `pip3 install -e .` (this creates a local copy of the python module which is linked to this directory)\n2. Edit and test the extension as you please. Feel free to submit a push request on GitHub!\n3. After you're done testing, be sure to uninstall the local link to this directory if you ever want to use the published version: `pip3 uninstall aiconfig_extension_llama_guard`\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "An extension for using LLama Guard with aiconfig",
    "version": "0.0.3",
    "project_urls": {
        "Homepage": "https://github.com/lastmile-ai/aiconfig/",
        "Issues": "https://github.com/lastmile-ai/aiconfig/issues"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0ec9ccd5d891afdda54c355d0817d7003472db90b253cb9c11d7728f47153319",
                "md5": "4dd0d389eb47d9669910920448f07c8c",
                "sha256": "ffa31aa79c8b1ccfa72e7db1aeba24a0ebd83898f076e6c77c5863ba01e3d924"
            },
            "downloads": -1,
            "filename": "aiconfig_extension_llama_guard-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4dd0d389eb47d9669910920448f07c8c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 6638,
            "upload_time": "2023-12-28T21:28:00",
            "upload_time_iso_8601": "2023-12-28T21:28:00.785108Z",
            "url": "https://files.pythonhosted.org/packages/0e/c9/ccd5d891afdda54c355d0817d7003472db90b253cb9c11d7728f47153319/aiconfig_extension_llama_guard-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ba84f86b4f99f5d6bdfb82848f3cb047ab6ce2e2edcbb246d463ca1389b73887",
                "md5": "6a2c29369bf2134a83d23745769dc65d",
                "sha256": "d2d3c29a4496ddb405eaa506aaba12602de87a6c7d89216d314e2779ccc56b21"
            },
            "downloads": -1,
            "filename": "aiconfig_extension_llama_guard-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "6a2c29369bf2134a83d23745769dc65d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 6493,
            "upload_time": "2023-12-28T21:28:02",
            "upload_time_iso_8601": "2023-12-28T21:28:02.470931Z",
            "url": "https://files.pythonhosted.org/packages/ba/84/f86b4f99f5d6bdfb82848f3cb047ab6ce2e2edcbb246d463ca1389b73887/aiconfig_extension_llama_guard-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-28 21:28:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lastmile-ai",
    "github_project": "aiconfig",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "aiconfig-extension-llama-guard"
}
        
Elapsed time: 1.89361s