neon-skill-fallback-llm


Nameneon-skill-fallback-llm JSON
Version 2.1.0 PyPI version JSON
download
home_pagehttps://github.com/NeonGeckoCom/skill-fallback_llm
SummaryNone
upload_time2025-08-12 01:22:47
maintainerNone
docs_urlNone
authorNeongecko
requires_pythonNone
licenseBSD-3-Clause
keywords
VCS
bugtrack_url
requirements neon-utils ovos-utils ovos-bus-client ovos-workshop neon-mq-connector
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # <img src='./logo.svg' card_color="#FF8600" width="50" style="vertical-align:bottom" style="vertical-align:bottom">LLM Fallback  
  
## Summary
Get an LLM response from the Neon Diana backend.

## Description
Converse with an LLM and enable LLM responses when Neon doesn't have a better
response.

To send a single query to an LLM, you can ask Neon to "ask Chat GPT <something>".
To start conversing with an LLM, ask to "talk to Chat GPT" and have all of your input
sent to an LLM until you say goodbye or stop talking for a while.

Enable fallback behavior by asking to "enable LLM fallback skill" or disable it
by asking to "disable LLM fallback".

To have a copy of LLM interactions sent via email, ask Neon to 
"email me a copy of our conversation".

## Examples 

* "Explain quantum computing in simple terms"
* "Ask chat GPT what an LLM is"
* "Talk to chat GPT"
* "Enable LLM fallback skill"
* "Disable LLM fallback skill"
* "Email me a copy of our conversation"

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/NeonGeckoCom/skill-fallback_llm",
    "name": "neon-skill-fallback-llm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Neongecko",
    "author_email": "developers@neon.ai",
    "download_url": "https://files.pythonhosted.org/packages/5d/03/ad615afbfe6c549f9242b2aca325d994de8f98f2c3238df6eeadc0eea441/neon-skill-fallback_llm-2.1.0.tar.gz",
    "platform": null,
    "description": "# <img src='./logo.svg' card_color=\"#FF8600\" width=\"50\" style=\"vertical-align:bottom\" style=\"vertical-align:bottom\">LLM Fallback  \n  \n## Summary\nGet an LLM response from the Neon Diana backend.\n\n## Description\nConverse with an LLM and enable LLM responses when Neon doesn't have a better\nresponse.\n\nTo send a single query to an LLM, you can ask Neon to \"ask Chat GPT <something>\".\nTo start conversing with an LLM, ask to \"talk to Chat GPT\" and have all of your input\nsent to an LLM until you say goodbye or stop talking for a while.\n\nEnable fallback behavior by asking to \"enable LLM fallback skill\" or disable it\nby asking to \"disable LLM fallback\".\n\nTo have a copy of LLM interactions sent via email, ask Neon to \n\"email me a copy of our conversation\".\n\n## Examples \n\n* \"Explain quantum computing in simple terms\"\n* \"Ask chat GPT what an LLM is\"\n* \"Talk to chat GPT\"\n* \"Enable LLM fallback skill\"\n* \"Disable LLM fallback skill\"\n* \"Email me a copy of our conversation\"\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": null,
    "version": "2.1.0",
    "project_urls": {
        "Homepage": "https://github.com/NeonGeckoCom/skill-fallback_llm"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bfe9d857023aa1c7f75f4c7a3265b6ee10f788be58075176f890d61e760f5111",
                "md5": "aeed4e671b454deb7d45d02dcb6ee5f3",
                "sha256": "b6dc6918bb9eef838680fb95e5398d238ff2ab74acf41674cdce50da60e7acf8"
            },
            "downloads": -1,
            "filename": "neon_skill_fallback_llm-2.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "aeed4e671b454deb7d45d02dcb6ee5f3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 36146,
            "upload_time": "2025-08-12T01:22:46",
            "upload_time_iso_8601": "2025-08-12T01:22:46.173556Z",
            "url": "https://files.pythonhosted.org/packages/bf/e9/d857023aa1c7f75f4c7a3265b6ee10f788be58075176f890d61e760f5111/neon_skill_fallback_llm-2.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5d03ad615afbfe6c549f9242b2aca325d994de8f98f2c3238df6eeadc0eea441",
                "md5": "c16ebcd3b17545fc61ae9b62e7c87843",
                "sha256": "1006562bb3d9ca6f192c1993bb4e9103dc25cbe3c1e2427ca79ecc9606594455"
            },
            "downloads": -1,
            "filename": "neon-skill-fallback_llm-2.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "c16ebcd3b17545fc61ae9b62e7c87843",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 17184,
            "upload_time": "2025-08-12T01:22:47",
            "upload_time_iso_8601": "2025-08-12T01:22:47.273082Z",
            "url": "https://files.pythonhosted.org/packages/5d/03/ad615afbfe6c549f9242b2aca325d994de8f98f2c3238df6eeadc0eea441/neon-skill-fallback_llm-2.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-12 01:22:47",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "NeonGeckoCom",
    "github_project": "skill-fallback_llm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "neon-utils",
            "specs": [
                [
                    "~=",
                    "1.12"
                ]
            ]
        },
        {
            "name": "ovos-utils",
            "specs": [
                [
                    ">=",
                    "0.0.28"
                ],
                [
                    "~=",
                    "0.0"
                ]
            ]
        },
        {
            "name": "ovos-bus-client",
            "specs": [
                [
                    ">=",
                    "0.0.3"
                ],
                [
                    "~=",
                    "0.0"
                ]
            ]
        },
        {
            "name": "ovos-workshop",
            "specs": [
                [
                    "~=",
                    "0.1"
                ]
            ]
        },
        {
            "name": "neon-mq-connector",
            "specs": [
                [
                    "~=",
                    "0.7"
                ]
            ]
        }
    ],
    "lcname": "neon-skill-fallback-llm"
}
        
Elapsed time: 3.93481s