neon-skill-fallback-llm


Nameneon-skill-fallback-llm JSON
Version 2.0.0 PyPI version JSON
download
home_pagehttps://github.com/NeonGeckoCom/skill-fallback_llm
SummaryNone
upload_time2025-03-20 01:19:04
maintainerNone
docs_urlNone
authorNeongecko
requires_pythonNone
licenseBSD-3-Clause
keywords
VCS
bugtrack_url
requirements neon-utils ovos-utils ovos-bus-client ovos-workshop neon-mq-connector
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # <img src='./logo.svg' card_color="#FF8600" width="50" style="vertical-align:bottom" style="vertical-align:bottom">LLM Fallback  
  
## Summary
Get an LLM response from the Neon Diana backend.

## Description
Converse with an LLM and enable LLM responses when Neon doesn't have a better
response.

To send a single query to an LLM, you can ask Neon to "ask Chat GPT <something>".
To start conversing with an LLM, ask to "talk to Chat GPT" and have all of your input
sent to an LLM until you say goodbye or stop talking for a while.

Enable fallback behavior by asking to "enable LLM fallback skill" or disable it
by asking to "disable LLM fallback".

To have a copy of LLM interactions sent via email, ask Neon to 
"email me a copy of our conversation".

## Examples 

* "Explain quantum computing in simple terms"
* "Ask chat GPT what an LLM is"
* "Talk to chat GPT"
* "Enable LLM fallback skill"
* "Disable LLM fallback skill"
* "Email me a copy of our conversation"

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/NeonGeckoCom/skill-fallback_llm",
    "name": "neon-skill-fallback-llm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Neongecko",
    "author_email": "developers@neon.ai",
    "download_url": "https://files.pythonhosted.org/packages/ef/54/d62618553b5cc0c0bd6833be63a0600c3eb0176f0fa7d7afcacb0ffa7739/neon-skill-fallback_llm-2.0.0.tar.gz",
    "platform": null,
    "description": "# <img src='./logo.svg' card_color=\"#FF8600\" width=\"50\" style=\"vertical-align:bottom\" style=\"vertical-align:bottom\">LLM Fallback  \n  \n## Summary\nGet an LLM response from the Neon Diana backend.\n\n## Description\nConverse with an LLM and enable LLM responses when Neon doesn't have a better\nresponse.\n\nTo send a single query to an LLM, you can ask Neon to \"ask Chat GPT <something>\".\nTo start conversing with an LLM, ask to \"talk to Chat GPT\" and have all of your input\nsent to an LLM until you say goodbye or stop talking for a while.\n\nEnable fallback behavior by asking to \"enable LLM fallback skill\" or disable it\nby asking to \"disable LLM fallback\".\n\nTo have a copy of LLM interactions sent via email, ask Neon to \n\"email me a copy of our conversation\".\n\n## Examples \n\n* \"Explain quantum computing in simple terms\"\n* \"Ask chat GPT what an LLM is\"\n* \"Talk to chat GPT\"\n* \"Enable LLM fallback skill\"\n* \"Disable LLM fallback skill\"\n* \"Email me a copy of our conversation\"\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": null,
    "version": "2.0.0",
    "project_urls": {
        "Homepage": "https://github.com/NeonGeckoCom/skill-fallback_llm"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8147a5625955255942a67b1e30e502ea2ab71812b4b11723741fb414979415a9",
                "md5": "a3df73abc4d7ef79455dcf731cfb63e3",
                "sha256": "0207e74708f0445b756ada423224f647bc7db3b9ab957241d395885839a6e83e"
            },
            "downloads": -1,
            "filename": "neon_skill_fallback_llm-2.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a3df73abc4d7ef79455dcf731cfb63e3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 29808,
            "upload_time": "2025-03-20T01:19:03",
            "upload_time_iso_8601": "2025-03-20T01:19:03.260995Z",
            "url": "https://files.pythonhosted.org/packages/81/47/a5625955255942a67b1e30e502ea2ab71812b4b11723741fb414979415a9/neon_skill_fallback_llm-2.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ef54d62618553b5cc0c0bd6833be63a0600c3eb0176f0fa7d7afcacb0ffa7739",
                "md5": "11379734210d25ea5b8eb9265757726e",
                "sha256": "d39ba7dde0cba452a4c96eba9584c237ad23ddbd6fd7c3237577198b46e93efc"
            },
            "downloads": -1,
            "filename": "neon-skill-fallback_llm-2.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "11379734210d25ea5b8eb9265757726e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 15053,
            "upload_time": "2025-03-20T01:19:04",
            "upload_time_iso_8601": "2025-03-20T01:19:04.442807Z",
            "url": "https://files.pythonhosted.org/packages/ef/54/d62618553b5cc0c0bd6833be63a0600c3eb0176f0fa7d7afcacb0ffa7739/neon-skill-fallback_llm-2.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-03-20 01:19:04",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "NeonGeckoCom",
    "github_project": "skill-fallback_llm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "neon-utils",
            "specs": [
                [
                    "~=",
                    "1.12"
                ]
            ]
        },
        {
            "name": "ovos-utils",
            "specs": [
                [
                    "~=",
                    "0.0"
                ],
                [
                    ">=",
                    "0.0.28"
                ]
            ]
        },
        {
            "name": "ovos-bus-client",
            "specs": [
                [
                    ">=",
                    "0.0.3"
                ],
                [
                    "~=",
                    "0.0"
                ]
            ]
        },
        {
            "name": "ovos-workshop",
            "specs": [
                [
                    "~=",
                    "0.1"
                ]
            ]
        },
        {
            "name": "neon-mq-connector",
            "specs": [
                [
                    "~=",
                    "0.7"
                ]
            ]
        }
    ],
    "lcname": "neon-skill-fallback-llm"
}
        
Elapsed time: 1.11441s