symposium


Namesymposium JSON
Version 0.2.1 PyPI version JSON
download
home_pageNone
SummaryInteraction of multiple language models
upload_time2024-04-27 16:20:38
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords symposium conversations ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Symposium
Interactions with multiple language models require at least a little bit of a 'unified' interface. The 'symposium' packagee is an attempt to do that. It is a work in progress and will change without notice. If you need a recording capabilities, install the `grammateus` package and pass an instance of Grammateus/recorder in your calls to connectors.
## Unification
One of the motivations for this package was the need in a unified format for messaging language models, which is particularly useful if you are going to experiment with interactions between them.

The unified standard used by this package is as follows.
### 'System' messages
```python
messages = [
    {"role": "world", "name": "openai", "content": "Be an Antagonist."}
]
```
Name field should be set to 'openai', 'anthropic', 'google_gemini' or 'google_palm'.
For the 'anthropic' name, the last 
'system' message will be used as the 'system' parameter in the request. For palm_messages v1beta3 format this message will be used in the 'context' parameter.
### 'User' messages
```python
messages = [
    {"role": "human", "name": "Alex", "content": "Let's discuss human nature."}
]
```
The utility functions stored in the `adapters` sub-package transform incoming and outgoing messages of particular model from this format to a model-specific format and back from the format of its' response to the following output format. This includes the text synthesis with older (but in)
## Output format
The unified standard used by this package is:
```python
message = {
    "role": "machine", "name": "claude",  
    "content": " ... ", 
    "tags": [{}],   # optional, if in the response, then returned
    "other": [{}]   # optional, if n > 1
}
```
`name` field will be set to 'chatgpt', 'claude', 'gemini' or 'palm'.<br>
Tags are extracted from the text and put into a list. The placeholder for the tags is: (tag_name).<br>
If there are more than one response, the other field will contain the list of the rest (transformed too).
## Anthropic
There are two ways of interaction with Anthropic API, through the REST API and through the native Anthropic Python library with 'client'. If you don't want any dependencies (and uncertainty) use `anthropic_rest` connector. If you want to install this dependency do `pip install symposium[anthropic_native]`.
#### Messages
REST version:
```python
from symposium.connectors import anthropic_rest as ant

messages = [
    {"role": "human", "name": "alex", "content": "Can we change human nature?"}
]
kwargs = {
    "model":                "claude-3-sonnet-20240229",
    "system":               "answer concisely",
    # "messages":             [],
    "max_tokens":           5,
    "stop_sequences":       ["stop", ant.HUMAN_PREFIX],
    "stream":               False,
    "temperature":          0.5,
    "top_k":                250,
    "top_p":                0.5
}
response = ant.claud_message(messages,**kwargs)
```
Native version:
```python
from symposium.connectors import anthropic_native as ant

ant_client = ant.get_claud_client()
messages = [
    {"role": "human", "name": "alex", "content": "Can we change human nature?"}
]
anthropic_message = ant.claud_message(
    client=ant_client,
    messages=messages,
    **kwargs
)
```
#### Completion
Again, there is a REST version and a native version.
REST version:
```python
from symposium.connectors import anthropic_rest as ant

messages = [
    {"role": "human", "name": "alex", "content": "Can we change human nature?"}
]
kwargs = {
    "model":                "claude-instant-1.2",
    "max_tokens":           5,
    # "prompt":               prompt,
    "stop_sequences":       [ant.HUMAN_PREFIX],
    "temperature":          0.5,
    "top_k":                250,
    "top_p":                0.5
}
response = ant.claud_complete(messages, **kwargs)
```
## OpenAI
Import:
```python
from symposium.connectors import openai_rest as oai
```
#### Messages
```python
from symposium.connectors import openai_rest as oai

messages = [
  {"role": "user", "content": "Can we change human nature?"}
]
kwargs = {
    "model":                "gpt-3.5-turbo",
    # "messages":             [],
    "max_tokens":           5,
    "n":                    1,
    "stop_sequences":       ["stop"],
    "seed":                 None,
    "frequency_penalty":    None,
    "presence_penalty":     None,
    "logit_bias":           None,
    "logprobs":             None,
    "top_logprobs":         None,
    "temperature":          0.5,
    "top_p":                0.5,
    "user":                 None
}
responses = oai.gpt_message(messages, **kwargs)
```
#### Completion
```python
from symposium.connectors import openai_rest as oai

prompt = "Can we change human nature?"
kwargs = {
    "model":                "gpt-3.5-turbo-instruct",
    # "prompt":               str,
    "suffix":               str,
    "max_tokens":           5,
    "n":                    1,
    "best_of":              None,
    "stop_sequences":       ["stop"],
    "seed":                 None,
    "frequency_penalty":    None,
    "presence_penalty":     None,
    "logit_bias":           None,
    "logprobs":             None,
    "top_logprobs":         None,
    "temperature":          0.5,
    "top_p":                0.5,
    "user":                 None
}
responses = oai.gpt_complete(prompt, **kwargs)
```
## Gemini
Import:
```python
from symposium.connectors import gemini_rest as gem
```
#### Messages
```python
from symposium.connectors import gemini_rest as gem

messages = [
        {
            "role": "user",
            "parts": [
                {"text": "Human nature can not be changed, because..."},
                {"text": "...and that is why human nature can not be changed."}
            ]
        },{
            "role": "model",
            "parts": [
                {"text": "Should I synthesize a text that will be placed between these two statements and follow the previous instruction while doing that?"}
            ]
        },{
            "role": "user",
            "parts": [
                {"text": "Yes, please do."},
                {"text": "Create a most concise text possible, preferably just one sentence}"}
            ]
        }
]
kwargs = {
    "model":                "gemini-1.0-pro",
    # "messages":             [],
    "stop_sequences":       ["STOP","Title"],
    "temperature":          0.5,
    "max_tokens":           5,
    "n":                    1,
    "top_p":                0.9,
    "top_k":                None
}
response = gem.gemini_content(messages, **kwargs)
```
 
## PaLM
Import:
```python
from symposium.connectors import palm_rest as path
```
#### Completion
```python
from symposium.connectors import palm_rest as path

kwargs = {
    "model": "text-bison-001",
    "prompt": str,
    "temperature": 0.5,
    "n": 1,
    "max_tokens": 10,
    "top_p": 0.5,
    "top_k": None
}
responses = path.palm_complete(prompt, **kwargs)
```
#### Messages
```python
from symposium.connectors import palm_rest as path

context = "This conversation will be happening between Albert and Niels"
examples = [
        {
            "input": {"author": "Albert", "content": "We didn't talk about quantum mechanics lately..."},
            "output": {"author": "Niels", "content": "Yes, indeed."}
        }
]
messages = [
        {
            "author": "Albert",
            "content": "Can we change human nature?"
        }, {
            "author": "Niels",
            "content": "Not clear..."
        }, {
            "author": "Albert",
            "content": "Seriously, can we?"
        }
]
kwargs = {
    "model": "chat-bison-001",
    # "context": str,
    # "examples": [],
    # "messages": [],
    "temperature": 0.5,
    # no 'max_tokens', beware the effects of that!
    "n": 1,
    "top_p": 0.5,
    "top_k": None
}
responses = path.palm_content(context, examples, messages, **kwargs)
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "symposium",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "symposium, conversations, ai",
    "author": null,
    "author_email": "Alexander Fedotov <alex.fedotov@aol.com>",
    "download_url": "https://files.pythonhosted.org/packages/e2/32/34186a57a4210aab67f96525ec92dcbd61451f945b684efe33bbeb2340ee/symposium-0.2.1.tar.gz",
    "platform": null,
    "description": "# Symposium\nInteractions with multiple language models require at least a little bit of a 'unified' interface. The 'symposium' packagee is an attempt to do that. It is a work in progress and will change without notice. If you need a recording capabilities, install the `grammateus` package and pass an instance of Grammateus/recorder in your calls to connectors.\n## Unification\nOne of the motivations for this package was the need in a unified format for messaging language models, which is particularly useful if you are going to experiment with interactions between them.\n\nThe unified standard used by this package is as follows.\n### 'System' messages\n```python\nmessages = [\n    {\"role\": \"world\", \"name\": \"openai\", \"content\": \"Be an Antagonist.\"}\n]\n```\nName field should be set to 'openai', 'anthropic', 'google_gemini' or 'google_palm'.\nFor the 'anthropic' name, the last \n'system' message will be used as the 'system' parameter in the request. For palm_messages v1beta3 format this message will be used in the 'context' parameter.\n### 'User' messages\n```python\nmessages = [\n    {\"role\": \"human\", \"name\": \"Alex\", \"content\": \"Let's discuss human nature.\"}\n]\n```\nThe utility functions stored in the `adapters` sub-package transform incoming and outgoing messages of particular model from this format to a model-specific format and back from the format of its' response to the following output format. This includes the text synthesis with older (but in)\n## Output format\nThe unified standard used by this package is:\n```python\nmessage = {\n    \"role\": \"machine\", \"name\": \"claude\",  \n    \"content\": \" ... \", \n    \"tags\": [{}],   # optional, if in the response, then returned\n    \"other\": [{}]   # optional, if n > 1\n}\n```\n`name` field will be set to 'chatgpt', 'claude', 'gemini' or 'palm'.<br>\nTags are extracted from the text and put into a list. The placeholder for the tags is: (tag_name).<br>\nIf there are more than one response, the other field will contain the list of the rest (transformed too).\n## Anthropic\nThere are two ways of interaction with Anthropic API, through the REST API and through the native Anthropic Python library with 'client'. If you don't want any dependencies (and uncertainty) use `anthropic_rest` connector. If you want to install this dependency do `pip install symposium[anthropic_native]`.\n#### Messages\nREST version:\n```python\nfrom symposium.connectors import anthropic_rest as ant\n\nmessages = [\n    {\"role\": \"human\", \"name\": \"alex\", \"content\": \"Can we change human nature?\"}\n]\nkwargs = {\n    \"model\":                \"claude-3-sonnet-20240229\",\n    \"system\":               \"answer concisely\",\n    # \"messages\":             [],\n    \"max_tokens\":           5,\n    \"stop_sequences\":       [\"stop\", ant.HUMAN_PREFIX],\n    \"stream\":               False,\n    \"temperature\":          0.5,\n    \"top_k\":                250,\n    \"top_p\":                0.5\n}\nresponse = ant.claud_message(messages,**kwargs)\n```\nNative version:\n```python\nfrom symposium.connectors import anthropic_native as ant\n\nant_client = ant.get_claud_client()\nmessages = [\n    {\"role\": \"human\", \"name\": \"alex\", \"content\": \"Can we change human nature?\"}\n]\nanthropic_message = ant.claud_message(\n    client=ant_client,\n    messages=messages,\n    **kwargs\n)\n```\n#### Completion\nAgain, there is a REST version and a native version.\nREST version:\n```python\nfrom symposium.connectors import anthropic_rest as ant\n\nmessages = [\n    {\"role\": \"human\", \"name\": \"alex\", \"content\": \"Can we change human nature?\"}\n]\nkwargs = {\n    \"model\":                \"claude-instant-1.2\",\n    \"max_tokens\":           5,\n    # \"prompt\":               prompt,\n    \"stop_sequences\":       [ant.HUMAN_PREFIX],\n    \"temperature\":          0.5,\n    \"top_k\":                250,\n    \"top_p\":                0.5\n}\nresponse = ant.claud_complete(messages, **kwargs)\n```\n## OpenAI\nImport:\n```python\nfrom symposium.connectors import openai_rest as oai\n```\n#### Messages\n```python\nfrom symposium.connectors import openai_rest as oai\n\nmessages = [\n  {\"role\": \"user\", \"content\": \"Can we change human nature?\"}\n]\nkwargs = {\n    \"model\":                \"gpt-3.5-turbo\",\n    # \"messages\":             [],\n    \"max_tokens\":           5,\n    \"n\":                    1,\n    \"stop_sequences\":       [\"stop\"],\n    \"seed\":                 None,\n    \"frequency_penalty\":    None,\n    \"presence_penalty\":     None,\n    \"logit_bias\":           None,\n    \"logprobs\":             None,\n    \"top_logprobs\":         None,\n    \"temperature\":          0.5,\n    \"top_p\":                0.5,\n    \"user\":                 None\n}\nresponses = oai.gpt_message(messages, **kwargs)\n```\n#### Completion\n```python\nfrom symposium.connectors import openai_rest as oai\n\nprompt = \"Can we change human nature?\"\nkwargs = {\n    \"model\":                \"gpt-3.5-turbo-instruct\",\n    # \"prompt\":               str,\n    \"suffix\":               str,\n    \"max_tokens\":           5,\n    \"n\":                    1,\n    \"best_of\":              None,\n    \"stop_sequences\":       [\"stop\"],\n    \"seed\":                 None,\n    \"frequency_penalty\":    None,\n    \"presence_penalty\":     None,\n    \"logit_bias\":           None,\n    \"logprobs\":             None,\n    \"top_logprobs\":         None,\n    \"temperature\":          0.5,\n    \"top_p\":                0.5,\n    \"user\":                 None\n}\nresponses = oai.gpt_complete(prompt, **kwargs)\n```\n## Gemini\nImport:\n```python\nfrom symposium.connectors import gemini_rest as gem\n```\n#### Messages\n```python\nfrom symposium.connectors import gemini_rest as gem\n\nmessages = [\n        {\n            \"role\": \"user\",\n            \"parts\": [\n                {\"text\": \"Human nature can not be changed, because...\"},\n                {\"text\": \"...and that is why human nature can not be changed.\"}\n            ]\n        },{\n            \"role\": \"model\",\n            \"parts\": [\n                {\"text\": \"Should I synthesize a text that will be placed between these two statements and follow the previous instruction while doing that?\"}\n            ]\n        },{\n            \"role\": \"user\",\n            \"parts\": [\n                {\"text\": \"Yes, please do.\"},\n                {\"text\": \"Create a most concise text possible, preferably just one sentence}\"}\n            ]\n        }\n]\nkwargs = {\n    \"model\":                \"gemini-1.0-pro\",\n    # \"messages\":             [],\n    \"stop_sequences\":       [\"STOP\",\"Title\"],\n    \"temperature\":          0.5,\n    \"max_tokens\":           5,\n    \"n\":                    1,\n    \"top_p\":                0.9,\n    \"top_k\":                None\n}\nresponse = gem.gemini_content(messages, **kwargs)\n```\n \n## PaLM\nImport:\n```python\nfrom symposium.connectors import palm_rest as path\n```\n#### Completion\n```python\nfrom symposium.connectors import palm_rest as path\n\nkwargs = {\n    \"model\": \"text-bison-001\",\n    \"prompt\": str,\n    \"temperature\": 0.5,\n    \"n\": 1,\n    \"max_tokens\": 10,\n    \"top_p\": 0.5,\n    \"top_k\": None\n}\nresponses = path.palm_complete(prompt, **kwargs)\n```\n#### Messages\n```python\nfrom symposium.connectors import palm_rest as path\n\ncontext = \"This conversation will be happening between Albert and Niels\"\nexamples = [\n        {\n            \"input\": {\"author\": \"Albert\", \"content\": \"We didn't talk about quantum mechanics lately...\"},\n            \"output\": {\"author\": \"Niels\", \"content\": \"Yes, indeed.\"}\n        }\n]\nmessages = [\n        {\n            \"author\": \"Albert\",\n            \"content\": \"Can we change human nature?\"\n        }, {\n            \"author\": \"Niels\",\n            \"content\": \"Not clear...\"\n        }, {\n            \"author\": \"Albert\",\n            \"content\": \"Seriously, can we?\"\n        }\n]\nkwargs = {\n    \"model\": \"chat-bison-001\",\n    # \"context\": str,\n    # \"examples\": [],\n    # \"messages\": [],\n    \"temperature\": 0.5,\n    # no 'max_tokens', beware the effects of that!\n    \"n\": 1,\n    \"top_p\": 0.5,\n    \"top_k\": None\n}\nresponses = path.palm_content(context, examples, messages, **kwargs)\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Interaction of multiple language models",
    "version": "0.2.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/multilogue/sumposium/issues",
        "Homepage": "https://github.com/multilogue/symposium"
    },
    "split_keywords": [
        "symposium",
        " conversations",
        " ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e29176d8f407be2e71f34b04c3264e0708bde932dc6721a516a53b7dd2914bf4",
                "md5": "c18c6597358409b5e444dc5011da8a51",
                "sha256": "01f90f3ac44bcfc3c7ff78c954ea055225b78a816b490a36864fddfe6f5214d4"
            },
            "downloads": -1,
            "filename": "symposium-0.2.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c18c6597358409b5e444dc5011da8a51",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 48351,
            "upload_time": "2024-04-27T16:20:35",
            "upload_time_iso_8601": "2024-04-27T16:20:35.586019Z",
            "url": "https://files.pythonhosted.org/packages/e2/91/76d8f407be2e71f34b04c3264e0708bde932dc6721a516a53b7dd2914bf4/symposium-0.2.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e23234186a57a4210aab67f96525ec92dcbd61451f945b684efe33bbeb2340ee",
                "md5": "464ded401c1aef719c925a0784636d20",
                "sha256": "b4cb23d3c163f0b9e97a9177d636bed40f5a6f1312dd254ca591c0fc491c7543"
            },
            "downloads": -1,
            "filename": "symposium-0.2.1.tar.gz",
            "has_sig": false,
            "md5_digest": "464ded401c1aef719c925a0784636d20",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 26301,
            "upload_time": "2024-04-27T16:20:38",
            "upload_time_iso_8601": "2024-04-27T16:20:38.119964Z",
            "url": "https://files.pythonhosted.org/packages/e2/32/34186a57a4210aab67f96525ec92dcbd61451f945b684efe33bbeb2340ee/symposium-0.2.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-27 16:20:38",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "multilogue",
    "github_project": "sumposium",
    "github_not_found": true,
    "lcname": "symposium"
}
        
Elapsed time: 0.36461s