aix


Nameaix JSON
Version 0.0.4 PyPI version JSON
download
home_pagehttps://github.com/thorwhalen/aix
SummaryArtificial Intelligence eXtensions
upload_time2024-09-18 12:23:08
maintainerNone
docs_urlNone
authorThor Whalen
requires_pythonNone
licenseapache-2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# aix
Artificial Intelligence eXtensions

Fast access to your favorite A.I. tools. 

To install:	```pip install aix```



# AI chat

`aix.chat_funcs` gathers the models you have access to in your environment 
(depends on what you have installed (e.g. `google.generativeai`, `oa` (itself a facade to `openai` functionality), etc.)).


```python
from aix import chat_funcs

list(chat_funcs)
```




    ['gemini-1.5-flash',
     'gpt-4',
     'gpt-4-32k',
     'gpt-4-turbo',
     'gpt-3.5-turbo',
     'o1-preview',
     'o1-mini',
     'gpt-4o',
     'gpt-4o-mini']



`chat_funcs` is a dictionary whos keys are the names of the available models, and 
values are a `chat` function with the model set to that model name. 


```python
chat_funcs['o1-mini']
```




    functools.partial(<function chat at 0x1355f68c0>, model='o1-mini')



Note that different providers have different interfaces, but the functions that 
`chat_funcs` provides all have `prompt` as their first argument. 


```python
from inspect import signature

signature(chat_funcs['o1-mini'])
```




    <Sig (prompt=None, *, model='o1-mini', messages=None, frequency_penalty: 'Optional[float] | NotGiven' = NOT_GIVEN, function_call: 'completion_create_params.FunctionCall | NotGiven' = NOT_GIVEN, functions: 'Iterable[completion_create_params.Function] | NotGiven' = NOT_GIVEN, logit_bias: 'Optional[Dict[str, int]] | NotGiven' = NOT_GIVEN, logprobs: 'Optional[bool] | NotGiven' = NOT_GIVEN, max_tokens: 'Optional[int] | NotGiven' = NOT_GIVEN, n: 'Optional[int] | NotGiven' = NOT_GIVEN, parallel_tool_calls: 'bool | NotGiven' = NOT_GIVEN, presence_penalty: 'Optional[float] | NotGiven' = NOT_GIVEN, response_format: 'completion_create_params.ResponseFormat | NotGiven' = NOT_GIVEN, seed: 'Optional[int] | NotGiven' = NOT_GIVEN, service_tier: "Optional[Literal['auto', 'default']] | NotGiven" = NOT_GIVEN, stop: 'Union[Optional[str], List[str]] | NotGiven' = NOT_GIVEN, stream: 'Optional[Literal[False]] | Literal[True] | NotGiven' = NOT_GIVEN, stream_options: 'Optional[ChatCompletionStreamOptionsParam] | NotGiven' = NOT_GIVEN, temperature: 'Optional[float] | NotGiven' = NOT_GIVEN, tool_choice: 'ChatCompletionToolChoiceOptionParam | NotGiven' = NOT_GIVEN, tools: 'Iterable[ChatCompletionToolParam] | NotGiven' = NOT_GIVEN, top_logprobs: 'Optional[int] | NotGiven' = NOT_GIVEN, top_p: 'Optional[float] | NotGiven' = NOT_GIVEN, user: 'str | NotGiven' = NOT_GIVEN, extra_headers: 'Headers | None' = None, extra_query: 'Query | None' = None, extra_body: 'Body | None' = None, timeout: 'float | httpx.Timeout | None | NotGiven' = NOT_GIVEN)>




```python
signature(chat_funcs['gemini-1.5-flash'])
```




    <Sig (prompt: str, *, model='gemini-1.5-flash', generation_config: 'generation_types.GenerationConfigType | None' = None, safety_settings: 'safety_types.SafetySettingOptions | None' = None, stream: 'bool' = False, tools: 'content_types.FunctionLibraryType | None' = None, tool_config: 'content_types.ToolConfigType | None' = None, request_options: 'helper_types.RequestOptionsType | None' = None)>



For tab-completion convenience, the (python identifier version of the) models 
were placed as attributes of `chat_funcs`, so you can access them directly there.


```python
print(chat_funcs.gemini_1_5_flash('What is the capital of France?'))
```

    The capital of France is **Paris**. 
    



```python
print(chat_funcs.gpt_3_5_turbo('What is the capital of France?'))
```

    The capital of France is Paris.


There's also a dictionary called `chat_models` that contains the same keys:


```python
from aix import chat_models

list(chat_models)
```




    ['gemini-1.5-flash',
     'gpt-4',
     'gpt-4-32k',
     'gpt-4-turbo',
     'gpt-3.5-turbo',
     'o1-preview',
     'o1-mini',
     'gpt-4o',
     'gpt-4o-mini']



But here the values are some useful metadatas on the model, like pricing...


```python
chat_models['gpt-4o']
```




    {'price_per_million_tokens': 5.0,
     'pages_per_dollar': 804,
     'performance_on_eval': 'Efficiency-optimized version of GPT-4 for better performance on reasoning tasks',
     'max_input': 8192,
     'provider': 'openai'}



The corresponding attributes are only the name of the model (the key itself):


```python
chat_models.gpt_4
```



    'gpt-4'



This is for the convenience of entering a model name in a different context, with 
less errors than if you were typing the name as a string. 
For example, you can enter it yourself in the general `chat` function:


```python
from aix import chat, chat_models

chat('How many Rs in "Strawberry"?', model=chat_models.gpt_4o, frequency_penalty=0.5)  
```




    'The word "Strawberry" contains two instances of the letter \'R\'.'




# Extras (older version -- might deprecate or change interface)

Want all your faves at your fingertips?

Never remember where to import that learner from?

Say `LinearDiscriminantAnalysis`?

... was it `from sklearn`?

... was it `from sklearn.linear_model`?

... ah no! It was `from sklearn.discriminant_analysis import LinearDiscriminantAnalysis`.

Sure, you can do that. Or you can simply type `from aix.Lin...` click tab, and; there it is! 
Select, enter, and moving on with real work.

*Note: This is meant to get off the ground quickly 
-- once your code is stable, you should probably import your stuff directly from it's origin*


# Coming up

Now that the AI revolution is on its way, we'll add the ability to find, and one day,
use the right AI tool -- until the day that AI will do even that for us...

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/thorwhalen/aix",
    "name": "aix",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Thor Whalen",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/9b/b4/1bceb3a7cac365a06b61ca2aa856e55bbac54240b56e44fce36fa492ebf8/aix-0.0.4.tar.gz",
    "platform": "any",
    "description": "\n# aix\nArtificial Intelligence eXtensions\n\nFast access to your favorite A.I. tools. \n\nTo install:\t```pip install aix```\n\n\n\n# AI chat\n\n`aix.chat_funcs` gathers the models you have access to in your environment \n(depends on what you have installed (e.g. `google.generativeai`, `oa` (itself a facade to `openai` functionality), etc.)).\n\n\n```python\nfrom aix import chat_funcs\n\nlist(chat_funcs)\n```\n\n\n\n\n    ['gemini-1.5-flash',\n     'gpt-4',\n     'gpt-4-32k',\n     'gpt-4-turbo',\n     'gpt-3.5-turbo',\n     'o1-preview',\n     'o1-mini',\n     'gpt-4o',\n     'gpt-4o-mini']\n\n\n\n`chat_funcs` is a dictionary whos keys are the names of the available models, and \nvalues are a `chat` function with the model set to that model name. \n\n\n```python\nchat_funcs['o1-mini']\n```\n\n\n\n\n    functools.partial(<function chat at 0x1355f68c0>, model='o1-mini')\n\n\n\nNote that different providers have different interfaces, but the functions that \n`chat_funcs` provides all have `prompt` as their first argument. \n\n\n```python\nfrom inspect import signature\n\nsignature(chat_funcs['o1-mini'])\n```\n\n\n\n\n    <Sig (prompt=None, *, model='o1-mini', messages=None, frequency_penalty: 'Optional[float] | NotGiven' = NOT_GIVEN, function_call: 'completion_create_params.FunctionCall | NotGiven' = NOT_GIVEN, functions: 'Iterable[completion_create_params.Function] | NotGiven' = NOT_GIVEN, logit_bias: 'Optional[Dict[str, int]] | NotGiven' = NOT_GIVEN, logprobs: 'Optional[bool] | NotGiven' = NOT_GIVEN, max_tokens: 'Optional[int] | NotGiven' = NOT_GIVEN, n: 'Optional[int] | NotGiven' = NOT_GIVEN, parallel_tool_calls: 'bool | NotGiven' = NOT_GIVEN, presence_penalty: 'Optional[float] | NotGiven' = NOT_GIVEN, response_format: 'completion_create_params.ResponseFormat | NotGiven' = NOT_GIVEN, seed: 'Optional[int] | NotGiven' = NOT_GIVEN, service_tier: \"Optional[Literal['auto', 'default']] | NotGiven\" = NOT_GIVEN, stop: 'Union[Optional[str], List[str]] | NotGiven' = NOT_GIVEN, stream: 'Optional[Literal[False]] | Literal[True] | NotGiven' = NOT_GIVEN, stream_options: 'Optional[ChatCompletionStreamOptionsParam] | NotGiven' = NOT_GIVEN, temperature: 'Optional[float] | NotGiven' = NOT_GIVEN, tool_choice: 'ChatCompletionToolChoiceOptionParam | NotGiven' = NOT_GIVEN, tools: 'Iterable[ChatCompletionToolParam] | NotGiven' = NOT_GIVEN, top_logprobs: 'Optional[int] | NotGiven' = NOT_GIVEN, top_p: 'Optional[float] | NotGiven' = NOT_GIVEN, user: 'str | NotGiven' = NOT_GIVEN, extra_headers: 'Headers | None' = None, extra_query: 'Query | None' = None, extra_body: 'Body | None' = None, timeout: 'float | httpx.Timeout | None | NotGiven' = NOT_GIVEN)>\n\n\n\n\n```python\nsignature(chat_funcs['gemini-1.5-flash'])\n```\n\n\n\n\n    <Sig (prompt: str, *, model='gemini-1.5-flash', generation_config: 'generation_types.GenerationConfigType | None' = None, safety_settings: 'safety_types.SafetySettingOptions | None' = None, stream: 'bool' = False, tools: 'content_types.FunctionLibraryType | None' = None, tool_config: 'content_types.ToolConfigType | None' = None, request_options: 'helper_types.RequestOptionsType | None' = None)>\n\n\n\nFor tab-completion convenience, the (python identifier version of the) models \nwere placed as attributes of `chat_funcs`, so you can access them directly there.\n\n\n```python\nprint(chat_funcs.gemini_1_5_flash('What is the capital of France?'))\n```\n\n    The capital of France is **Paris**. \n    \n\n\n\n```python\nprint(chat_funcs.gpt_3_5_turbo('What is the capital of France?'))\n```\n\n    The capital of France is Paris.\n\n\nThere's also a dictionary called `chat_models` that contains the same keys:\n\n\n```python\nfrom aix import chat_models\n\nlist(chat_models)\n```\n\n\n\n\n    ['gemini-1.5-flash',\n     'gpt-4',\n     'gpt-4-32k',\n     'gpt-4-turbo',\n     'gpt-3.5-turbo',\n     'o1-preview',\n     'o1-mini',\n     'gpt-4o',\n     'gpt-4o-mini']\n\n\n\nBut here the values are some useful metadatas on the model, like pricing...\n\n\n```python\nchat_models['gpt-4o']\n```\n\n\n\n\n    {'price_per_million_tokens': 5.0,\n     'pages_per_dollar': 804,\n     'performance_on_eval': 'Efficiency-optimized version of GPT-4 for better performance on reasoning tasks',\n     'max_input': 8192,\n     'provider': 'openai'}\n\n\n\nThe corresponding attributes are only the name of the model (the key itself):\n\n\n```python\nchat_models.gpt_4\n```\n\n\n\n    'gpt-4'\n\n\n\nThis is for the convenience of entering a model name in a different context, with \nless errors than if you were typing the name as a string. \nFor example, you can enter it yourself in the general `chat` function:\n\n\n```python\nfrom aix import chat, chat_models\n\nchat('How many Rs in \"Strawberry\"?', model=chat_models.gpt_4o, frequency_penalty=0.5)  \n```\n\n\n\n\n    'The word \"Strawberry\" contains two instances of the letter \\'R\\'.'\n\n\n\n\n# Extras (older version -- might deprecate or change interface)\n\nWant all your faves at your fingertips?\n\nNever remember where to import that learner from?\n\nSay `LinearDiscriminantAnalysis`?\n\n... was it `from sklearn`?\n\n... was it `from sklearn.linear_model`?\n\n... ah no! It was `from sklearn.discriminant_analysis import LinearDiscriminantAnalysis`.\n\nSure, you can do that. Or you can simply type `from aix.Lin...` click tab, and; there it is! \nSelect, enter, and moving on with real work.\n\n*Note: This is meant to get off the ground quickly \n-- once your code is stable, you should probably import your stuff directly from it's origin*\n\n\n# Coming up\n\nNow that the AI revolution is on its way, we'll add the ability to find, and one day,\nuse the right AI tool -- until the day that AI will do even that for us...\n",
    "bugtrack_url": null,
    "license": "apache-2.0",
    "summary": "Artificial Intelligence eXtensions",
    "version": "0.0.4",
    "project_urls": {
        "Homepage": "https://github.com/thorwhalen/aix"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1b84df7713e7765da962be63f00c15011b42ab72412f4ec0d6d580c9b3f6b681",
                "md5": "0c406924dc808ddb3fdd37c774863cc7",
                "sha256": "32499ff858452b7f5fe25841eb2f2c9395a720a1b7684e46949c9f678242fe8b"
            },
            "downloads": -1,
            "filename": "aix-0.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0c406924dc808ddb3fdd37c774863cc7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 11073,
            "upload_time": "2024-09-18T12:23:07",
            "upload_time_iso_8601": "2024-09-18T12:23:07.879264Z",
            "url": "https://files.pythonhosted.org/packages/1b/84/df7713e7765da962be63f00c15011b42ab72412f4ec0d6d580c9b3f6b681/aix-0.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9bb41bceb3a7cac365a06b61ca2aa856e55bbac54240b56e44fce36fa492ebf8",
                "md5": "b55646712306b57535bb32741a643577",
                "sha256": "779602533dd3432459a8e5d6d07d8225f7aa98d9ba2775603a3246b15cac0f3c"
            },
            "downloads": -1,
            "filename": "aix-0.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "b55646712306b57535bb32741a643577",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 11230,
            "upload_time": "2024-09-18T12:23:08",
            "upload_time_iso_8601": "2024-09-18T12:23:08.809187Z",
            "url": "https://files.pythonhosted.org/packages/9b/b4/1bceb3a7cac365a06b61ca2aa856e55bbac54240b56e44fce36fa492ebf8/aix-0.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-18 12:23:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "thorwhalen",
    "github_project": "aix",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "aix"
}
        
Elapsed time: 0.73729s