langchain-fireworks


Namelangchain-fireworks JSON
Version 0.2.6 PyPI version JSON
download
home_pagehttps://github.com/langchain-ai/langchain
SummaryAn integration package connecting Fireworks and LangChain
upload_time2024-12-19 16:06:19
maintainerNone
docs_urlNone
authorNone
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LangChain-Fireworks

This is the partner package for tying Fireworks.ai and LangChain. Fireworks really strive to provide good support for LangChain use cases, so if you run into any issues please let us know. You can reach out to us [in our Discord channel](https://discord.com/channels/1137072072808472616/)


## Installation

To use the `langchain-fireworks` package, follow these installation steps:

```bash
pip install langchain-fireworks
```



## Basic usage

### Setting up

1. Sign in to [Fireworks AI](http://fireworks.ai/) to obtain an API Key to access the models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.

    Once you've signed in and obtained an API key, follow these steps to set the `FIREWORKS_API_KEY` environment variable:
    - **Linux/macOS:** Open your terminal and execute the following command:
    ```bash
    export FIREWORKS_API_KEY='your_api_key'
    ```
    **Note:** To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.

    - **Windows:** For Command Prompt, use:
    ```cmd
    set FIREWORKS_API_KEY=your_api_key
    ```

2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai/models).

```python
import getpass
import os

# Initialize a Fireworks model
llm = Fireworks(
    model="accounts/fireworks/models/mixtral-8x7b-instruct",
    base_url="https://api.fireworks.ai/inference/v1/completions",
)
```


### Calling the Model Directly

You can call the model directly with string prompts to get completions.

```python
# Single prompt
output = llm.invoke("Who's the best quarterback in the NFL?")
print(output)
```

```python
# Calling multiple prompts
output = llm.generate(
    [
        "Who's the best cricket player in 2016?",
        "Who's the best basketball player in the league?",
    ]
)
print(output.generations)
```





## Advanced usage
### Tool use: LangChain Agent + Fireworks function calling model
Please checkout how to teach Fireworks function calling model to use a [calculator here](https://github.com/fw-ai/cookbook/blob/main/examples/function_calling/fireworks_langchain_tool_usage.ipynb). 

Fireworks focus on delivering the best experience for fast model inference as well as tool use. You can check out [our blog](https://fireworks.ai/blog/firefunction-v1-gpt-4-level-function-calling) for more details on how it fares compares to GPT-4, the punchline is that it is on par with GPT-4 in terms just function calling use cases, but it is way faster and much cheaper.

### RAG: LangChain agent + Fireworks function calling model + MongoDB + Nomic AI embeddings
Please check out the [cookbook here](https://github.com/fw-ai/cookbook/blob/main/examples/rag/mongodb_agent.ipynb) for an end to end flow

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/langchain-ai/langchain",
    "name": "langchain-fireworks",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/ae/88/745b3356b92a92ec2b351448cf45e425313561d79daaf95bcb391494ce0c/langchain_fireworks-0.2.6.tar.gz",
    "platform": null,
    "description": "# LangChain-Fireworks\n\nThis is the partner package for tying Fireworks.ai and LangChain. Fireworks really strive to provide good support for LangChain use cases, so if you run into any issues please let us know. You can reach out to us [in our Discord channel](https://discord.com/channels/1137072072808472616/)\n\n\n## Installation\n\nTo use the `langchain-fireworks` package, follow these installation steps:\n\n```bash\npip install langchain-fireworks\n```\n\n\n\n## Basic usage\n\n### Setting up\n\n1. Sign in to [Fireworks AI](http://fireworks.ai/) to obtain an API Key to access the models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.\n\n    Once you've signed in and obtained an API key, follow these steps to set the `FIREWORKS_API_KEY` environment variable:\n    - **Linux/macOS:** Open your terminal and execute the following command:\n    ```bash\n    export FIREWORKS_API_KEY='your_api_key'\n    ```\n    **Note:** To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.\n\n    - **Windows:** For Command Prompt, use:\n    ```cmd\n    set FIREWORKS_API_KEY=your_api_key\n    ```\n\n2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai/models).\n\n```python\nimport getpass\nimport os\n\n# Initialize a Fireworks model\nllm = Fireworks(\n    model=\"accounts/fireworks/models/mixtral-8x7b-instruct\",\n    base_url=\"https://api.fireworks.ai/inference/v1/completions\",\n)\n```\n\n\n### Calling the Model Directly\n\nYou can call the model directly with string prompts to get completions.\n\n```python\n# Single prompt\noutput = llm.invoke(\"Who's the best quarterback in the NFL?\")\nprint(output)\n```\n\n```python\n# Calling multiple prompts\noutput = llm.generate(\n    [\n        \"Who's the best cricket player in 2016?\",\n        \"Who's the best basketball player in the league?\",\n    ]\n)\nprint(output.generations)\n```\n\n\n\n\n\n## Advanced usage\n### Tool use: LangChain Agent + Fireworks function calling model\nPlease checkout how to teach Fireworks function calling model to use a [calculator here](https://github.com/fw-ai/cookbook/blob/main/examples/function_calling/fireworks_langchain_tool_usage.ipynb). \n\nFireworks focus on delivering the best experience for fast model inference as well as tool use. You can check out [our blog](https://fireworks.ai/blog/firefunction-v1-gpt-4-level-function-calling) for more details on how it fares compares to GPT-4, the punchline is that it is on par with GPT-4 in terms just function calling use cases, but it is way faster and much cheaper.\n\n### RAG: LangChain agent + Fireworks function calling model + MongoDB + Nomic AI embeddings\nPlease check out the [cookbook here](https://github.com/fw-ai/cookbook/blob/main/examples/rag/mongodb_agent.ipynb) for an end to end flow\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An integration package connecting Fireworks and LangChain",
    "version": "0.2.6",
    "project_urls": {
        "Homepage": "https://github.com/langchain-ai/langchain",
        "Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-fireworks%3D%3D0%22&expanded=true",
        "Repository": "https://github.com/langchain-ai/langchain",
        "Source Code": "https://github.com/langchain-ai/langchain/tree/master/libs/partners/fireworks"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "96cff775fc2536825c05621c6a853d538794c2714725dc8ee5a4c6fc19faef65",
                "md5": "029bb5e00c3fb476d91137eda5595ca1",
                "sha256": "e6a917ec2337ec7ae5b497ec78afe8a66bf6868b5f4ca13e0c898adb44630078"
            },
            "downloads": -1,
            "filename": "langchain_fireworks-0.2.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "029bb5e00c3fb476d91137eda5595ca1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 17435,
            "upload_time": "2024-12-19T16:06:17",
            "upload_time_iso_8601": "2024-12-19T16:06:17.504909Z",
            "url": "https://files.pythonhosted.org/packages/96/cf/f775fc2536825c05621c6a853d538794c2714725dc8ee5a4c6fc19faef65/langchain_fireworks-0.2.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ae88745b3356b92a92ec2b351448cf45e425313561d79daaf95bcb391494ce0c",
                "md5": "c229035b7548bdce91bf84310aebd72d",
                "sha256": "b7a18fc2449aa7aeb44e4eee81d64c2d192e2fc4fc3075352400e59ba0c18495"
            },
            "downloads": -1,
            "filename": "langchain_fireworks-0.2.6.tar.gz",
            "has_sig": false,
            "md5_digest": "c229035b7548bdce91bf84310aebd72d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 16799,
            "upload_time": "2024-12-19T16:06:19",
            "upload_time_iso_8601": "2024-12-19T16:06:19.657091Z",
            "url": "https://files.pythonhosted.org/packages/ae/88/745b3356b92a92ec2b351448cf45e425313561d79daaf95bcb391494ce0c/langchain_fireworks-0.2.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-19 16:06:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "langchain-ai",
    "github_project": "langchain",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "langchain-fireworks"
}
        
Elapsed time: 0.39412s