langchain-fireworks


Namelangchain-fireworks JSON
Version 0.2.5 PyPI version JSON
download
home_pagehttps://github.com/langchain-ai/langchain
SummaryAn integration package connecting Fireworks and LangChain
upload_time2024-10-31 18:32:38
maintainerNone
docs_urlNone
authorNone
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LangChain-Fireworks

This is the partner package for tying Fireworks.ai and LangChain. Fireworks really strive to provide good support for LangChain use cases, so if you run into any issues please let us know. You can reach out to us [in our Discord channel](https://discord.com/channels/1137072072808472616/)


## Installation

To use the `langchain-fireworks` package, follow these installation steps:

```bash
pip install langchain-fireworks
```



## Basic usage

### Setting up

1. Sign in to [Fireworks AI](http://fireworks.ai/) to obtain an API Key to access the models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.

    Once you've signed in and obtained an API key, follow these steps to set the `FIREWORKS_API_KEY` environment variable:
    - **Linux/macOS:** Open your terminal and execute the following command:
    ```bash
    export FIREWORKS_API_KEY='your_api_key'
    ```
    **Note:** To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.

    - **Windows:** For Command Prompt, use:
    ```cmd
    set FIREWORKS_API_KEY=your_api_key
    ```

2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai/models).

```python
import getpass
import os

# Initialize a Fireworks model
llm = Fireworks(
    model="accounts/fireworks/models/mixtral-8x7b-instruct",
    base_url="https://api.fireworks.ai/inference/v1/completions",
)
```


### Calling the Model Directly

You can call the model directly with string prompts to get completions.

```python
# Single prompt
output = llm.invoke("Who's the best quarterback in the NFL?")
print(output)
```

```python
# Calling multiple prompts
output = llm.generate(
    [
        "Who's the best cricket player in 2016?",
        "Who's the best basketball player in the league?",
    ]
)
print(output.generations)
```





## Advanced usage
### Tool use: LangChain Agent + Fireworks function calling model
Please checkout how to teach Fireworks function calling model to use a [calculator here](https://github.com/fw-ai/cookbook/blob/main/examples/function_calling/fireworks_langchain_tool_usage.ipynb). 

Fireworks focus on delivering the best experience for fast model inference as well as tool use. You can check out [our blog](https://fireworks.ai/blog/firefunction-v1-gpt-4-level-function-calling) for more details on how it fares compares to GPT-4, the punchline is that it is on par with GPT-4 in terms just function calling use cases, but it is way faster and much cheaper.

### RAG: LangChain agent + Fireworks function calling model + MongoDB + Nomic AI embeddings
Please check out the [cookbook here](https://github.com/fw-ai/cookbook/blob/main/examples/rag/mongodb_agent.ipynb) for an end to end flow

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/langchain-ai/langchain",
    "name": "langchain-fireworks",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/88/e3/b2e1566ba8a7201f80679fda6e3368956c048c381a3f6850c4e887b3f32f/langchain_fireworks-0.2.5.tar.gz",
    "platform": null,
    "description": "# LangChain-Fireworks\n\nThis is the partner package for tying Fireworks.ai and LangChain. Fireworks really strive to provide good support for LangChain use cases, so if you run into any issues please let us know. You can reach out to us [in our Discord channel](https://discord.com/channels/1137072072808472616/)\n\n\n## Installation\n\nTo use the `langchain-fireworks` package, follow these installation steps:\n\n```bash\npip install langchain-fireworks\n```\n\n\n\n## Basic usage\n\n### Setting up\n\n1. Sign in to [Fireworks AI](http://fireworks.ai/) to obtain an API Key to access the models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.\n\n    Once you've signed in and obtained an API key, follow these steps to set the `FIREWORKS_API_KEY` environment variable:\n    - **Linux/macOS:** Open your terminal and execute the following command:\n    ```bash\n    export FIREWORKS_API_KEY='your_api_key'\n    ```\n    **Note:** To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.\n\n    - **Windows:** For Command Prompt, use:\n    ```cmd\n    set FIREWORKS_API_KEY=your_api_key\n    ```\n\n2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai/models).\n\n```python\nimport getpass\nimport os\n\n# Initialize a Fireworks model\nllm = Fireworks(\n    model=\"accounts/fireworks/models/mixtral-8x7b-instruct\",\n    base_url=\"https://api.fireworks.ai/inference/v1/completions\",\n)\n```\n\n\n### Calling the Model Directly\n\nYou can call the model directly with string prompts to get completions.\n\n```python\n# Single prompt\noutput = llm.invoke(\"Who's the best quarterback in the NFL?\")\nprint(output)\n```\n\n```python\n# Calling multiple prompts\noutput = llm.generate(\n    [\n        \"Who's the best cricket player in 2016?\",\n        \"Who's the best basketball player in the league?\",\n    ]\n)\nprint(output.generations)\n```\n\n\n\n\n\n## Advanced usage\n### Tool use: LangChain Agent + Fireworks function calling model\nPlease checkout how to teach Fireworks function calling model to use a [calculator here](https://github.com/fw-ai/cookbook/blob/main/examples/function_calling/fireworks_langchain_tool_usage.ipynb). \n\nFireworks focus on delivering the best experience for fast model inference as well as tool use. You can check out [our blog](https://fireworks.ai/blog/firefunction-v1-gpt-4-level-function-calling) for more details on how it fares compares to GPT-4, the punchline is that it is on par with GPT-4 in terms just function calling use cases, but it is way faster and much cheaper.\n\n### RAG: LangChain agent + Fireworks function calling model + MongoDB + Nomic AI embeddings\nPlease check out the [cookbook here](https://github.com/fw-ai/cookbook/blob/main/examples/rag/mongodb_agent.ipynb) for an end to end flow\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An integration package connecting Fireworks and LangChain",
    "version": "0.2.5",
    "project_urls": {
        "Homepage": "https://github.com/langchain-ai/langchain",
        "Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-fireworks%3D%3D0%22&expanded=true",
        "Repository": "https://github.com/langchain-ai/langchain",
        "Source Code": "https://github.com/langchain-ai/langchain/tree/master/libs/partners/fireworks"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0d9626df7bc1d67a94b26ec04017abeda9c82688eaa891c1821cdc151c696e78",
                "md5": "ac25174c141d56a4881584e0e0073b12",
                "sha256": "a37e9fada74a428040e676e14a6438b97db4af807cb1404001cec4def492f129"
            },
            "downloads": -1,
            "filename": "langchain_fireworks-0.2.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ac25174c141d56a4881584e0e0073b12",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 17423,
            "upload_time": "2024-10-31T18:32:37",
            "upload_time_iso_8601": "2024-10-31T18:32:37.068286Z",
            "url": "https://files.pythonhosted.org/packages/0d/96/26df7bc1d67a94b26ec04017abeda9c82688eaa891c1821cdc151c696e78/langchain_fireworks-0.2.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "88e3b2e1566ba8a7201f80679fda6e3368956c048c381a3f6850c4e887b3f32f",
                "md5": "0f69a58565cc2160a04c4ec3f533fd70",
                "sha256": "878d0dcd975f8d24ee1caf3d4cc6ccad0391352585ab81a8b33f8992bba6ced5"
            },
            "downloads": -1,
            "filename": "langchain_fireworks-0.2.5.tar.gz",
            "has_sig": false,
            "md5_digest": "0f69a58565cc2160a04c4ec3f533fd70",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 16783,
            "upload_time": "2024-10-31T18:32:38",
            "upload_time_iso_8601": "2024-10-31T18:32:38.691647Z",
            "url": "https://files.pythonhosted.org/packages/88/e3/b2e1566ba8a7201f80679fda6e3368956c048c381a3f6850c4e887b3f32f/langchain_fireworks-0.2.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-31 18:32:38",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "langchain-ai",
    "github_project": "langchain",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "langchain-fireworks"
}
        
Elapsed time: 0.44919s