langchain-core


Namelangchain-core JSON
Version 0.3.28 PyPI version JSON
download
home_pagehttps://github.com/langchain-ai/langchain
SummaryBuilding applications with LLMs through composability
upload_time2024-12-19 22:56:02
maintainerNone
docs_urlNone
authorNone
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 🦜🍎️ LangChain Core

[![Downloads](https://static.pepy.tech/badge/langchain_core/month)](https://pepy.tech/project/langchain_core)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

## Quick Install

```bash
pip install langchain-core
```

## What is it?

LangChain Core contains the base abstractions that power the rest of the LangChain ecosystem.

These abstractions are designed to be as modular and simple as possible. Examples of these abstractions include those for language models, document loaders, embedding models, vectorstores, retrievers, and more.

The benefit of having these abstractions is that any provider can implement the required interface and then easily be used in the rest of the LangChain ecosystem.

For full documentation see the [API reference](https://python.langchain.com/api_reference/core/index.html).

## 1️⃣ Core Interface: Runnables

The concept of a Runnable is central to LangChain Core – it is the interface that most LangChain Core components implement, giving them

- a common invocation interface (invoke, batch, stream, etc.)
- built-in utilities for retries, fallbacks, schemas and runtime configurability
- easy deployment with [LangServe](https://github.com/langchain-ai/langserve)

For more check out the [runnable docs](https://python.langchain.com/docs/expression_language/interface). Examples of components that implement the interface include: LLMs, Chat Models, Prompts, Retrievers, Tools, Output Parsers.

You can use LangChain Core objects in two ways:

1. **imperative**, ie. call them directly, eg. `model.invoke(...)`

2. **declarative**, with LangChain Expression Language (LCEL)

3. or a mix of both! eg. one of the steps in your LCEL sequence can be a custom function

| Feature   | Imperative                      | Declarative    |
| --------- | ------------------------------- | -------------- |
| Syntax    | All of Python                   | LCEL           |
| Tracing   | ✅ – Automatic                  | ✅ – Automatic |
| Parallel  | ✅ – with threads or coroutines | ✅ – Automatic |
| Streaming | ✅ – by yielding                | ✅ – Automatic |
| Async     | ✅ – by writing async functions | ✅ – Automatic |

## ⚡️ What is LangChain Expression Language?

LangChain Expression Language (LCEL) is a _declarative language_ for composing LangChain Core runnables into sequences (or DAGs), covering the most common patterns when building with LLMs.

LangChain Core compiles LCEL sequences to an _optimized execution plan_, with automatic parallelization, streaming, tracing, and async support.

For more check out the [LCEL docs](https://python.langchain.com/docs/expression_language/).

![Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers.](https://raw.githubusercontent.com/langchain-ai/langchain/master/docs/static/svg/langchain_stack_112024.svg "LangChain Framework Overview")

For more advanced use cases, also check out [LangGraph](https://github.com/langchain-ai/langgraph), which is a graph-based runner for cyclic and recursive LLM workflows.

## 📕 Releases & Versioning

`langchain-core` is currently on version `0.1.x`.

As `langchain-core` contains the base abstractions and runtime for the whole LangChain ecosystem, we will communicate any breaking changes with advance notice and version bumps. The exception for this is anything in `langchain_core.beta`. The reason for `langchain_core.beta` is that given the rate of change of the field, being able to move quickly is still a priority, and this module is our attempt to do so.

Minor version increases will occur for:

- Breaking changes for any public interfaces NOT in `langchain_core.beta`

Patch version increases will occur for:

- Bug fixes
- New features
- Any changes to private interfaces
- Any changes to `langchain_core.beta`

## 💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see the [Contributing Guide](https://python.langchain.com/docs/contributing/).

## ⛰️ Why build on top of LangChain Core?

The whole LangChain ecosystem is built on top of LangChain Core, so you're in good company when building on top of it. Some of the benefits:

- **Modularity**: LangChain Core is designed around abstractions that are independent of each other, and not tied to any specific model provider.
- **Stability**: We are committed to a stable versioning scheme, and will communicate any breaking changes with advance notice and version bumps.
- **Battle-tested**: LangChain Core components have the largest install base in the LLM ecosystem, and are used in production by many companies.
- **Community**: LangChain Core is developed in the open, and we welcome contributions from the community.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/langchain-ai/langchain",
    "name": "langchain-core",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/de/f2/1787f9e7fcf6bee70f7a8f488ee95b814408706ab35d61dbba279fbe7361/langchain_core-0.3.28.tar.gz",
    "platform": null,
    "description": "# \ud83e\udd9c\ud83c\udf4e\ufe0f LangChain Core\n\n[![Downloads](https://static.pepy.tech/badge/langchain_core/month)](https://pepy.tech/project/langchain_core)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\n## Quick Install\n\n```bash\npip install langchain-core\n```\n\n## What is it?\n\nLangChain Core contains the base abstractions that power the rest of the LangChain ecosystem.\n\nThese abstractions are designed to be as modular and simple as possible. Examples of these abstractions include those for language models, document loaders, embedding models, vectorstores, retrievers, and more.\n\nThe benefit of having these abstractions is that any provider can implement the required interface and then easily be used in the rest of the LangChain ecosystem.\n\nFor full documentation see the [API reference](https://python.langchain.com/api_reference/core/index.html).\n\n## 1\ufe0f\u20e3 Core Interface: Runnables\n\nThe concept of a Runnable is central to LangChain Core \u2013 it is the interface that most LangChain Core components implement, giving them\n\n- a common invocation interface (invoke, batch, stream, etc.)\n- built-in utilities for retries, fallbacks, schemas and runtime configurability\n- easy deployment with [LangServe](https://github.com/langchain-ai/langserve)\n\nFor more check out the [runnable docs](https://python.langchain.com/docs/expression_language/interface). Examples of components that implement the interface include: LLMs, Chat Models, Prompts, Retrievers, Tools, Output Parsers.\n\nYou can use LangChain Core objects in two ways:\n\n1. **imperative**, ie. call them directly, eg. `model.invoke(...)`\n\n2. **declarative**, with LangChain Expression Language (LCEL)\n\n3. or a mix of both! eg. one of the steps in your LCEL sequence can be a custom function\n\n| Feature   | Imperative                      | Declarative    |\n| --------- | ------------------------------- | -------------- |\n| Syntax    | All of Python                   | LCEL           |\n| Tracing   | \u2705 \u2013 Automatic                  | \u2705 \u2013 Automatic |\n| Parallel  | \u2705 \u2013 with threads or coroutines | \u2705 \u2013 Automatic |\n| Streaming | \u2705 \u2013 by yielding                | \u2705 \u2013 Automatic |\n| Async     | \u2705 \u2013 by writing async functions | \u2705 \u2013 Automatic |\n\n## \u26a1\ufe0f What is LangChain Expression Language?\n\nLangChain Expression Language (LCEL) is a _declarative language_ for composing LangChain Core runnables into sequences (or DAGs), covering the most common patterns when building with LLMs.\n\nLangChain Core compiles LCEL sequences to an _optimized execution plan_, with automatic parallelization, streaming, tracing, and async support.\n\nFor more check out the [LCEL docs](https://python.langchain.com/docs/expression_language/).\n\n![Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers.](https://raw.githubusercontent.com/langchain-ai/langchain/master/docs/static/svg/langchain_stack_112024.svg \"LangChain Framework Overview\")\n\nFor more advanced use cases, also check out [LangGraph](https://github.com/langchain-ai/langgraph), which is a graph-based runner for cyclic and recursive LLM workflows.\n\n## \ud83d\udcd5 Releases & Versioning\n\n`langchain-core` is currently on version `0.1.x`.\n\nAs `langchain-core` contains the base abstractions and runtime for the whole LangChain ecosystem, we will communicate any breaking changes with advance notice and version bumps. The exception for this is anything in `langchain_core.beta`. The reason for `langchain_core.beta` is that given the rate of change of the field, being able to move quickly is still a priority, and this module is our attempt to do so.\n\nMinor version increases will occur for:\n\n- Breaking changes for any public interfaces NOT in `langchain_core.beta`\n\nPatch version increases will occur for:\n\n- Bug fixes\n- New features\n- Any changes to private interfaces\n- Any changes to `langchain_core.beta`\n\n## \ud83d\udc81 Contributing\n\nAs an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.\n\nFor detailed information on how to contribute, see the [Contributing Guide](https://python.langchain.com/docs/contributing/).\n\n## \u26f0\ufe0f Why build on top of LangChain Core?\n\nThe whole LangChain ecosystem is built on top of LangChain Core, so you're in good company when building on top of it. Some of the benefits:\n\n- **Modularity**: LangChain Core is designed around abstractions that are independent of each other, and not tied to any specific model provider.\n- **Stability**: We are committed to a stable versioning scheme, and will communicate any breaking changes with advance notice and version bumps.\n- **Battle-tested**: LangChain Core components have the largest install base in the LLM ecosystem, and are used in production by many companies.\n- **Community**: LangChain Core is developed in the open, and we welcome contributions from the community.\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Building applications with LLMs through composability",
    "version": "0.3.28",
    "project_urls": {
        "Homepage": "https://github.com/langchain-ai/langchain",
        "Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-core%3D%3D0%22&expanded=true",
        "Repository": "https://github.com/langchain-ai/langchain",
        "Source Code": "https://github.com/langchain-ai/langchain/tree/master/libs/core"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9abf07e63d4b4c41aa49daf4a4499e8010928ce8545469f0265544f925c95fff",
                "md5": "4b7376082c82b7e80ead03536b786d44",
                "sha256": "a02f81ca53a8eed757133797e5a602ca80c1324bbecb0c5d86ef7bd3d6625372"
            },
            "downloads": -1,
            "filename": "langchain_core-0.3.28-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4b7376082c82b7e80ead03536b786d44",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 411553,
            "upload_time": "2024-12-19T22:56:00",
            "upload_time_iso_8601": "2024-12-19T22:56:00.680371Z",
            "url": "https://files.pythonhosted.org/packages/9a/bf/07e63d4b4c41aa49daf4a4499e8010928ce8545469f0265544f925c95fff/langchain_core-0.3.28-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "def21787f9e7fcf6bee70f7a8f488ee95b814408706ab35d61dbba279fbe7361",
                "md5": "64da9ba56ce9ea9443c0c0019f48f4a6",
                "sha256": "407f7607e6b3c0ebfd6094da95d39b701e22e59966698ef126799782953e7f2c"
            },
            "downloads": -1,
            "filename": "langchain_core-0.3.28.tar.gz",
            "has_sig": false,
            "md5_digest": "64da9ba56ce9ea9443c0c0019f48f4a6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 330743,
            "upload_time": "2024-12-19T22:56:02",
            "upload_time_iso_8601": "2024-12-19T22:56:02.177020Z",
            "url": "https://files.pythonhosted.org/packages/de/f2/1787f9e7fcf6bee70f7a8f488ee95b814408706ab35d61dbba279fbe7361/langchain_core-0.3.28.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-19 22:56:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "langchain-ai",
    "github_project": "langchain",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "langchain-core"
}
        
Elapsed time: 0.40043s