Name | langchain-pinecone JSON |
Version |
0.2.11
JSON |
| download |
home_page | None |
Summary | An integration package connecting Pinecone and LangChain |
upload_time | 2025-07-23 09:30:15 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.14,>=3.9 |
license | MIT |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# langchain-pinecone
This package contains the LangChain integration with Pinecone.
## Installation
```bash
pip install -qU langchain langchain-pinecone langchain-openai
```
And you should configure credentials by setting the following environment variables:
- `PINECONE_API_KEY`
- `OPENAI_API_KEY` (optional, for embeddings to use)
## Development
### Running Tests
The test suite includes both unit tests and integration tests. To run the tests:
```bash
# Run unit tests only
make test
# Run integration tests (requires environment variables)
make integration_test
```
#### Required Environment Variables for Tests
Integration tests require the following environment variables:
- `PINECONE_API_KEY`: Required for all integration tests
- `OPENAI_API_KEY`: Optional, required only for OpenAI embedding tests
You can set these environment variables before running the tests:
```bash
export PINECONE_API_KEY="your-api-key"
export OPENAI_API_KEY="your-openai-key" # Optional
```
If these environment variables are not set, the integration tests that require them will be skipped.
## Usage
### Initialization
Before initializing our vector store, let's connect to a Pinecone index. If one named `index_name` doesn't exist, it will be created.
```python
from pinecone import ServerlessSpec
index_name = "langchain-test-index" # change if desired
if not pc.has_index(index_name):
pc.create_index(
name=index_name,
dimension=1536,
metric="cosine",
spec=ServerlessSpec(
cloud='aws',
region='us-east-1'
)
)
index = pc.Index(index_name)
```
Initialize embedding model:
```python
from langchain_openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
```
The `PineconeVectorStore` class exposes the connection to the Pinecone vector store.
```python
from langchain_pinecone import PineconeVectorStore
vector_store = PineconeVectorStore(index=index, embedding=embeddings)
```
### Manage vector store
Once you have created your vector store, we can interact with it by adding and deleting different items.
#### Add items to vector store
We can add items to our vector store by using the `add_documents` function.
```python
from uuid import uuid4
from langchain_core.documents import Document
document_1 = Document(
page_content="I had chocalate chip pancakes and scrambled eggs for breakfast this morning.",
metadata={"source": "tweet"},
)
document_2 = Document(
page_content="The weather forecast for tomorrow is cloudy and overcast, with a high of 62 degrees.",
metadata={"source": "news"},
)
document_3 = Document(
page_content="Building an exciting new project with LangChain - come check it out!",
metadata={"source": "tweet"},
)
document_4 = Document(
page_content="Robbers broke into the city bank and stole $1 million in cash.",
metadata={"source": "news"},
)
document_5 = Document(
page_content="Wow! That was an amazing movie. I can't wait to see it again.",
metadata={"source": "tweet"},
)
document_6 = Document(
page_content="Is the new iPhone worth the price? Read this review to find out.",
metadata={"source": "website"},
)
document_7 = Document(
page_content="The top 10 soccer players in the world right now.",
metadata={"source": "website"},
)
document_8 = Document(
page_content="LangGraph is the best framework for building stateful, agentic applications!",
metadata={"source": "tweet"},
)
document_9 = Document(
page_content="The stock market is down 500 points today due to fears of a recession.",
metadata={"source": "news"},
)
document_10 = Document(
page_content="I have a bad feeling I am going to get deleted :(",
metadata={"source": "tweet"},
)
documents = [
document_1,
document_2,
document_3,
document_4,
document_5,
document_6,
document_7,
document_8,
document_9,
document_10,
]
uuids = [str(uuid4()) for _ in range(len(documents))]
vector_store.add_documents(documents=documents, ids=uuids)
```
#### Delete items from vector store
```
vector_store.delete(ids=[uuids[-1]])
```
### Query vector store
Once your vector store has been created and the relevant documents have been added you will most likely wish to query it during the running of your chain or agent.
#### Query directly
Performing a simple similarity search can be done as follows:
```python
results = vector_store.similarity_search(
"LangChain provides abstractions to make working with LLMs easy",
k=2,
filter={"source": "tweet"},
)
for res in results:
print(f"* {res.page_content} [{res.metadata}]")
```
#### Similarity search with score
You can also search with score:
```python
results = vector_store.similarity_search_with_score(
"Will it be hot tomorrow?", k=1, filter={"source": "news"}
)
for res, score in results:
print(f"* [SIM={score:3f}] {res.page_content} [{res.metadata}]")
```
### Query by turning into retriever
You can also transform the vector store into a retriever for easier usage in your chains.
```python
retriever = vector_store.as_retriever(
search_type="similarity_score_threshold",
search_kwargs={"k": 1, "score_threshold": 0.4},
)
retriever.invoke("Stealing from the bank is a crime", filter={"source": "news"})
```
Raw data
{
"_id": null,
"home_page": null,
"name": "langchain-pinecone",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.14,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/b1/1f/5d9dc2820e1c32e510cb2b713fa004ee63501b1425297f8bd52bf75bc816/langchain_pinecone-0.2.11.tar.gz",
"platform": null,
"description": "# langchain-pinecone\n\nThis package contains the LangChain integration with Pinecone.\n\n## Installation\n\n```bash\npip install -qU langchain langchain-pinecone langchain-openai\n```\n\nAnd you should configure credentials by setting the following environment variables:\n\n- `PINECONE_API_KEY`\n- `OPENAI_API_KEY` (optional, for embeddings to use)\n\n## Development\n\n### Running Tests\n\nThe test suite includes both unit tests and integration tests. To run the tests:\n\n```bash\n# Run unit tests only\nmake test\n\n# Run integration tests (requires environment variables)\nmake integration_test\n```\n\n#### Required Environment Variables for Tests\n\nIntegration tests require the following environment variables:\n\n- `PINECONE_API_KEY`: Required for all integration tests\n- `OPENAI_API_KEY`: Optional, required only for OpenAI embedding tests\n\nYou can set these environment variables before running the tests:\n\n```bash\nexport PINECONE_API_KEY=\"your-api-key\"\nexport OPENAI_API_KEY=\"your-openai-key\" # Optional\n```\n\nIf these environment variables are not set, the integration tests that require them will be skipped.\n\n## Usage\n\n### Initialization\n\nBefore initializing our vector store, let's connect to a Pinecone index. If one named `index_name` doesn't exist, it will be created.\n\n```python\nfrom pinecone import ServerlessSpec\n\nindex_name = \"langchain-test-index\" # change if desired\n\nif not pc.has_index(index_name):\n pc.create_index(\n name=index_name,\n dimension=1536,\n metric=\"cosine\",\n spec=ServerlessSpec(\n cloud='aws',\n region='us-east-1'\n )\n )\n\nindex = pc.Index(index_name)\n```\n\nInitialize embedding model:\n\n```python\nfrom langchain_openai import OpenAIEmbeddings\n\nembeddings = OpenAIEmbeddings(model=\"text-embedding-3-small\")\n```\n\nThe `PineconeVectorStore` class exposes the connection to the Pinecone vector store.\n\n```python\nfrom langchain_pinecone import PineconeVectorStore\n\nvector_store = PineconeVectorStore(index=index, embedding=embeddings)\n```\n\n### Manage vector store\n\nOnce you have created your vector store, we can interact with it by adding and deleting different items.\n\n#### Add items to vector store\n\nWe can add items to our vector store by using the `add_documents` function.\n\n```python\nfrom uuid import uuid4\n\nfrom langchain_core.documents import Document\n\ndocument_1 = Document(\n page_content=\"I had chocalate chip pancakes and scrambled eggs for breakfast this morning.\",\n metadata={\"source\": \"tweet\"},\n)\n\ndocument_2 = Document(\n page_content=\"The weather forecast for tomorrow is cloudy and overcast, with a high of 62 degrees.\",\n metadata={\"source\": \"news\"},\n)\n\ndocument_3 = Document(\n page_content=\"Building an exciting new project with LangChain - come check it out!\",\n metadata={\"source\": \"tweet\"},\n)\n\ndocument_4 = Document(\n page_content=\"Robbers broke into the city bank and stole $1 million in cash.\",\n metadata={\"source\": \"news\"},\n)\n\ndocument_5 = Document(\n page_content=\"Wow! That was an amazing movie. I can't wait to see it again.\",\n metadata={\"source\": \"tweet\"},\n)\n\ndocument_6 = Document(\n page_content=\"Is the new iPhone worth the price? Read this review to find out.\",\n metadata={\"source\": \"website\"},\n)\n\ndocument_7 = Document(\n page_content=\"The top 10 soccer players in the world right now.\",\n metadata={\"source\": \"website\"},\n)\n\ndocument_8 = Document(\n page_content=\"LangGraph is the best framework for building stateful, agentic applications!\",\n metadata={\"source\": \"tweet\"},\n)\n\ndocument_9 = Document(\n page_content=\"The stock market is down 500 points today due to fears of a recession.\",\n metadata={\"source\": \"news\"},\n)\n\ndocument_10 = Document(\n page_content=\"I have a bad feeling I am going to get deleted :(\",\n metadata={\"source\": \"tweet\"},\n)\n\ndocuments = [\n document_1,\n document_2,\n document_3,\n document_4,\n document_5,\n document_6,\n document_7,\n document_8,\n document_9,\n document_10,\n]\nuuids = [str(uuid4()) for _ in range(len(documents))]\nvector_store.add_documents(documents=documents, ids=uuids)\n```\n\n#### Delete items from vector store\n\n```\nvector_store.delete(ids=[uuids[-1]])\n```\n\n### Query vector store\n\nOnce your vector store has been created and the relevant documents have been added you will most likely wish to query it during the running of your chain or agent. \n\n#### Query directly\n\nPerforming a simple similarity search can be done as follows:\n\n```python\nresults = vector_store.similarity_search(\n \"LangChain provides abstractions to make working with LLMs easy\",\n k=2,\n filter={\"source\": \"tweet\"},\n)\nfor res in results:\n print(f\"* {res.page_content} [{res.metadata}]\")\n```\n\n#### Similarity search with score\n\nYou can also search with score:\n\n```python\nresults = vector_store.similarity_search_with_score(\n \"Will it be hot tomorrow?\", k=1, filter={\"source\": \"news\"}\n)\nfor res, score in results:\n print(f\"* [SIM={score:3f}] {res.page_content} [{res.metadata}]\")\n```\n\n### Query by turning into retriever\n\nYou can also transform the vector store into a retriever for easier usage in your chains.\n\n```python\nretriever = vector_store.as_retriever(\n search_type=\"similarity_score_threshold\",\n search_kwargs={\"k\": 1, \"score_threshold\": 0.4},\n)\nretriever.invoke(\"Stealing from the bank is a crime\", filter={\"source\": \"news\"})\n```",
"bugtrack_url": null,
"license": "MIT",
"summary": "An integration package connecting Pinecone and LangChain",
"version": "0.2.11",
"project_urls": {
"Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-pinecone%3D%3D0%22&expanded=true",
"Source Code": "https://github.com/langchain-ai/langchain-pinecone/tree/main/libs/pinecone",
"repository": "https://github.com/langchain-ai/langchain-pinecone"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8a9e90040eb269b0d663335a0ec40b5d4b5f1005a95557b9b84e72e3af9cf85d",
"md5": "a8d380ec8d83fb98384bbdd039d4ddd0",
"sha256": "2337b5cd9ffbb1600e079273fdd3f579473a4e7473365ceb956389f76bb973c5"
},
"downloads": -1,
"filename": "langchain_pinecone-0.2.11-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a8d380ec8d83fb98384bbdd039d4ddd0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.14,>=3.9",
"size": 23738,
"upload_time": "2025-07-23T09:30:14",
"upload_time_iso_8601": "2025-07-23T09:30:14.594331Z",
"url": "https://files.pythonhosted.org/packages/8a/9e/90040eb269b0d663335a0ec40b5d4b5f1005a95557b9b84e72e3af9cf85d/langchain_pinecone-0.2.11-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "b11f5d9dc2820e1c32e510cb2b713fa004ee63501b1425297f8bd52bf75bc816",
"md5": "c5bb1511063f7e2027cf6ae4a65cf445",
"sha256": "0e7b9b821351a5163ffce5e0a692066fd3b04c088fe9199ba7df00e9b34b2cd1"
},
"downloads": -1,
"filename": "langchain_pinecone-0.2.11.tar.gz",
"has_sig": false,
"md5_digest": "c5bb1511063f7e2027cf6ae4a65cf445",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.14,>=3.9",
"size": 35631,
"upload_time": "2025-07-23T09:30:15",
"upload_time_iso_8601": "2025-07-23T09:30:15.539006Z",
"url": "https://files.pythonhosted.org/packages/b1/1f/5d9dc2820e1c32e510cb2b713fa004ee63501b1425297f8bd52bf75bc816/langchain_pinecone-0.2.11.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-23 09:30:15",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "langchain-ai",
"github_project": "langchain",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "langchain-pinecone"
}